Artificial Intelligence and Ownership
Doghouse – AI
If an artificial intelligence produces something new, who owns the new creation?
Some free software people do not believe in intellectual property and copyrights. I am not one of them. I do believe that people have the right to say what happens to their ideas and work, whether those are licensed as free and open source or whether they are closed and proprietary.
As such, I do not "spit on" people who decide to close their code and sell it, but I do believe that the best way of producing code for the end user is the free software model, which gives the end user the ability to maintain their system for as long as it is feasible.
Recently there have been more and more people asking me about the effects of artificial intelligence (AI) on the programming job market. They ask me if I think that AI will take over and put programmers out of work. My answer might not be popular, but if you take AI to its ultimate end, the answer must be "yes."
I have been hearing about "artificial intelligence" since the 1950s, with science fiction books like I Robot and movies and TV shows like Star Trek: Next Generation (STNG) having androids, like Mr. Data. I have seen computers become faster, logically larger, physically smaller, and more complex. I have seen more people work on and produce what they consider artificial intelligence, and I am sure that some day in the future we will find the algorithm that allows the computer we call the human brain to learn and gain knowledge and apply that to inorganic intelligence (what I prefer to call AI).
It is inevitable.
However, we have to think about what happens when this artificially intelligent artificial human (yes, there will probably first be AI dogs and AI birds) creates something new. Who owns that new thing? The artificial human? The "owner" of the artificial human? And if the artificial human is owned, is that slavery? Many of the same questions were asked and somewhat answered with regards to the android Data on STNG, as well as in many science fiction stories dating back to the 1950s.
But we may have a crisis a lot sooner, even without an artificial human.
Microsoft's Copilot, supposed AI software, has been trained on FOSS software that is both under copyright and under software licenses. The authors of this FOSS software probably did not consider or license the use of their software by AI, nor did they consider that some AI "mind" would use their software to generate its own code, and this is causing consternation among some FOSS developers regarding attribution.
The creation of new and unique code, by itself, should not cause many problems, because human programmers might look at existing code, learn how to write new code, and then generate new code from that knowledge. Students have been doing this for decades, but we also teach students about plagiarism and how to create sandboxes so they do not copy the code verbatim.
One issue, with both flesh-and-blood and inorganic intelligence, is when the output is exactly (or very, very close) to what was first written, and without the attribution requested by many licenses. In many places, this is known as plagiarism and could be a violation of copyright law unless licensed and with proper attribution.
The user of Microsoft's Copilot, which was trained through the use of FOSS source code, may not even realize that the code which Copilot outputs is an exact duplicate of a FOSS program, and the AI program might not even be "aware" that it created that exact copy. Therefore in a court of law, when the original copyright holder brings a copyright infringement against the holder of the duplicate code, how does the Copilot user prove that it was an innocent copy, and what happens to that copied code? If Copilot is true AI, then even running Copilot with the same commands and the same input might not create the same output, making it difficult to prove that Copilot generated the code in question.
Does the AI system have access to all appropriate patents? What happens when the AI system inserts a patented algorithm without knowing it? Of course this could happen with a human coder too, but this type of filtering should be built into something like Microsoft's Copilot or any other AI "creative" system.
A person by the name of Matthew Butterick has been asking these questions, and many more [1], and it may behoove us to think about companies inserting these types of tools into platforms (such as GitHub) that FOSS developers use all the time. It is not necessarily bad that developers use these tools, but there should be some discussion and understanding regarding the legality and impact of using them.
Infos
- Matthew Butterick on CoPilot: https://githubcopilotinvestigation.com/?fbclid=IwAR3gI83OQZ8Wsu4WUhTfYo8StjgsIvHi_9gPvkhfOw5cZW1xfDxAsIlJzpY
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Halcyon Creates Anti-Ransomware Protection for Linux
As more Linux systems are targeted by ransomware, Halcyon is stepping up its protection.
-
Valve and Arch Linux Announce Collaboration
Valve and Arch have come together for two projects that will have a serious impact on the Linux distribution.
-
Hacker Successfully Runs Linux on a CPU from the Early ‘70s
From the office of "Look what I can do," Dmitry Grinberg was able to get Linux running on a processor that was created in 1971.
-
OSI and LPI Form Strategic Alliance
With a goal of strengthening Linux and open source communities, this new alliance aims to nurture the growth of more highly skilled professionals.
-
Fedora 41 Beta Available with Some Interesting Additions
If you're a Fedora fan, you'll be excited to hear the beta version of the latest release is now available for testing and includes plenty of updates.
-
AlmaLinux Unveils New Hardware Certification Process
The AlmaLinux Hardware Certification Program run by the Certification Special Interest Group (SIG) aims to ensure seamless compatibility between AlmaLinux and a wide range of hardware configurations.
-
Wind River Introduces eLxr Pro Linux Solution
eLxr Pro offers an end-to-end Linux solution backed by expert commercial support.
-
Juno Tab 3 Launches with Ubuntu 24.04
Anyone looking for a full-blown Linux tablet need look no further. Juno has released the Tab 3.
-
New KDE Slimbook Plasma Available for Preorder
Powered by an AMD Ryzen CPU, the latest KDE Slimbook laptop is powerful enough for local AI tasks.
-
Rhino Linux Announces Latest "Quick Update"
If you prefer your Linux distribution to be of the rolling type, Rhino Linux delivers a beautiful and reliable experience.