Programmer/designer Matthew Butterick filed the case Thursday in San Francisco, saying it was on behalf of millions of GitHub users potentially affected by the $10-a-month Copilot service: The lawsuit seeks to challenge the legality of GitHub Copilot, as well as OpenAI Codex which powers the AI tool, and has been filed against GitHub, its owner Microsoft, and OpenAI…. “By training their AI systems on public GitHub repositories (though based on their public statements, possibly much more), we contend that the defendants have violated the legal rights of a vast number of creators who posted code or other work under certain open-source licences on GitHub,” said Butterick.
These licences include a set of 11 popular open source licences that all require attribution of the author’s name and copyright. This includes the MIT licence, the GNU General Public Licence, and the Apache licence. The case claimed that Copilot violates and removes these licences offered by thousands, possibly millions, of software developers, and is therefore committing software piracy on an unprecedented scale.
Copilot, which is entirely run on Microsoft Azure, often simply reproduces code that can be traced back to open-source repositories or licensees, according to the lawsuit. The code never contains attributions to the underlying authors, which is in violation of the licences. “It is not fair, permitted, or justified. On the contrary, Copilot’s goal is to replace a huge swath of open source by taking it and keeping it inside a GitHub-controlled paywall….” Moreover, the case stated that the defendants have also violated GitHub’s own terms of service and privacy policies, the DMCA code 1202 which forbids the removal of copyright-management information, and the California Consumer Privacy Act.
The lawsuit also accuses GitHub of monetizing code from open source programmers, “despite GitHub’s pledge never to do so.”
And Butterick argued to IT Pro that “AI systems are not exempt from the law… If companies like Microsoft, GitHub, and OpenAI choose to disregard the law, they should not expect that we the public will sit still.” Butterick believes AI can only elevate humanity if it’s “fair and ethical for everyone. If it’s not… it will just become another way for the privileged few to profit from the work of the many.”
Reached for comment, GitHub pointed IT Pro to their announcement Monday that next year, suggested code fragments will come with the ability to identify when it matches other publicly-available code — or code that it’s similar to.
The article adds that this lawsuit “comes at a time when Microsoft is looking at developing Copilot technology for use in similar programmes for other job categories, like office work, cyber security, or video game design, according to a Bloomberg report.”