web analytics
Home » Technology » Microsoft » GitHub Copilot Fears Copyright Lawsuits Among Developers

GitHub Copilot Fears Copyright Lawsuits Among Developers

microsoft_github

With the introduction of GitHub Copilot, Microsoft actually wanted to make life easier for developers on its programming platform. But now there is more fear of being confronted with copyright lawsuits after using the tool.

The idea sounds good: Microsoft added an AI component to the code completion function. This was able to make significantly more extensive suggestions for supplementing the code and thus take on more annoying everyday tasks. The AI ​​is trained with a large amount of freely accessible source code, which can be found both on GitHub itself and elsewhere on the web.

But what was probably not sufficiently considered: Open source software is also subject to copyright protection. Now that cases have arisen in which the AI ​​has added passages to developers’ code that it simply copied from other programmers, the first lawyers are already set to deal with such cases, as reported in a report by the British magazine The Register.

For example, Matthew Butterick, who is both a developer and a lawyer himself, is considering a lawsuit. According to reports, he is currently examining two lines of attack: Has GitHub Copilot illegally trained on open-source code that was not approved for it? And does the tool improperly inject someone else’s copyrighted work—pulled from the training data—into the new code?

Cases Are Already There

Butterick already criticized Copilot when the feature was launched. In June, he published a post arguing that “any code generated by Copilot may have a license or copyright infringement lurking in it.” He already recommended not using the function. The organization Software Freedom Conservancy (SFC) also announced at the time that it would stop using GitHub in order not to make itself vulnerable.

It is now apparent that the fears were justified. Tim Davis, a professor of computer science and engineering at Texas A&M University, spoke up. He had found that Copilot was reproducing his proprietary code for transposing a sparse matrix. However, he decided first to seek contact with Microsoft himself.

Microsoft and GitHub did not comment on the developments when asked. However, Redmond is well aware of the problems. Finally, the GitHub documentation warns that the code output may contain “undesirable patterns” and puts the responsibility for intellectual property infringement on the Copilot user.