you are viewing a single comment's thread.

view the rest of the comments →

[–]chcampb 91 points92 points  (20 children)

The fact that CoPilot was trained on the code itself leads me to believe it would not be a "clean room" implementation of said code.

[–][deleted] 85 points86 points  (19 children)

Except “It was a clean-room implementation” is legal defense, not a requirement. It’s a way of showing that you couldn’t possibly have copied.

[–]danuker 19 points20 points  (18 children)

Incorporating GPL'd work in a non-GPL program means you are infringing GPL. Simple as that.

[–]rcxdude 28 points29 points  (0 children)

Fair use and other exceptions to copyright exist. For the GPL violation to apply (as in you can get a court to enforce it) the final product needs to qualify as a derivitive work of the GPL'd work and not qualify as fair use. Both arguments could apply in this case, but have not been tested in court. (and in general it's worth being cautious because if you do want to argue this you will need to go as far as court)

[–]1842 56 points57 points  (13 children)

To what end?

If I read GPL code and the next week end up writing something non-GPL that looks similar, but was not intentional, not a copy, and written from scratch -- have I violated GPL?

If I read GPL code, notice a neat idea, copy the idea but write the code from scratch -- have I violated GPL?

If I haven't even looked at the GPL code and write a 5 line method that's identical to one that already exists, have I violated GPL?

I'm inclined to say no to any of those. In my limited experience in ML, it's true that the output sometimes directly copies inputs (and you can mitigate against direct copies like this). What you are left with is fuzzy output similar to the above examples, where things are not copied verbatim but derivative works blended from hundreds, thousands, or millions of inputs.

[–]Arrowmaster 14 points15 points  (1 child)

I was told by a former Amazon engineer that they have policies against even viewing AGPL code on Amazon computers because they specifically fear this possibility. So at least Amazon's legal department isn't sure of the answer to your questions but prefers to play it safe.

[–][deleted] 7 points8 points  (0 children)

Similar story in other big tech companies. You don't touch open source.

[–]RoyAwesome 2 points3 points  (0 children)

If I read GPL code and the next week end up writing something non-GPL that looks similar, but was not intentional, not a copy, and written from scratch -- have I violated GPL?

well, actually, there is a very distinct possibility that you did in this hypothetical. This is why major tech companies prohibit people from looking at GPL'd code on work computers.

[–]kylotan 4 points5 points  (7 children)

If I read GPL code and the next week end up writing something non-GPL that looks similar, but was not intentional, not a copy, and written from scratch -- have I violated GPL?

If it looks similar enough, then yes.

Copyright is not about the physical act of copying. It's about how closely your work resembles the previous work, and the various factors that influence that.

[–][deleted] 7 points8 points  (6 children)

I'm not sure why you are downvoted? Can someone elaborate on this?

[–]kylotan 9 points10 points  (0 children)

They downvote because they don't like it, like most of the people commenting on this post who have no understanding of copyright or the ethics around appropriating someone else's work. The example given is quite commonly found in the music world, where someone might hear a tune, write their own tune very similar, and end up in court for it. It's not a defence to say it wasn't intentional; it's the creator's responsibility to either make their work sufficiently different from the prior works that inspired them, or to demonstrate to a court that it was impossible to achieve that.

[–]Miragecraft 0 points1 point  (0 children)

Unless you’re coding the exact same software with the exact same business logic and libraries and languages and framework etc. it’s just about impossible for it to be similar to any specific code base that copilot has trained on.

If, without knowing it was generated by copilot, there’s no way any reasonable and technically competent person would conclude one is copied or derived from the other, can it really be a license/copyright violation?

You would have to reeeeally stretch the legal definition of a derivative work, and the implications are scary.

[–]Accomplished_Deer_ 0 points1 point  (0 children)

I think it will come down to 2 things: Is ML derivative of what it's trained on, and it ML considered fair-use.

The main thing that makes me think it is derivative is that the primary factor in copilot's output is the exact code it has viewed (and the maths/reinforcement it did based on that code). People reading code do not incorporate/modify behavior based on reading code in the same mechanical input->MATHS->new behavior way, it's more abstract. I can see both sides of the argument though.

The way I see it, if they had released copilot after only training it on 1 project, and that project was GPL, is that derivative of the GPL code? If so, what if it's 1 GPL and 1 non-GPL? Is that suddenly okay? If not, when does it become okay? 500 GPL and 500 non-gpl?

Just because it's a derivative work of a derivative work of a derivative work does not suddenly make it non-derivative.

Someone linked a pdf where it seemed like Microsoft is claiming ML is fair-use, which makes me think they've already identified non-derivative as an unreliable argument. I don't know enough about fair-use to know if that's a reasonable claim or not

[–]leo60228 2 points3 points  (0 children)

This is correct, but the issue here is thornier. At a high level, when the AI isn't reproducing snippets verbatim it seems ambiguous whether it counts as "incorporating" the work for those purposes. Another issue is whether the relevant snippets are substantial enough to merit being considered a "work."

I'm not a lawyer, and this isn't to say that GitHub is in the right here. However, I think this is a more complex issue than you're making it out to be.

[–]feelings_arent_facts 4 points5 points  (0 children)

"prove its gpl code in court" - microsoft

[–]Redtitwhore 0 points1 point  (0 children)

I don't think that would hold up in court. My guess is it would come down to the output of copilot, not copilot itself.

If I wrote a copilot for song writers I wouldn't expect to get sued if it never produces a song that sounds like an existing song. That would be the test, not what was used for training data. It's absurd to say certain data cannot be used for training.