Now we are facing an unprecedented growth of AI as a whole. Do you think is time for FSF elaborate a new version of GPL to incorporate the new challenges of AI in software development to keep protecting users freedom?

  • John Colagioia
    link
    34
    edit-2
    11 months ago

    I keep saying “no” to this sort of thing, for a variety of reasons.

    1. “You can use this code for anything you want as long as you don’t work in a field that I don’t like” is pretty much the opposite of the spirit of the GPL.
    2. The enormous companies slurping up all content available on the Internet do not care about copyright. The GPL already forbids adapting and redistributing code without licensing under the GPL, and they’re not doing that. So another clause that says “hey, if you’re training an AI, leave me out” is wasted text that nobody is going to read.
    3. Making “AI” an issue instead of “big corporate abuse” means that academics and hobbyists can’t legally train a language model on your code, even if they would otherwise comply with the license.
    4. The FSF has never cared about anything unless Stallman personally cared about it on his personal computer, and they’ve recently proven that he matters to them more than the community, so we probably shouldn’t ever expect a new GPL.
    5. The GPL has so many problems (because it’s been based on one person’s personal focuses) that they don’t care about or isolate in random silos (like the AGPL, as if the web is still a fringe thing) that AI barely seems relevant.

    I mean, I get it. The language-model people are exhausting, and their disinterest in copyright law is unpleasant. But asking an organization that doesn’t care to add restrictions to a license that the companies don’t read isn’t going to solve the problem.