Is FSF's stance on AI actually clear? I thought they were just upset it was made by Microsoft.
Creative Commons has been fairly pro-AI -- they have been quite balanced, actually, but they do say that opt-in is not acceptable, it should be opt-out at most. EFF is fairly pro AI too -- at least, against using copyright to legislate against it.
You shouldn't discount progress in the open model ecosystem. You can sort of pirate ChatGPT by fine tuning on its responses, there's GPU sharing initiatives like Stable Horde, there's TabbyML which works very well nowadays, and Stable Diffusion is still the most advanced way of generating images. There's very much of an anti-IP spirit going on there, which is a good thing -- it's what copyleft is there for in sprit, isn't it?
The Software Freedom Conservancy has been complaining about GitHub Copilot since 2022[0]. They specifically cite Copilot's use of training data in ways that violate the copyleft and attribution requirements of various FOSS licenses. Hector Martin (the guy porting Linux to MacBooks) also agrees with this. It's also important to note that the first AI training lawsuit was specifically to enforce GPL copyleft[1].
The EFF's argument has come across to me less like "AI is cool and good" and more like "copyright doesn't do a good job of protecting artists against AI taking their jobs". Cory Doctorow's also taken a similar position, arguing that unions are better at protecting against AI than copyright is. e.g. WGA being able to get contractual provisions preventing workers from being replaced with AI.
This is a different vein of opposition to AI from what we saw the following year in 2023 with artists and writers, though. Even then, those artists and writers aren't suddenly massively pro-copyright[2] and more consider it a means to fatally wound AI companies[3]. In contrast, big businesses that own shittons of copyright have been oddly quiet about AI. Sure, you have Getty Images and The New York Times suing Stability and OpenAI, but where's, say, Disney or Nintendo's litigation? These models can draw shittons of unlicensed fanart[4], and nobody cares. Wizards and Wacom made big statements against AI art, but then immediately got caught using it anyway, because stock image sites are absolutely flooded with it.
My personal opinion is that generative AI creates enough issues that we can't group them down into neat "pro-copyright" vs. "anti-copyright" arguments. People who share their work for free online are complaining about it while people who expect you to pay money for their work are oddly ambivalent. AI is orthogonal to copyright.
I will give you that the open model community is doing cool shit with their stolen loot. However, that's still something large corporations can benefit from (e.g. Facebook and LLaMA).
[3] Their actual argument against AI is based on moral grounds, not legal ones. I don't think any artist is going to accept licensing payments for training data, they just want the models deleted off the Internet, full stop.
[4] OpenAI tried to ban asking for fanart, but if you ask for something vaguely related (e.g. "red videogame plumber" or "70s sci-fi robot") you'll get fanart every time.
Creative Commons has been fairly pro-AI -- they have been quite balanced, actually, but they do say that opt-in is not acceptable, it should be opt-out at most. EFF is fairly pro AI too -- at least, against using copyright to legislate against it.
You shouldn't discount progress in the open model ecosystem. You can sort of pirate ChatGPT by fine tuning on its responses, there's GPU sharing initiatives like Stable Horde, there's TabbyML which works very well nowadays, and Stable Diffusion is still the most advanced way of generating images. There's very much of an anti-IP spirit going on there, which is a good thing -- it's what copyleft is there for in sprit, isn't it?