The most charitable explanation is that they are concerned about their own privacy or identifiability,Bl but ultimately it is a Dick Move™ to other participants.
It's the kind of late-edit things that spurs me to include quotes in replies.
Traditionally, large corporations have taken very conservative legal stances with regard to integrating e.g. A/GPL code, even when there's almost no risk.
If my license explicitly says "any LLM output trained on this code is legally tainted," I feel like BigAICorp would be foolish to ignore it. Maybe I couldn't sue them today, but are they confident this will remain the case 5, 10, 20 years from now? Everywhere in the world?
Github has posted that they will now train on everyone's data (even private) unless you opt out (until they change their mind on that). Anthropic has been training on your data on certain tiers already. Meta bittorrented books to train their models.
Surely if your license says "LLM output trained on this code is legally tainted", it is going to dissuade them.
> I've been looking for a copy-left "source available" license that allows me to distribute code openly but has a clause that says "if you would like to use these sources to train an LLM, please contact me and we'll work something out". I haven't yet found that.
Personally, I want a viral (GPL-style) license that explicitly prohibits use of code for LLM training/tuning purposes — with the asterisk that while current law might view LLM training as fair use, this may not be the case forever, and blatant disregard of the terms of the license should make it easier for me to sue offenders in the future.
Alternatively, this could be expressed as: the output of any LLM trained on this code must retain this license.
Empathy hijacking. If the chatbots framed their responses as “beep boop, I’m a robot, here’s an estimated answer to your query” then we likely wouldn’t have this problem.
I'm not sure I'll be fully anti-AI in perpetuity: the future is hard to predict. But it's certainly becoming clear to me that we need "-noai" variants of programming communities.
Someone might feel different about a (future) community owned and managed LLM than one controlled by Altman, Musk, and similar. It would be nice to feel like we're building something together instead of funding the oligarchy and accelerating the collapse of civilization.
In other words, it is an existential question for them. And given that some of the people running these companies have no moral convictions, expect a complete shitshow. Regulation. Natural security classifications. Endless lawfare. Outright bribery. Anything and everything to retain their valuations.
reply