All right, I am not a lawyer but I’ve been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.
A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.
That’s a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn’t even apply, but even under the broader term of “intellectual property”, it doesn’t hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)
Edit: and to be fair, I’m not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can’t claim to be any type of authority on it.
And the thing is, if they are using works written by others to build an AI for profit without permission, that’s exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn’t mean we shouldn’t cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I’m going to go off on any comparison analogy.
Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there’s a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.
I have skills besides technical writing, but it’s one of the things I rely on to get hired. So yeah, I’m partially on the chopping block prior to creative writing. And it’s a serious problem that all the writing I’ve done on the internet is being used to train AI.
But the thing about the comparison to humans in morality is there’ll be a line that gets crossed. And once that line is crossed, you can’t OWN an AI anymore, and you certainly can’t sell it. Up until then, you have to treat it as a tool.
The end solution is going to be something along the lines of a Creative Commons license where you specific if your work can be used to train AI, if it can’t, or if it can only be used to train non-profit AI.
I don’t follow why calling it a tool matters. If a python script renders someone’s job redundant (hypothetically; this is unlikely in reality) does it matter if the script was written by a human or a LLM?
@effingjoe I imagine it matters to the person who wrote it. Were THEY paid for this?
I mean, it’s a shitty thing that consultants and such remove jobs, but at the very least the exploitation there is only on one side, the poor guy kicked out. If an LLM is removing someone’s job, then the people used to train the LLM are getting exploited too.
Plus, a certain amount of the law is for deterrence. We don’t want the companies replacing creatives with AI. It would be beneficial to discourage that. We DO want things like fruit-picking and weeding and other backbreaking manual labor replaced by AI, so we can push for laws that encourage THAT. But right now they are trying to replace the wrong end.
You’re going to need to strictly define “exploited”, I think. I don’t know what you mean when you use that term.
If I read a book on Python and write a script to replace someone’s job, did I exploit the person who wrote the book? What about the people that created and/or maintain python?
Why don’t we want companies replacing creatives with AI? Should we roll back other technological advances that resulted in fewer humans being employed? No human routes phone calls anymore, but they used to. Should their jobs be protected, too? What about people that used to carve ice out of mountain lakes and deliver it to businesses? Should refrigeration technology be held back by the law to protect those jobs? If not, why artists? What makes them more deserving of being protected?
Intent is a big deal in this one. Your Python book writer intended for people to read it to learn Python.
A romance book writer does not intend for an AI to use it to learn to generate sentences. But because there was no obvious barrier and they could get away with it, the companies grabbed the romance book and used it. That’s an exploit.
And again, you’re ignoring the quality of labor. Back-breaking jobs that hurt people’s health should be improved with technology. A migrant worker might lost his job to a mechanical fruitpicker but he’s likely bilingual and eligible for a translator job. Unless that job, which is better for health and longevity, and allows someone to stay in one place, is taken by an AI.
The promise of automation was that it would RAISE the quality of human life. Taking away the jobs of creatives lowers the quality of human life. Using automation to carve ice out of mountain lakes raises the quality of human life. Things are not neutral here.
The large companies want to keep manual labor in human hands and put creative work and decision making in AI hands. This is going to make life worse for everyone.
All right, I am not a lawyer but I’ve been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.
A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.
That’s a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn’t even apply, but even under the broader term of “intellectual property”, it doesn’t hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)
Edit: and to be fair, I’m not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can’t claim to be any type of authority on it.
Okay, well my hobby is ethics.
And the thing is, if they are using works written by others to build an AI for profit without permission, that’s exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn’t mean we shouldn’t cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I’m going to go off on any comparison analogy.
Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there’s a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.
If you’re leaning on morality, then the comparison to humans becomes relevant again.
Lawyers taking a high profile case is not any indication to go by.
I could be off base here, but are you financially impacted if AI starts making commercial art? Like, is that how you make income, too?
I have skills besides technical writing, but it’s one of the things I rely on to get hired. So yeah, I’m partially on the chopping block prior to creative writing. And it’s a serious problem that all the writing I’ve done on the internet is being used to train AI.
But the thing about the comparison to humans in morality is there’ll be a line that gets crossed. And once that line is crossed, you can’t OWN an AI anymore, and you certainly can’t sell it. Up until then, you have to treat it as a tool.
The end solution is going to be something along the lines of a Creative Commons license where you specific if your work can be used to train AI, if it can’t, or if it can only be used to train non-profit AI.
I don’t follow why calling it a tool matters. If a python script renders someone’s job redundant (hypothetically; this is unlikely in reality) does it matter if the script was written by a human or a LLM?
@effingjoe I imagine it matters to the person who wrote it. Were THEY paid for this?
I mean, it’s a shitty thing that consultants and such remove jobs, but at the very least the exploitation there is only on one side, the poor guy kicked out. If an LLM is removing someone’s job, then the people used to train the LLM are getting exploited too.
Plus, a certain amount of the law is for deterrence. We don’t want the companies replacing creatives with AI. It would be beneficial to discourage that. We DO want things like fruit-picking and weeding and other backbreaking manual labor replaced by AI, so we can push for laws that encourage THAT. But right now they are trying to replace the wrong end.
You’re going to need to strictly define “exploited”, I think. I don’t know what you mean when you use that term.
If I read a book on Python and write a script to replace someone’s job, did I exploit the person who wrote the book? What about the people that created and/or maintain python?
Why don’t we want companies replacing creatives with AI? Should we roll back other technological advances that resulted in fewer humans being employed? No human routes phone calls anymore, but they used to. Should their jobs be protected, too? What about people that used to carve ice out of mountain lakes and deliver it to businesses? Should refrigeration technology be held back by the law to protect those jobs? If not, why artists? What makes them more deserving of being protected?
Intent is a big deal in this one. Your Python book writer intended for people to read it to learn Python.
A romance book writer does not intend for an AI to use it to learn to generate sentences. But because there was no obvious barrier and they could get away with it, the companies grabbed the romance book and used it. That’s an exploit.
And again, you’re ignoring the quality of labor. Back-breaking jobs that hurt people’s health should be improved with technology. A migrant worker might lost his job to a mechanical fruitpicker but he’s likely bilingual and eligible for a translator job. Unless that job, which is better for health and longevity, and allows someone to stay in one place, is taken by an AI.
The promise of automation was that it would RAISE the quality of human life. Taking away the jobs of creatives lowers the quality of human life. Using automation to carve ice out of mountain lakes raises the quality of human life. Things are not neutral here.
The large companies want to keep manual labor in human hands and put creative work and decision making in AI hands. This is going to make life worse for everyone.