caption
a screenshot of the text:
Tech companies argued in comments on the website that the way their models ingested creative content was innovative and legal. The venture capital firm Andreessen Horowitz, which has several investments in A.I. start-ups, warned in its comments that any slowdown for A.I. companies in consuming content “would upset at least a decade’s worth of investment-backed expectations that were premised on the current understanding of the scope of copyright protection in this country.”
underneath the screenshot is the “Oh no! Anyway” meme, featuring two pictures of Jeremy Clarkson saying “Oh no!” and “Anyway”
screenshot (copied from this mastodon post) is of a paragraph of the NYT article “The Sleepy Copyright Office in the Middle of a High-Stakes Clash Over A.I.”
We need copyright reform. Life of author plus 70 for everything is just nuts.
This is not an AI problem. This is a companies literally owning our culture problem.
Going one step deeper, at the source, it’s oligarchy and companies owning the law and in consequence also its enforcement.
We do need copyright reform, but also fuck “AI.” I couldn’t care less about them infringing on proprietary works, but they’re also infringing on copyleft works and for that they deserve to be shut the fuck down.
Either that, or all the output of their “AI” needs to be copyleft.
Not just the output. One could construct that training your model on GPL content which would have it create GPL content means that the model itself is now also GPL.
It’s why my company calls GPL parasitic, use it once and it’s everywhere.
This is something I consider to be one of the main benefits of this license.
So if I read a copyleft text or code once, because I understood and learned from it any text I write in the future also has to be copyleft?
HOLY SHIT!
Doctor here, I’m sorry to inform you that you have a case of parasitic copyleftiosis. Your brain is copyleft, your body is copyleft, and even your future children will be copyleft.
GPL. Not even once!
It already is. God you uninformed people are insufferable.
It already is.
If you mean that the output of AI is already copyleft, then sure, I completely agree! What I meant to write that we “need” is legal acknowledgement of that factual reality.
The companies running these services certainly don’t seem to think so, however, so they need to be disabused of their misconception.
I apologize if that was unclear. (Not sure the vitriol was necessary, but whatever.)
There have already been cases decided… That’s enough
If this is what it takes to get copyright reform, just granting tech companies unlimited power to hoover up whatever they want and put it in their models, it’s not going to be the egalitarian sort of copyright reform that we need. Instead, we will just getting a carve out just for this, which is ridiculous.
There are small creators who do need at least some sort of copyright control, because ultimately people should be paid for the work they do. Artists who work on commission are the people in the direct firing line of generative AI, both in commissions and in their day jobs. This will harm them more than any particular company. I don’t think models will suffer if they can only include works in the public domain, if the public domain starts in 2003, but that’s not the kind of copyright protection that Amazon, Google, Facebook, etc. want, and that’s not what they’re going to ask for.
I’m gonna play them a song on the world’s smallest violin.
And i’m gonna put this for me Lucky 10000
“You can’t just decide we were wrong about IP, that would make us broke!”
Can we just put all the media and technology executives in an alley where they can fight it out like the scene from Anchorman?
Let’s just find and island and Australia them.
I feel like we could improve the situation by making kangaroos carnivorous and predatory
And the emus!
Piracy / stealing content is ok for big corps Piracy / stealing content punishable by life in prison for us proletarians
This is simply not stealing. Viewing content has never ever ever been stealing.
There is no view right.
Tech illiterate guy here. All these Ml models require training data, right? So all these AI companies that develop new ML based chat/video/image apps require data. So where exactly do they? It can’t be that their entire dataset is licensed, isn’t it?
If so, are there any firms that are using these orgs for data theft? How to know if the model has been trained on your data? Sorry if this is not the right place to ask.
Could say piracy is just running a program that “views” the content, and then regurgitates its own interpretation of it into local data stores.
It’s just not very creative, so it’s usually very close.
That’s one thing, but I think regurgitating it and claiming it as your own is a completely different thing.
Also, I’m pretty sure the argument is more about the unequal enforcement of the law. Copyright should be either enforced fairly or not at all. If AI is allowed to scrape content and regurgitate it, piracy should also be legal.
Again that’s not what’s happening here
They are downloading the data so thei LLM can “view” it. How is that different than downloading movies to view them?
They’re not downloading anything tho. That’s the point. At no point are they posessing the content that the AI is viewing.
This is LESS intrusive than a Google web scraper. No one trying to sue Google for copyright for Google searches.
What? Of course they are downloading, the content still has to reach their networks and computers.
Go look up how ai works. There is no download lol. It’s the exact same principal as web scrapers which have been around for literally decades.
That’s what would be called “a swing and a miss”
It’s almost like speculating has risks
Bu-bu-but you didn’t think of my investors!
I mean, I won’t deny that small bit of skill it took to construct a plausible sounding explanation for why the public should support your investment, because it’s “not illegal (yet)”.
“technically this thievery isn’t covered by law”
"technically this what?”
"OBJECTION!”
You stole my post by looking at it. Pay me.
They have chosen to think that if it runs through AI, it is then a derivative, it is not. If I put Disney and Amazon together as a prompt, things come out very similar to their logos and it’s obviously a copyright infringement. The worst part of this, they’ll still steal all of the small artists and avoid the larger ones.
Don’t forget that they will then switch sides and try to copyright “their work”, preventing others from even thinking about their work without paying the toll.
Hey, what if I were to draw two circles…
COPYRIGHT INFRINGEMENT!
I hadn’t thought of that, we truly are fucked.
I don’t give a shit about copyright for training AI.
But I don’t give a shit about investors, either.
Copyright should ceise to exist and sharing digital copies of any content should be a protected right. The best software is foss anyway.
But if i cant have that i will settle for techbross going to jail for mass theft. Either the law is equal or it is unlawful.
Nah. Even in its current stupid state, copyright has to recognize that sifting through the entire internet to get a gigabyte of linear algebra is pretty goddamn transformative.
No kidding the machine that speaks English read every book in the library. Fuck else was it gonna do?
That’s the point about money, if you have enough you can simply sue or bribe in order to not lose money.
A’ight. Time to self host the entire of the internet in a server and do machine learning with the content I stored. :)
Can someone rephrase me ? I’ve read it two times and I really don’t get it
A scammer made unreasonable promises to investors and is now warning everyone that their victims/investors are going to lose money when the process of making fair laws takes the typical amount of time that it always takes.
As Robert Evan’s put it: “If we can’t steal every book ever written, we’ll go broke!”
They made investments and projections on their business based on the current laws and they’ll be sad if the laws change now.
“warned”
Or what? I want to see that bluster called.
Or their victims will realize they got scammed.