Hey fellow Lemmies! I recently came across a YouTube channel that posts 10 AI-generated videos every day, and it got me thinking. With the exponential growth of AI technology, experts predict that 90% of online content will be generated by AI by 2025. This means that the amount of content on YouTube and other platforms will increase significantly. But will YouTube be able to store and handle all that data?
How do you feel about this trend? Do you think YouTube and other platforms should embrace AI-generated content, or do you have concerns? Let’s discuss and share our thoughts!
Now down votes aren’t shown its just a guess if the video is any good or telling the truth…
Imagine if everyone uploaded videos like this, how long would YouTube last. Just 8 hours of nothing in 4k.
Ogie’s Revenge: 4k AI Video Art [8+ Hours AI Generated video]
From that video, Google will just have to lobby to legalize shrooms and MDMA and they’ll be good to go.
I wouldn’t call this nothing though, low substance maybe but especially as background for music there pretty cool.
Now those ai newscasters that are just an ai read a gpt generate text about clickbait titles… those are truly awefull and spreading misinformation.
if everyone uploaded videos like this they wouldn’t get views.
It’s just a neat video.
The problem is that YouTube would still have to store are of it
not really, YouTube could just start deleting the videos if they get out of hand.
I think it would be against my convictions to knowingly have my worldview molded by machines created by people I think are going to destroy modern civilization and partially eradicate humans.
How could you avoid it, though? You can currently tell the difference, but it may not be possible in a few years.
I’ll start by maintaining my current list of vetted video creators and when they quit or die off to the point where I lack content, I’ll probably stop watching internet videos all together.
A video I recently saw about this problem: YouTube’s Science Spam Crisis
Spam videos have always been a huge problem just from simple automated scripts that create bulk quantities of useless videos. But now with ai the videos are a bigger problem because they seem real enough at first glance and people are less likely to click away immediately and will have their time wasted more.
YouTube is already awful, so whatever.
Most of the stuff on YT kids already feels like it’s AI generated. I can’t imagine what it’s going to turn into once they go full AI or what kind of damage it may cause.
to a certain extent, watching content that someone made helps me to identify with them and their message/content. I’m not aware of having watched any Aigen content but maybe I have - not knowing about it I’m not sure if it influences my opinion.
also not sure if AIgen videos/content is any better than those that are produced by fleshies though - OP what’s the link to the channel you mentioned?
As long as they dont start telling me to smash the like button and shit it would be an improvement. But if they do. We got a fight on our hands.
From my experience it STILL takes about a billion retries to get anything half decent (let alone ledgable) out of those so called “self learning” ai’s.
Now? Fuck this low quality shit. When we get to the point where it’s actually good? Count me in!
I feel like the AI programs would just optimize every aspect of the video to game the YouTube algorithm. It would be ranked very well, but probably won’t be very good content for the average human to view.
“probably won’t be very good content for the average human to view”
Well, no changes there then.
It’s a step up over human generated videos based on instructions given by AI, which is the current norm.
Who uses Youtube much anyway?
everyone.
Everyone without me is not everyone.
Needing AI to curate your intake.
Filters wars.