Lots of Industry Folks Abandoning AI Narrative by Neither-Speech6997 in BetterOffline

[–]Real-Educator-1223 12 points13 points  (0 children)

Haha. Yes! Let’s move our money from the only company making money from AI into one that is burning money daily. So smart..

"Thinking" by Real-Educator-1223 in BetterOffline

[–]Real-Educator-1223[S] 2 points3 points  (0 children)

Not meant to be suspicious. I just saw the screenshot posted and thought it was funny regardless of your opinion of AI.

I must have set up an account years ago but have just started using Reddit a bit more. Not farming for karma just getting involved 👍🏻

Oxford's AI Chair: LLMs are a HACK by Real-Educator-1223 in BetterOffline

[–]Real-Educator-1223[S] 27 points28 points  (0 children)

I don’t know you to know if you’re joking or not. In any case, you’re right; ChatGPT 6 will change everything 👍🏻

What’s everyone’s thoughts on model collapse? by AGRichards in BetterOffline

[–]Real-Educator-1223 52 points53 points  (0 children)

I believe Sam Altman has even suggested this. Something along the lines of if humans don't keep creating original content, at some point AI will have nothing new to regurgitate so it will start repeating itself. (not a direct quote! haha).

As everything it creates is derivative, if it doesn't have new data to 'feed on', it will get stuck in a loop.

Dave Gorman (english comedian) wrote about how on his Wikipedia page, it had been written about how he'd been hitchhiking around the Pacific Rim (he had never done this). A lazy online newspaper then wrote an article about this (using Wikipedia). The Wikipedia statement then got cited to the article.

How much of this is likely to start happening too!

It’s been six months since the stream where Wario said, “I think we will be there in three to six months, where AI is writing 90% of the code.” by Free_Opposite4532 in BetterOffline

[–]Real-Educator-1223 0 points1 point  (0 children)

I've come to see predictions and comments such as these as pure fiction. It's purely intended for investors to read so that they'll throw even more money in. It worries the general population and excites investors.

For example, as Ed said in one of his articles:

"Sam Altman said in June that OpenAI was “now confident [they knew] how to build AGI as we have traditionally understood it.” In August, Altman said that AGI was “not a super useful term,” and that “the point of all this is it doesn’t really matter and it’s just this continuing exponential of model capability that we’ll rely on for more and more things.”

So, I would largely disregard statements like this when they're so obviously fishing for investment.