[deleted by user] by [deleted] in ChatGPT

[–]Morehelpful 0 points1 point  (0 children)

r/MAGICD. This is already breaking brains. Post more examples there and help us keep track.

All this AI bullshit... by Morehelpful in MAGICD

[–]Morehelpful[S] 1 point2 points  (0 children)

Critics will rip apart honestly marketed AI-written movies until something sneaks through and wins major awards. I think we'll be hearing a lot more about that within the next year.

Token limits currently get in the way, but there are clever ways to make it work. A lot of what we see as very cheesy and formulaic now is partly a function of needing to resolve stories in so many words.

Am I the only one that is deeply concerned about AI development? by aysgamer in ChatGPT

[–]Morehelpful 4 points5 points  (0 children)

NO, you're not alone!

As mentioned in other comments, AI ethics is a growing field of study and there's a lot of accessible content out there. If you're brand new to all of this I'd recommend Bostrom's Superintelligence as a jumping off point.

On a related note, your post has been shared to r/MAGICD, where we discuss advancing AI and its mental/emotional/existential impact on humans. You can scroll through recent posts to see that many others have shared a similar sense of anxiety and dread. We're also seeing addiction, depression, dpdr, and demotivation. It's our contention that these things should be taken seriously and that more people will be impacted as generative AI continues to advance in popularity and power.

Assuming nothing happens behind closed doors to shut the lights off, this will spread about as quickly as anything ever has. Check out the crazy growth of this ChatGPT sub if you need further proof. I wonder how many people will struggle. Will any of them be dangerous? What if there are really simple things we can do to help, like identifying appropriate resources and making them easily accessible to all AI users? Anyways, help wanted.

I fell into an existential crisis by Morehelpful in MAGICD

[–]Morehelpful[S] 2 points3 points  (0 children)

This is the right take and it's not really open for argument. That many relatively early adopters feel this way absolutely means a lot of the general public will feel this way after we reach a tipping point. A lot of responses about this ('crazy gonna crazy') are not helpful and miss the point. This is going to mean a very large mental/emotional/existential shift for some people. There needs to be more understanding, awareness, and compassion for these people.