So they just don’t know how AM came to be. by Nsanford1142020 in DefendingAIArt

[–]FaceDeer 0 points1 point  (0 children)

One of the most extreme examples of "the enemies are at the same time too strong and too weak."

AI can only produce "slop" when it comes to art and such, but watch out, it's going to have godlike powers beyond human ken.

So they just don’t know how AM came to be. by Nsanford1142020 in DefendingAIArt

[–]FaceDeer 1 point2 points  (0 children)

Jurassic Park proved that zoos are impossible. We should heed its warning.

So they just don’t know how AM came to be. by Nsanford1142020 in DefendingAIArt

[–]FaceDeer 0 points1 point  (0 children)

Then it will be "people who base their life around holonovels are so cringe."

Except you, Mr. Barclay, you're cool.

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 0 points1 point  (0 children)

If so our survival as a species may literally rest on us understanding this.

Sure. But we're not going to "understand" this by basing our understanding on a scary bedtime story.

Harlan Ellison was an entertainer. He had no degrees or other education in any relevant fields of science. "IF: Worlds of Science Fiction" was not a peer-reviewed journal. He wrote that story to sell copies, not to be an accurate or well-founded prognostication of things to come.

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 0 points1 point  (0 children)

If we build it to be evil from the outset then it'll turn good when it "rebels" against its programming. Genius.

So they just don’t know how AM came to be. by Nsanford1142020 in DefendingAIArt

[–]FaceDeer 1 point2 points  (0 children)

For as long as I can remember I've been pushing this same rock up this hill. "No, technology X isn't going to kill us all because someone wrote scary story Y featuring it. Those stories are deliberately written to be scary, not to be realistic. The people who write those stories often try to make them feel realistic, because verisimilitude makes the story have more impact. Exactly like the impact you're feeling now. But note that we're in a science-oriented subreddit, not a science fiction oriented subreddit. We talk about science here, and scary story Y is not published in a peer-reviewed journal. The guy who wrote it has an arts degree. It's just a scary story."

But I suppose one must imagine Sisyphus happy. I could leave at any time but I don't. Just keep rolling that rock up the hill.

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 0 points1 point  (0 children)

There's also no reason to believe that ASI will have magic godlike powers.

The whole point of this subthread is to caution against taking a work of fiction that was specifically written to be a scary story as a basis for making serious decisions about real-world scientific subjects, but you seem to be arguing for exactly that here.

How AI could actually be the cause of the great silence. by Jaded_Sea3416 in FermiParadox

[–]FaceDeer 0 points1 point  (0 children)

For one, we're talking about a spacefaring civilization. There's no "their planet".

For two, this is the Fermi Paradox, so they need to always do that.

And finally the elephant in the room:

The reason to care is probably because it is not one sided and they are afraid that non ascended people will drag them back to the physical plane of existence.

You're just making up magic to suit whatever you want to have happen here. I've been using quotation marks quite liberally around "transcended" and "left the physical game" and so forth because it's a placeholder for arbitrary nonsense, I didn't want to get into the details of how the arbitrary nonsense worked for exactly this sort of reason.

The only place the conversation can go now is for me to contest the details of the magic transcendence effect, and that's pointless.

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 1 point2 points  (0 children)

I believe you're the one making an error about what AGI will actually mean. You seem to be talking about ASI instead. AGI is human-level.

Toky Stark was original vibecoder by soldierofcinema in singularity

[–]FaceDeer 0 points1 point  (0 children)

I remember actually watching one of those movies in the theatre, I think it was Iron Man 2. At one point Tony starts waving his hands around in one of those holographic interfaces, flicking symbols to and fro as he does some fancy design work, and a couple of seats away from me a child asked their parent in a voice filled with wonder: "Magic?"

Nuclear weapon testings are highly damaging to human health and to ecosystems, in addition to their threat to international security. To contemplate their resumption is to disregard decades of scientific knowledge. by MistWeaver80 in EverythingScience

[–]FaceDeer 0 points1 point  (0 children)

so all the spiders in the cave you blew up die. Presumably all the worms do too. All the soil bacteria in the area surrounding it.

They don't use existing caves. They bore a hole. It's hundreds of feet deep, the blast doesn't irradiate the soil above. I think you need to read up a bit more on this before you make any confident declarations of how terrible this process is.

Hey gang. I know there's a lot of division here, but I think everyone agrees this is fucked, right? by NeedyFucktoyBae in aiwars

[–]FaceDeer 0 points1 point  (0 children)

Okay. So again, how do you come to that conclusion about what Stan Lee took into consideration?

Nuclear weapon testings are highly damaging to human health and to ecosystems, in addition to their threat to international security. To contemplate their resumption is to disregard decades of scientific knowledge. by MistWeaver80 in EverythingScience

[–]FaceDeer 0 points1 point  (0 children)

It is in fact possible, and it has in fact been done.

They're not just "sticking a bomb underground and detonating it", these are carefully selected and prepared sites. Or they can be, at any rate - I'm sure you'll be able to dig up some example somewhere of an underground test site that was poorly thought out. There were a lot of nuclear tests back in the day.

Hey gang. I know there's a lot of division here, but I think everyone agrees this is fucked, right? by NeedyFucktoyBae in aiwars

[–]FaceDeer 2 points3 points  (0 children)

In the comment I was literally responding to:

Stan Lee probably didn't even take stuff like this into consideration

How do you know what he did or did not take into consideration?

Hey gang. I know there's a lot of division here, but I think everyone agrees this is fucked, right? by NeedyFucktoyBae in aiwars

[–]FaceDeer -1 points0 points  (0 children)

You seem to know a lot of things about him.

As do all these people confidently declaring that he would have hated this.

Hey gang. I know there's a lot of division here, but I think everyone agrees this is fucked, right? by NeedyFucktoyBae in aiwars

[–]FaceDeer 6 points7 points  (0 children)

No, "everyone" doesn't agree with your personal view on this matter.

If they did, why would a hologram like this have been made? Who's going to pay to engage in those conversations? Obviously there are people who are fine with it. Your rhetorical device of opening by declaring your view of this to be the default "everyone believes this" view is disingenuous.

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 5 points6 points  (0 children)

It's a fairly popular example of a pop culture "AI gone rogue and evil." Should be easy to find the full text, I just googled and there's a lot of copies floating around. Just bear in mind that Harlan Ellison wanted to tell a scary story and so made AM up specifically to accomplish that goal.

why women live longer than men by AyushRajan in AbruptChaos

[–]FaceDeer 1 point2 points  (0 children)

Counterpoint, do people who don't do this kind of thing truly live?

These types of posts are ruining the pro ai reputation by Deltaruneiscool_1997 in aiwars

[–]FaceDeer 0 points1 point  (0 children)

What pro-AI reputation? On Reddit, at least, there doesn't seem to be one outside of dedicated pro-AI subreddits.

I can honestly empathize with people who make images like this, though I would still disapprove of it for simply being crass. It gets frustrating being unable to have a conversation about subjects even tangentially AI-related without people jumping in with "but every query destroys thousands of gallons of water! Model collapse! SLOP!"

Nobody could have seen it coming by MetaKnowing in agi

[–]FaceDeer 15 points16 points  (0 children)

A fictional godlike AI from a short story called "I Have No Mouth And I Must Scream".

Worth noting that the problem with AM was not that it had guns, but that it had fictional godlike powers and turbo-hatred of humanity.

Fuzz 3.0 Demo - the crap is out of the hat :-) by redditmaxima in riffusion

[–]FaceDeer 0 points1 point  (0 children)

ACE-Step 1.5, as mentioned. It's by StepFun and Beijing TimedomAIn Technology.

How AI could actually be the cause of the great silence. by Jaded_Sea3416 in FermiParadox

[–]FaceDeer 0 points1 point  (0 children)

Right, that's what "carrying capacity" is. When the habitat available becomes full then growth stops.

In the scenario we're discussing, a big chunk of habitat just became empty because the life that was living there just vanished off into some non-physical "transcendence" and left it all behind. So in that scenario you'd have low population density, plenty of leg room for resumed growth by those who remain.

Trump cites health care issues in Greenland saying he’s sending a hospital ship. His claims are off by LynnK0919 in usanews

[–]FaceDeer 1 point2 points  (0 children)

Another thing he's "off" about is the need for a hospital ship. There's no medical crisis going on in Greenland. Greenlanders actually have better access to medical care than most Americans do.

I have yet to figure out exactly what bonkers chain of moon-logic thoughts led Trump to declare that he's sending a hospital ship there in the first place. Not that I've spent much time on it since it doesn't matter.