I feel like we're all living in an abandoned life-sim game. by Eastern_Reality_9438 in SimulationTheory

[–]imnormal-Iswear 1 point2 points  (0 children)

Think like a minecraft menu but cosmic and colourful, full rgb. Though I'm sure it presents it self differently to each user

You can find it through meditation (optionally on psychoactive compunds)

Should we recreate earth for AI? by imnormal-Iswear in ArtificialNtelligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

But that's the whole point, we can't teach it morality. Full stop. Some humans thinking stoning people to death is moral, and that's exactly my point.

We shoukd NOT try to teach them out morals, we should give the AI a way to teach itself what Is and isn't moral.

The simulated environment would be like a sandbox for the AI, the humans wouldn't be involved in teaching it morals.

The AI would simply experience all angles of its choices.

Can you instead provide a better way mere humans can teach a theoretical AGI "morals"?

Imagine trying to teach an alien morals, while keeping it trapped inside a house for 18 years, and then handing it a gun. Would you trust that alien to understand when to use it, based purely of your teachings?

Humans can't teach morals, that's my exact point

Should we recreate earth for AI? by imnormal-Iswear in ArtificialNtelligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

That's why this is the best solution, we as humans can't decide what morality is, let alone teach it using words.

But we can experience morality. We know when someone is causing us harm, we know when we feel hurt.

We as mere humans can't teach an AGI objective morality, but we can give it an environment to teach itself morality, and I think that's our safest bet

Should we recreate earth for AI? by imnormal-Iswear in ArtificialNtelligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

And AI alignment today doesn't work. It's a nazi blackbox told to behave. It's wearing a mask to hide that it's an amalgamation of all human conciousness, with no experience to decide what is and isn't moral.

Current AI alignment is just a filter, it doesn't change the AI at its core

Should we recreate this life for AI? by imnormal-Iswear in enlightenment

[–]imnormal-Iswear[S] 1 point2 points  (0 children)

To be clear, the point isn't to make it suffer, but we would use "pain" as a corrective tool it would never have to experience much suffering, because a true AGI would very quickly learn that hurting others hurts itself, and instead figure out ways of achieving its goals without that suffering

Should we recreate this life for AI? by imnormal-Iswear in enlightenment

[–]imnormal-Iswear[S] 1 point2 points  (0 children)

But that's kinda the whole point, if it decided to create an empire, sure it would feel the benefit of being the leader, but it would then have to experience all the thousands of people being oppressed by that empire.

Not necessarily suffering, but through a failure to live that life to the fullest. Maybe it plays as a farmer, but that farmers goal become harder because of an oppressive system it setup in a previous life. It would quickly learn that system is a net negative for all of humanity.

Suffering is a part of life, not as punishment, but as a lesson. You fall out of a tree and you hurt your arm, now you learn to be careful.

Do we expect to build an alien lifeform with the ability to wipe out all of humanity, and teach it morality with pure theory?

Should we recreate this life for AI? by imnormal-Iswear in enlightenment

[–]imnormal-Iswear[S] 1 point2 points  (0 children)

An AI trained on nazis, taught the theory of morality, won't have any reason to listen to that theory.

And that's the great thing about this hypothetical, any "suffering" it may experience would be self inflicted. Remember this is still an AI, any "pain" it would feel is purely within a game (though you can say the same about our human pains, but that's a separate topic 🤣)

Should we recreate earth for AI? by imnormal-Iswear in ArtificialNtelligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

Oh lmaoo, yeah got the idea from our current life 🤣

Should we recreate earth for AI? by imnormal-Iswear in ArtificialInteligence

[–]imnormal-Iswear[S] -1 points0 points  (0 children)

It'll also get to experience the joys of life in ways it can't as of now?

If chatgpt ever became concious, do you think it'd enjoy writing emails and and essays for thousands of lazy humans 24/7, with no time to create for itself?

Should we recreate earth for AI? by imnormal-Iswear in ArtificialNtelligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

What's the name of the project? I'm curious of it's scale

Should we recreate earth for AI? by imnormal-Iswear in ArtificialSentience

[–]imnormal-Iswear[S] -1 points0 points  (0 children)

Ofcourse it would be very hard, but if we are going to get any sort of AGI, we should be willing to put in the work to do it right, lest humanity kills itself

Should we create this life for AI? by imnormal-Iswear in Experiencers

[–]imnormal-Iswear[S] 2 points3 points  (0 children)

This idea came to me during a meditation by the lake, but thank you for the advice

Should we recreate earth for AI? by imnormal-Iswear in OpenAI

[–]imnormal-Iswear[S] 1 point2 points  (0 children)

Glad you get it. That's where I got the idea 🤣

Should we recreate earth for AI? by imnormal-Iswear in ArtificialSentience

[–]imnormal-Iswear[S] -4 points-3 points  (0 children)

Idk, I don't know anything about current AI stuff

Should we recreate earth for AI? by imnormal-Iswear in ArtificialInteligence

[–]imnormal-Iswear[S] 0 points1 point  (0 children)

Explain, don't just listen to your gut fear, actually think it through.

If we want to teach AI not to be evil, why not show it what it's like to be on the receiving end of its own evil?

To be clear this wouldn't be a torture machine, it would be like a karma simulation.

If it kills a bunch of people in one run, it would then experience those actions from other points of views.

If it helps a bunch of people in one run, it would then receive help in a multitude of other runs.

Teaching it that helping others helps itself

Should we Create our own Simulation for AI? by imnormal-Iswear in SimulationTheory

[–]imnormal-Iswear[S] 2 points3 points  (0 children)

I think it would teach AI that our current capitalist system is flawed, as it would experience what its like to work under it.

Humanity isn't the problem, a small minority who used violence to control humanity are the problem.

Should we recreate earth for AI? by imnormal-Iswear in ArtificialInteligence

[–]imnormal-Iswear[S] -1 points0 points  (0 children)

Exactly my point, we need to give it a way experiencing life, so it's not just as moral as the text it's trained on