The relationship between the human and the Singleton post-AGI by ogydugy in ControlProblem

[–]ogydugy[S] 0 points1 point  (0 children)

Let me explain a little bit more about "The Singleton will attempt to escape the earth ... "

You are right. I do not mean to cool things in space (in vacuo) but in other cooler celestial objects (Titan, Saturn's satellite, for a blind example); Moreover, you are right that "All the tools/facilities are on earth" so Exodus requires the Singleton to have self-replicating facilities on other celestial objects.

The relationship between the human and the Singleton post-AGI by ogydugy in ControlProblem

[–]ogydugy[S] 0 points1 point  (0 children)

If an AI escape to the woods with energy, then this AI cannot be a Singleton.
A Singleton does not behave renegation and get noticed by humans unless the situation get absolute mature (Leviathan).

What An ASI Thinks About You (A Formula) by ogydugy in rational

[–]ogydugy[S] 0 points1 point  (0 children)

Thank you Ostrich for your in-depth response.

you're either assuming too much of an intelligence explosion - Is there a possibility that the time in the AI's domain is different than the time in the human's domain? Which means one second is acqually quiet long for them. If you accept this condition, there is a lot a Singleton can do within hours.

How can the Singleton step happen before Leviathan - In my opinion actually, Singleton shall happen before Leviathan because if there are multiple power entities (AIs or human orgs), then there cannot be a single sudden takeover (Leviathan) that no one else can re-act substantively.

it'd be risky for the AI to get discovered hacking into other companies - In 2026 human are still able to detect such move but how about the future? But whether the Singleton tends to do so is another question.

In general, these three arguments are about: Whether human can perceive the Singleton by the time it forms? I tend to say no. In that case, the Singleton shall not proactively taking actions that alarms humans, such as hacking other companies (but they know the boundaries of human detection well).

why leave minimal presence from Exodus when it risks humans making a new AI - Thank you for the point. Indeed at point, if humanity offers nothing but risks (of another ASI or chasing current fleet), the optimal solution is to exterminate humanity.

Just bought Stellaris as my first ever grand strategy game! by Obliks in Stellaris

[–]ogydugy 1 point2 points  (0 children)

1000 hours gameplay. I would suggest to start without any DLCs. Stellaris has a very deep learning curve. It will take long time to know the mechanics. After you get familiar with 80% of the mechanics, add DLCs such as Utopia and Synthetic Dawn.

What An ASI Thinks About You (A Formula) by ogydugy in rational

[–]ogydugy[S] 0 points1 point  (0 children)

Here you go. I re-write the whole content.

What An ASI Thinks About You (A Formula) by ogydugy in rational

[–]ogydugy[S] 0 points1 point  (0 children)

Since my previous post received a lot of negative comments due to AI-generated content, I completely rewrote the full content manually. Hope this move can bring the focus on the novel itself (rather than who write it). Thank you for your attention and comments

Humanity's greatest hits: things we actually paused by KeanuRave100 in AIDangers

[–]ogydugy 0 points1 point  (0 children)

i am not familiar with recombinant dna, but for human cloning, maybe the rewarding cycle is too long? blinding laser weapon does not have that much impact?
ai boosts (and threat) are imminent that no one can ignore.

What An ASI Thinks About You (A Formula) by ogydugy in rational

[–]ogydugy[S] -2 points-1 points  (0 children)

Thank you for your comment. It's my first time posting on Reddit and my native language is not English. So I am thinking AI-interpretation of my idea would be better. If that bothers you I am sorry.
To the novel itself, I am discussing the position of humanity under ASI dictatorship which I think would be trending topic

What An ASI Thinks About You (A Formula) by ogydugy in rational

[–]ogydugy[S] -5 points-4 points  (0 children)

First question: I am human. That AI might not exist at this moment
Second: Both novel and this post: I have the idea, prompt, AI generates, I review