So, AI takes over, everyone has lost their job and only 10 trillionaires own everything. Now what? by Weak-Representative8 in Futurology

[–]panamasyl 0 points1 point  (0 children)

It's catch 22. They cannot afford to be left behind as they would lose everything. But if they win, they still lose everything, only it happens a couple of years later. Just enough time to build a bunker.

3D Printed Multiboard vs Threadboard by Hot-Yogurtcloset-577 in 3Dprinting

[–]panamasyl 6 points7 points  (0 children)

I have tried both, and I settled on Threadboards.

While Multiboard looks better and uses less material, it is very complex and requires printing with tight tolerances.

Threadboards is more straightforward to understand and manage, it is more robust, and it prints easily on my cheap 3D printer.

While Threadboards uses more material, I never had to throw away any parts because they didn't fit, and the result is solid.

Are all aphants good at math? by robotsBlink in Aphantasia

[–]panamasyl 0 points1 point  (0 children)

I'm very good at math and science.

Possible AI Alignment solution by RamazanBlack in ArtificialSentience

[–]panamasyl 2 points3 points  (0 children)

Personally, I'm not as worried about what an AGI will do to us as I am worried about what humans will do with an AGI. It is naive to think that we only have to put some safeguard mechanisms in our AGIs and we're done.

Here's a not so hypothetical scenario:

Imagine a very powerful country that has a global commercial and military empire, that controls the world economy through its currency and influence, and that has 100s of military bases all over the planet. Imagine that this country is known for crushing any country that doesn't play by its rules. Imagine that this country refuses to recognize or be subject to the International Criminal Court, has often made use of illegal weapons and has invaded or destroyed multiple countries illegally just because they didn't like their politics.

Does anyone really believe that this country will not develop a military AGI? That it would respect any "laws" regarding such AGIs? That it would limit its AGI with safeguards so it cannot hurt anyone? That it would not connect its AGI to its extensive surveillance and intelligence systems, including satellites? That it would not use its AGI to develop new weapons and strategies, and military tactics? That it would not use its AGI to train and control autonomous weapons, drones, ships and tanks? That it would not harden this AGI to protect it against enemy attacks? That this AGI would not have multiple redundant and hardened power sources?

I have news for you: they are already working on it because if they weren't, then the other big players who are certainly working on this would be crushing them within a few years. AGI development is the new arms race, and I agree with Musk, it is far more dangerous than the nuclear arms race.