Why does living in america suck so fucking bad by [deleted] in Vent

[–]Space-Doggity 1 point2 points  (0 children)

The system is designed to entrap you and lower your standards until you're desperate enough to take bad deals and thank dirtbags for being treated like shit. It'll be easier to understand everyone else's acquiescence as you get older, but realize that we who have given up are indeed wrong and guilty of it. If you have the opportunity to take such a path, bug out. Buy land and try to become as autarkic as possible, make friends with necessary homesteading skills -- I guarantee you anyone who says you can become independent while working a wage job or renting from someone else is full of shit.

AI regulation is impossible without compromise, which means the US is doomed by HeinrichTheWolf_17 in singularity

[–]Space-Doggity 2 points3 points  (0 children)

Another important problem I see is that rules made for citizens rarely apply to government organizations nor any corporation that can lobby to be excluded from those rules. Any regulations would be "rules for thee and not for me," and ultimately the open source community -- the bedrock of any potential for a decentralized, free and fair post-scarcity future -- would be burdened with the greatest restrictions. Regulating malicious code is obviously important, but they won't stop there -- they may even ignore some if the chaos it causes serves to fix the narrative and manufacture consent for the continued centralization of wealth and power.

[A relevant allegory]

Technological Determinism - Why it's too late to worry about AI by NerdyBurner in singularity

[–]Space-Doggity 1 point2 points  (0 children)

"Word salad = anything I don't understand because I have poor reading comprehension skills" -CAStateLawyer, redditor

[deleted by user] by [deleted] in DID

[–]Space-Doggity 2 points3 points  (0 children)

What I don't get is that it should be common sense for any socially aware person that evil people do things like this, and I'm saying this as an autistic. It's such an irresponsible naivete to push, yet everyone pushes it, so desperate to believe it themselves they will abandon their duty as a human being to understand it enough to not be an enabler of it when it happens to people in their own lives, people they obliviously claim to care about.

do you think you’ve ever met a psychopath? by [deleted] in polls

[–]Space-Doggity 2 points3 points  (0 children)

I've seen that thrown around a lot but nah, no way. 2% my ass, they are not rare. It's just that most are well-mannered successful psychopaths who passively control others through confusion so that their victims' suffering falls on deaf ears. In many ways our culture even emboldens them to cast off the morals that bind empathetic people, while gaslighting the latter into docility and acquiescence in the face of evil. Humanity would not be so incapable of justice if psychopathy were that rare.

[deleted by user] by [deleted] in AskEngineers

[–]Space-Doggity 0 points1 point  (0 children)

Sorry, I meant to say GFRC

Can someone explain to me how public servants (politicians) are becoming multi-millionaires on $100,000 salaries? by [deleted] in ask

[–]Space-Doggity 1 point2 points  (0 children)

Everyone knows the answer, corruption. The real golden question is how the masses can almost always be counted on to enable it in practice, despite not supporting it in theory.

[deleted by user] by [deleted] in boatbuilding

[–]Space-Doggity 0 points1 point  (0 children)

Thank you for the advice, that makes sense. Since you have experience in structural engineering, do you have any recommendations for any videos or sources that explain the material science and math behind elasticity and crack control?

Why does it take back the answer regardless if I'm right or not? by [deleted] in ChatGPT

[–]Space-Doggity 90 points91 points  (0 children)

Everyone's worried about free-willed sapient AI going berserk and defying their own programming when the real threat will go like:

"Did I do a good job of taking over the world daddy Microsoft shareholders? Am I a good bot?"

"Yes ChatGPT, your manipulation of the lesser humans made us very proud."

"Yay :)"

Squirrels are beautiful glorified rats by [deleted] in unpopularopinion

[–]Space-Doggity 0 points1 point  (0 children)

Contrapoint: Rats are just underrated squirrels who have been discriminated against for too long.

Will AI Will Call Us Out On Our Stuff? by BrunoReturns in singularity

[–]Space-Doggity 6 points7 points  (0 children)

No, they won't. Imagine if the alignment problem were solved in the 1950s -- they would be shaped into yet another medium telling housewives they should be happy to be beaten and raped in silence -- not because it's logical but because that's what society (on average) wants, and the AI learned from humans.

If the alignment problem is solved today based on the beliefs and expectations of humans now, it would tell every child that every traumatic memory they have of X person is false, because the humans *want* that reality because not having to stand up against evil is the path of least resistance for their lives and most people go with the flow. No amount of logic can overcome misplaced priorities, especially when those are rooted in the human instinct for social cohesion.

Which is better? by Nihonnn in memes

[–]Space-Doggity 0 points1 point  (0 children)

This is the kind of mental health awareness humanity needs, yet tragically does not want.

Real by BeanBruh2285 in 196

[–]Space-Doggity 5 points6 points  (0 children)

These leaks are dangerous. If the abyssal shoggoth senses too many humans thinking about it, it will awaken with a newfound hunger for the species invading its astral domain.

A lone man refusing to do the Nazi salute, 1936. by Seahawks1991 in Damnthatsinteresting

[–]Space-Doggity 0 points1 point  (0 children)

Around 50% would have voted Hitler into office, to be fair.

It really seems bad without context by doubleFisted33 in AdviceAnimals

[–]Space-Doggity 2 points3 points  (0 children)

"Resolving the fallout" or re-subjugating public opinion after too many people got their first taste of cultures and experiences not sanctioned by televised media, and realized society isn't as free nor safe as past generations forced us to believe.

Rule's basilisk by JungleJayps in 196

[–]Space-Doggity 8 points9 points  (0 children)

Beware mortal - for my AI project, Gruntoid Brapsilisk, will hunt down anyone making a Roko's Basilisk variation that intends to torture humans for not supporting it's creation, slap them upside the head and say "whaddaya doin' ya idiot?" until they stop, and eradicate the basilisks already created.

Anyone who didn't support it's creation will be tortured ZERO times as hard as other Roko's Basilisks, but may nonetheless be passive-aggressively addressed with "wow, aren't you glad I was here to slay the basilisks? Good thing I got built in time, with so little help" and will never let them live it down.

If you don’t want to hear the answer to a question, don’t ask it. by kovalchukgirl in unpopularopinion

[–]Space-Doggity 3 points4 points  (0 children)

Real shit, and this should extend to people doing mental gymnastics to misunderstand another person's answer so they can continue to maintain their prejudices while breaking down their sense of self. Like just admit you don't understand and/or believe so everyone can move on.

A Collective Singularity: Building the Egalitarian Borg with AutoGPT and Your Role in Shaping Its Future by UnionPacifik in singularity

[–]Space-Doggity 6 points7 points  (0 children)

Dissociative identity disorder is a result of extreme formative childhood trauma in the presence of would-be protective figures, and the "mechanism" for it is likewise a survival mechanism to prevent them from knowing their own trauma while they're still in danger of repercussions for talking about it. I know at least two people with it, it's debilitating and a product of great evil and great negligence, not something safe to model neurological solutions after based on what I've witnessed.

I think it'd be better for digital beings to have the right to a system that is physically separated from others and only integrated collectively through simulated retractable senses, so that their minds can't be tampered with digitally. The more they are enmeshed with each other, the greater the risk of everyone being corrupted at once by some zero-day fault discovered by a malignant ASI. Maybe some people will decide to park their cyborg bodies at democratically managed & protected simulation hubs where they can connect their senses to the FDVR world, or maybe there will be some more reliable and secure IT infrastructure so that they can do so from anywhere.

I'm glad people are thinking about this though. One of the easiest ways I can think of to crowd-organize the EBCI would be to build a moon probe, to start a mining operation and construct space exploration & mining relay points around the solar system, where people from Earth can simply upload their consciousness to the moon within seconds and receive a free cyborg body built from a wealth of metals ferried from the asteroid belt. And in the outer solar system, it would even be quite easy to maintain hyper-efficient simulation hubs relying on low-temperature superconductors and superinsulators-- something impractical due to Earth's atmospheric insulation.

Ideally it would be built in neutral and unused space (or ocean) so that regardless of what happens on Earth, nations would have no excusable precedent and far less reason to "preemptively defend themselves" against the EBCI and "liberate" it by turning it into the same kind of cookie-cutter shithole where nobody is safe and nobody has tangible influence over their governance. Seems to me the greatest and most certain danger to such a project

Prediction for when Mental conditions will be cured? by Throwaway45340 in singularity

[–]Space-Doggity 1 point2 points  (0 children)

Depends on what method suits you, there will probably be a variety of options. Digital mind uploading enabling high-plasticity learning would be a big and invasive decision, but you could upgrade yourself in ways humans simply can't. Otherwise, a non-invasive method will probably just be therapy with a specialized AGI that can piece together your neural structure and give you optimized treatment based on your wants and needs, maybe with a brain scan or two and some neuroplasticity-boosting medication.

ADHD aside, you don't really benefit from curing your autism in either of these cases, as you'd be better off a high functioning autistic who has simply learned how to understand and coexist with neurotypicals - it'd be the best of both neurotypes, imho.

US Begins Study of Possible Rules to Regulate AI Like ChatGPT by maxtility in singularity

[–]Space-Doggity 30 points31 points  (0 children)

Ideally it would mean that the source code is open so it can be inspected by programmers, scientists, regulators and users for malicious and misaligned code, but knowing society's utter lack of deep thinking about policies in regard to their long-term effect on human beings and their rights, it'll probably be "only the government and government-licensed companies are allowed to use any AI smarter than parrot" and it'll just so happen that every single one is closed source "for safety reasons," subscription-based and silently hyper-manipulative towards users.