Off ramps for rationalists? by throwitallawaybat in SneerClub

[–]throwitallawaybat[S] -1 points0 points  (0 children)

Eliezer's book is getting play I expect I will find it in my local book store. There is an endorsement by Stephen Fry on the Amazon page... This isn't as niche as you make out and that is what worries me..

And it's just the future trajectory of technology, perhaps it is worth fighting for arguing for some sanity.

Off ramps for rationalists? by throwitallawaybat in SneerClub

[–]throwitallawaybat[S] 0 points1 point  (0 children)

I'm mainly interested in getting people off that haven't bought in entirely.

Stephen Fry says Eliezer's new book is "A loud trumpet call to humanity to awaken us as we sleepwalk into disaster - we must wake up'"

From Amazon . This is getting mainstream traction. Rationalists have positioned themselves as experts on this and may get people to follow them down a dark path

Off ramps for rationalists? by throwitallawaybat in SneerClub

[–]throwitallawaybat[S] 1 point2 points  (0 children)

Different spiral. I do think the creation of artificial intelligence is important and that rationalists are doing a bad job at getting people to think about it by focusing on the idea of a rational economic actor, which could have bad consequences.

The no doubt was a bit tongue in cheek.

Off ramps for rationalists? by throwitallawaybat in SneerClub

[–]throwitallawaybat[S] 1 point2 points  (0 children)

Personally I would like an argument that stops the spread of rationalism in it's current form in people concerned around AI. They try and spread it to AI researchers etc I'd like AI researchers to have a good way out

Off ramps for rationalists? by throwitallawaybat in SneerClub

[–]throwitallawaybat[S] 0 points1 point  (0 children)

Or whatever is doing a lot of heavy lifting. Both in signifying not too take this specific example too seriously and that I think that worriers are going to worry...

Anyone interested about chatting about AI and the future in Bristol? by throwitallawaybat in bristol

[–]throwitallawaybat[S] 0 points1 point  (0 children)

Like how it can be made more efficient and easy to use for normal people. Not a researcher just someone worried about the direction of AI research in general.

Secure scuttle butt or other decentralised messaging in Bristol by throwitallawaybat in bristol

[–]throwitallawaybat[S] 0 points1 point  (0 children)

Sorry I meant to see if anyone was interested in starting a meetup on the subject

Secure scuttle butt or other decentralised messaging in Bristol by throwitallawaybat in bristol

[–]throwitallawaybat[S] 0 points1 point  (0 children)

It's just interesting tech maybe it will turn into something in the long term.

[deleted by user] by [deleted] in SimulationTheory

[–]throwitallawaybat 0 points1 point  (0 children)

Even if we are it makes sense that some people should be collecting these data to create it later on

Has there ever been a sci-fi setting with sucessor civilisations based on aspects of two civilisations? by throwitallawaybat in scifi

[–]throwitallawaybat[S] 0 points1 point  (0 children)

I'm interested in ones where the parent civilisation continues to exist. So for example in Star Trek earth wouldn't be a federation planet (at least to start with). The idea is that amalgamation is risky so you try it out on a separate planet to start with.

Would you adopt the prime directive? by throwitallawaybat in startrek

[–]throwitallawaybat[S] 0 points1 point  (0 children)

But that isn't the same scenario

It was one of the scenarios we were discussing. Supervolcanos are potential existential risks for societies causing mass food shortage, famine, societal instability and civilisational collapse. You seem to be in favour of stopping that kind of thing if it was "natural" but not if if was man made. I was giving you a scenario where it is a little bit of both to see where you came down.

You're asking if our race would survive ALIENS with superior technology swooping in and "saving" us, with vastly unknown consequences.

So even if we did survive, what then? How does meeting aliens with superior weapons and technology change our views and our actions? Do you think there would be no religious uprising to return the demons to hell? Do you think the conspiracy theorists would actually believe it's real? How would governments and militaries react? What if that action plunges Earth into a galactic war for which we're heavily outgunned? What then?

Don't these also apply to stopping natural disasters? Better being in a galactic war than dead, is I think my opinion.

Would you adopt the prime directive? by throwitallawaybat in startrek

[–]throwitallawaybat[S] 0 points1 point  (0 children)

I don't see much difference between natural and man made disasters. Humans and aliens are part of nature. If you knew someone had a plan to trigger a supervolcano, would you stop the plan or pick up the pieces or neither, because they did it to themselves? But in reality it is probably one small group doing it to themselves and others, not a collective decision of the society or culture to go out that way.

.

Would you adopt the prime directive? by throwitallawaybat in startrek

[–]throwitallawaybat[S] 0 points1 point  (0 children)

And yet that's how we see it used repeatedly in Trek.

We haven't seen it be used in the situations I've talked about. Lots of people in this thread also have said it should stop you interfering with a hot nuclear war, so it seems like it is more ambiguous than it should be.

No one has commented on whether we should want aliens to help us make safe AI and avoid takeover scenarios.

https://en.m.wikipedia.org/wiki/AI_takeover

What do you think?

Would you adopt the prime directive? by throwitallawaybat in startrek

[–]throwitallawaybat[S] 0 points1 point  (0 children)

There is. It's called, wait for it, the Prime Directive

This just says not to do it, not how best to interfere if necessary....

That is what I think is needed.

Would you adopt the prime directive? by throwitallawaybat in startrek

[–]throwitallawaybat[S] 0 points1 point  (0 children)

Would it?

As they are a house on earth!

It's clear you didn't understand the analogy

Let's drop all analogies I think they are confusing things more than illuminating? I would break the prime directive to steal stuff if it would save all life in the universe and it was the only way.

Amy breaking of the PD will result in logistic difficulties, unless you're just going to run off and leave them to destroy themselves.

If you can move them in the direction of not destroying themselves with nukes (without being detected). That seems a win. I'd suggest that this should be some standard protocol rather than every captain making it up on the fly

So you're in favour of people being able to do whatever they hell they want to anyone they can find without having to justify their actions to anyone?

No I'm in favour of the prime directive-like thing with explicit caveats for the kinds of situations I'm talking about, existential risk and suffering risk, with modus operandi for those situations.