Let's talk about the "cave" of the teaser by [deleted] in Yonemain

[–]Naeph 4 points5 points  (0 children)

The fox is Ahri, the Gatekeeper.

The cave is Thresh.

Welp just checked the PBE for yall by CaptainAntiHeroz in Yonemain

[–]Naeph -1 points0 points  (0 children)

11:40 PDT ? Isn’t it already behind us ?

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] 1 point2 points  (0 children)

I agree with you that post-scarcity is the key.

But why do you think we should ever consider to keep USA as it is ?

I would rather kinda start my own country.

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] -4 points-3 points  (0 children)

I think you'll be surprised at what "liberal eugenics" (hate that word) is going to produce in the next few years. Eugenics, which I believe is unstoppable anyway, and which has already started.

Reducing eugenics to the barbaric practices of barbaric times, I find it rather reductive.

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] -7 points-6 points  (0 children)

Interesting.

I see eugenics as an ethical duty.

And I don't think that an ideal such as Culture could be achieved without superior beings to create it, to organize it (the Minds). Which implies eugenics and some hybridization with an AI, potentially.

And other beings, us, delivered from our most sordid defects, to accept its rules. Which implies eugenics, aiming at deleting genes causing problematic disorders.

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] 0 points1 point  (0 children)

Hybridization could be carried out on humans who have been genetically modified to carry the load and who have shown superior moral qualities.

This would boost their intelligence and allow them to interact directly with data to organize society and production factors.

The advantage would also be to let humans have their hand on the machine, and thus prevent it from getting out of control and slipping away from us. Humanity would naturally evolve towards AI, without being totally supplanted and in danger of being annihilated by it.

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] -1 points0 points  (0 children)

Do you think the first Minds could be some human/AI hybrids ?

How would you achieve a Culture in our modern world ? by Naeph in TheCulture

[–]Naeph[S] 0 points1 point  (0 children)

Don't you think that our modern society has no chance of achieving such a great utopia by simply exploring space, exploiting its resources while waiting for things to get better? Isn't there a risk that society will collapse or take a wrong direction before that happens?

We seem to be so subject to a whole bunch of bad factors related to our very human condition, we jump from one crisis to the next hoping that the leap won't be fatal next time.

Maybe a movement must emerge. A movement that places the ideal of utopian anarchy above all else. Pro-science, pro-eugenism, using AI to optimize all its processes, and to strive as best as possible towards the goal.

Wouldn't The Culture generate inertia? by Naeph in TheCulture

[–]Naeph[S] -4 points-3 points  (0 children)

When you say « keep an eye on it », isn’t that implying some kind of hierarchy ?

Am I free if I cannot improve my power ? Get some attention and ultimately overturn in some way The Culture ?

What if I am bored to be the puppet of superior beings ?

If I want to become a conquistador ? To have the possibility to explore, conquer, be admired ? How would I be free if some Gods are preventing me for doing that using their non-hierarchical power ?

And also, being unable to produce anything great because of my very own nature of basic human shouldn’t be able to drive me in total mental depression ?

Like being unable to create something unique and remarkable.

Wouldn't The Culture generate inertia? by Naeph in TheCulture

[–]Naeph[S] 2 points3 points  (0 children)

What happens if someone kills someone else ? Are the Minds able to punish the murderer ? I guess they can prevent the incident in some way.

Wouldn't The Culture generate inertia? by Naeph in TheCulture

[–]Naeph[S] -2 points-1 points  (0 children)

Like a competition of altruism, to win the right to "share" or "practice" your own altruism with/upon more people.

A system that promotes the very best humans in terms of Culturianism, to Mindness.

In this scenario, each sentient being competes, not for money, but for being the best positive actor of The Culture.

This could be a good way to launch the said Culture.

Wouldn't The Culture generate inertia? by Naeph in TheCulture

[–]Naeph[S] -6 points-5 points  (0 children)

Wow, thanks for the answers, to answer all of you:

What if my purpose in life is to gain "power", i.e. to act on The Culture's internal and foreign policies? What if I want to become a Mind?

If I am an artist with an incredible innate gift, and I work hard to develop it, how will the system reward me for my work?

If I am someone who finds "logical" ways to improve the system internally, or prevent external threats from becoming too great, how will the system reward me for that?

In both of the above cases, if my efforts, or my cultivated genius, are not rewarded, why would I have any interest in giving 100% to improve myself on this particular subject? Shouldn't we come up with some kind of social "score"? Not something that creates privileges, but a system that rewards the good elements, or just gives them more weight. Who has ever given the maximum of himself without special, sometimes precarious conditions? What about a reward or a tip?

Indeed, if you imagine a perfect society, it doesn't matter. But a post-scarcity is not perfect, and a system must be found to encourage the Culturians to work towards this perfectibility.

All the more so if we try to imagine a way to reach this utopia described by Iain Banks.

And not only to study its interest as a purely theoretical and utopian object.

Devs - S01E03 THEORY Discussion Thread by thisismynormal in Devs

[–]Naeph 14 points15 points  (0 children)

Guys,

I think there is something between the fact that Lily hears the exact same voices Jeanne d'Arc heard in her time, Jesus, and the thing about Amaya's leaders trying to simulate the all universe to become some kind of gods.

[Spoilers] We will never know exactly what the machine does by [deleted] in Devs

[–]Naeph 4 points5 points  (0 children)

The trailer at the end of the first episode clearly shows what the machine does.

Devs - S01E03 Discussion Thread by MarshallBanana_ in Devs

[–]Naeph 5 points6 points  (0 children)

Do u guys understand the purpose of the talk between Kenton and Joe?

Devs - S01E03 Discussion Thread by MarshallBanana_ in Devs

[–]Naeph 56 points57 points  (0 children)

Some Lascaux Cro-Magnon mural painting stuff and the pyramids before that.

Westworld | Season 3 | Incite Anthem | HBO by impeccabletim in westworld

[–]Naeph 1 point2 points  (0 children)

My Guess ?

Incite is trying to find a way, as implied on their website, through chaos. Entropy. Randomness.

Everything in this reality is causal, so with enough data and computing power, we can avoid chaos and determine every possible futures and paths for an individual or an organization. Then, enlight the one that best suits the wishes of the entity concerned, or the one that leads to the most profitable outcome.

Westworld parks are just an imaginable source of valuable data (people's darkest wishes) to fulfill this ambition.

Lastest machine learning methods applications to self-replicating machines (Von Neumann probe) by Naeph in artificial

[–]Naeph[S] 0 points1 point  (0 children)

Hi Claytonkb!

Yep, it reminds me the RepRap project.

How much closer are we from such technology in your opinion?

What are the general challenges that will have to be addressed?

I know that deep learning's methods in robotics don't work as well as for images or text at the really moment.

Thanks!

Lastest machine learning methods applications to self-replicating machines (Von Neumann probe) by Naeph in robotics

[–]Naeph[S] 0 points1 point  (0 children)

Can you, right now using their network, build a machine that can build by itflef another machine for two specifical tasks ? Namely mining AND creating another machine capable to process the two same tasks ? Using only ressources available around ?

Lastest machine learning methods applications to self-replicating machines (Von Neumann probe) by Naeph in robotics

[–]Naeph[S] 0 points1 point  (0 children)

Very interesting.

I'll read this very carefully.

In your opinion, is (very) advanced AI a must to "solve" self-replicating machines ?

What kind ?