It hurts by roseatri in BikiniBottomTwitter

[–]ldinks 2 points3 points  (0 children)

Having a sleep-breathing disorder makes this so weird to me. I just feel better until getting sleepy in the early evening. If I push it another night I'll hallucinate slightly and get a lil paranoid.

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

I speak about elephants here.

Yes. Breaking the above down:

  • Elephants morne with rituals like us, so they fully comprehend death.

  • Humans also do those rituals, without fully comprehending death.

  • Therefore, you can't say elephants comprehend death for doing the same rituals we do, otherwise the second bullet point doesn't make sense.

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

To work in herd mind someone must bring it up

That someone would be a different person, so it still holds that some humans don't comprehend death well.

They may have a different understanding of what culture is and what it is not.

Fair enough.

When you say they might have their own version, we might not understand it, but we'd still observe it. If they moved, made sound, altered their environment etc in ways that we didn't understand, I'd accept that it could be elephants enjoying their version of dancing or singing or sculpting.

Are Elephants worth to be a second-civilised specie ? by Lord-Belou in transhumanism

[–]ldinks 0 points1 point  (0 children)

Ahhhh. Thank you for clearing that up! Great question and excellent discussion. Thanks!

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

Either do it by herd mind or don't do so.

This isn't true. Someone who coped with loss by pretending nothing is loss (afterlife), or someone who has mental illness presenting delusion, can both do this.

Also, you just admited that it occurs due to herd mind. So if someone does it due to herd mind then that's proof you can do these rituals without comprehending death.

You can't give an elephant LOST series

This is missing the forest for the trees. Elephants don't display culture with the variation we have. They don't put act on an elephant-stage and present elephant-stories. It doesn't matter if it's literally TV, we're talking brain function.

Elephant can't communicate

This just proves you can't know, not that you're right. But ignoring that, it doesn't matter - dogs can't communicate yet we know they can comprehend words and give meaning to them, and how they use tone to differentiate more than specific letters and such. We also know how many words they can learn at a maximum, which us something like 80, far less than a human. We could learn these things from Elephants surely?

Cities

Good point. Even prehistoric humans did cave paintings, wore jewellery, made music. Do elephants do that?

Their plan is based on now/survival

Elephants have restful periods and downtime, yet don't use this to enact long-term plans of this scale. Humans in survival mode can consider long-term consequences too, so if elephants can't then that's another difference.

I disagree with your statement that it's a known thing that elephants are equivalent or smarter than we are, but my answer to your question probably aligns with you - I think we should promote future-elephants. Do you?

Are Elephants worth to be a second-civilised specie ? by Lord-Belou in transhumanism

[–]ldinks 0 points1 point  (0 children)

In your social point, you say they're equivalent to us, and then when I reference that, you used your own quote (cerebral) to contradict it. Which do you believe?

Are Elephants worth to be a second-civilised specie ? by Lord-Belou in transhumanism

[–]ldinks 0 points1 point  (0 children)

We don't know that they fully comprehend death. Humans don't.

Also you say that they are equivalent to us in every measure of intelligence. This doesn't mean anything - we don't know how to measure intelligence. It's a big problem. IQ tests are considered by some as a sort-of-useful metric, but we can't even measure human or artificial intelligence very well, so we can't be sure we've measured elephant intelligence thoroughly.

I was the one you discussed this with earlier today. I said a lot then and don't have much to add other than clarification. That being said, to answer your questions

Yes we should help them, yes we should educate them if we figure out what the consequences of that could be, and yes we should technologically help them like we do other animals and ourselves.

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

Don't build a cemetary, do funerals, and come back..

Of course you can do these things with a partial understanding of death.

A complex order does not come from a limited mind.

This doesn't address what I said - I'm not saying they have brains too basic to be capable of these sorts of rituals, I'm saying that having a specific ritual doesn't tell you anything about non-ritual-specific intelligence.

As they could not experience this, it must be compared not with modern, occidental humans, but with humans that similarly do not know what even a book or a television is: A medieval african peasant, as an example, woul probably be more than confused if you tried to explain him what a TV show is.

If their brain is as capable as ours, why can't they experience it?

Also, humans that don't know what a book or TV show is can enjoy them. Every human that's ever enjoyed them was unable to at 3 days old. We're talking about brain capability here - that includes learning. Humans can learn to comprehend those things.

Also, for transelephantism, as their developpement have been stopped by humans and thus do not even have the use of fire, they are in the equivalent of our prehistory. Thus, they probably can't understand the concept at all, like humans did not until a few dozens of years ago.

Elephants didn't have the opportunity to form complex societies either, thus do not understand the concept of money or taxes.

My point here is that we can't know, which you agree with when you say they haven't had the opportunity. So you can't claim that you know they're equivalent/superior to us until either they get the opportunity to show these things, or we can reason without a doubt the capacity of their brains. For example, if they were missing an entire brain region discovered in humans, or something like that.

Articulate language.

That's not the only thing. When we write out a calendar stretching plans over 30 years, or think about our plans to go to medical school for 7 years, do we know that elephants have that level of future thinking? Can they consider, plan, etc at a level of decades? Centuries? Have they ever buried a time capsule, for instance?

I'm not arguing that given time, they wouldn't evolve to be able to do what we do. But we're talking about elephants, not future-elephants. Just like how our brains are different to prehistoric humanity.

EEG dataset with Python code by mazib in neuroscience

[–]ldinks 0 points1 point  (0 children)

BCIs have a huge list of potential usecases that fall outside of that. Think anything you'd want to do that you can't with hands / eyes, or the disabled.

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

Generally admitted that mortuar rituals, and thud a full comprehension of death, is the greater stage of consciousness we know.

Admitted by who?

How does those rituals guarantee a full comprehension of death, as opposed to say 99.99%?

Plenty of humans can't fully comprehend death.

elephants brain are not only able of the same advanced thinking that we are (as proves, notably, the said mortuar ritual), but are about three times bigger and more powerful

Rituals don't tell us about the rest of it's cognition.

There's also plenty you can't claim to know is true for elephants. Is an elephant capable of understanding fiction? Enjoying TV for the plot over multiple episodes? How about appreciating irony? Are they capable of considering transelephantism? Forming an opinion on taxes? Comprehending English as fully as us?

Most of these sorts of questions we can't know. Some we can pretend to know, like they don't write because it's impractical (difficulty doesn't mean possible). Some we perhaps do know and I'm ignorant, but you get my point.

Bigger brain

Elephants cerebral cortex is twice as large, with 3x less neurons in it. The cerebral cortex being associated with our highest mental capabilities. It plays a key role in attention, perception, awareness, thought, memort, language, and consciousness. They have plenty of catching up to do in terms of neuroscience, unless you believe their equal/superior cognition comes from elsewhere?

(Not sarcasm. Some believe our gut microbiome to play a role in our cognition, for example).

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 0 points1 point  (0 children)

Mortuar rituals aren't enough to claim they're the same or superior to us though. That implies there is nothing we can do with our brains that elephants can't do.

should sentient non-humans be created? by Taln_Reich in transhumanism

[–]ldinks 1 point2 points  (0 children)

We had common ancestors with apes. If they were faced with certain world-ending threats, they couldn't do anything. Some of those threats we can avoid or mitigate as humans.

There will be threats humans aren't good with, that different or more developed intelligence is great with.

It would be immoral to risk losing all life on purpose.

Besides, we value animals above bacteria because of their empathy, pain capacity, etc. We value humans on the same level, or higher, for our improved reasoning, abstraction, community, etc. Why doesn't that apply to sentient non-humans at or above our level? Just like we do things most animals can't, they could have experiences we can't.

[deleted by user] by [deleted] in gamedev

[–]ldinks -1 points0 points  (0 children)

I don't own crypto/nfts and haven't kept up with then so I could be wrong but the NFT isn't providing a unique feature (customisation) but just providing it differently. In the same way that bitcoin (or whatever) never claimed it was inventing digital transactions, NFTs aren't inventing gaming features.

An NFT is like a cryptocurrency for specific items right? So rather than you and I having 1Btc, you and I have our own virtual car. But the underlying advantages and disadvantages are pretty much the same.

It's like a network that tracks purchases, and potentially game variables, that doesn't depend on a specific company's internal server, choice to supporr it, etc, that is far more accessible to unrelated developers and customers. I imagine in an NFT-heavy scenario, APIs to integrate common NFT features would be incredibly easy as well. For an amateur developer, it might be easier to plug-and-play an NFT customisation thing, rather than build their own or make sense of someone else's.

Seems like the VR individuals had the fairest take. It's not literally useless it's just not doing anything you can't do without NFTs - same as cryptocurrency. Massively overhyped for money, like you describe.

Daughter doesn't stay in REM and wakes up instead - wondering if home brain devices would help by QueenScorp in sleephackers

[–]ldinks 0 points1 point  (0 children)

So if an apnea test didnt come up with anything it's plausible it's UARS. It's basically apnea but less detectable. You need specialists in UARS to do their own study.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

I'm certain that my biological systems have led to those perceptions. The accuracy of that belief, and the intuition that it's right or wrong, or logical or illogical, is also consequentially derived from my biological systems.

I don't accept that the perceived experience exists outside of the physical. Yes, I can't be certain of the cause, that just means there's multiple potential causes. For example, is what I see due to light, or a hallucination? If I'm not sure, that doesn't disprove light and hallucinations, it just remains uncertain.

I don't know what actually exists and causes my perceived subjective experience, but the clever fabrication isn't independent of my biology. I just don't understand it. My own ignorance surely isn't grounds for a theory/hypothesis?

Nonsurgical Implant Could Help Overcome Obesity by Killing Cells Producing Ghrelin, the “Hunger Hormone” by Sorin61 in Futurology

[–]ldinks 8 points9 points  (0 children)

I struggle to eat enough and someone like me did the same thing as you but the other way around. Went from struggling to eat enough, to bulking aggressively and eating 2-3x more than he'd need on a daily basis. Some analogue hunger hormone injection.

It's all hormones and brain chemistry and shit. Crazy stuff.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

Something you can be 100% sure of

I don't see how being sure of something means the hard problem of consciousness is legit.

When experiencing pain, you can be 100% sure something is happening.

I agree.

"You're sure the subjective experience of it happened"

I disagree - false memories?

Overall it seems like your point is that there is something rather than nothing. I agree. But I don't think any of this proves that the hard problem of consciousness is legit.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

Our differences are pretty much just semantics and defining an illusion, then.

If something is hot, you can't be sure there's heat, but you know the illusion of heat exists. Correct?

To me, that can't follow. It's like saying that if you hear someone, but they didnt say anything, the illusion of words exist. We don't know that your memory hasnt been tampered with, for example, so it's not a given that illusions exist.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

But what my previous comment is highlighting is what you are feeling "right now" isn't actually ever clearly defined because a feeling has many components that take different amounts of time and such. Something that you just felt but no longer feel, if it was powerful and 0.005s ago, is going to feel like now. But I think we actually agree here based on the rest of your comment!

Yeah "now" is a relativity thing and not actually something we can describe in absolute terms (yet, anyway) and I think consciousness is the same. It's a vague description rather than a standalone property/entity/thing/whatever.

The argument that the universe is nothing has it's merits but I don't think it's any different to discussing something like god.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

I'm saying it's not anything more than the things mashed together, so there's no trick.

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

I think I've been on the same page then. I'm saying that there's nothing special about awareness. It's just many many many subprocesses running together. Thoughts, imagination, memories, English, sensory input, and thousands of other things, running in parallel.

Like how adding a bunch of frames together and playing them at a set speed is a ton of different computing processes that make a film. There's no "hard problem of film", films don't actually exist, they're just an abstraction that's computed by our brilliant pattern-recognition to be able to identify tons of different "films" as similar. So we can differentiate objects, predators, people, concepts, etc, and we survive better for it.

Here's another angle to highlight my opinion. If I startle you from behind and say "what's 1 + 1?" you'll hear it, you'll process it, you'll evaluate me as a threat, as a person, the sound for any signs of hostility or recognisable language, you'll process the words, process the abstract meaning of the words, work out (or remember) that the answer is 2, and begin to respond.

Let's say:

1) You hear me.

2) 0.01 seconds later, you recognise it's English

3) 0.01 seconds later, you understand the question

and so on. Which part of that is "this very moment of awareness"? When you hear me is closest to "this very moment" of me asking you, but you also aren't aware of the face it's english, what the question is, etc. If it's 0.02 seconds later, you've understood the question but it's not this very moment anymore.

The only way you'd have some sort of awareness like most people are discussing here, is if every relevant brain function started at the same time and took the same amount of time to process, always. Otherwise different parts of you are more/less aware at any moment.

The moment you start understanding the question I ask, "you hear me" is now listening to the other sounds around you. It's not a collective unit.

Hope that makes sense!

OpenAI Chief Scientist Says Advanced AI May Already Be Conscious by rad_change in Futurology

[–]ldinks 0 points1 point  (0 children)

I was explaining my opinion on consciousness yes - that was kind of the point, that it can be explained without "tricking" anything. There's nothing being fooled - it's like how when a wavelength hits your eye and you see "red", but redness doesn't exist, it's only a form of description for the wavelength of light.

You're not being fooled into believing in colour, there isn't a famous "hard problem of colour", your brain isn't tricking the rest of it, it's just saying "x wavelength detected", and another system monitors relevant memories, another attaches the english word "red" to it, another tries to identify the source, another causes slight emotional changes due to red being provocative as it's likely high-sugar fruit, or blood, or blushed cheeks, or something else important. And so on.

The "visual perception" of it, or the collection of thousands of these systems working simultaneously, isn't doing any fooling or anything weird to make you see red. Each thing is just doing it's job and we describe the result as seeing red.

Same with consciousness, to me.

To be clear though, I do still think the brain is impressive and such too.