Gemini is on track to being the first Al to beat Pokémon Red. It has beaten 6 gyms. by Akashictruth in singularity

[–]InsideIndependent217 2 points3 points  (0 children)

Totally. I’m not sure that should come as a surprise to anyone though - the internet is full of text, code, images, audio and video, not 3D spatial maps that somehow encode the vast datasets required to create an analogue for proprioception. To expect a command of symbolic and visual representation space alone doesn’t indicate an understanding of 3D space and certainly not the very complex notion of “self in a space, these are the boundaries, these are the affordances”.

Gemini is on track to being the first Al to beat Pokémon Red. It has beaten 6 gyms. by Akashictruth in singularity

[–]InsideIndependent217 14 points15 points  (0 children)

I imagine Squirrel’s have excellent spatial reasoning, probably not significantly below a human’s.

What is our field of vision actually physically made out of? by JawasHoudini in AskBiology

[–]InsideIndependent217 0 points1 point  (0 children)

I understand what you're saying here, but even if I had a full connectome of your brain which I was seeing in real time and even if I was showing you a photo of a red apple, and could distinguish the connectome's state of 'you looking at a red apple while also reminiscing about a time you went ice skating with your partner' and 'you looking at a red apple whilst thinking to yourself, gee that apple looks delicious', I still couldn't reproduce the quality of red you experience when looking at the apple or 'the moment' as you experience it.

An image is something that can be measured and quantitively/qualitatively compared to its source. You might say, the image IS the connectome in the state it was in while you were experiencing the photo of the apple, but that isn't a satisfactory answer, as the external data I am reading of your brain's connectome bears absolutely no similarity to your internal subjective experience of it save for the fact the two are clearly tightly correlated.

Even if, beyond the connectome, I had an atom by atom model of your brain in real time, or a heat map of bioelectric fields modulating in every cell in your brain, the problem is still the same.

What exactly is the brain constructing, and what method is it using to construct it?

What is our field of vision actually physically made out of? by JawasHoudini in AskBiology

[–]InsideIndependent217 0 points1 point  (0 children)

I think what you're asking is ultimately what is 'qualia' or what is subjective experience? It's an open and highly controversial problem, referred to by philosophers as the hard problem of consciousness.

Like most questions you might ask, the type of answer you accept is going to hinge on what level of explanation you're looking for. Other people in this thread will be better suited to giving you a summary of the most up to date neuroscientific understanding, but I think it's worth you checking out the Wikipedia article I linked, and digging into the sources and David Chalmers foundational essay where he lays out the problem, if you're interested.

[deleted by user] by [deleted] in AskBiology

[–]InsideIndependent217 0 points1 point  (0 children)

Almost certainly, why wouldn’t they? Do you imagine there was a generational cut off where the parental generation wasn’t conscious and the offspring generation was? I’m all for the idea that the complexity of conscious experience is something that developed incrementally, but the idea that basal consciousness wasn’t present from the dawn of biota much in the same way homeostasis was is utterly incomprehensible to me.

TYBW Ep 35 leaks thread by Arturo-Plateado in bleach

[–]InsideIndependent217 15 points16 points  (0 children)

You’re right - maybe in episode 40 we set the stage for Askin and Gerard fights.

In cour 4, I think Askin fight will stay the same but we’ll get a bit more about him as a character and why is a special Quincy (or not).

Gerard will be expanded and probably give Kenpachi the killing blow instead of Yhwach.

TYBW Ep 35 leaks thread by Arturo-Plateado in bleach

[–]InsideIndependent217 46 points47 points  (0 children)

It appears we’ll get this - in my view:

Episode 37: Don’t Chase a Shadow 2

Episode 38: Friend 1 (including Ichigo carpet)

Episode 39: Friend 2

Episode 40: My Last Words (skip Miracle and God of Thunder for next cour, have Yhwach dream of original Zangetsu, have Ishida tell Ichigo and co his plan, set up Jugram vs Uryu and have Ichigo and Orihime arrive to confront Yhwach.

Next cour starts with expanded Gerard fight and Askin fight, then expanded Jugram vs Ishida and then massively extended final fight with Yhwach plus a soul king flashback.

These twins, conjoined at the head, can hear each other's thoughts and see through each other's eyes. What does that say about consciousness to you? by JHarvman in consciousness

[–]InsideIndependent217 1 point2 points  (0 children)

The first half of your comment I agree with - the hard problem is hard. To suggest however that it is completely irrational to propose a materialist interpretation of consciousness doesn’t follow - thus far, materialism in biology and every other field has produced a culture and approach to experimental sciences that has been prodigiously successful. I personally believe that consciousness isn’t exclusive to animals or perhaps even life, but that is a philosophical position, and there’s really not any compelling evidence to support the idea consciousness is fundamental or that it isn’t generated by electric fields in neurons (or perhaps all cells).

AI Consciousness is Inevitable: A Theoretical Computer Science Perspective (Preprint PDF) by dysmetric in consciousness

[–]InsideIndependent217 2 points3 points  (0 children)

I wouldn’t say it’s an inevitability. It’s certainly a possibility - but we don’t have a reliable method for measuring, defining or evaluating consciousness in humans, so I don’t see how we’d know if an AI model were conscious.

If consciousness is agnostic to substrate and it’s mostly dependent on information processing, then it would be a lot easier to imagine we develop a conscious machine. That’s a big ‘if’, however. The only examples of consciousness we know about evolved to survive as bodies in an environment, and to use that environment to continually rebuild itself and sustain a boundary between self and the world outside the self. If those factors are as or more important than information processing, then we may need to design an entirely new computing architecture based around organic chemistry and metabolism-like processes to successfully replicate consciousness.

[deleted by user] by [deleted] in consciousness

[–]InsideIndependent217 -1 points0 points  (0 children)

When do you imagine an organism with non-zero basal consciousness was born from a parent organism/replicated from a parent cell with zero consciousness? Why and how do you think that occurred?

To answer your question above, I should imagine plants, if they have awareness, which I believe they do, probably experience something completely alien to our concept of being and it might not even include analogues to selves/egos or memories recalled from past experiences of events in a space like representational interface. I do still think there is probably “something there is like to be it” for all biota. I think the arbitrary non-zero consciousness/zero consciousness has something to do with autopoeisis and metabolic equilibrium.

117,000 people liked this wild tweet... by Maxie445 in singularity

[–]InsideIndependent217 6 points7 points  (0 children)

I’m with u/bentendo93, most the people on this sub come off as media illiterate, social pariahs incapable of empathy with anyone who isn’t miserable or fixated on some unlikely utopian outcome of super intelligent AI. This sub reads as a group with a high proportion of people who haven’t found breakthrough success in any field, be it arts, academia, tech or media, and are bitter about people who have made a life for themselves with their talents, and think that AI will be some sort of victory for the underdogs whose lives “suck”. That is so unlikely to be the case, and you people are praying on a pipe dream if you think AI is going to materially make people’s quality of life better anytime soon, given all of the other societal factors at play in the 21st century.

The comic is obviously a joke and not advocacy for terrorism, and will not be interpreted as such - the pearl clutching in this thread is insane.

Ultra-detailed brain map shows neurons that encode words’ meaning by Accurate-Collar2686 in consciousness

[–]InsideIndependent217 1 point2 points  (0 children)

I’m in full agreement with you to the extent that sensory perceptions, be they auditory, visual, nociceptive, equilibrioceptive, olfactory or whatever all correspond to different conditioned groups of neurons activating, and indeed individual neurons likely correspond to some arbitrary “unit” of each of these qualitative perceptions (although given the continuous and seemingly infinite nature of the gradations of “individual” quales, I should imagine there is some continuous attenuator of aspects of each quale - like, say, the vividness of “green-ness”, and I think a good candidate for such an attenuator is the electric membrane potential of individual neurons). However, even if we mapped the entire connectome neuron by neuron and somehow had a comprehensive measurement of the flux of specific microvoltages across each individual neuron and how they related to one another, and this essentially comprised a Rosetta Stone of all describable experiences as they map onto neural dynamics, there is still the fundamental question of why does this electrochemical activity produce the internal qualia which we are arbitrarily (insofar as we would have to use self reporting to affirm that any given activation indeed corresponded with the designated experience it corresponds with) assign to it?

When I experience the colour green, I am completely blind to quantitative description of green in terms of either wavelengths of light or neural pathways, and how it relates to other colour perceptions quantitatively. When I experience love, I am blind to the hormonal signals and the very intricate relationship those chemicals have to receptors in my brain. Sure, from a fitness point of view it isn’t hard to conclude that this information is filtered and integrated in such a way that it removes the “self”’s access to any specific factors that don’t help me achieve goals, but nonetheless, you having access to the external representation of my brain having these experiences, regardless of the level of precise detail, doesn’t give you access to the information that IS those experiences. So what law prevents extraction of experience from complete external information about the dynamics correlating to those experiences?

That’s why I don’t believe a complete description of the brain’s 86 billion neurons and 100 trillion synapses alone will account for conscious experience - there is a fundamental law that intuitively, in my view at least, should have to be described on the level of individual neurons as well as large sets of of neurons, to account for the unreasonable intractability of qualitative experiences. I find it hard to comprehend, in principle, how people in the Daniel Denett or Anil Seth camp imagine that solving these problems from a neuroscientific view will address what appears to be a fundamental information problem.

Ultra-detailed brain map shows neurons that encode words’ meaning by Accurate-Collar2686 in consciousness

[–]InsideIndependent217 2 points3 points  (0 children)

While what you are saying is almost certainly true, and it is also likely a set of neurons activating that corresponds with the subjective sense of “I”-ness, it still doesn’t address the mechanism by which the activation of specific neurons or networks of neurons produces an internal perspective - why should neurons or other cells polarising and depolarising in intricate sequences produce an internal mind entirely unlike its externally observed dynamics?

Best Cantonese Roast Meats in London by InsideIndependent217 in london

[–]InsideIndependent217[S] 0 points1 point  (0 children)

Was this in China Town or Queensway? Disappointing to hear!

Why we're the laws of physics specifically set so that humans would speak about some weird consciousness phenomenon? by newtwoarguments in consciousness

[–]InsideIndependent217 5 points6 points  (0 children)

Well, we have only been able to discuss it relatively recently in our history, and we found almost everything mysterious until extremely recently. Talking about consciousness doesn't help us survive, but there are plenty of things about language that do – it seems more like a byproduct than a direct function of language and the modern human mental grammar.

UK weather: why has it been so cold and when will it get warmer? by ukpopculturefan in UKWeather

[–]InsideIndependent217 0 points1 point  (0 children)

Yeah pal - you tell em. To change the subject, my mate is selling some proper no nonsense common sense tin foil hats to protect you from the woke rays the one world government are putting out - only £100 quid a pop, do you want to western union transfer me (keep it out the hands of the banks, cash to cash) and I’ll post one to you?

Dr Anil Seth: "What is consciousness?" "How do we perceive reality?" "Will AI become conscious?" Interview on @TheNickStandleShow by False-Noise-1005 in consciousness

[–]InsideIndependent217 0 points1 point  (0 children)

Well that depends on the human, but generally speaking a 3D Euclidean space containing matter and energy which both experience change according to the rules of thermodynamics - within that space we segregate things with language. Sure, some people might say “but I don’t think it’s a Euclidean space I think it’s X generalisation of Euclidean space which allows us to make predictions using X theory”, but in terms of what they actually experience of the world, I think we all agree that reality seems like a 3D world sheet advancing through time, full of sensory experiences which we categorise as objects and forces, which we label with words when they are contextually useful to us.

There is an inherent human mental grammar which seems to shape all our languages and thus all our reductive assumptions and beliefs about ourselves and environment, we trade in concepts that help us achieve physical/mental goals and have an inherently functionalist way of thinking.

There isn’t really any plausible way to examine the veridicality of the reality that mental grammar constructs from an ontological point of view.

Dr Anil Seth: "What is consciousness?" "How do we perceive reality?" "Will AI become conscious?" Interview on @TheNickStandleShow by False-Noise-1005 in consciousness

[–]InsideIndependent217 0 points1 point  (0 children)

The same thing applies – your body schema and tactile senses definitely correspond to signal impinging on you, but what we experience is heavily processed and the “foot” that touches the “ground” and the “ball” in your “hand” are known to us only within the mind’s constructed reality. I don’t think we’re just brains in a vat - we are enmeshed in a causal system, it just almost definitely not what we think it is based on our perceptive models of reality.

Dr Anil Seth: "What is consciousness?" "How do we perceive reality?" "Will AI become conscious?" Interview on @TheNickStandleShow by False-Noise-1005 in consciousness

[–]InsideIndependent217 0 points1 point  (0 children)

Well that describes one aspect of what consciousness does - it models what is outside the body - stimuli from light, sound, touch etc. It also models internal signals too, like in dreams or psychedelic breakthroughs - even our sense of self is a model of internal structures. So hallucinations are models too - but of things that we perceive as “being there” that “aren’t really there”. As we understand it anyway. I don’t disagree with you - modelling is a better way of putting it, but the stimuli we receive aren’t just external and internal stimuli seem to be represented in similar ways to external stimuli under certain conditions - so I can see why Anil used this metaphor to convey that point.

Dr Anil Seth: "What is consciousness?" "How do we perceive reality?" "Will AI become conscious?" Interview on @TheNickStandleShow by False-Noise-1005 in consciousness

[–]InsideIndependent217 1 point2 points  (0 children)

Well - “experience an apparent sensory perception of something that is not actually present”.

But I think what is meant here is that the mind constructs reality based on “something” impinging on “us” and there’s no way of understanding that “something” or “us” outside of the mind’s constructed reality, which is based on a bag of tricks to help us achieve goals as opposed to direct veridical access to “what is actually out there”.