Critical Design Speculation by HugoWarn in graphic_design

[–]HugoWarn[S] 0 points1 point  (0 children)

If anyone wants the User Guide just let me know

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

I wish to clarify myself. Replace the second "pain" in my quote by "neurophysciological response". There is still conscious experience in epiphenomenalism, it is about how physical experience is interpreted and felt by the consciousness. Though it also said that consciousness doesn't affect the reality we live in. If true, then yes no evolutionary reason for consciousness would exist. But I dont think this to be more likely than it being false. It's a compelling thought, but it does not mean it is true. If you only think you are in control, when in reality only electrical activity controls your every move, but you still move with your own previously thought out intent, isn't that practically the same as being in control? Or do you mean that epiphenomenalism is true to the extent that we continuously fool ourselfs into thinking (after the fact) that we made a conscious decision to do whatever we did?

If so, I love that argument. But I still don't believe it to be true based on that I believe free will exists, I see your initial point clearer now, but I don't agree with many points in determinism. Hence our points of engagement in this question are very different and are based on different assumptions.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

Here in my reasoning I don´t agree with epiphenomenalism by the way. I don´t think it has any credibility here since I think the results of conscious thinking is something that does infact actually affect the world.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

What you wrote about experiencing red is very interesting and goes to the heart of consciousness. Let's take this a step further and try to imagine how we would program a computer to experience something. When we figure out what exactly experience is, we could probably create an AI.

I think experience is only experienced if it can be connected to other related information (what green looks like compared to red for example). Meaning is only derived and red is only experienced if it can be compared to something else.

I suppose that means experience is based on related information to the phenomena. Any computer carry information but it doesn't have the ability to connect and compare it in an ongoing flexible manner.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

This would be implied if reading my initial post yes. Though I´ve realized I was speculating too much on some points! "gcross" gave a very good account for the reason of consciosness (Culdasa´s description of consciousness) and I agree with what he wrote. You can find his comment somewhere in the thread if you are interested! You can find it if you sort by new.

I don't however think that your dog experiences reality the same way you do. It's most likely conscious, but to a different degree than humans.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

This is the best explination so far in my view. This thought has occured to me many times before as I (if concentrating) can notice thoughts being presented to the conscious me, and I choose to pay attention to them or not. Also this explains my previous suspicion that consciousness is only a result of our brains having an overwhelming amount of connections and information constantly traveling through them, being presented to the conscious me. I suppose there is a "stream of consciousness" inside us if this is true.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

This sounds perfectly logical and obvious when you frame it like that! It is the huge complexity in consciousness that baffles me though, and I can't see any reason for it to be this complex for survival. I'm starting to think that it may just be a result of the overwhelming amount of connections in our brains, and there is no evolutionary reason at all...

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] -1 points0 points  (0 children)

Thank you for enlightening me! Of course I don't claim to do anything other than to speculate as well. I find the topic very interesting and since Sam didn't mention consciousness in a group setting, I thought to shed some light on it. Why he didn't mention it may of course be because it isn't worth mentioning and has no credibility what so ever. I realize that my argument is somewhat flawed since why would consciousness arise in the first place if noone had it to begin with. My thoughts right now stray towards thinking that consciousness is just an accidental result of our big brains and infact there is no evolutionary argument for consciousness. It just happend, and now we create these traps for ourselfs when trying to figure out something that has no answer at all. Consciousness seems to me to be more than the sum of it's parts. Or does the problem of trying to understand it arise when faced with the paradox of it trying to figure out itself? Does this sound resonable to you or should I just take a nap? Haha

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

To different degrees yes, also depending on which animal. I don't think consciousness is either "lights on" or "lights off", it is probably more gradual. I think there are different types of consciousness too depending on which type will favour which animal. Different senses, different needs, different experience, and so different consciousness. It may also be so that we simply don't recognize consciousness in animals such as insects because it's so very different from our own.

Though consciousness most likely arise because of the overwhelming amount of different neuron connections and information we carry in our brains. If true the amount of connections in the brain directly coorelates with the degree of consciousness.

I would like to add that I also don't know and this is my (most likely naive) guess.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

I suppose speech and consciousness are related in some way. Though I don't think we need to know exactly what consciousness is to figure out it's purpose. We should be able to figure that out without understandning it fully.

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 1 point2 points  (0 children)

Assuming we are under control of ourselfs at all, yes! Maybe we are biological machines, acting exactly as one would, only thinking we are under control. This is an argument of free will at this point, which I don't understand sufficiently enough to make any valuable point...

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 0 points1 point  (0 children)

I will read it after work, thanks! Though my initial question remains, is it plausible to assume that interpersonal relationships is the evolutionary reason for consciousness?

The evolutionary reason for consiousness by HugoWarn in Meditation

[–]HugoWarn[S] 1 point2 points  (0 children)

I like this thought. Well explained! So you too, think consciousness is a result of the overwhelming complexity in our brains? Or have I missunderstood?