How did the Brain Come into Existence?

In summary, the brain is just an information processing device and its main purpose is to integrate sensory information and to move the body in the environment on the basis of that information.
  • #1
Iacchus32
2,315
1
So just how did the brain come into existence physically? ... To exploit an available resource perhaps? :wink: Which, I think is the whole point, why does anything evolve, if not to exploit the resources at hand? Why did our eyes elvolve, if not to exploit the available source of light? What would be the difference between that and say designing an amplifier in order to reproduce sound? Wouldn't it be fair to say that the amplifier exists solely to reproduce the sound? So why should it be any different with the brain which, is just a means by which to capture/contain consciousness?
 
Physics news on Phys.org
  • #2
So just how did the brain come into existence physically? ... To exploit an available resource perhaps?
How, or why?

What would be the difference between that and say designing an amplifier in order to reproduce sound?
We can be certain the amplifier had a creator.

Wouldn't it be fair to say that the amplifier exists solely to reproduce the sound?
An ideal amplifier, yes, but this isn’t an ideal world and so unfortunately in practice they tend also to produce...

So why should it be any different with the brain which, is just a means by which to capture/contain consciousness?
We can be certain the amplifier had a creator.
 
  • #3
Everything which is alive has at least a rudimentary form of consciousness, whether it has a brain or not. So guess what? The brain is not necessarily the source of consciousness.

Also, we have eyes which are the receptacle to light, we have ears which are the receptacle to sound, we have noses which are the receptacle to smell, we have skin which is the receptacle to touch, and we have taste buds which are the receptacle to taste, all of which are externally based sensations. So why should it be any different with the mind, the receptacle to consciousness?
 
  • #4
Iacchus32 said:
Everything which is alive has at least a rudimentary form of consciousness, whether it has a brain or not.

On what grounds do you make this claim?

Also, we have eyes which are the receptacle to light, we have ears which are the receptacle to sound, we have noses which are the receptacle to smell, we have skin which is the receptacle to touch, and we have taste buds which are the receptacle to taste, all of which are externally based sensations. So why should it be any different with the mind, the receptacle to consciousness?

In a physicalist evolutionary view, the brain is just an information processing device whose baisc purpose is:

1) to integrate all the sensory information gathered by the eyes, ears, etc., into a coherent representation of the world; and
2) to move the body in the environment on the basis of that representation, in ways that are generally advantageous to the organism.

Without a brain, the eyes, ears, etc. wouldn't be of much use to organisms. They could still produce reflexive behaviors, but they could not produce flexible, intelligent behaviors in the absence of a brain. Thus, the brain provides obviously evolutionary advantages, and thus its evolution was favored by natural selection.

I happen to believe that there is more to nature than the physicalist view indicates, but nonetheless it certainly assigns a coherent and useful purpose to the brain without having to invoke consciousness. To provide a successful critique of physicalism, you will have to pose a dilemma that physicalism cannot resolve satsifactorily. Unfortunately, the dilemma you pose in this thread is quite satisfactorily resolved by physicalism.
 
Last edited:
  • #5
Actually I have no problem with physicalism per se', I just see it as being incomplete, since it merely addresses the mechanistic side to things, and fails to take into account the experience of the (alleged) operator which is in control. Yet even in that sense it's hard to refute, since we're all bound to this temporal physical body while we're alive.
 
Last edited:
  • #6
consciousness must surely exist on a huge range of levels. where or why we as humans judge what it is to BE conscious is a very interesting question.

for example, an extremely well programmed computer, or a planet, or any complex system for that matter, seem to have some sort of 'overriding' properties that in one way or another suggest something similar to consciousness, or if not, a constructor or 'higher' knowledge if you will.

somebody please tell me how we as humans, with our collectively subjective view of the universe, could possibly 'know' for sure whether any other entity was 'conscious'. i imagine the only way to 'know' something is to 'be' it.

the physicalists have a point, but i suppose it is in the interest of those who know more about existence, to try and 'enlighten' such a limited perspective.



<-------i wiggle my finger and am affecting matter with my mind------->
 
  • #7
Iacchus32 said:
Actually I have no problem with physicalism per se', I just see it as being incomplete, since it merely addresses the mechanistic side to things, and fails to take into account the experience of the (alleged) operator which is in control.

I agree with you on this. However, the point is that physicalism already has a great theory of how the brain came to be, without invoking 'mind' (in the conscious sense) at all. There is no compelling reason to accept your mind-based argument of how the brain came to be, since physicalism describes the process more rigorously without having to mention mind at all (thus being more parsimonious as well as more rigorous).

In short, asking questions about the brain and how it came to be is not going to get you any traction on the problem of consciousness. We already have a theory of the brain (loosely speaking) that makes perfect sense, and the fact that it makes no mention of subjective, experiential consciousness is a good indicator that this is not a fruitful approach to attacking the problem.
 
  • #8
hypnagogue said:
I agree with you on this. However, the point is that physicalism already has a great theory of how the brain came to be, without invoking 'mind' (in the conscious sense) at all. There is no compelling reason to accept your mind-based argument of how the brain came to be, since physicalism describes the process more rigorously without having to mention mind at all (thus being more parsimonious as well as more rigorous).
And yet why wouldn't the brain evolve, if not to become the vessel of consciousness?


In short, asking questions about the brain and how it came to be is not going to get you any traction on the problem of consciousness. We already have a theory of the brain (loosely speaking) that makes perfect sense, and the fact that it makes no mention of subjective, experiential consciousness is a good indicator that this is not a fruitful approach to attacking the problem.
And of course without the medium, conscsiouness itself (the fact that we're alive and aware), we wouldn't have the means by which to examine any of this now would we? In which case I think Science is putting the cart before the horse when it says subjective experience has little or no bearing on the matter.
 
  • #9
magus niche said:
consciousness must surely exist on a huge range of levels. where or why we as humans judge what it is to BE conscious is a very interesting question.

for example, an extremely well programmed computer, or a planet, or any complex system for that matter, seem to have some sort of 'overriding' properties that in one way or another suggest something similar to consciousness, or if not, a constructor or 'higher' knowledge if you will.

somebody please tell me how we as humans, with our collectively subjective view of the universe, could possibly 'know' for sure whether any other entity was 'conscious'. i imagine the only way to 'know' something is to 'be' it.

I agree. A given person can have no solid empirical grounds to declare the existence of consciousness in systems other than his or her own self (the old problem of other minds). Therefore, we must resort to using various tactics of logical induction and deduction to try to at least pose coherent and informed guesses as to what systems do and do not have consciousness. It is important that we try to place this process on sure logical footing, rather than simply assume the existence or non-existence of consciousness in various circumstances, as Iacchus did in his previous post. Assuming consciousness in fellow human beings is not much of a logical leap, and surely is the minimal assumption we must make in order to get anywhere, but extending the analysis to other systems becomes a dicier proposition the further we get from human-like systems.

the physicalists have a point, but i suppose it is in the interest of those who know more about existence, to try and 'enlighten' such a limited perspective.

I don't think this is a fair portrayal. Physicalists are humans, and so they are just as knowledgeable about the basic experiential aspect consciousness-- in terms of what it is to be conscious, what it feels like to visually experience a sunset, etc.-- as any other person. They interpret the problem of consciousness differently, and some even try to write it off as a non-existent illusion, but even these people (I'm willing to bet) are just as conscious as you or I. They are almost definitely not zombies, all 'dark inside,' and thus they are just as knowledgeable and 'enlightened' in the important respects of the problem. The point of divergence is not the base facts, but what each party considers to be the logical consequences of those facts.


<-------i wiggle my finger and am affecting matter with my mind------->

Somewhat like Iacchus's effort, this is not the way to proceed. If you try to impress a physicalist with a statement like this, you will be summarily written off, due to the simple fact that the movement of your finger is entirely explicable in physicalist terms. The argument against the physicalist position must center on those aspects of reality that are not explicable by physicalism, and the very existence and nature of consciousness is plausibly one such aspect (whereas the evolution of brains and the movement of body parts are not).
 
  • #10
Iacchus32 said:
And yet why wouldn't the brain evolve, if not to become the vessel of consciousness?

I already explained this; see points 1) and 2) in my post above.

And of course without the medium, conscsiouness itself (the fact that we're alive and aware), we wouldn't have the means by which to examine any of this now would we? In which case I think Science is putting the cart before the horse when it says subjective experience has little or no bearing on the matter.

It is logically coherent to imagine that we would have the means to examine the world without having subjective experience. Most people do not consider a desktop computer to be conscious, though functionally it does the same type of thing we do when we examine the world-- it performs logical operations on incoming information (inputs), and produces outputs.

The concepts of computation, examination, awareness, etc., do not require any mention of conscious experience in order to be coherent. We know that conscious experience exists, and that it is intimately related to what we consider to be our own awareness, examination, etc. of the world, but the fact that these notions are logically coherent on their own changes the way we must approach the problem.

The problem presents itself as one of how we can conceptually incorporate consciousness into an otherwise coherent and complete physical picture of the world, not one of how consciousness completes what used to be an incomplete or incoherent framework. To approach the problem in this latter way will not get one anywhere without a great deal of prior theoretical motivation, and vague intuitions will not do the job here. We must approach the problem with great respect for the integrity of the physicalist picture where it has already been well established.
 
  • #11
hypnagogue said:
The problem presents itself as one of how we can conceptually incorporate consciousness into an otherwise coherent and complete physical picture of the world, not one of how consciousness completes what used to be an incomplete or incoherent framework. To approach the problem in this latter way will not get one anywhere without a great deal of prior theoretical motivation, and vague intuitions will not do the job here. We must approach the problem with great respect for the integrity of the physicalist picture where it has already been well established.
And yet this coherent picture you're talking about here would have never have come about unless there were conscious beings to establish it. The same scenario with the computer. All these things we are discussing here are by-products of the fact that we're conscious. So I don't see how you could possibly discount one's subjective experience, because that's really all we've got.
 
  • #12
Iacchus32 said:
And yet this coherent picture you're talking about here would have never have come about unless there were conscious beings to establish it. The same scenario with the computer. All these things we are discussing here are by-products of the fact that we're conscious. So I don't see how you could possibly discount one's subjective experience, because that's really all we've got.

This is not a good argument unless you can establish that it is logically necessary for us to have subjective experience in order for the types of things we have done to have been done. The very existence of a coherent physicalist ontology that makes no mention of subjective experience undermines any attempt to establish that logical necessity.
 
  • #13
hypnagogue said:
Somewhat like Iacchus's effort, this is not the way to proceed. If you try to impress a physicalist with a statement like this . . . [<-------i wiggle my finger and am affecting matter with my mind------->] . . . you will be summarily written off, due to the simple fact that the movement of your finger is entirely explicable in physicalist terms. The argument against the physicalist position must center on those aspects of reality that are not explicable by physicalism, and the very existence and nature of consciousness is plausibly one such aspect (whereas the evolution of brains and the movement of body parts are not).

I don't think finger movement is "entirely explicable in physicalist terms." From brain to finger can be explained, but how the will of consciousness can trigger the brain is not understood. Because of this, I think Iacchus might be considered right to say "Everything which is alive has at least a rudimentary form of consciousness . . ." IF he hadn't added " . . . whether it has a brain or not" since life without some sort of nervous system seems to have no will. But even an amoeba, as primitive as its nervous system is, can will its body to move.
 
  • #14
hypnagogue said:
This is not a good argument unless you can establish that it is logically necessary for us to have subjective experience in order for the types of things we have done to have been done. The very existence of a coherent physicalist ontology that makes no mention of subjective experience undermines any attempt to establish that logical necessity.
What are you saying, that we're just to pretend like people don't exist? That doesn't sound very human to me? In fact it kind of takes the fun out of life. :confused:
 
Last edited:
  • #15
Les Sleeth said:
I don't think finger movement is "entirely explicable in physicalist terms." From brain to finger can be explained, but how the will of consciousness can trigger the brain is not understood.

The point is that physicalism can tell a complete, causally closed story about the finger movement without invoking consciousness. So trying to 'enlighten' a physicalist by saying "I move my finger with my mind" is somewhat like trying to impress a physicist by saying "God caused that ball to roll down the hill!" (I say somewhat, of course, because we actually do have very good reasons for believing in consciousness-- but that is besides the point in this specific instance.) Or, in a more metaphorical sense, it's like telling a community of well-off farmers, "Good news! I found a source of food!"

The way to undermine an ontology is not to propose solutions to problems that that ontology has already solved. The way to go is to pose a problem that that ontology cannot solve, even in principle.

As you and I both believe, the problem of consciousness is just such a problem that physicalism cannot solve, even in principle. To ply the problem against physicalism, though, one must naturally focus on the aspects of the problem for which physicalism has no answer: why it is that anything like subjective experience should exist at all, why physical systems should have some important place in the picture, etc. When it comes to questions of causality, physicalism is already robust, and attempting to critique its causal picture of the world is not an effective way to go.

Because of this, I think Iacchus might be considered right to say "Everything which is alive has at least a rudimentary form of consciousness . . ." IF he hadn't added " . . . whether it has a brain or not" since life without some sort of nervous system seems to have no will. But even an amoeba, as primitive as its nervous system is, can will its body to move.

Actually, I consider myself a panexperientialist, meaning that I believe that something like consciousness (albeit in very primitive and alien forms) really does (loosely speaking) 'belong to' all physical phenomena. So I do not disagree with Iachhus on this basic point; I just don't want claims being thrown about without some degree of consideration.

I believe a high-level cognitive phenomena such as 'will' or high-level physical phenomena such as nervous systems are not the kinds of things to look for when trying to uncover something that is proposed to be a fundamental aspect of nature, but that is a topic best left for another thread.
 
  • #16
Iacchus32 said:
What are you saying, that we're just to pretend like people don't exist? That doesn't sound very human to me? In fact it kind of takes the fun out of life. :confused:

I'm not denying that human beings are conscious. All I'm saying is that there is a logical possibility from physicalism that sophisticated cognitive systems could exist without being conscious. You said:

Iacchus32 said:
And yet this coherent picture you're talking about here would have never have come about unless there were conscious beings to establish it. The same scenario with the computer. All these things we are discussing here are by-products of the fact that we're conscious.

You seem to be trying to establish that consciousness is essential for things such as philosophies and computers to be created. I'm saying that it could be the case that these things could just as well have been produced by non-conscious agents.

The motivation for this claim is roughly as follows:
1) Physicalism gives us a complete and coherent causal story of nature.
2) Physicalism does not need to invoke consciousness in order to sketch a complete and coherent causal story of nature.
3) We can tell a complete causal story of all human history without mentioning consciousness (from 1 and 2). [Note that I say that we can tell a complete causal story, not a complete overall story. This distiction is important.]
4) Therefore, it is possible (perhaps in our world, or perhaps only in a world with suitably adjusted laws) that the causal history of humans (and thus all the physical accomplishments of humans) could be duplicated by beings without consciousness (from 3).

Even if we take humans to be an example of an instance where things have been created in virtue of consciousness, that does not imply that all such acts must be aided in some sense by consciousness.
 
  • #17
hypnagogue said:
Even if we take humans to be an example of an instance where things have been created in virtue of consciousness, that does not imply that all such acts must be aided in some sense by consciousness.
Except how do we know, when all we really know is that we're conscious? Do you really know anything outside of the fact that you're conscious? Can you think of anything outside of the fact that you're conscious? Try it some time ... and you'll soon realize that unless you're conscious, it's not possible.
 
  • #18
hypnagogue said:
The point is that physicalism can tell a complete, causally closed story about the finger movement without invoking consciousness. . . . The way to undermine an ontology is not to propose solutions to problems that that ontology has already solved.

I'd grant it might not be the best way to argue for a non-physicalistic theory of consciousness, but I also don't think there is a physicalist explanation that solves the problem of the interaction of consciousness with the brain. Physicalists may not admit their theory falls short, but I believe it does.

To solve the problem we'd have to observe a physical system which can move the finger, and we can observe that; and we'd have to observe the will that triggers the brain. As far as I know, will has never been observed except behaviorally (i.e., not in essence, as we can with the nervous system), yet it plays a key role in moving around the body.

Looked at another way, if will is absent from the body, as when someone is in a coma, the body moves only by biological activity, such as from autonomic influences. If we remove nervous system responsiveness, as when a neck injury paralyzes someone, the person's will can still be fully intact but just unable to physically demonstrate it below the neck. So the existence of one isn't dependent on the other. Therefore, by empiricism's own standards, there is evidence missing that's needed to claim willed movement is purely physicalistic since we don't know if will is physical (because it's constitution can't be observed), and we have no verifiable explanation for how will and the nervous system interact.
 
Last edited:
  • #19
Iacchus32 said:
Except how do we know, when all we really know is that we're conscious? Do you really know anything outside of the fact that you're conscious? Can you think of anything outside of the fact that you're conscious? Try it some time ... and you'll soon realize that unless you're conscious, it's not possible.

Again, physicalism gives a straightforward example of how it is logically coherent (and thus, how it might be possible under suitable circumstances). Physicalism, in principle, can tell us a complete and coherent causal story of how philosophies and computers are made, and importantly, this complete and coherent causal story does not invoke experiential consciousness.

The recipe is already there, given by physicalism; all it would take would be instantiation of the 'recipe' under suitable natural laws in order for it there to exist intelligent agents with the same causal propensities as humans, but nonetheless without experiential consciousness. The only way around this claim is if we can show that the type of brain described by physicalism must necessarily give rise to experiential consciousness, under all existential circumstances (whether actually existent or only logically possible) which physicalism describes accurately.

Of course, such a tight equivalence between physical brain function and experiential consciousness quite plausibly does not coherently exist in any theory put forth, and there is further good reason that it cannot exist even in principle. That is to say, there is good reason to believe that physicalism alone cannot tell the complete story about experiential consciousness. If this is true, then it admits the logical possibility of intelligent cognitive agents who can do everything a human can do, but nonetheless are not conscious in the experiential sense.
 
Last edited:
  • #20
Les Sleeth said:
I'd grant it might not be the best way to argue for a non-physicalistic theory of consciousness, but I also don't think there is a physicalist explanation that solves the problem of the interaction of consciousness with the brain. Physicalists may not admit their theory falls short, but I believe it does.

We should be very careful about how we phrase the problem. Although physicalists might deny it, I would argue that physicalism does indeed not solve the general problem of the relationship between consciousness and the brain.

But to phrase the problem specifically in terms of the interaction of consciousness with the brain is to already devote oneself to a rather strong theoretical position that may or may not be viable, at the expense of other possible theories.

For instance, consider epiphenomenalism (the doctrine that experiential consciousness is caused in some sense by the brain, but does not affect the brain's causal dynamics). Epiphenomenalism is rather ugly and counterintuitive as metaphysical pictures go, and I don't believe a word of it. But it does appear to be logically consistent, and has the added benefit of respecting the causal closure of the physical, and so cannot be written off out of hand-- it could, perhaps, be true. If epiphenomenalism is right, then there is no problem of interaction of consciousness with the brain that physicalism must consider, although the problem of how the brain causes such experience is still a very real and substantial one.

It is important to note here that there is at least one substantial theory of experiential consciousness (Gregg Rosenberg's Theory of Natural Individuals) that describes consciousness as casually relevant (i.e. not epiphenomenal) without assuming an interactionist ontology-- so it is not necessarily the case that one must choose between either epiphenomenalism or interactionist dualism.
 
  • #21
Physicalism, in principle, can tell us a complete and coherent causal story of how philosophies and computers are made, and importantly, this complete and coherent causal story does not invoke experiential consciousness.

so what is the role of the physicalist? to manifest a logical theory of nature in his/her eyes? or using his/her technology (of which he/she constructed purposefully)?

so the physicalist would be constructing a philosophy to be evolutionarily fitter and to survive? more logical theories.

how can one deny all these subjective enterprises? i mean logic was invented by consciousness, no? for survival? more logical theories.

logic is artificial, so the very means by which western knowledge has been deduced is culturally dependent, culturally subjective.
 
  • #22
hypnagogue said:
Again, physicalism gives a straightforward example of how it is logically coherent (and thus, how it might be possible under suitable circumstances). Physicalism, in principle, can tell us a complete and coherent causal story of how philosophies and computers are made, and importantly, this complete and coherent causal story does not invoke experiential consciousness.
Except that physicalism does not tell us anything about the experience of consciousness which, is the very thing that tells us about physicalism. So doesn't this suggest to you that the medium for us human beings, first and foremost which, allows us to decipher the very nature of existence is consciousness? And doesn't that also suggest that the structure of reality is "bound" within consciousness, rather than saying that the structure we find in consciousness arises from the brain? There's a big difference there. Which of course if correct, we could see how consciousness permeates everything. And then see how the mind/brain evolved expressly in order to obtain this.

The portal through understanding our Universe is consciousness.
 
  • #23
And another thing, if none of us were conscious, we wouldn't be holding this conversation would we? Why? Because it's like the difference between night and day, as if somebody were shutting the lights on and off. In which case consciousness is a lot like the of the sun[/URL] (as a medium), which illuminates everything and allows us to discern the truth of the matter.
 
Last edited by a moderator:
  • #24
Iacchus32 said:
Except that physicalism does not tell us anything about the experience of consciousness which, is the very thing that tells us about physicalism.

I agree that physicalism tells us nothing about the experience of consciousness. I disagree that the experience of consciousness is necessary for an ontological theory such as physicalism to be created, applied, understood, and so on.

Think of what the creation, application, understanding, etc. of a theory mean. They all have experiential aspects (what it consciously feels like to participate in these processes), and they all have functional aspects (the causal mechanisms that are responsible for generating and describing them).

An analysis rooted purely in a physicalist ontology can tell us a complete story of the functional side of things. It can tell us how theories are created (a complex, dynamic relationship between the environment and the neurons in a person's brain). It can tell us whether or not a person understands a theory (using results from a written test, or an involved conversation, or studying one's life work in a field, or even examining the mental model created by the neural structure in a person's brain, and so on).

Since a physicalist analysis can tell us the complete causal story of how theories are created, refined, etc., and the complete conditions under which a person can be said to understand a theory, and so on, then the ontology of physicalism must be sufficient to produce these things. Since physicalism is not sufficient to account for experiential consciousness, it thus follows that an account of experiential consciousness is not necessary to provide an account of the construction, understanding, etc. of a theory.

Suppose there exists a system S that is functionally identical to a human body (including the brain of course), but that S does not have experiential consciousness. Because S is functionally identical to a human, physicalism can describe the causal dynamics of S just as well as it could that of a human. From the above considerations, it follows that a society of 'S people' in an evironment like that of the Earth could, in principle, create a theory called 'physicalism' despite not having experiential consciousness.
 
  • #25
hypnagogue said:
I agree that physicalism tells us nothing about the experience of consciousness. I disagree that the experience of consciousness is necessary for an ontological theory such as physicalism to be created, applied, understood, and so on.
That would be like admitting that you don't have a mother and a father or, perhaps that they died in a plane crash when you were very young. No, you don't need to own up to the fact that they existed in order to acknowledge that you're here. However, that "is" how you got here and, there is something fundamental about having parents. There's no escaping that. So, is it okay to forsake your parents for the sake of what you deem to be progress? Indeed, we very well could have a problem here ...


Think of what the creation, application, understanding, etc. of a theory mean. They all have experiential aspects (what it consciously feels like to participate in these processes), and they all have functional aspects (the causal mechanisms that are responsible for generating and describing them).

An analysis rooted purely in a physicalist ontology can tell us a complete story of the functional side of things. It can tell us how theories are created (a complex, dynamic relationship between the environment and the neurons in a person's brain). It can tell us whether or not a person understands a theory (using results from a written test, or an involved conversation, or studying one's life work in a field, or even examining the mental model created by the neural structure in a person's brain, and so on).
Sure, you can understand all the fundamentals about what went into the design of the car, and yet have no idea about who the operator is, which is "me." The two are not the same. Also, if it wasn't for the fact that there was an operator (similar to the relationship between the mind and the brain), there would be no car to speak of. Why? Because the car was designed specifically to serve the operator.


Since a physicalist analysis can tell us the complete causal story of how theories are created, refined, etc., and the complete conditions under which a person can be said to understand a theory, and so on, then the ontology of physicalism must be sufficient to produce these things. Since physicalism is not sufficient to account for experiential consciousness, it thus follows that an account of experiential consciousness is not necessary to provide an account of the construction, understanding, etc. of a theory.
Yes, it's very easy to acknowledge that something has structure, all you need do is open your eyes and observe. And yet that tells us very little about how that stucture got there.


Suppose there exists a system S that is functionally identical to a human body (including the brain of course), but that S does not have experiential consciousness. Because S is functionally identical to a human, physicalism can describe the causal dynamics of S just as well as it could that of a human. From the above considerations, it follows that a society of 'S people' in an evironment like that of the Earth could, in principle, create a theory called 'physicalism' despite not having experiential consciousness.
What, are you referring to automatons here? I don't see how such a thing is possible, not yet anyway. Where is the experiential "essence" in these folks? And where is the experiential "essence" in the Science which is observing it? Scientists are human beings too you know. These things don't just happen out of thin air, unless of course the structure was already bound within it. :wink:

Also, if something mimics the real thing, via your limited physical description of it, does that mean you have the real thing?
 
  • #26
Iacchus32 said:
That would be like admitting that you don't have a mother and a father or, perhaps that they died in a plane crash when you were very young. No, you don't need to own up to the fact that they existed in order to acknowledge that you're here. However, that "is" how you got here and, there is something fundamental about having parents. There's no escaping that. So, is it okay to forsake your parents for the sake of what you deem to be progress? Indeed, we very well could have a problem here ...

This anaology is a bit off the mark, so let me try to straighten it up. It is indeed a contingent fact that all humans have come to exist via the meeting of a father's sperm and a mother's egg. Is it necessarily the case? No, it's not necessarily the case. We have mapped out the human genome, and thus in principle it should be possible to create a human entirely 'from scratch,' without needing a combination of sperm and egg to create one. So even though it just so happens that most (all) of us have come about in this way, it is in fact not necessarily the case that this should be so. It is possible for a human to come into existence without the meeting of egg and sperm.

I am making a similar argument about consciousness. We are in fact conscious, true, but physicalism's ability to separate our causal propensities from our experiential consciousness implies that it is at least logically possible that some system have the causal propensities of a human without having experiential consciousness. It is logically possible precisely because the one does not seem to depend critically on the other in a theoretical, non-contingent sense.

Perhaps in a different universe, with different laws of nature, there could exist a system physically identical to a human yet without experiential consciousness. We cannot rule out this logical possibility unless we can show that the causal propensities of a human require or necessitate experiential consciousness somehow, and physicalism's ability to describe these causal propensities without mention of experiential consciousness suggest that such a necessary link does not exist. (If such a necessary link did exist, then physicalism should not be able to adequately describe all the causal mechanisms involving human behavior without mentioning experiential consciousness, but it appears as if it does just that.)

Sure, you can understand all the fundamentals about what went into the design of the car, and yet have no idea about who the operator is, which is "me." The two are not the same. Also, if it wasn't for the fact that there was an operator (similar to the relationship between the mind and the brain), there would be no car to speak of. Why? Because the car was designed specifically to serve the operator.

Physicalism can make the case that the brain is its own operator. That is, it is not at all mysterious how the brain does what it does according to physicalism, since in principle all the brain's actions can be described as the consequence of physical laws.

Thus your analogy here is off the mark. You are trying to make the case that the operation of the brain without consciousness is just as inscrutible as the operation of a car without a driver. But to make your analogy line up here, you would have to suppose that the operation of the car nonetheless falls completely in line with a natural, physical description based on physical laws (perhaps as in the case where a computer is 'driving' the car). And if the operation of the car is amenable to a straightforward physicalist description, then there is no paradox or failure of description, despite what appearances might seem to indicate.

Yes, it's very easy to acknowledge that something has structure, all you need do is open your eyes and observe. And yet that tells us very little about how that stucture got there.

Again, the explanation of how the structure of the brain got there is subsumed under evolution and natural selection. We need not look for explanations outside of physicalism to coherently explain what needs to be explained here.

What, are you referring to automatons here? I don't see how such a thing is possible, not yet anyway.

It may or may not be possible, in our universe, to create an entity physically or functionally identical to a human that nonetheless does not have experiential consciousness. In fact, it's a good bet that such a feat is not possible in our universe. Supposing this feat is not possible in our universe, as a consequence of our universe's natural laws, we would say that it is not nomically possible.

However, that does not rule out metaphysical possibility. (If X is metaphysically possible, then X could be the case in a world whose natural laws are suitably adjusted.) It may be the case that, if the universe's natural laws were to be adjusted appropriately, we could have an entity physically identical to a human but without experiential consciousness. Such a metaphysical possibility could only obtain if we could show that, strictly speaking, experiential consciousness is not necessary in order to have a fully functioning causal system such as a human brain. (The supposition that all normal human brain activity in our universe is conscious does not imply that this is necessarily the case in all metaphysically possible universes; it could just be a contingent fact that arises due to natural laws particular to our universe, which could perhaps have been different.)

Again, the fact that physicalism appears to provide a complete causal description of the human brain without factoring in experiential consciousness suggests that such a necessary link between the two in fact does not obtain, which in turn implies that it is metaphysically possible to have a completely functional physical human brain without experiential consciousness.
 
Last edited:
  • #27
Hypnagogue

"... physicalism's ability to separate our causal propensities from our experiential consciousness implies that it is at least logically possible that some system have the causal propensities of a human without having experiential consciousness."

I'm not so sure about this. Wasn't it Chalmer's who pointed out that a system without experiential consciousness is very unlikely to spend its time writing about consciousness, or undertaking research in order to explain it.
 
  • #28
There are all kinds of more or less persuasive arguments that Zombies (as these conscious but not experiential beings are called) could practiclally exist, particularly when, is in most discussions, they are expected to exactly simulate the behavior of experiential beings. Why would a zombie spend hours listening to late Beethoven quartets if he wasn't aware of enjoying them? Why fix himself a fancy meal or fuss about her clothes?
 
  • #29
selfAdjoint said:
Why would a zombie spend hours listening to late Beethoven quartets if he wasn't aware of enjoying them? Why fix himself a fancy meal or fuss about her clothes?
As a matter of fact, they don't. Just watch "Braindead" or "Dawn of the Dead".
 
  • #30
selfAdjoint said:
There are all kinds of more or less persuasive arguments that Zombies (as these conscious but not experiential beings are called) could practiclally exist, particularly when, is in most discussions, they are expected to exactly simulate the behavior of experiential beings. Why would a zombie spend hours listening to late Beethoven quartets if he wasn't aware of enjoying them? Why fix himself a fancy meal or fuss about her clothes?
Now why does this sound so much like Commander Data on Star Trek the Next Generation? :wink: Yeah, he would sit and there and listen to the music, even twenty channels at once, but it was obvious he wasn't getting anything out of it.

Also, when speaking of how the brain evolved, we're speaking of something which occurred naturally and of its own accord, not of something which merely "mimics" consciousness through the means of a robot which, for all intents and purposes, could never have been devised through nature.
 
  • #31
Iacchus32, the Zombies, or beings who are conscious but don't experience or "feel" it, are constructs used by philosophers to raise and discuss problems with theories of consciousness. The question I raised is a well known one tending to deny that they are valid constructs.

If we are going to stick with evolution, which I greatly favor, then all this and other philosophical baggage goes by the board.
 
  • #32
selfAdjoint said:
There are all kinds of more or less persuasive arguments that Zombies (as these conscious but not experiential beings are called) could practiclally exist, particularly when, is in most discussions, they are expected to exactly simulate the behavior of experiential beings.

To be as clear as possible, we should be mindful of the words we use. 'Consciousness' is really a mongrel concept, insofar as it picks out a relatively large range of phenomena. To call a zombie 'conscious but not experiential' obfuscates things a bit, as experience is one of the phenomena generally included in the lumped-together package of concepts we call 'consciousness.'

It would be clearer if we divided consciousness broadly into its functional aspects, which we can call 'awareness,' and its phenomenal/experiential aspects, which we can call 'subjective experience' or just 'experience.' So for clarity's sake, let's say a zombie is 'aware' in the sense that it can act rationally on the basis of sensory input and so on, but not 'experiential' in the sense that there is nothing it is like to be a zombie.

Why would a zombie spend hours listening to late Beethoven quartets if he wasn't aware of enjoying them? Why fix himself a fancy meal or fuss about her clothes?

The answers to these questions are quite simply the ones that any physicalist description will give us. For example, a zombie would spend hours listening to Beethoven because the kind of auditory inputs it so receives will tend to release neurotransmitters such as dopamine (or serotonin, or whatever) which tend to reinforce the given behavior, because of the function dopamine plays in the neural dynamics of the brain.

This explanation is sufficient, in principle, to account for the given behavior in question. So, the only way your question poses a dilemma for the viability of zombies is if we can establish that the kind of functional/physicalist explanation given above must automatically entail the kind of subjective experience that a person would normally attribute to be the cause of this behavior. If this could be shown to be the case, then it would indeed be impossible to have a creature that behaves like a human but does not experience like a human. There is really no known way to settle this issue empirically, but there are persuasive arguments that suggest that experience cannot be explained purely in terms of mechanics. One's stance on zombies will really come down to whether one accepts these arguments or not.
 
  • #33
selfAdjoint said:
Iacchus32, the Zombies, or beings who are conscious but don't experience or "feel" it, are constructs used by philosophers to raise and discuss problems with theories of consciousness. The question I raised is a well known one tending to deny that they are valid constructs.

If we are going to stick with evolution, which I greatly favor, then all this and other philosophical baggage goes by the board.
So, in what would be more in tune with Canute's reply, how did philosophy evolve then? Certainly it must be due in part to the fact that we have enlarged brains, correct?


Canute said:
I'm not so sure about this. Wasn't it Chalmer's who pointed out that a system without experiential consciousness is very unlikely to spend its time writing about consciousness, or undertaking research in order to explain it.
 
Last edited:
  • #34
selfAdjoint said:
If we are going to stick with evolution, which I greatly favor, then all this and other philosophical baggage goes by the board.

Buried in Iacchus's original questions and conjectures is the pertinent issue of what role subjective experience has played in the evolution of the brain. I raised the point of the 'seperability' of subjective exerience and physical brain dynamics precisely to strengthen the purely physicalist, evolutionary answer to the question 'how did the brain come into existence?'

In a nutshell, if a physicalist account of brain dynamics is a complete account of brain dynamics, then we can put forth the standard evolutionary (and ultimately, entirely mechanistic) explanation rather straightforwardly without having to worry about how experience factors in.

If one accepts that subjective experience is a consequence of purely physical laws, then the above explanation is already sufficient to answer the question of the brain's existence, for surely experience is already buried somewhere in the physicalist evolutionary account.

If one believes that subjective experience is not a consequence of physical laws alone, as do Iacchus and I, one must motivate the further point that the physicalist account tells us everything there is to know about the brain's causal dynamics if one is to accept the physicalist evolutionary account as an entirely sufficient explanation. To motivate this further point, one must support the metaphysical possibility of zombies.
 
Last edited:
  • #35
Greetings, Canute. :smile: Good to hear from you again.

Canute said:
"... physicalism's ability to separate our causal propensities from our experiential consciousness implies that it is at least logically possible that some system have the causal propensities of a human without having experiential consciousness."

I'm not so sure about this. Wasn't it Chalmer's who pointed out that a system without experiential consciousness is very unlikely to spend its time writing about consciousness, or undertaking research in order to explain it.

It wouldn't be unlikely if such a system mistakenly believed it had subjective experience. (Attributing belief to a zombie is largely unproblematic, since belief can plausibly be characterized purely in functional or dispositional terms. There is no 'hard problem' of belief.)

In fact, this is more or less precisely Dan Dennett's position. Dennett is a hard-line physicalist, but rather than denounce the metaphysical possibility of zombies as one might expect, he essentially holds that we are zombies who mistakenly believe that we have subjective experience.
 
Back
Top