- #106
moving finger
- 1,689
- 1
This subjective “test” that you suggest only allows the subject to determine whether “itself” is conscious. It says nothing about the consciousness of anything else.Tournesol said:In the absence of an objective explanation there is no objective way of
testing for consciousness. Of course there is still a subjective way; if
you are conscious, the very fact that you are conscious tells you you are
conscious. Hence Searle puts himself inside the room.
.
As I said, a circular argumentTournesol said:That consciousness is part of understanding is established by the definitions of the words and the way language is used.
“consciousness is required for understanding” is a proposition (or statement), which can be either asserted or denied.
Whether “consciousness is required for understanding” is either an analytic or a synthetic statement is open to question, and depends on which definition of understanding one accepts.
According to MF’s definition of understanding, the statement “consciousness is required for understanding” is clearly synthetic.
To simply assert that the statement “consciousness is required for understanding” is true, and then to use this as a premise in an argument which concludes “understanding is not possible without consciousness”, results in a circular argument.
A circular argument shows nothing except that “whatever we assert to be true is true”.
If the conclusion of the argument is already contained within the premises of the argument then the argument is fallacious. You may not like the idea, but that is accepted logic.
The basic problem is that to engage in any rational debate about anything, we need a common language. We clearly do not have a common language, since "understanding" does not mean the same thing to you as it does to me.
An example. If the term "person" means "human being" to you, but to me "person" means "humanoid", then the statement "all persons are examples of the species homo sapiens" would be an analytic statement to you, but NOT to me.
Until we can agree on the language we are using, we will continue to disagree whether the statement "understanding requires consciousness" is analytic or not.
Imho this is just what the human brain does – information-processing,or symbol manipulation.Tournesol said:As I have stated several times, the intelligence of an artificial intelligence
needs to be pinned to human intelligence (albeit not it in a way that makes it
trivially impossible) in order to make the claim of "artificiallity"
intelligible. Otherwise, the computer is just doing something -- something
that might as well be called infromation-processing,or symbol manipulation.
Where has anyone suggested “detaching the intelligence of the CR from human intelligence” ? (whatever this might mean)Tournesol said:No-one can doubt that computers can do those things, and Searle doesn't
either. Detaching the intelligence of the CR from human intelligence does
nothing to counteract the argument of the CR; in fact it is suicidal to the
strong AI case.
Consciousness is not a definitional quality of understanding in my definition.Tournesol said:But consciousness is a defintional quality of understanding, just as being umarried is being a defintional quality of being a bachelor.
The difference between an analytic statement and a synthetic one is that the former are true “by definition”, therefore to claim that something is an “analytical truth” is a non-sequitur. Analytic statements are essentially uninformative tautologies.Tournesol said:Is "bachelors are unmarried because bachelors are unmarried" viciously
circular too ? Or is it -- as every logician everywhere maintains -- a
necessary, analytical truth ?
However, whether the statement “consciousness is necessary for understanding” is analytic or synthetic is open to debate. In my world (where I define understanding such that consciousness is not necessary for understanding), it is synthetic.
I guess that Tournesol would claim the statement “all unicorns eat meat” is synthetic, and not analytic?
But if I now declare “I define a unicorn as a carnivorous animal”” then (using your reasoning) I can claim the statement is now analytic, not synthetic.
According to your reasoning, I can now argue “all unicorns eat meat because I define a unicorn as a carnivorous animal”, and this argument is a sound argument?
This is precisely what the argument “consciousness requires understanding because I define consciousness as necessary for understanding” boils down to.
Not necessarily. The ability “to report” requires more than just “understanding Chinese”.Tournesol said:If you understand something , you can report that you know it, explain how you know it. etc.
I suppose this is your definition of consciousness? Is this an analytic statement again?Tournesol said:That higher-level knowing-how-you-know is consciousness by definition.
I’m glad that you brought us back to the Searle CR argument again. Because I see no evidence that the CR does not understand semanticsTournesol said:The question is whether syntax is sufficient for semantics.
Not very relevant, since I am not claiming the CR does have consciousness.Tournesol said:If it is necessary but insufficient criterion for consciousness, and the CR doesn't have it, the CR doesn't have consciousness.
Are you suggesting the “correct” way to establish whether “understanding requires consciousness” is “by definition”?Tournesol said:How else would you do it ? Test for understanding without knowing what
"understanding means". Beg the question in the other direction by
re-defining "understanding" to not require consciousness ?
The correct way to do it is NOT by definition at all. All this achieves is the equivalent of the ancient Greeks deciding how many teeth in a horse’s mouth “by debate” instead of by experiment.
In simple summary, here is the correct way :
Hypothesis : Understanding requires consciousness
Develop the hypothesis further – what predictions would this hypothesis make that could be tested experimentally?
Then carry out experimental tests of the hypothesis (try to falsify it)
“Experiential quality” is not “understanding”Tournesol said:I am suggesting that no-one can write a definition that conveys the
sensory, experiential quality.
I do not need the “sensory experiential quality” of red to understand red, any more than I need the “sensory experiential quality” of x-rays to understand x-rays, or the “sensory experiential quality” of flying to understand aerodynamics.
The conclusion to “What Mary Knew” is disputed.Tournesol said:You seem to be saying that
non-experiential knowledge ("red light has a wavelentght of 500nm") *is*
understanding, and all there is to understanding, and experience is
something extraneous that does not belong to understanding at all
(in contradiction to the conclusion of "What Mary Knew").
Perhaps reasonable to you, not to me.Tournesol said:It is perfectly reasonable to suggest that anyone needs normal vision in order to fully understand colour terms in any language.
Sense-experience is not understanding.moving finger said:I disagree. I do not need to have the power of flight to understand aerodynamics.
Tournesol said:To theoretically understand it.
Yes, isn’t it convenient? Just as convenient as concluding that “understanding requires consciousness because I define consciousness as necessary for understanding”?moving finger said:Vision is simply an access to experiential information, a person who “sees red” does not necessarily understand anything about “red” apart from the experiential aspect (which imho is not “understanding”).
Tournesol said:How remarkably convenient. Tell me, is that true analytically, by defintion,
or is it an observed fact ?
Shall we start debating again whether particular statements are analytic or synthetic?
Information and knowledge are required for understanding, not senses.Tournesol said:It doesn't have any human-style senses at all. Like Wittgenstien's lion, but
more so.
“what it looks like” is sense-experience, it is not understanding.Tournesol said:they understand just what Mary doesn't: what it looks like.
Why not - is this perhaps yet another analytic statement?Tournesol said:However, I do not need to argue that non-experiential knowledge is not knowledge.
Its not at all “clear” to me – or perhaps you also “define” understanding as “requiring sense-experience”? Analytic again?Tournesol said:However, if you can do both you clearly have more understanding than someone
who can only do one or the other or neither.
the question is irrelevant – because “ability to fly a plane” is not synonymous with “understanding flight”.Tournesol said:(Would you want to fly in a plane piloted by someone who had never been in the
air before ?)
Are you saying you only put your trust in the pilot because he “understands”?
If the same plane is now put onto autopilot, would you suddenly want to bail out with a parachute because (in your definition) machines “do not possesses understanding”?
We are talking about “understanding”, not simply an experiential quality.Tournesol said:They don't know what Mary doesn't know.
What is it that you think Mary “understands” once she has “experienced seeing red” that she necessarily did NOT understand before she had “experienced seeing red”?
(remember – by definition Mary already “knows all there is to know about the colour red”, and sense-experience is sense-experience, it is not understanding)
And I claim they are not. The senses are merely “possible conduits” of information.Tournesol said:I claim that experience is necessary for a *full* understanding of *sensory*
language, and that an entity without sensory exprience therefore lacks full
semantics.
There is no reason why all of the information required to “understand red”, or to “understand a concept” cannot be encoded directly into the computer (or CR) as part of its initial program. In principle, no sense-receptors are needed at all. The computer or CR can be totally blind (ie have no sense receptors) but still incorporate all of the information needed in order to understand red, syntactically and semantically. This is the thesis of strong AI, which you seem to dispute.
My claim (and that of strong AI) is that it is simply information, and not necessarily direct access to information from sense-receptors, that is required for understanding. Senses in humans are a means of conveying information – but that is all they are. This need not be the case in all possible agents, and is not the case in the CR. If we could “program the human brain” with the same information then it would have the same understanding, in the absence of any sense-receptors.Tournesol said:If you are going to counter this claim as stated, you need to rise to the
challenge and show how a *verbal* definition of "red" can convey the *experiential*
meaning of "red".
with respect
MF
Last edited: