- #176
moving finger
- 1,689
- 1
I understand. Your definition of consciousness is thus “any agent which possesses all of the characteristics of sensation, perception, thought and awareness is by definition conscious”, is that it?Tisthammerw said:If an entity has a state of being such that it includes the characteristics I described, the entity has consciousness (under my definition of the term).
We would then have to ask for the definitions of each of those characteristics – what exactly do we mean by “sensation, perception, thought, and awareness”? (without defining any of these words in terms of consciousness, otherwise we simply have a tautology).
If, as you say, perception, thought etc are implicitly included in “awareness”, are you now suggesting that “awareness alone necessarily implies consciousness”? Should we revise the definition of consciousness above?Tisthammerw said:Understanding (as how I defined it) requires that the entity be aware of what the words mean (this would also imply a form of perception, thought etc.). This would imply the existence of consciousness (under my definition of the term
If you ask the CR (in Chinese) “are you aware of what these words mean?”, it’s reply will depend on how it defines “awareness”. If awareness is defined as “conscious awareness” then (if it is not conscious) it will necessarily reply “no”. But defining awareness as “conscious awareness” makes the definition of “consciousness in terms of awareness” a tautology (“consciousness is characterised by a state of conscious awareness”) therefore not very useful in terms of our epistemology.
All we achieve with a tautology is the following :
“If I define understanding as requiring conscious awareness then it follows that undertsanding required consciousness”
This doesn’t really tell us very much does it?
The problem is that I do not agree that understanding requires conscious awareness. Thus we disagree at the level of your initial assumptions.
In which case I humbly apologise for this error on my part. My statement “Imho it is completely logical” was intended to respond to what I took be the implication that my own argument was not logical. What I should have said is that I do not agree with your assumption “he knows the meaning of the words without knowing the meaning of the words”, therefore (since the premise is disputed) your conclusion “that isn’t logical” is invalid.Tisthammerw said:As I mentioned in post #171, I said, "But he doesn't know the meaning of any Chinese word! Are you saying he knows the meaning of the words without knowing the meaning of the words? That isn't logical." To which you replied, "Imho it is completely logical." And I thus said, "Imho you need to look up the law of noncontradiction" in post #169.
Your argument is still based on an implicit assumption – step 2.moving finger said:With respect, you have not shown how you arrive at the conclusion “my pet cat possesses consciousness”, you have merely stated it.
Tisthammerw said:Not at all. My argument went as follows (some premises were implicit):
1. If my cat possesses key characteristic(s) of consciousness (e.g. perception) then my cat possesses consciousness (by definition).
2. My cat does possesses those attribute(s).
3. Therefore my cat has consciousness.
If we assume your definition of consciousness is sufficient (I dispute that it is), then how do you “know” that your cat is aware?
Your earlier argument (as I have pointed out) already implies that “perception, thought etc” are subsumed into “awareness” – thus the acid test of consciousness (according to your own definition) should be the presence not of perception alone, but of awareness alone. Can you show that your cat is indeed “aware” (you need to define aware first)?
It could be, but then it’s not my thought experiment. If someone tells me he has internalised the rulebook, it is surely not up to me to guess where this internalised rulebook sits, is it?Tisthammerw said:So where does this alleged understanding take place if not in Searle's brain? His arm? His stomach? What?
Tisthammerw said:In the physical plane, it would be the brain would it not?
I disagree. His conscious self may have “participated in the process of internalisation”, but once internalised, the internalised version of the rulebook exists within Searle but not as a part of his consciousness. Consciousness is not a fixed or a physical object, it cannot "contain" anything in permanent terms, much less a rulebook or the contents of a rulebook. Consciousness is a dynamic and ever-changing process, and as such it may gain access to information contained in physical objects (such as a rulebook, or in memories, or in sense perception) but it does not contain any such objects, and it does not contain any permanent information.Tisthammerw said:The part that has internalized the rulebook is his conscious self
His consciousness “participated in” the physiacl process of internalisation of the rulebook, but the rulebook does not sit “in his consciousness”. Consciousness is a dynamic and ephemeral process, it is not something that can “internalise something within itself”. What happens if we knock Searle unconscious, is the rulebook destroyed? No, it continues to exist. When Searle regains consciousness, he can once again access the rulebook, not because his consciousness recreates it from nothing but because the rulebook now physically exists within his entity (but not in his consciousness).Tisthammerw said:Perhaps we are confusing each other's terms. When I say he consciously internalized the rulebook, I mean that he has consciously memorized the rulebook, consciously knows all the rules, and consciously applies those rules to the input he receives. What do you mean by it?
We have the same problem. By “aware” you implicitly mean “consciously aware”. If you define “awareness” as “conscious awareness” then I dispute that an agent needs to be consciously aware in order to have understanding. The internalised rulebook does NOT understand English (it is another part of Searle which “understands English”). Asking the internalised rulebook a question in English would be a test only of whether it understands English, not a test of whether it understands per se.moving finger said:What part of “grasp the meaning of a word in Chinese” (ie an understanding of Chinese, by your own definition) would necessarily mean that an agent could respond to a question in English?
Tisthammerw said:Because understanding Chinese words (as I have defined it) means he is aware of what the Chinese words mean, and thus (since he knows and understands English) he can tell me in English if he understands Chinese.
I think we keep covering the same ground. The basic problem (correct me if I am wrong) is that you define understanding as requiring conscious awareness. I dispute that. Most of our disagreement stems from that.Tisthammerw said:There is no part of Searle--stomach, arm, liver, or whatever--that is aware of what the Chinese words mean.
The whole point is that I disagree with the basic premise that “consciousness is a necessary pre-requisite of understanding”.
Are you asking whether I agree with the definitions of your terms here, or with your logic, or with your conclusion?Tisthammerw said:Then do you also disagree with the belief that all bachelors are unmarried? Remember what I said before about tautologies...
If we agree on the definition of terms then if we follow the same logic it is a foregone conclusion that we will agree on the conclusion. The problem is that in the case of understanding and awareness we do not agree on the definition of terms.
Do we agree on what exactly?Tisthammerw said:the Chinese room (and its variants) strongly support my claim that programmed computers (under the model we’re familiar with; i.e. using a complex set of instructions acting on input to produce “valid” output)--even when they pass the Turing test--cannot literally understand (using my definition of the term); i.e. computers cannot perceive the meaning of words, nor can computers be aware of what words mean. Do we agree on this?
I agree with your logic, but I disagree with your definition of the term understanding (which you define as requiring conscious awareness, rather than showing that it requires conscious awareness), therefore I disagree with your conclusion.
With respect
MF
Last edited: