Should PF Implement Formal Debates? Share Your Thoughts!

  • Thread starter Jameson
  • Start date
  • Tags
    Thread
In summary: People in general are not intelligent enough to debate. Most debates take the sole form of "well here's what I think." One person presenting his point of view followed by the other person presenting his point of view does not constitute a debate. Tossing points back and forth productively requires powerfully accurate comprehension and reasoning skills, which people generally do not have. When this reasoning inadequacy becomes compounded with a psychological inability to admit error, debate is futile.
  • #36
P.S. Evo was right.
 
Physics news on Phys.org
  • #37
by the way I think this debate idea is great. I had no idea it was being considered before, you guys need to take a few lessons in how to advertise effectively.
 
  • #38
hypnagogue said:
If you take 'sentience' to mean 'phenomenal consciousness' or 'subjective experience' or 'qualia,' it really is not such a clear-cut issue, basically for the reasons BicycleTree listed in post #11. I would be shocked if a baseball had some sort of primitive but unitary subjective experience, but we can't rule it out definitively as you would like to.
To be blunt, it's arguments like that this example that are the reason people have no respect for philosophy. C'mon, at a minimal level, you have to at least be alive and capable of some manner of sensing your environment to have any sort of "subjective experience." Some inanimate object is not sentient. And if philosophers want to twist around the definition of sentience because they can't support their argument based on the more commonly used definition, then to what purpose? You're no longer even having the same discussion if one person is arguing for it based on one definition and another arguing against based on another definition. That would be like discussing that oranges are the color orange once establishing what the fruit "orange" is and what the color "orange" is and then someone wanting to refute it by redefining the fruit definition to include lemons and arguing they aren't orange. That would be one big exercise in futility to have a debate like that.
 
Last edited:
  • #39
Moonbear, that would be true if anyone were arguing that inanimate objects are conscious in the same way that humans are conscious, which obviously is not so. The analogy is more like: oranges are brightly colored, and perhaps lemons are also brightly colored. The objects are brightly colored in different ways. The fundamental type of experience of consciousness--the experience of the blueness of blue and the loudness of loud--could apply to anything, though the concepts involved would probably not be "blue" or "loud" but rather some inhumanly strange sensation that reflects the internal conditions of the object.

Zooby, baseballs do have input and they do perform computation. A baseball struck by a bat receives the force of the bat and the resistance of the air as input, and performs a vast amount of computation in the form of momentum change, internal vibrations, spin rate, and slight damage, computations involving all the atoms in the baseball and the physics that govern them. No man-made computer can fully simulate even an appreciable fraction of that computation (though rough and fairly accurate models can be made, the combined responses of the individual baseball atoms are beyond the computer's reach). So given that baseballs have input and perform computations on that input, they may be conscious in some way. It remains an open issue.
 
  • #40
BicycleTree said:
Zooby, baseballs do have input and they do perform computation. A baseball struck by a bat receives the force of the bat and the resistance of the air as input, and performs a vast amount of computation in the form of momentum change, internal vibrations, spin rate, and slight damage, computations involving all the atoms in the baseball and the physics that govern them. No man-made computer can fully simulate even an appreciable fraction of that computation (though rough and fairly accurate models can be made, the combined responses of the individual baseball atoms are beyond the computer's reach). So given that baseballs have input and perform computations on that input, they may be conscious in some way. It remains an open issue.
Uhm...NO.

Dictionary time.

From Merriam Webster online

Sentient

1 : responsive to or conscious of sense impressions
2 : AWARE (having or showing realization, perception, or knowledge)
3 : finely sensitive in perception or feeling
 
  • #41
Is someone here trying to assert that we humans have created artificial sentient life ala Frankenstein by assembling pieces of dead cows and cotton plants and liquified dinosaurs into a sphere? This should be in one of the cutting-edge biology sections.
 
  • #42
Evo said:
Uhm...NO.

Dictionary time.

From Merriam Webster online

Sentient

1 : responsive to or conscious of sense impressions
2 : AWARE (having or showing realization, perception, or knowledge)
3 : finely sensitive in perception or feeling
From Dictionary.com (The American Heritage® Dictionary of the English Language, Fourth Edition)

1. Having sense perception; conscious: “The living knew themselves just sentient puppets on God's stage” (T.E. Lawrence).
2. Experiencing sensation or feeling.

Either of which is essentially my concept.

But whichever definition of sentience you use, I am talking about the type of consciousness that is about the blueness of blue or the redness of red. Your original statement about people arguing that baseballs are sentient was probably using the version of sentient that I mean (perceiving things like the blueness of blue or redness of red).

If you really only meant sentient as "thinking in the way that humans do," then no, baseballs are not sentient, but I don't think you could find anyone interested in philosophy would argue that baseballs are sentient in that manner.
 
  • #43
Moonbear said:
To be blunt, it's arguments like that this example that are the reason people have no respect for philosophy.

I can appreciate why one would respond in this way, but given a full survey of the philosophical issues surrounding subjective experience, it turns out to be more plausible than it might seem at first. The fundamental observation here is that we have very strict empirical limitations on observing subjective experience-- a given individual can only know his own internal experiences. Strictly speaking, I cannot even be sure that you subjectively experience, because I cannot empirically verify it; I cannot just 'jump' inside your head and feel what you feel. From where I stand, respect for empirical considerations seems like a worthy virtue to uphold.

Short of directly observing the existence of subjective experience in systems other than myself, I can start off with reasonable assumptions-- e.g. other cognitively normal humans subjectively experience in roughly the same way I do-- and from there, try to derive general principles that might govern the existence and nature of subjective experience. Of course, given the profound epistemic limitations with which we are faced, this will be an incredibly difficult undertaking. At what point can I be confident that the psychophysical laws I have derived really are accurate reflections of how subjective experience is governed in nature? Does an experiencing system need to have a brain? Does it need to have neurons? Does it need to be biological? Does it need to be a cognitive system? What exactly is a cognitive system anyway? In the end, what are the true necessary and sufficient conditions for the existence of some kind of subjective experience, no matter how minimal and alien it might appear from our perspective, if we could but 'see' it? What if these conditions are rather more basic and ubiquitous than we otherwise might have thought, and the specific, high-level, human cognitive context we find ourselves in is somehow biasing our first person evidence and intuitions about these matters? These are by no means trivial questions. To trivialize them, I suspect, is just to be the assenting victim of powerful but nonetheless questionable intuitions and assumptions.

C'mon, at a minimal level, you have to at least be alive and capable of some manner of sensing your environment to have any sort of "subjective experience."

What is it to be alive? Is life really a fundamentally delineated category of nature upon which basic psychophysical laws might rest? Is it not more of a fuzzy gradient with no clear universal criteria? If we could construct a close functional copy of a human brain using silicon chips, would this computer be alive? Would it subjectively experience?

What counts as sensing the environment? Is this kind of sensing a fundamental category of nature upon which basic psychophysical laws might rest? Doesn't any system 'sense' (feel the causal effects of) its environment in some way or another? What distinguishes the relevant kind of sensing in such a way to make it a plausible basis for subjective experience, as opposed to other, implausible kinds of 'sensings'?

Some inanimate object is not sentient.

It depends on what we mean by 'sentient.' This is a rather ambiguous word, as is the word 'conscious.' It is helpful here, I think, to employ Ned Block's distinction between access consciousness (a-consciousness) and phenomenal consciousness (p-consciousness). A-consciousness refers to that collection of elements in a cognitive system that are globally available for the purpose of guiding reasoning, action, and verbal report. It is basically an umbrella term for the functional aspects of exhibited by a cognitively normal human being. Terms that are usually used synonymously include self-awareness, awareness, agency, and sentience.

P-consciousness refers not to the functional, but rather to the qualitative nature of consciousness: the way it feels to take a cold shower or see the redness of an apple or to feel sad, 'qualia' or 'raw feels,' in general the nature of "what it is like" to be a subject of experience (see Thomas Nagel's paper, "What Is It Like to Be a Bat?") Terms that are usually used synonymously include subjective experience, experience, phenomenal content, qualia, and raw feels.

Now that we've cleared things up a bit, my response to your statement above is: yes, it is undoubtedly the case that just any old inanimate object is not a-conscious. For a system to be a-conscious, it requires certain kinds of sophisticated physical structures in order to instantiate the relevant kinds of high-order cognitive/mental structures and functions definitive of a-consciousness. (I hold to this position very strongly, but some e.g. Les Sleeth do not.)

Now, is it the case that just any old inanimate object is not p-conscious? That is an infinitely more complex question. I would tend to doubt that just any system could be p-conscious as a system, in the same way that (some subset of) our brains seem to be p-conscious as a system, as a unified whole. Ultimately, it depends on the nature of the system in question, and the nature of the kinds of psychophysical laws we can derive, and the extent to which those laws can really be trusted; but no matter what, this is a much more difficult and more substantial question than the question of a-consciousness. We can pretty much evaluate the presence of a-consciousness by means of third person investigation, but the epistemic problems surrounding p-consciousness-- the seemingly fundamental privacy of subjective experience-- prevents us from making such a facile 3rd person judgment.

And if philosophers want to twist around the definition of sentience because they can't support their argument based on the more commonly used definition, then to what purpose?

You have it backwards. I was not trying to twist the definitions of sentience in my previous post, but rather was just trying to point out a possible point of confusion: zooby and BT seemed to be using different senses of the word. zooby seemed to be using 'sentience' synonymously with 'a-consciousness' whereas BT seemed to be using it synonymously with 'p-consciousness.' (Usually, from what I've seen, 'sentience' is used in the former way rather than the latter.)

For what it's worth, the distinction between a- and p-consciousness is a widely recognized one in consciousness studies. The point of creating two new terms is not to cause confusion or obfuscation, but rather to clarify things by pinpointing exactly what it is we're talking about. Often when one uses just the word 'conscious,' it is not clear if one means to refer to a-, or p-, or both at once. Of course, this can lead to substantial confusions, so there are obvious benefits in creating a more technical vocabulary.
 
  • #44
Myself, I generally use "sentience" to mean a-consciousness in your terminology, but I was making an exception here because of the way Evo used it. I don't think that a-consciousness is a philosophical topic so much as it is a philosophical misconception, worth mentioning only because it is often confused with p-consciousness.
 
  • #45
So guys, we have gotten a little sidetracked, but what's the verdict? Should we take a poll? PM the Admins? I want to take action. I'm more of a doer than a talker. I'll setup a poll.

Jameson
 
  • #46
Jameson said:
So guys, we have gotten a little sidetracked, but what's the verdict? Should we take a poll? PM the Admins? I want to take action. I'm more of a doer than a talker. I'll setup a poll.

Jameson
I'm interested to know what function the moderator performs.
 
  • #47
zoobyshoe said:
I can.

...

Okay, we're done. :rolleyes:
 
  • #48
Ivan Seeking said:
Okay, we're done. :rolleyes:
I'm sorry, but some things are just too assinine.
 
  • #49
zoobyshoe said:
I'm sorry, but some things are just too assinine.
:smile: :smile: You have to love misspelled insults of major issues in Western philosophy... it's just so great...
 

Similar threads

Replies
64
Views
7K
Replies
4
Views
2K
Replies
4
Views
1K
Replies
1
Views
2K
Replies
3
Views
3K
Replies
2
Views
1K
Replies
69
Views
7K
Replies
32
Views
7K
Back
Top