- #1
- 3,012
- 42
Computationalism has a number of inconsistencies that haven’t logically been refuted. The purpose of this thread is to discuss one of them and see if there’s a way out of the problem and/or to help get perspectives on the problem. That problem I’ll call the “special signal problem”.
In the paper by Arnold Zuboff* entitled “http://themindi.blogspot.com/2007/02/147.html" ”, Arnold provides a thought experiment by creating a story around a brain in a vat. The brain is provided with all the same inputs and outputs it would have had in the person’s head, but like The Matrix, he has this brain in a jar and simply provides the same signals to simulate the brain being in a head. By providing these signals, we can safely assume the brain undergoes the same changes of state as it would otherwise go through in the head, so it is assumed the experiences are also identical. I’d recommend reading the story, it’s very entertaining.
The first twist to the story is to have the brain cut in half. He then suggests the signals to the neurons at the cut are provided the same signals they would receive in the brain, except the opposite brain half doesn’t produce them. Instead, the signals are provided by an “impulse cartridge”. What exactly that is, isn’t important. Just the fact it can simulate the connection at each broken synapse or other fractured surface is all that is necessary. It provides a “signal” that allows each half of the brain to continue firing as if it were still together and connected as a single brain.
Arnold then separates the brain into smaller sections, again using the impulse cartridge to simulate the interactions at each of the breaks in the brain until finally, the brain is separated into individual neurons. We can then ask the question of whether or not the brain still experiences everything as it did when it was together as a complete brain in a vat. The implied assumption is that the brain can’t remain consciously aware of anything any more, regardless of whether or not the individual neurons are still operating as if they were together. Certainly computationalism would predict that the dissociated brain was no longer experiencing anything.
To explain why the brain might no longer be able to support phenomenal consciousness, a character in the story, Cassander, suggests there are 4 guiding principals, anyone of which when violated, might prevent phenomenal experience from occurring. They are:
- The condition of proximity: The cells are no longer close together as they were in the brain.
- The condition of no actual causal connection: Also called “counterfactual information” the problem with no causal connection between the parts of the brain is the problem most philosophers focus on. That is, regardless of the fact there is a duplicate signal, the actual signal is what is needed to maintain phenomenal consciousness. This actual signal is the “special signal”.
- The condition of synchronization: The cells may no longer be in sync.
- The condition of topology: Whether or not the cells are in the same basic spatial relationship, or pointing in the right direction.
The focus of the special signal problem will be the counterfactual sensitivity concern since that is the focus in the literature.
The problem that Arnold writes about has also been written about by others in different ways including Maudlin, Putnam and Bishop. But regardless of how the problem is approached, the defenders of computationalism focus on why the system in question must be able to support counterfactual information and thus be capable of performing other computations. That is, regardless of whether or not a portion of a system is needed to perform a computation, if the system can not support the right counterfactual information, it can't instantiate a program and it can't support phenomenal consciousness.
Note that in Zuboff’s story, only the subjective experience is compromised by moving from a causally connected brain to a brain that is NOT causally connected. In the causally disconnected brain, all objectively measurable phenomena still occur within every part of the brain just as they had in the causally connected one since the signals are still provided. This seems to suggest that the duplicate signal provided by the impulse cartridge is not sufficient, only the original signal is sufficient to create the phenomenon of consciousness. The duplicate signal may have all the same properties, may be indistinguishable from the original, and may maintain all the same objectively measurable properties throughout the disconnected brain. But the duplicate signal is not sufficient per computationalism, to support consciousness. We need a special signal. We need the original one.
From this analysis, it is clear that any theory of mind must address how and why counterfactual alternatives are crucial to consciousness. We need to understand what is so special about that particular signal.
*Story available on the web at: http://themindi.blogspot.com/2007/02/147.html
and is part of the book by Hofstadter and Dennett (editors) entitled “The Mind’s I: Fantasies and Reflections on Self and Soul”.
In the paper by Arnold Zuboff* entitled “http://themindi.blogspot.com/2007/02/147.html" ”, Arnold provides a thought experiment by creating a story around a brain in a vat. The brain is provided with all the same inputs and outputs it would have had in the person’s head, but like The Matrix, he has this brain in a jar and simply provides the same signals to simulate the brain being in a head. By providing these signals, we can safely assume the brain undergoes the same changes of state as it would otherwise go through in the head, so it is assumed the experiences are also identical. I’d recommend reading the story, it’s very entertaining.
The first twist to the story is to have the brain cut in half. He then suggests the signals to the neurons at the cut are provided the same signals they would receive in the brain, except the opposite brain half doesn’t produce them. Instead, the signals are provided by an “impulse cartridge”. What exactly that is, isn’t important. Just the fact it can simulate the connection at each broken synapse or other fractured surface is all that is necessary. It provides a “signal” that allows each half of the brain to continue firing as if it were still together and connected as a single brain.
Arnold then separates the brain into smaller sections, again using the impulse cartridge to simulate the interactions at each of the breaks in the brain until finally, the brain is separated into individual neurons. We can then ask the question of whether or not the brain still experiences everything as it did when it was together as a complete brain in a vat. The implied assumption is that the brain can’t remain consciously aware of anything any more, regardless of whether or not the individual neurons are still operating as if they were together. Certainly computationalism would predict that the dissociated brain was no longer experiencing anything.
To explain why the brain might no longer be able to support phenomenal consciousness, a character in the story, Cassander, suggests there are 4 guiding principals, anyone of which when violated, might prevent phenomenal experience from occurring. They are:
- The condition of proximity: The cells are no longer close together as they were in the brain.
- The condition of no actual causal connection: Also called “counterfactual information” the problem with no causal connection between the parts of the brain is the problem most philosophers focus on. That is, regardless of the fact there is a duplicate signal, the actual signal is what is needed to maintain phenomenal consciousness. This actual signal is the “special signal”.
- The condition of synchronization: The cells may no longer be in sync.
- The condition of topology: Whether or not the cells are in the same basic spatial relationship, or pointing in the right direction.
The focus of the special signal problem will be the counterfactual sensitivity concern since that is the focus in the literature.
The problem that Arnold writes about has also been written about by others in different ways including Maudlin, Putnam and Bishop. But regardless of how the problem is approached, the defenders of computationalism focus on why the system in question must be able to support counterfactual information and thus be capable of performing other computations. That is, regardless of whether or not a portion of a system is needed to perform a computation, if the system can not support the right counterfactual information, it can't instantiate a program and it can't support phenomenal consciousness.
Note that in Zuboff’s story, only the subjective experience is compromised by moving from a causally connected brain to a brain that is NOT causally connected. In the causally disconnected brain, all objectively measurable phenomena still occur within every part of the brain just as they had in the causally connected one since the signals are still provided. This seems to suggest that the duplicate signal provided by the impulse cartridge is not sufficient, only the original signal is sufficient to create the phenomenon of consciousness. The duplicate signal may have all the same properties, may be indistinguishable from the original, and may maintain all the same objectively measurable properties throughout the disconnected brain. But the duplicate signal is not sufficient per computationalism, to support consciousness. We need a special signal. We need the original one.
From this analysis, it is clear that any theory of mind must address how and why counterfactual alternatives are crucial to consciousness. We need to understand what is so special about that particular signal.
*Story available on the web at: http://themindi.blogspot.com/2007/02/147.html
and is part of the book by Hofstadter and Dennett (editors) entitled “The Mind’s I: Fantasies and Reflections on Self and Soul”.
Last edited by a moderator: