- #316
loseyourname
Staff Emeritus
Gold Member
- 1,830
- 5
vanesch said:I understood that by reading the previous posts in this monster thread. I have no hard opinion on the issue, but would like to raise a point. If something is to have evolutionary value, is it not more intelligence than conciousness ? Which then links to another question: is conciousness necessary for intelligence ? Personally, I don't think so, but I have no strong arguments. Intelligence is much more "measurable" than conciousness, and I think, in 20-30 years, we will have very intelligent machines, intelligent in the sense that they can do lots of "smart things". But to me, conciousness is that "other" aspect of our being, namely, as I wrote earlier, our "awareness", our "feelings", the fact that "pain hurts". I know this is vague, but that's exactly the problem !
A couple things about this. First, remember that a trait does not necessarily need to be of value to be fit into an evolutionary framework. There are other mechanisms I've spoken of by which relatively superfluous traits may evolve. That said, I do think that consciousness is of value. If you go back to my example of functionally conscious machines - that is, a machine that is capable of performing all of the functions that a conscious human can, but without having any subjective experience, you'll see a point I made about computing power. The ability to strategize holistically seems to still be fairly lost on computers, as evidenced by the fact that we have yet to design a program that can defeat a human skilled at the game of Go. I'm sure there are other examples, but that is the best one I can think of. A program that would be capable of doing this; that is, strategizing holistically about this very complex game at least as well as a human can, would require even more computing power than is currently available - but the most powerful computers already have far more computing power than does the human brain. So it is think it is clear that being conscious gives us the ability to economize - to perform these functions with minimal computing power. Without consciousness, I'd imagine our heads would be much larger and use up far more oxygen and ATP than they already do, which in and of itself is a disadvantage.
There is something ethical about conciousness, because "pain hurts". Once we know that certain machines would be concious, I think ethically they should have fundamental rights, such as the right for not being tortured or so.
On the other hand, I don't think you can do unethical things to your PC. But maybe one day I will stand on trial because I made a big, conscious computer in the basement, which I then tortured during years because I'm a perverted lunatic :-)
You know, I hadn't even thought of that. There is clearly an advantage in having members of a species instilled with a certain sense of ethics, and that cannot be achieved unless individuals of that species are capable of "feeling" subjectively.