- #1
someoneis777
- 4
- 0
Obviously the topic isn't new...but...
What do you think? In my opinion computers have the potential to be much smarter than us and that is - the reason for this not happening is not the lack of hardware...but simply: perhaps we don't know yet how to program them. To quote Bill Gates: "I've never doubtd in the magic of software". I am a firm believer that AI has to do mostly with the way the program is programmed and not in the number of transistors - actually the whole transistor argument is kind of laghuable, now that i have thought about it for...days and weeks. Hence, the whole question in my opinion is not: Will they become - but: is it wise to create them so smart? What are the real implications of computers destroying Earth once we make them so smart? I think the chance for a dramatic collision between humans and AI is relatively big. It's not the safest bet...but...we won't be humans if we don't try.
What do you think? In my opinion computers have the potential to be much smarter than us and that is - the reason for this not happening is not the lack of hardware...but simply: perhaps we don't know yet how to program them. To quote Bill Gates: "I've never doubtd in the magic of software". I am a firm believer that AI has to do mostly with the way the program is programmed and not in the number of transistors - actually the whole transistor argument is kind of laghuable, now that i have thought about it for...days and weeks. Hence, the whole question in my opinion is not: Will they become - but: is it wise to create them so smart? What are the real implications of computers destroying Earth once we make them so smart? I think the chance for a dramatic collision between humans and AI is relatively big. It's not the safest bet...but...we won't be humans if we don't try.