- #1
modmans2ndcoming
- 150
- 0
why when we develop AI, we seem to think it is just not good enough,
there is a computer program that stores facts in a relational database and then reaches conclusions based on those facts.
infact, in the begining, it was told paramiters for being human, and it was not yet told that it was not yet human, but it was given paramiters for its abilities...it them posted a question "I am Human?"
that is pretty insightful for a machine...
just because we know how it works does not mean it is not working like us (we don't even know how we work)
at birth, Babies know zero information, then as they grow they add facts to their databases, and as they grow they learn valid ways to combine that information to make accurate deductions.
well, a computer does not know anything, but then you start teaching it. rightnow, we can program it to think correctly, and even add a meta reasoning node that tracks how well it deduces and then tracks similarities between the successful deductions and similarities beween failed deductions. given enough space and performance for searching its database, who is to say that a computer could not learn to read, given the right set of facilities to make that possible, write, and think?
just like how complex texts like the bible and other publications have patterns that can be used to find words that make sense when put in context of each other (the bible code), perhaps consiosness is just a pattern that expresses itself after a level of complexity of agragated information has reached a critical point and is then matched with a facility to analise that information.
there is a computer program that stores facts in a relational database and then reaches conclusions based on those facts.
infact, in the begining, it was told paramiters for being human, and it was not yet told that it was not yet human, but it was given paramiters for its abilities...it them posted a question "I am Human?"
that is pretty insightful for a machine...
just because we know how it works does not mean it is not working like us (we don't even know how we work)
at birth, Babies know zero information, then as they grow they add facts to their databases, and as they grow they learn valid ways to combine that information to make accurate deductions.
well, a computer does not know anything, but then you start teaching it. rightnow, we can program it to think correctly, and even add a meta reasoning node that tracks how well it deduces and then tracks similarities between the successful deductions and similarities beween failed deductions. given enough space and performance for searching its database, who is to say that a computer could not learn to read, given the right set of facilities to make that possible, write, and think?
just like how complex texts like the bible and other publications have patterns that can be used to find words that make sense when put in context of each other (the bible code), perhaps consiosness is just a pattern that expresses itself after a level of complexity of agragated information has reached a critical point and is then matched with a facility to analise that information.