- #1
JamesN
- 3
- 0
Many people out there today seem to think that we'll soon have computers powerful enough to simulate the physical world well enough that we'll be able to upload ourselves and live in such a simulation. People really seem to think a Matrix situation is possible. Some, like Nick Bostrom, have argued that we might in fact already live in a simulation. I find this all very, very doubtful, for a number of reasons.
I am thinking of writing up a little piece that tries to convey a sense of just how much computational power it would take to simulate even a small amount of matter with a fine enough grain to make it experimentally indistinguishable from the real thing.
If our world is a simulation, how much computation is necessary to produce it?
It seems to me very unlikely that it would be possible to simulate the world only at a very coarse grain most of the time, giving just enough detail to fool people when they aren't looking too closely, and only to do the calculations for very fine details when people happen to be looking in microscopes, measuring very short intervals of time, and that sort of thing. I suspect that everything at the smallest scales and the shortest time intervals would have to be calculated all the time to give rise to all the behavior we see at larger scales.
It has to be bottom-up. If not, I think that any careful scientific investigation would reveal many inconsistencies. In order for every possible microphysical observation to be perfectly consistent with every other possible observation, microscopic or macroscopic, I tend to think that it would be necessary to simulate all of the subatomic particles all the time. I think that not simulating what we aren't looking at wouldn't work. The story would fall apart.
Biological processes, to behave as they do, rely on a huge amount of stuff going on at very tiny scales. Our computers similarly rely on very small things. And of course, we can go to Mars, or anywhere else for that matter, examine things microscopically, and make observations consistent with a long history of microscopic processes. And it is all a very tight story. We'll never discover any inconsistencies.
If the story given us were constructed from the top down, with a degree of simulation detail only fine enough to match our observational detail at any given time, as we drill down in our investigations from different angles, we would surely find that the story doesn't make sense, that there are inconsistencies of some kind. It would be hard to explain the success of science and the degree of consilience that we find if the world were not processed bottom-up. Everything at macroscopic scales seems to be fully accounted for by the microphysical details.
We can find a wealth of microscopic evidence for a long history of physical processes that preceded our existence. You can't just start the simulation with the first moment a human made an observation! To have a fully consistent history that would make sense to careful scientific investigators, you'd have to calculate the whole history.
And contrary to what some think, it wouldn't suffice to just simulate our brains at a coarse grain like a simple neural net! That's crazy! You have to also simulate the environment if you want to give us the sort of experiences that we have, with all the information available to us through any possible investigation of that environment. Basically, it would require a complete simulation of at least the environment available to us, including this planet's surface and possibly the rest of the solar system.
I think that if I can get a realistic idea of the number of calculations it would take to fully simulate something as small as a grain of salt, and show how large a computer it would take to do that, it would show all this simulated reality stuff to be the nonsense that it is.
Okay, so what would it take? Does the simulation need to go all the way down to the Planck length, with a time step of Planck time? Or can we just treat each subatomic particle as a simple entity with a handful of numbers associated with it? Or do you think it would be adequate to ignore particles like quarks and just simulate protons, neutrons, electrons, and so on?
How many calculations per time step would be required for each particle? It seems to me that it wouldn't make sense to calculate the forces acting on each particle by adding up all the forces from every other particle. Rather, it might be more efficient to just keep track of the fields, calculating each particle's local effect on the fields and the effects of the fields on the particles. Does this sound right?
I was thinking that it might make sense to use the holographic principle and just keep track of each bit that is written on each Planck area of the surface of the sphere bounding the region being simulated, maybe with a time step of Planck time. This would simplify the estimation of required computing power greatly, but I fear it might be too fine-grained and might greatly exaggerate the computational requirements.
According to Wikipedia,
Is there any reason to think that we might still need a time interval shorter than this?
I am not sure how to even begin thinking about how to simulate all the quantum mechanical behavior.
Any ideas would be appreciated.
I am thinking of writing up a little piece that tries to convey a sense of just how much computational power it would take to simulate even a small amount of matter with a fine enough grain to make it experimentally indistinguishable from the real thing.
If our world is a simulation, how much computation is necessary to produce it?
It seems to me very unlikely that it would be possible to simulate the world only at a very coarse grain most of the time, giving just enough detail to fool people when they aren't looking too closely, and only to do the calculations for very fine details when people happen to be looking in microscopes, measuring very short intervals of time, and that sort of thing. I suspect that everything at the smallest scales and the shortest time intervals would have to be calculated all the time to give rise to all the behavior we see at larger scales.
It has to be bottom-up. If not, I think that any careful scientific investigation would reveal many inconsistencies. In order for every possible microphysical observation to be perfectly consistent with every other possible observation, microscopic or macroscopic, I tend to think that it would be necessary to simulate all of the subatomic particles all the time. I think that not simulating what we aren't looking at wouldn't work. The story would fall apart.
Biological processes, to behave as they do, rely on a huge amount of stuff going on at very tiny scales. Our computers similarly rely on very small things. And of course, we can go to Mars, or anywhere else for that matter, examine things microscopically, and make observations consistent with a long history of microscopic processes. And it is all a very tight story. We'll never discover any inconsistencies.
If the story given us were constructed from the top down, with a degree of simulation detail only fine enough to match our observational detail at any given time, as we drill down in our investigations from different angles, we would surely find that the story doesn't make sense, that there are inconsistencies of some kind. It would be hard to explain the success of science and the degree of consilience that we find if the world were not processed bottom-up. Everything at macroscopic scales seems to be fully accounted for by the microphysical details.
We can find a wealth of microscopic evidence for a long history of physical processes that preceded our existence. You can't just start the simulation with the first moment a human made an observation! To have a fully consistent history that would make sense to careful scientific investigators, you'd have to calculate the whole history.
And contrary to what some think, it wouldn't suffice to just simulate our brains at a coarse grain like a simple neural net! That's crazy! You have to also simulate the environment if you want to give us the sort of experiences that we have, with all the information available to us through any possible investigation of that environment. Basically, it would require a complete simulation of at least the environment available to us, including this planet's surface and possibly the rest of the solar system.
I think that if I can get a realistic idea of the number of calculations it would take to fully simulate something as small as a grain of salt, and show how large a computer it would take to do that, it would show all this simulated reality stuff to be the nonsense that it is.
Okay, so what would it take? Does the simulation need to go all the way down to the Planck length, with a time step of Planck time? Or can we just treat each subatomic particle as a simple entity with a handful of numbers associated with it? Or do you think it would be adequate to ignore particles like quarks and just simulate protons, neutrons, electrons, and so on?
How many calculations per time step would be required for each particle? It seems to me that it wouldn't make sense to calculate the forces acting on each particle by adding up all the forces from every other particle. Rather, it might be more efficient to just keep track of the fields, calculating each particle's local effect on the fields and the effects of the fields on the particles. Does this sound right?
I was thinking that it might make sense to use the holographic principle and just keep track of each bit that is written on each Planck area of the surface of the sphere bounding the region being simulated, maybe with a time step of Planck time. This would simplify the estimation of required computing power greatly, but I fear it might be too fine-grained and might greatly exaggerate the computational requirements.
According to Wikipedia,
As of May 2010 [update], the smallest time interval uncertainty in direct measurements is on the order of 12 attoseconds (1.2 × 10^−17 seconds), about 3.7 × 10^26 Planck times.
Is there any reason to think that we might still need a time interval shorter than this?
I am not sure how to even begin thinking about how to simulate all the quantum mechanical behavior.
Any ideas would be appreciated.