- #1
ricardo81
- 39
- 1
Disclaimer: I'm just an amateur enthusiast who is a programmer during the day
I suppose everyone who takes an interest in physics who doesn't have the mathematical tools to further their enquiries end up writing posts like mine, and that at the very least it's nice to see where the ideas reside amongst (or outside of) what is more mathematically sound, or just simply hearing the wisdom of people who know better.</disclaimer>
I like the idea that the universe is a computational machine, either conceptually or in actuality (I prefer the latter). I also like( or believe) that the universe is indeed deterministic and that there is a granularity of its fundamental units, and the emergent property of it is information. After reading Seth Lloyd's "Computing the Universe", it'd seem he's happy with the conceptual angle (and wouldn't go as far to say it is actually computing "something" more than itself).
I would very much appreciate if you could humour me on this and see whether there's any potential truth or refutable in my statements.
- According to Einstein, gravity is not a force but a property of spacetime itself. Could it be that in a computational sense, gravity acts as a barrier to prevent computational complexity exceeding a 'limit' within an area of spacetime? i.e. time is relatively slowed where there is more information, and the number of bit flips required in a more conceptually average time frame would not exceed a limit created by the effect of gravity.
- I can get my head round wave-particle duality if the fundamental unit is information, and that matter is only emergent when it is interacting with other matter. i.e. matter/information is only observable when it needs to be. The probability wave only needs to collapse when the information is forced to reveal itself at a particular place and time. Could it perhaps be that this preferred form is an 'insurance' that a given locality would not exceed a computational complexity also?
- Is it possible that matter/information falling into a black hole is equivalent to writing the universe's laws onto its spinning disk? i.e. every unique bit of information is written on there linearly (and I assume written back out but but perhaps encoded differently or the data is effectively wiped (which I understand is considered bad form if the process could not be reversed)).
My own cobbled-up notion is that the universe can be thought of as a series of computational 'cores' of a similar size or few orders of magnitude larger or smaller than the Planck scale, though I'm slightly confused about Planck scales. My understanding is that it's the most fundamental "unit of action" and perhaps in the computational sense, a unit of action could be considered at two different scales.
Thanks for reading
I suppose everyone who takes an interest in physics who doesn't have the mathematical tools to further their enquiries end up writing posts like mine, and that at the very least it's nice to see where the ideas reside amongst (or outside of) what is more mathematically sound, or just simply hearing the wisdom of people who know better.</disclaimer>
I like the idea that the universe is a computational machine, either conceptually or in actuality (I prefer the latter). I also like( or believe) that the universe is indeed deterministic and that there is a granularity of its fundamental units, and the emergent property of it is information. After reading Seth Lloyd's "Computing the Universe", it'd seem he's happy with the conceptual angle (and wouldn't go as far to say it is actually computing "something" more than itself).
I would very much appreciate if you could humour me on this and see whether there's any potential truth or refutable in my statements.
- According to Einstein, gravity is not a force but a property of spacetime itself. Could it be that in a computational sense, gravity acts as a barrier to prevent computational complexity exceeding a 'limit' within an area of spacetime? i.e. time is relatively slowed where there is more information, and the number of bit flips required in a more conceptually average time frame would not exceed a limit created by the effect of gravity.
- I can get my head round wave-particle duality if the fundamental unit is information, and that matter is only emergent when it is interacting with other matter. i.e. matter/information is only observable when it needs to be. The probability wave only needs to collapse when the information is forced to reveal itself at a particular place and time. Could it perhaps be that this preferred form is an 'insurance' that a given locality would not exceed a computational complexity also?
- Is it possible that matter/information falling into a black hole is equivalent to writing the universe's laws onto its spinning disk? i.e. every unique bit of information is written on there linearly (and I assume written back out but but perhaps encoded differently or the data is effectively wiped (which I understand is considered bad form if the process could not be reversed)).
My own cobbled-up notion is that the universe can be thought of as a series of computational 'cores' of a similar size or few orders of magnitude larger or smaller than the Planck scale, though I'm slightly confused about Planck scales. My understanding is that it's the most fundamental "unit of action" and perhaps in the computational sense, a unit of action could be considered at two different scales.
Thanks for reading