- #71
Fra
- 4,175
- 618
jal said:5. In the beginning, It appeared that our degrees of freedom were limited to 2 and that we were organized so that we could only move from a cubic to a hex. pattern.
Roughly, the simplest way I imagine how 2D "spacetime" emerges from evolving discrete complexions is like this.
Consider an observer that has a finite information capacity (memory) that can distinguish only ONE boolean event. Consider a counter that simply encodes/stores the historical counts indexed by 0 and 1.
At each instant all there is, is a counter state.
At the high complexity limit when the counter structure becomes sufficiently complex, the limit of the state space of the counter converges fills [0,1]. So almost a real number (but the further construction can only be understood if it's acknowledged that the limit is never reached).
The state of this counter is constantly challanged by new events and when the counter is saturated, a decision problem appears: An existing count needs to be erased from memory in order to make room from fresh data. What is the optimal information update here? I conjecture that data is ereased randomly!
(This means the erased data is randomly distributed with respect to the emitter, but not necessariy with respect to the receiver; compare here to black body radiation and the information content of hawking radiation)
As the complexity of the observer increases (getting close to the continuum), more possibilities of reencoding the microstructure appears! For example one can consider histories of counter states, effectively considering a history of real numbers. This is the first dimension.
This can then be repeated. But clearly the stability of this higher dimensional records depends on the complexity. At low complexity, the idea is taht these are unlikely to appears, for statistical reasons. The are not forbidden at all, they just don't happen since they are unstable.
But in parallell to this simple cobordism type of genration of dimensions, there are OTHER maybe more interesting development, such as more complex recodings... cobordism is extremely SIMPLE. More complex things is formation of non-commutative strucutres such as a fourier-like transform of the first "string" of real numbers. This would encode the state of change, and thus increase predictivity and stability of the entire measure complex.
So dimensional creation and creation of non-commutative structures are really both just different types of recoding of the data. The selection of WHICH of these recodings that are most stable is the challange.
IF you start from the low complexity end, one can user combinatorics and look as things.
Also the cobordism type of development (histories of states by recursion) and the development of parallell non-commutative structures are in equilibrium since both processes are constrained by the same ocmplexity. Inflating higher dimensions is extremely complexity demanding, but even creating parallell non-commutin structures are... but at the same time this entire structure complexed is constantly challaged by it's environemnt... and if you picture an idea where ALL these possibilities are randomly tried, what emerges in evolution is the optimally fit decomosition of eternal dimensionality and internal non-commuting structures. There is some equilibrium condition we seek here. This is how I see it.
I'm working on this and, all along the guiding principles are no ad hoc actions, all actions are rational random actions. The point is that what is just a entropic dissipation in a simple microstructure, will generate highly nontrivial actions when you combine it with higher dimensions (ie more than one:) and non-commuting structures.
/Fredrik