- #1
giulio_hep
- 104
- 6
Let me write a brief intoduction to my question, just to clarify what I am not asking here, since it has been already discussed elsewhere.
[1] intro
Well, it is well known how the classical world of our experience can emerge from quantum mechanics in terms of decoherence. But it doesn't seem to me the only approach to generalization and emergence.
Another well known example is how we obtain classical physics by taking the limit of Planck's constant to zero, and simlarly with relativity and other deformations (where the classical limit involves small speeds or a small mass size, etc...)
[2] typical renormalization approach and its limits
Unlike in systems biology, physics is dominated by effective models and theories.
This is in large part due to the great success of continuum limit arguments and RG procedures.
The problem is that - while renormalization group operates on field theories with a scale invariance or conformal symmetry - biochemical networks - and all what constitute the classical world - are particularly challenging due to inhomogeneity and lack of symmetries.
[3] my question
Machine learning methods that search for real world patterns are a growing feature of our recent research methods. In fields like ecology, systems biology, and macroeconomics, grossly simplified models capture important features of the behavior of incredibly complex interacting systems.
In contrast to carefully and independently measure each parameter, we can instead constrain the model parameters with system-level measurements that are similar to the types of measurements we wish to predict. And somewhat counterintuitively, from the perspective of sloppy models, working to estimate precise parameter values in a model is useless while instead useful predictions of interest can be made without precisely knowing any single parameter.
Therefore my question is how can you exclude that observed, effective laws (of chemistry, biology, etc) are - at least partially - independent from the strictly known, conventional physics (e.g standard model and general relativity) models?
[1] intro
Well, it is well known how the classical world of our experience can emerge from quantum mechanics in terms of decoherence. But it doesn't seem to me the only approach to generalization and emergence.
Another well known example is how we obtain classical physics by taking the limit of Planck's constant to zero, and simlarly with relativity and other deformations (where the classical limit involves small speeds or a small mass size, etc...)
[2] typical renormalization approach and its limits
Unlike in systems biology, physics is dominated by effective models and theories.
This is in large part due to the great success of continuum limit arguments and RG procedures.
The problem is that - while renormalization group operates on field theories with a scale invariance or conformal symmetry - biochemical networks - and all what constitute the classical world - are particularly challenging due to inhomogeneity and lack of symmetries.
[3] my question
Machine learning methods that search for real world patterns are a growing feature of our recent research methods. In fields like ecology, systems biology, and macroeconomics, grossly simplified models capture important features of the behavior of incredibly complex interacting systems.
In contrast to carefully and independently measure each parameter, we can instead constrain the model parameters with system-level measurements that are similar to the types of measurements we wish to predict. And somewhat counterintuitively, from the perspective of sloppy models, working to estimate precise parameter values in a model is useless while instead useful predictions of interest can be made without precisely knowing any single parameter.
Therefore my question is how can you exclude that observed, effective laws (of chemistry, biology, etc) are - at least partially - independent from the strictly known, conventional physics (e.g standard model and general relativity) models?