Universal information: How much is enough?

  • #1
stuartmacg
28
6
A thought (it is probably as old as hills, but new to me):

Physics advances by finding rules our observations obey. A good theory reduces the information needed to describe what has been observed.
In each (most?) advance(s) we find that the universe actually contains (can be defined by) less information than we had previously (implicitly) assumed. When this is false, science is complete.

If we take that (that the universe contains much less information than we think) as a sort of meta law, then we could have expected a quantized physics would be required, to limit information ...

It amused me anyway!
 
Physics news on Phys.org
  • #2
Given the huge dataset of our observations, (much of which is redundant), what is the minimum set of equations needed to describe that dataset ?
How do we know it is the minimum ?
Why do we need to keep making observations ?
 
  • #3
From a commercial point of view, the reasons for making high energy particle observations seem to be becoming small :-) - basically "they thought it was all finished in 1900 and look what happened".

Humans - the information animals - have dominated the world by finding and exploiting good models (understanding) of what is going on, so we are not going to stop trying any time soon.

It seems to be in our DNA.
 
Last edited:

Related to Universal information: How much is enough?

1. How can we measure the amount of universal information?

We can measure the amount of universal information by looking at the complexity and organization of the data. This can be done through various mathematical methods such as Shannon entropy or Kolmogorov complexity.

2. Is there a limit to how much universal information exists?

It is not clear if there is a limit to the amount of universal information that exists. As our understanding of the universe and technology advances, we may discover new sources of information that were previously unknown.

3. Why is universal information important?

Universal information is important because it helps us understand the world around us and make informed decisions. It can also aid in solving complex problems and advancing scientific research.

4. How does universal information relate to the concept of entropy?

Universal information and entropy are closely related concepts. Entropy is a measure of disorder or randomness in a system, while universal information is a measure of the complexity and organization of data. As entropy increases, the amount of universal information decreases.

5. Can we ever have enough universal information?

It is unlikely that we will ever have "enough" universal information, as there is always more to learn and discover. The pursuit of knowledge is a never-ending journey that drives scientific progress and innovation.

Similar threads

Replies
5
Views
441
  • General Discussion
Replies
4
Views
850
  • Special and General Relativity
Replies
8
Views
900
Replies
3
Views
2K
  • General Discussion
Replies
9
Views
1K
  • Quantum Interpretations and Foundations
Replies
3
Views
983
Replies
1
Views
831
Replies
5
Views
961
Replies
190
Views
9K
Back
Top