Problem with randomness and uncertainty

In summary, the book suggests that there is a fundamental randomness in nature which reflects the uncertainty of quantum mechanics. This randomness allows for free will into nature.
  • #36
Franks makes a still stronger case, though more technical, in this latest paper.
http://arxiv.org/PS_cache/arxiv/pdf/1010/1010.2267v1.pdf

In his first paper I referenced, he makes the point that it does not matter whether the small-scale processes, or degrees of freedom, are random or determined. Viewed from a sufficient distance (ie: stabily constrained within some specified global context), it all looks the same statistics-wise. The variable lies in the nature of the global constraints (the information preserved at that level).

Most will find this scale-indifference to the fine details counter-intuitive. But it does show that the context is critical to what is observed. Like beauty, probability is in the eye of the beholder!

In this new paper, Franks says this more clearly. The measurement process imposes the constraints. All that needs to exist at the local level are symmetries or degrees of freedom.

So local scale = an ensemble of symmetries - which the imposed measurement process then breaks in some direction.

And generally there are two kinds of symmetry breaking. One that describes a static situation (one with extra constraints, like a boxed collection of particles - gaussian). And one that describes a dynamic or freely expanding situation (like an open box which allows the particles to fly away - powerlaw).

So...

Our unified explanation of the different commonly observed distributions in terms
of measurement points the way to a deeper understanding of the relations between pattern and process. We develop the role of measurement through maximum
entropy expressions for probability distributions.

We first note that all probability distributions can be expressed by maximization of entropy subject to constraint. Maximization of entropy is equivalent to minimizing total information while retaining all the particular information known to constrain underlying pattern [7–9]. To obtain a probability distribution of a given form, one simply chooses the informational constraints such that maximization of entropy yields the desired distribution.

(addendum): To me, this is making probability theory less philosophical and more physical!

The dichotomy of chance and necessity (automaton and ananke) goes back of course to Ancient Greece. And we know how vague and qualitative that division is.

But instead moving the whole understanding to one of measuring observers, physical scale and symmetry(breaking) is putting it squarely on modern physical foundations.

In another 30 years, it might even finally kill off the endless PF debates about the existence of freewill.
 
Last edited:
Physics news on Phys.org
  • #37
ZapperZ said:
I'm using words to argue scientific concept? I thought you were the one making a word play on this, by your own admission? How am I not using a scientific method? The words that I used have clear, unambiguous definitions with underlying mathematical description (i.e. "canonical ensemble"). Can you say the same thing with the words you were using?

Again, you simply say thing without any justification. I could easily say that you're making things up and leave it at that.

Zz.


Using technobabble is not the same thing as using a scientific method. I'll leave it Apeiron and Pythagorean to try get that point across, since they are much more familiar with such things.
 
  • #38
apeiron said:
I'm using well-defined philosophical terminology (strange that on a philosophy sub forum).

And if you read my post with any care, you will see that the essential point both wuliheron and I have been making is that randomness, like singularities, does not come naked in practice. It always has a specified context, a set of boundary constraints, that determines its probabilities.

What is the boundary of "context" for a function such as F=1/z?

As is quite clear, I was talking about all statistical modelling. Again, read the paper.

A simple example is an ideal gas. The box holding the particles provides the boundary constraints that reflect the particles back into the system and produce a definite single scale (gaussian) macrostate.

But lift the lid on the box, remove that constraint, and the distribution of the particles becomes powerlaw, a geometric mean.

A simple example of the way constraints specify the randomness observed.

Ah, but this is where you make your mistake. When you TALK about "free gas" or "ideal gas", you ARE already specifying a well-defined situation. If I say that I have an ideal gas gas occupying an infinite volume, there's nothing here that says that I have a specific boundary condition. In fact, in many cases in solid state physics, the boundary is taken to be at infinity and one still get accurate description of the system. An an ideal gas, the ONLY criteria for the boundary is that it plays NO role in changing the statistical distribution of the ideal gas (i.e. it makes elastic collision with the particles, etc...). It is why the volume of the "container" doesn't come into play. It is the unit volume of the gas that has any meaning here.

But I don't know why we are arguing this. How is this related to the point that I would countering in the beginning? I still haven't been shown ONE physical example where "... On the one hand the statements insist something is random, while on the other they insist it is predictably random..." doesn't actually occur or even contradictory. I've shown example where this IS the case. Even the ideal gas example supports my argument! So what gives?

Zz.
 
  • #39
ZapperZ said:
If I say that I have an ideal gas gas occupying an infinite volume, there's nothing here that says that I have a specific boundary condition.

Eh? What do you call the standard temperature and pressure? What do you call the maximum entropy condition?

ZapperZ said:
But I don't know why we are arguing this. How is this related to the point that I would countering in the beginning? I still haven't been shown ONE physical example where "... On the one hand the statements insist something is random, while on the other they insist it is predictably random..." .

Is that what was being said? Or was it being said that randomness always in practice has a specified context?

Right from the start, people were responding to the OP statement - "Probabilities in quantum theories reflect a fundamental randomness in nature". And as the OP protested, something seems contradictory because QM randomness is exactly predictable. We were explaining that randomness is never naked, it always is specified by its boundary constraints.

For some reason you decided to misunderstand the discussion.
 

Similar threads

Back
Top