Understanding Physics Through Randomness: A Curious Adult

  • Thread starter weisenhm
  • Start date
  • Tags
    Randomness
In summary, entropy is not about "disorder" but rather about the level of specificity in defining a physical system. It is a measure of our ignorance about the system, with higher entropy indicating a more vague definition of the system. In statistical mechanics, we try to reduce the lack of specificity by counting the "most specific distinct descriptions" of the system. This can be seen as the "thermodynamic force" driving the system towards certain states. In quantum mechanics, we use the dimensions of the Hilbert space instead of area in phase space to count the distinct descriptions. Entropy can also be derived from a probability distribution, where we maximize entropy under certain
  • #1
weisenhm
1
0
Disclaimer. I am a PhD level neurobiologist, and like most of my peers a crappy physicist. I went back to better complete my scientific understanding and just on my own am giving physics a crack as a curious adult, and in the process I have gotten hooked on the beauty of physics. I am working it at the level of the first calculus based course. I am especially blown away by the thermo and the quantum stuff. I feel like I can grasp each small piece of these topics, but I must confess the deep understanding, or clarity, is something I am still working towards. As an example, I can study the carnot cycle and understand entropy as it is introduced in the thermodynamic sense. But it is not obvious to me how this relates to the statistical derivation, aside from seeing that they give the same result. The mathematical equivalency doesn't help me figure out why they are related. The question I have deals with the statistical derivation and probabilities. In the maximization of microstates, once again the notion of probability comes into play as being a driver of physical processes. Is this just because the most likely events are the ones that happen, and there is nothing mysterious about it? Otherwise how does the universe "know" which state is the most likely? And also is this probability related to the probabilities in quantum mechanics. I know these are really big, and perhaps silly questions (see disclaimer), but I'd appreciate any feedback.
 
Physics news on Phys.org
  • #2
I found it helpful when I realized that entropy isn't about "disorder" because that is a subjective opinion. It is rather about (the, paradoxically, less subjective) idea of our knowledge about the system.

The key is in what we mean by a given physical system such as an amount of gas. In the system specification, constraints on quantities, there is a corresponding physical constraint. If we consider a specific volume of gas, we must have rigid walls holding the gas in. If we specify an amount of gas at a specific pressure we must have movable walls designed to press the gas with constant force per area. If we refer to an unspecified amount of gas with constant pressure and volume we are allowing the number of gas particles to change, say with a pressure valve, to maintain that pressure within a volume of rigid walls.

If we're careful physicists, all the terms we use have some operational meaning or some specifically mathematical meaning and there's no subjectivity.

Now in statistical mechanics we consider how specific we can get in describing a system. Start with say a fixed volume of gas consisting of a fixed number of particles which we have allowed to exchange energy randomly with a constant temperature heat bath for long enough time to be in approximate equilibrium as far as random heat exchange is concerned. In statistical mechanics we may ask the question "How more specific could we be in defining/constraining this system?" We try to reduce the lack of specificity to a count of the "most specific distinct descriptions" where "distinct" means physically distinguishable (and so we ignore swapping of indistinguishable particles). Classically these numbers are uncountable but we can define a phase-space volume unit (action unit) and get a relative number.

We see this number behaving multiplicatively when we combine two systems so we look at its logarithm (since that is then additive) and call it entropy. It is our measure of how non-specific we are being in constraining our system. The higher the entropy the more vague we are in restricting the range of states which are included as examples of "the system". In a laboratory we would reject experiments where the system violated our constraints, e.g. where the wall failed to stop the gas from leaking out.

I find this clarification to understand that entropy is not a function of the system's state but rather of the constraints we impose on the system in defining it vs other cases. Indeed "the system" can be a single particle, but a particle constrained to be, say, in a room of some volume. And even though classically that particle is always only in one exact state, our constraints allow that state to be one of many so the entropy is likewise non-zero.

The system really isn't just the particle but the particle plus its environment. And so we can talk about the "thermodynamic force" driving the particle from a small room to a bigger room. That force is nothing more than the aggregate of constraint forces imposed by the walls which when added up over equally probable cases will have a net statistical component that points toward the bigger room.

Now abstract these "rooms" to any regions in the system's state space (phase space) and you have the kernel of classical statistical thermodynamics in a nutshell. Quantum mechanics adds some features in that it is easier to justify our counting methods due to quantization. We end up counting dimensions of the Hilbert space (since it projectively defines sharp modes as 1-dim subspaces which we identify with vectors which span them) instead of arbitrary units of area in phase space.

I have not mentioned probabilistic constraints where we impose "soft" conditions on the system so that we can define a probability distribution over the state space. Think of the above example as a "crisp" constraint where we've specified a uniform distribution over some subset ("room") of state space. And then generalize from there.

We can derive a formula for entropy in terms of the probability distribution. We can consider hypothetical distributions and then find the ones that maximize entropy (subject to some less direct constraints such as that the expectation value of the energy be some value). We then simply assume that we are just as ignorant of the system as we say we are (that the two entropy formulas must agree) and we can calculate probabilities from the principle of maximum entropy.
 

FAQ: Understanding Physics Through Randomness: A Curious Adult

What is "Understanding Physics Through Randomness: A Curious Adult" about?

"Understanding Physics Through Randomness: A Curious Adult" is a book that explores the concept of randomness and how it relates to physics. It delves into topics such as chaos theory, quantum mechanics, and thermodynamics, and how randomness plays a role in these phenomena.

Why is understanding randomness important in physics?

Randomness is a fundamental aspect of the universe, and it is necessary to understand it in order to fully comprehend how the laws of physics work. Many natural phenomena, such as weather patterns and subatomic particle behavior, are governed by randomness, so understanding it is crucial in understanding the world around us.

Who is the target audience for this book?

The target audience for "Understanding Physics Through Randomness: A Curious Adult" is anyone with a basic understanding of physics who is curious about the role of randomness in the universe. It is suitable for both students and adults who are interested in deepening their understanding of physics.

Can someone with no background in physics understand this book?

While a basic understanding of physics may be helpful, this book is written in a way that is accessible to readers with no prior knowledge of the subject. The concepts are explained in a clear and easy-to-understand manner, making it suitable for anyone interested in learning about physics and randomness.

What can I expect to learn from reading this book?

Reading "Understanding Physics Through Randomness: A Curious Adult" will give you a deeper understanding of the role of randomness in the universe and how it relates to physics. You will also learn about various theories and concepts, such as the butterfly effect and the uncertainty principle, that are influenced by randomness. Overall, the book aims to provide readers with a new perspective on the world of physics and the role that randomness plays in it.

Similar threads

Replies
12
Views
1K
Replies
16
Views
2K
Replies
86
Views
10K
Replies
4
Views
1K
Replies
15
Views
2K
Replies
1
Views
968
Back
Top