Entropic force: I don't get it

  • Thread starter Coin
  • Start date
  • Tags
    Force
In summary, the conversation discusses the discomfort of someone trying to understand the concept of entropic gravity without a formal physics background. They mention Verlinde's paper on the topic and express concern about the lack of references and detail in his explanation of entropic force. They also question the legitimacy of discussing entropic force in the absence of other forces and how it can perform work on an external system. The conversation also touches on the need for a new formulation of information theory that is independent of external references and the importance of understanding the polymer example in Verlinde's paper.
  • #1
Coin
566
1
Speaking as someone who tries to follow arxiv-y physics without having a formal physics background, I get really uncomfortable whenever I see an argument presented in a way that I don't feel like I can stop and look up and verify every single step in the argument. I know I don't know enough about physics to spot the holes in an argument myself, so I want to be able to make sure that I'm accepting an argument because it's well-founded and not because it "sounds right".

And when I look at this "entropic gravity" stuff that's being discussed here and elsewhere a lot this last month, in this sense I get really, really uncomfortable. A lot of the discussions, in particular Verlinde's original paper, seem to take big leaps between extremely general ideas and I get lost very quickly. So I'm trying to go back and read Verlinde's original paper and make sure I understand the basic idea being discussed here from the beginning before I try to make sense of the other discussions. I'm getting stuck on like page three.

The big thing I'm having a problem with is at the very beginning, where the first thing Verlinde does after the introduction (section 2) is try to explain what an entropic force is. And the interesting thing, if you'll indulge me ranting for a moment, is he somehow manages to get through his whole two page explanation with no references. He does have one footnote giving a source for a couple of the equations; the source is two people who, I assume, provided the information in conversation. This seems really fundamental and critical to get right before accepting anything that follows-- if I'm going to be convinced that space has a kind of structure that can produce an entropic force, or that it makes sense to model gravity as an entropic force, then I want to be able to understand in great detail what an entropic force is and how it works and where I can read up on it to be sure Verlinde is using it in a way consistent with how other physicists use it.

So here is what I would like to ask:

Is the entropic force a well-recognized enough concept that someone with more of a physics background would be comfortable seeing introduced without much detail, as happens in Verlinde's section 2?

Where should someone interested in learning about entropic forces rigorously go? (Google turns up only some sketches of the concept, like on wikipedia, even less detailed than Verlinde's.)

Also, is it actually legitimate to talk of the "entropic force" in the absence of any other force (as would seem to be necessary to treat the entropic force as a fundamental thing that other forces emerge from)? Like, Verlinde gives the example of a uncoiling polymer, and seems to act like you can just treat the polymer as a complete mathematical abstraction, like a series of points joined by links, with the points moving in brownian motion. Can you really just like that separate the behavior of the polymer from, say, the chemical forces that hold it together, or its interactions with the medium it is suspended in?

And if the "entropic force" is really just an abstract tendency for points to cluster in certain locations once they start moving in brownian motion, as Verlinde seems to describe it, then how can it perform work on an external system?

Thanks.
 
Physics news on Phys.org
  • #2
Coin said:
Speaking as someone who tries to follow arxiv-y physics without having a formal physics background, I get really uncomfortable whenever I see an argument
...
Also, is it actually legitimate to talk of the "entropic force" in the absence of any other force
...

I'm no professonial but I have a physics eduation and information theoretic angles, and in particular a reconstruction of intrinsic information theory from first principles is one of my personal toy projects. But, I also find Verlindes paper unsatisfactory although I don't know if we have the same objections. It's not that I don't know what an entropic force is, it's rather that the "classical" understanding of an entropic force is way to simpistic and realist minded to make sense for a new framework for physics. They work fine for the usual domains, but here he aims to probe a new level of understanding.

But I like to read this paper as a sketch, that must be improved and deepend to qualify as the deeper insight we seek.

The foundations of these entropy methods; the degrees of freedom, microstructures and priors are the ones that are unacceptable as fundamental realist starting points. Current models are like an extrinsic information theory, What I think we lack is a context where these things evolve, so that we describe information from the inside.

We would need a formulation of information theory, that is independent of external references. Now that may seem impossible, but the natural escape is an evolving model. The evolution is forced upon it, due to incomplenetess due to lack of external references.

/Fredrik
 
  • #3
Coin said:
... Like, Verlinde gives the example of a uncoiling polymer, and seems to act like you can just treat the polymer as a complete mathematical abstraction, like a series of points joined by links, with the points moving in brownian motion. Can you really just like that separate the behavior of the polymer from, say, the chemical forces that hold it together, or its interactions with the medium it is suspended in?
...

It's worth understanding the polymer example primarily for its own sake. Coin, in your whole post, opening the thread, you didn't mention temperature.

You talk as if Verlinde's discussion of the polymer example ignores interactions with the medium. Those interactions with the heat bath are the key to understanding.
Take a kinky chain, as you stretch it out, it's nodes will strike units of the medium, and deliver heat energy to the medium.

Then when you release the straightened chain, units of the medium will deliver random kicks to the nodes of the chain and cause it to kink up in one of hundreds of different possible ways it could kink and fold and contract----so the medium is delivering heat energy to the chain.

Coin, did you read Verlinde's blog posts about this? They explain and clarify a lot, especially as involves this example.

Don't misunderstand what I'm saying. The polymer example doesn't justify the application to gravity. It should be understood for it's own sake.

I think entropic force is a legitimate concept in the context of organic chemistry, indeed I gather that it is encountered there, or at least that some organic chemists would understand the idea---Verlinde blogged about a relevant biochem lecture he attended.

In order to justify the concept in application to gravity one should, I think, identify the degrees of freedom that experience heat, and have temperature.

From my perspective this identification of the DoF is work in progress and interesting to watch. As I see it, the paper which makes the most headway in that direction is the one by Kowalski-Glikman. I started a thread about it.
https://www.physicsforums.com/showthread.php?t=377015
 
Last edited:
  • #4
marcus said:
In order to justify the concept in application to gravity one should, I think, identify the degrees of freedom that experience heat, and have temperature.

I certainly agree with this, but I would also like to answer this:

Are the abstraction "degrees of freedom" ultimately to be seen as a realist structure, or could it be that sometimes, two observers simply can not agree about the "ultimate degrees of freedom", and if so what does this mean in terms of their interactions?

Without asking this latter question, I think we are basically just doing old style realist statistics, which I think won't do the job. It's probably possible to make progress, but is it really satisfactory?

Also, if we think of the deepest distingishable degrees of freedom in a realist sense, then is that because it's like to be right, or because it's hard to see any other reasonable way to do the modelling unless there are fundamental degrees of freedom?

/Fredrik
 
  • #5
One issue that one can understand also classically is the way entropy is often introduced.

If one considers the Boltzmann definition, then apart from boltzmanns constant, entropy is more or less a pure combinatorical dimensionless quantity, simply a measure of the probability of a distribution, given a prior.

But boltzmanns constant represents a factor of specific complexity capacity (~ heat capacity) density per degree of freedom (dimension). This is just introduced as a constant, experimentally determined to match the micro with macro, but if one seeks a reconstruction from combinatorical principle (which I consider to be part of the foundations of statistics; more fundamental than continuum proability) then there should be a related between this, and the specific complexity per dimensionless complexion (which can be thought of maybe as the confidence in the complexions), which at the low complexity limit should be the minimal 1. So one would here expect dimensionality to be emergent too, where the unit measures of dimensions should have some kind of relation following from their relation.

Thus there is a missing link between micro and macro, that is usually just replaced by experimentall determined effective constants.

This I see also related to our confidence level, in a given say statistical distribution inferred from an interaction history. Ie. how many individual data points, do we have per distribution? This may leave the distribution itself, indifferent, but it makes the difference between this distribution beeing just itself a statistical fluctuation, or backup up by so much evidence that it's oerly unlikely that it's a fluctuation.

In this sense, distinguisshable particular probability distributions, can give rise to new complexions, and the confidence of the distribution in the above sense, would then be analogous to boltzmans constant. So that, while the microscopic details are hidden from the macro limit, the total complexity(~energy) can not be hidden.

/Fredrik

Edit: this is also related to what some poeple don't like the inductive construction of "probability of probability", and this also quantum mechanically n'th quantization, where quantiztion really just is an induction step. At each level there is always the "classical side" and the quantized side, but where the classical side can play the QM size at another level.
 
Last edited:
  • #7
Coin, have you found answers to the questions you asked here? We didn't do a very thorough job of responding, but you may have other sources outside PF.
Coin said:
...
So here is what I would like to ask:

Is the entropic force a well-recognized enough concept that someone with more of a physics background would be comfortable seeing introduced without much detail, as happens in Verlinde's section 2?

Where should someone interested in learning about entropic forces rigorously go? (Google turns up only some sketches of the concept, like on wikipedia, even less detailed than Verlinde's.)

Also, is it actually legitimate to talk of the "entropic force" in the absence of any other force (as would seem to be necessary to treat the entropic force as a fundamental thing that other forces emerge from)? Like, Verlinde gives the example of a uncoiling polymer, and seems to act like you can just treat the polymer as a complete mathematical abstraction, like a series of points joined by links, with the points moving in brownian motion. Can you really just like that separate the behavior of the polymer from, say, the chemical forces that hold it together, or its interactions with the medium it is suspended in?

And if the "entropic force" is really just an abstract tendency for points to cluster in certain locations once they start moving in brownian motion, as Verlinde seems to describe it, then how can it perform work on an external system?

Thanks.

I liked Bee Hossenfelder's reaction, which was initially "hunhh? I don't get it" and then about two weeks later she posted this quote in the Woit discussion of entropic force papers.

“The important thing in science is not so much to obtain new facts as to discover new ways of thinking about them.” ~Sir William Bragg

http://www.math.columbia.edu/~woit/wordpress/?p=2720&cpage=1#comment-53229

You may have already answered your own question "how can it perform work?"
In the polymer example the energy comes from the heat bath, which keeps banging the chain into increasingly random configurations.

But the most useful intuition is what you would pick up, not in a freshman physics course, but in a freshman chemistry course, where you study reactions be driven by entropy.

Something that comes to mind is a machine that generates electric power based on a difference between fresh water and an salt water. It can be installed wherever a river runs into the ocean. (and the entropic power is simply being wasted). Currently not economical, I believe, but technically feasible.

It uses osmotic pressure across a membrane that passes water molecules but not salt. It raises a column of water above sea level.

A type of desalinization plant would run the process backwards using electric power to put salt water under pressure and obtain fresh.
 
Last edited:
  • #8
http://arxiv.org/abs/cond-mat/9711071
The 'Friction' of Vacuum, and other Fluctuation-Induced Forces
Mehran Kardar, Ramin Golestanian

"Fluctuation-induced forces are ubiquitous in nature, covering many topics from biophysics to cosmology [1–5]. There are two basic ingredients in these phenomena: (i) A fluctuating medium, such as the electromagnetic (EM) field; and (ii) External objects whose presence suppresses (or in some way modifies) the fluctuations, such as dipoles or conductors. The overall strength of the interaction is proportional to the driving energy of fluctuations (kT and h for thermal and quantum fluctuations, respectively)"

"Interactions between a collection of charged macroions in an aqueous solution of neutralizing counterions, with or without added salt ... The attractive interaction is induced by the diminished charge-fluctuations close to the macroions, (due to the condensation of counterions [26–28], and in this sense related to the effects discussed in the previous sections. Since the connection between the entropic attraction of charged macroions and the general class of fluctuation– induced forces, is seldom made explicit, in Appendix II we present a path integral formulation that makes this analogy more transparent."

"Although less well known than its static counterpart, the dynamical Casimir effect, describing the force and radiation from moving mirrors has also garnered much attention [52–58]. This is partly due to connections to Hawking and Unruh effects (radiation from black holes and accelerating bodies, respectively), suggesting a deeper link between quantum mechanics, relativity, and cosmology [59,5]."
 
Last edited:
  • #9
Thanks Atyy, more examples!
I just googled "osmotic power"
http://en.wikipedia.org/wiki/Osmotic_power
and saw that a 4kW plant was opened in Norway in 2009.

The economic obstacle is the cost of the membrane. It works well enough.

The concept is called "pressure retarded osmosis" (PRO). Osmosis raises the column of water on the saline side of the membrane, creating a hydropower "head" of brackish water with graviational potential energy.

They have some photographs and diagrams. Osmotic power is being developed both in Norway and in Holland. The dutch do not use PRO however. They have some other means of extracting entropic power.

But you learn in beginning Chem that it it's normal for industrial chemistry reactions to be reversible and to be driven by differences in concentration, or differences in temperature----essentially by statistical likelihood. BTW it's also probably how some fuel cells work. Some types have membranes.
 

FAQ: Entropic force: I don't get it

What is entropic force?

Entropic force is a concept in physics that relates to the tendency of a system to increase its disorder, or entropy. It is a fundamental force that arises due to the second law of thermodynamics, which states that in any isolated system, the total entropy always increases over time.

How does entropic force work?

Entropic force works by driving systems towards states of higher disorder, or higher entropy. This force arises due to the statistical tendency of systems to increase their entropy, and it is particularly important in systems with many degrees of freedom, such as gases and liquids.

How is entropic force different from other fundamental forces?

Entropic force is different from other fundamental forces, such as gravity and electromagnetism, in that it is not a result of an interaction between particles. Rather, it arises from the statistical behavior of particles and their tendency to increase entropy.

What are some real-world applications of entropic force?

Entropic force has many real-world applications, including in nanotechnology, where it can be used to manipulate and control small particles. It is also important in understanding the behavior of polymers, which are long chains of molecules that exhibit strong entropic forces.

How does entropic force relate to the arrow of time?

Entropic force is closely related to the arrow of time, which describes the one-way direction of time from the past to the future. This is because entropic force is responsible for the increase of disorder and entropy in a closed system, which is in line with the second law of thermodynamics and the direction of time.

Back
Top