Is Information Energy? Exploring Landauer's Principle

In summary, Landauer's principle indicates that any irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment.
  • #36
cesiumfrog said:
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)

You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en...pyramidal neurons bayesian statistics&f=false
 
Physics news on Phys.org
  • #37
Pythagorean said:
You'd have to be a dualist to assume that our knowledge and all forms of human-human communication are somehow removed from physics. You actually physically receive information and your brain state physically changes to a new state when you receive information. How it does so is complicated, but it has been proposed several ways. Here's one:
http://books.google.com/books?hl=en...pyramidal neurons bayesian statistics&f=false

I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?
 
  • #38
cesiumfrog said:
I'm not sure what you mean by "extract free energy from information"?

I understand that various computational processes, by their inherent irreversibility, have a thermodynamic cost that can be measured in energy per information bit.

But energy contained in the information? Do you only mean loosely, like in the context of Maxwell's demon (if the information is known to correspond to the microstate of a gas, then having the information allows us to harness part of the thermal energy of the gas for free, say by informing the control of a shutter mechanism in order to compress the gas against a piston without the expenditure of effort that would usually be required)? But it seems like you're abusing/confusing the terminology by in the same breath discussing communication of knowledge (of meta-data of what the information corresponds to), or to ascribe energy content to that knowledge. (Such can't even be analysed in the framework of a closed cycle.)

If I understand what you are asking, my answer is 'yes'. That is, thermodynamics is a theory regarding the various forms of energy, the energy can transfer between two systems, and the allowed processes by which one form of energy can be converted into another. There are many forms of energy: mechanical, thermal, electromagnetic, chemical..., to which I add 'information'.

It's not as radical as it may sound. For example, a folded protein has a different energy than an unfolded protein. Where is this energy 'stored'? Microscopically, we may try to assign the difference to detailed structural interactions, just as we sometimes try to ascribe thermal energy to a detailed description of molecular motion. And we know that sometimes that works, other times it fails- dissipative processes can't readily be described using conservative forces.

More economically, we can also say the two protein states have a different 'conformation', 'configuration', or some similar term that ignores the (currently) unmeasurable microscopic picture. What is the 'conformation' of the protein? It's information about the shape.

So I can either treat 'information' as a preferred class of physical properties that (for some reason) cannot be treated as a physical variable. Or, I can accept that information is a form of energy- and then (for example), protein folding becomes a tractable problem.

Here's another example- copying in a lossy environment. Take the 'scratch on a metal' thread. You make a scratch, and I want to make an identical scratch. How much information do Ineed to do that? For a low-resolution copy, all I need is a few parameters: length, depth, maybe the tool you used and how much pressure you applied. That's not an exact copy- in order to make a medium resolution copy, I need more information: shape of the cutting tip, orientation of the tip and sample, rate of deformation, ... And to make a *perfect* copy, I need atomic-level information about the positions and momentum of all the atoms involved.

Thermodynamics gives us a way to *quantify* this.
 
  • #39
cesiumfrog said:
(Aren't cryptography and compression closely related?)

I had to think about this for a bit. No, I don't think they are that related, although there are similarities.

Superficially, they may appear similar- codecs are used to 'package' the original information (e.g. the MPEG-4 codec, PGP), but there is at least one crucial difference:

Cryptography requires use of a key; this is not the same thing as distributing a document containing the process for encrypting the data- I can have a public-key cryptographic scheme:

http://en.wikipedia.org/wiki/Public-key_cryptography

This scheme is *way* too complex for me to understand right now- I don't have the energy (pun definitely intended).
 
  • #40
adaptation said:
I don't mean to speak for cesiumfrog, but that's not the interpretation I get from that comment. I take it to mean that human to human communication is a part of physics, but it should not be confused with physical information. This particular discussion is aimed at physical information.

Incidentally, everyone is dualistic. Nothing you say, and do, and feel, and think is contradictory?

But human communication does pertain to physical information. There's no other way to transfer and store information.

And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe. If mind were separate from the physical universe then cesiumfrog's complaint would have merit and human communication would somehow be void of physical information. That's not the case though.
 
  • #41
Pythagorean said:
But human communication does pertain to physical information. There's no other way to transfer and store information.

I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."

I don't know of any way to quantify human communication so that it is useful to our discussion. You can talk about bits, but they are irrelevant to the amount of work you can extract from a message. The data ultimately must be interpreted by the human brain. When dealing with human communication you have to think about qualia. For every human that exists there is a different way to interpret a bit.

I hope that makes it more clear.

Pythagorean said:
And I meant the 'philosophy of mind' dualism that posits that mind is separate from the physical universe.
That makes a lot more sense! Thanks for clarifying.
 
  • #42
adaptation said:
I don't think anyone has disagreed with this. You're saying, "Apples and oranges are both types of fruit." I'm saying, "Yes, I agree with you, but let's talk about apples for the time being because talking about both of them at the same time can confuse the issues."

Yeah, I mirrored that sentiment. I just wanted to state that I still think it's valid, it's just not pedagogically efficient.
 
  • #43
Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.
 
  • #44
Vote to move this topic to PF Lounge "Skeptisism and debunking".
 
Last edited:
  • #45
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.

From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
In 2003, J. D. Bekenstein claimed there is a growing trend in physics to define the physical world as being made of information itself (and thus information is defined in this way) (see Digital physics). Information has a well defined meaning in physics. Examples of this include the phenomenon of quantum entanglement where particles can interact without reference to their separation or the speed of light. Information itself cannot travel faster than light even if the information is transmitted indirectly. This could lead to the fact that all attempts at physically observing a particle with an "entangled" relationship to another are slowed down, even though the particles are not connected in any other way other than by the information they carry.

Another link is demonstrated by the Maxwell's demon thought experiment. In this experiment, a direct relationship between information and another physical property, entropy, is demonstrated. A consequence is that it is impossible to destroy information without increasing the entropy of a system; in practical terms this often means generating heat. Another, more philosophical, outcome is that information could be thought of as interchangeable with energy. Thus, in the study of logic gates, the theoretical lower bound of thermal energy released by an AND gate is higher than for the NOT gate (because information is destroyed in an AND gate and simply converted in a NOT gate). Physical information is of particular importance in the theory of quantum computers.
 
Last edited by a moderator:
  • #46


I hope you like this:smile:

adaptation said:
I'm not concerned with the energy in the medium in which the information is stored or transmitted. I'm concerned with the information itself. If it is energy, how can I extract or transform it to do work?

I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
 
  • #47


kwestion said:
I hope you like this:smile:



I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

We may all observe this information on different types of monitors, printers, text readers etc. I think this may help discuss the idea of energy of the pure information without regard to the media.

There is information in the above string. Some may get it, some may not. Some may understand that I've sent a series of dots and dashes. Some may understand that it represents "soy flojo". Some may understand that it means "I'm lazy". Even though I sent the same information, different levels of information may be interpretted by different readers. As a bonus, some readers may understand more than the bland information that I sent and understand it as an insightful joke, because the human knowledge that "I'm lazy" doesn't lend itself to being transformed into useful work.

I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.

The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
 
  • Like
Likes cameron
  • #48


Chronos said:
Lucien Hardy has written some interesting papers on this subject. See, for example:

arXiv:0910.1323
Entropy for theories with indefinite causal structure

His underlying notion is the universe as a quantum computer.

I'm not going to lie. I didn't understand half of that paper. The maths were way beyond me. It had a really interesting premise that I'd not heard of before, an indefinite causal structure. Causality becomes probabilistic rather than definite. Really interesting stuff. Unfortunately, I don't understand it well enough to apply it to this discussion.

It would be cool if you had time to dumb it down a bit (no pun), and put it into context.

kwestion said:
I don't suspect the data changes its physical energy based on the various levels of understanding and so suspect that it would have the same energy as if they information was never understood, was never read, or was never in existence. Seems like zero energy.
Nice sample! Hahaha. I think you are right that the energy content doesn't change based on one's understanding of the data. I agree that the message itself contains zero energy. That's why I think we need to stay away from what we colloquially call information. It starts to get really confusing. For example:
Pythagorean said:
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols. I used hardly any glucose because I only know two letters in Morse code: S and O so I didn't even try to understand the phrase.

You have to consider the whole system, not just half of it.
Clearly, what Pythagorean is talking about here is chemical energy. That's what glucose is to us. Then we start getting into subjective stuff (as previously suggested) like how useful the information is to different receivers. I just don't see how we can make any progress that way, and yet we keep coming back to it, and people keep confusing types of information.
Q_Goest said:
I think a discussion on the meaning of "information" is worth while, especially the difference between quantum information and information in the conventional sense and how they differ. Up to this point, I've assumed the term was being used in the conventional way, but that's clearly not the intent. I think this is where the confusion is coming from.
From http://en.wikipedia.org/wiki/Information#As_a_property_in_physics":
Thanks for the input, Q_Goest. Philosophically, I consider information to be energy. But I can't seem to reconcile that with what I know about physics. In the context of physics, I'm comfortable saying that energy is information. I just can't quantitatively say that information is energy.

Do you know of any scientific source that states that physical information is energy? Or do you know if this is generally accepted within the scientific community? Or do you know of anyone who is currently working on this? Thanks!
 
Last edited by a moderator:
  • #49


Pythagorean said:
The energy (information) doesn't exist in the symbols, it exists between the reader and the symbols. [...]
You have to consider the whole system, not just half of it.

That seems reasonable, but my understanding is that this topic spun off of the question that regarded half of the system you are referring to. That is, whether a memory device has a different weight based on the information that is stored on it. The weight change due to the information was separated into two pieces: a) the weight due to the physical technique of storing the information, and b) the weight of the information itself. I understand the new topic to be focused on (b). The original question did not involve a reader. Here it sounds like you would say that if there is no reader, there's no information in (b) and hence no weight due to (b). Does that correctly represent your idea?

Surely you don't think, for instance, that we all used exactly the same amount of glucose reading those symbols.
No, I was suggesting that the energy of the information itself was zero in all cases, regardless of the reader's ability to extract different levels of information from the same raw data.
 
  • #50


kwestion said:
I'd like to follow up on that thought by sharing some sample information:

"... --- -.-- / ..-. .-.. --- .--- ---"

I've addressed this already. There are two equivalent ways of quantifying the entropy of this message:

1) Shannon entropy. This measures how far the message is from a random sequence. Note: the closer to a random sequence, the larger the entropy and the *more* information is contained in the message. That's the motivation for introducing the term 'negentropy'.

2) Kolmogorov entropy. This measures how much entropy is intrinsic to the message.

So, while I may not be able to understand the message, I think you would agree I can make a copy of the message. The ease or difficulty of copying the message is an equivalent measure of the information content of the message.
 
  • #51
It's an erasure of a bit that requires energy, not the creation of a bit...

Landauer's Principle, first argued in 1961[1] by Rolf Landauer of IBM, holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information bearing degrees of freedom of the information processing apparatus or its environment". (Bennett 2003)[2].

Specifically, each bit of lost information will lead to the release of an amount kT ln 2 of heat, where k is the Boltzmann constant and T is the absolute temperature of the circuit. On the other hand, if no information is erased, computation may in principle be achieved which is thermodynamically reversible, and require no release of heat. This has led to considerable interest in the study of reversible computing.

The principle is widely accepted as physical law; but in recent years it has been challenged...


...For a computational operation in which 1 bit of logical information is lost, the amount of entropy generated is at least k ln 2, and so the energy that must eventually be emitted to the environment is E ≥ kT ln 2.



http://en.wikipedia.org/wiki/Landauer's_principle
 
  • #52


Andy Resnick said:
The ease or difficulty of copying the message is an equivalent measure of the information content of the message.

If I use a pen and paper the "difficulty" of copying the message changes. If I cut and paste it with the handy tools built into my web browser, it's easier. I could represent this message with my fingers and toes, with sticks and stones, black and red ants, planets and stars. Or I could simply remember the pattern. The "difficulty" in copying the message is not equivalent to the information content in the message. You also need to look at the method of copy and the storage medium.

As I have stated repeatedly, storage mediums are not relevant here. As I said before, computational processes are a form of work. The fact that work produces heat is not strictly relevant here. Although, I would be glad to learn of any computational process that allowed me to extract work from information regardless of its storage medium. That would certainly be relevant.

You either continue to misunderstand the point of this discussion or refuse to acknowledge it. You still have not provided the source I requested or even acknowledged your failure or refusal to do so. You are not adding to this conversation.
 
  • #53


kwestion said:
That seems reasonable, but my understanding is that this topic spun off of the question that regarded half of the system you are referring to. That is, whether a memory device has a different weight based on the information that is stored on it. The weight change due to the information was separated into two pieces: a) the weight due to the physical technique of storing the information, and b) the weight of the information itself. I understand the new topic to be focused on (b). The original question did not involve a reader. Here it sounds like you would say that if there is no reader, there's no information in (b) and hence no weight due to (b). Does that correctly represent your idea?

Not quite. The "reader" is now the hard drive. It has a state previous to measuring the information (being written to) and it must change states to make that measurement. Of course, hard drives are simple; they only store one kind of information (i.e. they only have one kind of memory) and they only take one kind of measurement. This isn't subjective information, it's defined by the complexity of the states within the hard drive. If we randomly assign states to the hard drive, it will have near maximum information, but none of it will be useful to a human.

Anyway, my point is that there is no case of information that doesn't physically exist. I'm taking the reductionist view, but I'm not anti-integration by any means.

@Andy:

So would information be considered a kind of potential energy in the integrative view?
 
Last edited:
  • #54


Pythagorean said:
@Andy:

So would information be considered a kind of potential energy in the integrative view?

I guess... it's a contribution to the internal energy 'U' (E = K.E. + P.E. + U).

The fact that it's free energy would imply that negentropy (the 'official' term) is a form of potential energy, like how the chemiosmotic mechanism stores energy by maintaining a system away from equilibrium.
 
  • #55
Naty1 said:
It's an erasure of a bit that requires energy, not the creation of a bit...

Yes, and the use of information to create a reversible engine implies that copying the information increases the available free energy. Thus, energy is conversed when the information is erased.
 
  • #56
I found something that might help a bit:

selfAdjoint said:
Information producing energy? Your discussion above has information as processed and acted on by humanity as producing energy - it causes conversion of energy into heat (unavailable energy) that way too; teach someone to drive and you have one more driver on the road using up stored fossil energy and producing thermally distributed heat.

Let's consider a more basic level. Quantum mechanics says the components of the world, quanta, are defined by amplitudes which are represented in the mathematical formalism as equivalence classes of vectors with complex number components (could be more complicated linear objects too, but it doesn't change the argument). Now in certain situations - interactions with another quantum system - these complex amplitudes can be reduced to a set of real number vectors, with each vector coming with a probability (the vectors are called eigenstates and the probabilities are called eigenvalues), and just one of those vectors will be realized in actual spacetime. And these realized eigenstates are all there is.

Now treating those complex amplitudes as really-existent things has problems and so some physicsts prefer to think of them as just information. If this is so, then THE WHOLE WORLD at a sufficiently deep (quantum) level is just information. Note that all energy comes from quantum interactions (including gravitic energy if the physicists are right and gravity is quantum underneath it all).
 
  • #57


Andy Resnick said:
I guess... it's a contribution to the internal energy 'U' (E = K.E. + P.E. + U).

The fact that it's free energy would imply that negentropy (the 'official' term) is a form of potential energy, like how the chemiosmotic mechanism stores energy by maintaining a system away from equilibrium.

So generalizing, negentropy is the "opposite" of entropy, and entropy is a measure of disorder. Is negentropy a measure of order?
 
  • #58


Pythagorean said:
So generalizing, negentropy is the "opposite" of entropy, and entropy is a measure of disorder. Is negentropy a measure of order?

The term 'negentropy' and other, similar words were coined to give a more familiar sense to information and meaning:

http://en.wikipedia.org/wiki/Negentropy

Negentropy is a measure of distance to equilibrium can sometimes correspond to our intuitive sense of 'order' in a system. I guess it's also similar to exergy:

http://en.wikipedia.org/wiki/Exergy
 
  • #59
kwestion said:
No, I was suggesting that the energy of the information itself was zero in all cases, regardless of the reader's ability to extract different levels of information from the same raw data.
And this is why that type of information is not my focus. Any value in the sentence "I'm lazy." is completely subjective. I just don't see how it can be considered energy.
Naty1 said:
It's an erasure of a bit that requires energy, not the creation of a bit...
http://en.wikipedia.org/wiki/Landauer's_principle
Landauer's principle was addressed in the very first post of this discussion.
 
  • #60
adaptation said:
And this is why that type of information is not my focus. Any value in the sentence "I'm lazy." is completely subjective. I just don't see how it can be considered energy.

Landauer's principle was addressed in the very first post of this discussion.

The funny thing is that the sentence/information "I'm lazy" will evoke a response in an observer. This implies that it is energy or in the least it is the result of energy and lends itself to an energetic reaction like "get off the couch" or "you sure are!".
 
  • #61
baywax said:
The funny thing is that the sentence/information "I'm lazy" will evoke a response in an observer. This implies that it is energy or in the least it is the result of energy and lends itself to an energetic reaction like "get off the couch" or "you sure are!".

You got me thinking. Information can be the result of energy. Can energy be the result of information?

The more precise information I have about a system, the more energy I am able to extract from it. (This assumes that my measurements are nondestructive.) So there is definitely a relationship between the quality of information, quality here meaning how accurate the information is, and the work it can produce. This should hold true for any system that contains free energy.

Is this correct? Is there a way to quantify this?
 
  • #62
adaptation said:
You got me thinking. Information can be the result of energy. Can energy be the result of information?

The more precise information I have about a system, the more energy I am able to extract from it. (This assumes that my measurements are nondestructive.) So there is definitely a relationship between the quality of information, quality here meaning how accurate the information is, and the work it can produce. This should hold true for any system that contains free energy.

Is this correct? Is there a way to quantify this?

First of all, is "potential energy" similar, the same or unrelated to "free energy"? If energy does not do work is it really there?

Second... the "quality" of the information (as you say the accuracy, efficiency and "correctness") of the information certainly would determine the amount and the "quality" of the energy being passed along.

When someone uses "x" amount of energy to tell me to look out for a falling rock... it would seem that "x" would be the amount of energy I would use to move out of the way... as in Newton's "every action" axiom.
 
  • #63
Could information, for instance, be responsible for dark matter? Is it that real? Should it contribute to the total energy of the universe?
 
  • #64
baywax said:
First of all, is "potential energy" similar, the same or unrelated to "free energy"?
There are actually two types of free energy, Gibbs and Helmholtz. They are two types of thermodynamic potentials. In that way, you could consider free energy a type of potential energy. I don't claim to be an expert though.
baywax said:
If energy does not do work is it really there?
Yes. I could throw a piece of coal into my train's engine. I could get work from the heat produced by the chemical reaction of burning the coal. The train would move. If I didn't burn the coal, it would still contain energy. The coal's energy is not dependent on me getting work from it.

Or do you mean in some kind of philosophical sense of the word real? And I suppose it might also depend on what you mean by work...
baywax said:
When someone uses "x" amount of energy to tell me to look out for a falling rock... it would seem that "x" would be the amount of energy I would use to move out of the way... as in Newton's "every action" axiom.
Dodging a falling rock will almost certainly use more energy than warning some one about the rock. It would need to be a really passionate warning to compare to quickly moving your entire body out of the way of a deadly peril with adrenalin pumping through you veins, heavy breathing, and all that.
Pythagorean said:
Could information, for instance, be responsible for dark matter? Is it that real? Should it contribute to the total energy of the universe?
I would say that if dark matter exists, it contains information. If information really is energy, it should be considered as part of the total energy of the universe. But is it energy?

I'm going to go and rub my three good brain cells together and see if I can apply some numbers to any of this.
 
  • #65
baywax said:
First of all, is "potential energy" similar, the same or unrelated to "free energy"? If energy does not do work is it really there?

"Potential" energy is (usually) a function of mechanical coordinates- the position of a particle in a well, for example. "Free" energy is a more general concept, and refers to the amount of energy that can be converted into work (that is, the total energy less the entropy). The free energy contains a contribution from the potential energy



baywax said:
When someone uses "x" amount of energy to tell me to look out for a falling rock... it would seem that "x" would be the amount of energy I would use to move out of the way... as in Newton's "every action" axiom.

That's an incorrect application of the principle.
 
  • #66
adaptation said:
The more precise information I have about a system, the more energy I am able to extract from it. (This assumes that my measurements are nondestructive.) So there is definitely a relationship between the quality of information, quality here meaning how accurate the information is, and the work it can produce. This should hold true for any system that contains free energy.

Is this correct? Is there a way to quantify this?

This is incorrect. I was confusing knowledge about information and physical information itself. In actuality there is a relationship between the quality of information and the amount of work I can use it to produce. The accuracy of the information does not increase the system's ability to do work. It increases the ability of my process to extract the work.

Exploiting the accuracy of physical information requires consciousness to act on the knowledge of that information.

*returns to drawing board*
 
  • #67
adaptation said:
I'm going to go and rub my three good brain cells together and see if I can apply some numbers to any of this.

Please do that because you want to quantify that information = energy and prove it (at least mathematically) and that's the only way to get there. My meager attempt at the quantification with the warning = avoiding rock idea didn't work.

If you can prove info is energy it would be like saying a road sign contains potential energy in that it is continuously warning 24/7 and the "energy" does its work only when a passing driver reads and deciphers the message on the road sign.

––––––––––––––––

I think it can be safely said that information is an indication that energy is or was present at one time.

For example a 40,000 year old pictograph or petroglyph on a cliff face is an indication that quite a lot of energy was spent at that site at one time.

The interpretation of that "information" also takes quite a load of energy... the energy it takes to get to the site... the energy it took to complete an Archaeology degree... etc etc... but, is the pictograph or the petroglyph actually energy in its own rite?... I am intuitively doubtful.
 
  • #68
Andy Resnick said:
"Potential" energy is (usually) a function of mechanical coordinates- the position of a particle in a well, for example. "Free" energy is a more general concept, and refers to the amount of energy that can be converted into work (that is, the total energy less the entropy). The free energy contains a contribution from the potential energy





That's an incorrect application of the principle.

Thank you!
 
Back
Top