Does a full 3TB hard drive weigh more than an empty one?

In summary, a hard drive does not weigh more when data is stored in it. The mass difference is below the error of measurement. A book, however, may contain more or less information depending on how well it has been compressed.
  • #36
Andy Resnick said:
Hmmm... Let's try this- what do you think I am claiming?

Seriously? We've been in a direct back and forth for several hours now. Asking this question seems argumentative and unnecessary. But in the event you're not trying to backtrack or avoid what you have said earlier, I will answer your question in earnest.

You said in post #17:
Andy Resnick said:
Information is a form of energy, just like heat and pressure.

In another post of yours which I quoted previously (that you have subsequently deleted) you said:
I don't understand why you consider entropy, which has units of J/K, or the entropy of a bit, which is kT ln(2) and has units of Joules, is not energy (or energy/degree). Has introductory physics somehow become irrelevant?

So I "think" you are claiming that information is a form of energy and that it is measured by thermodynamic entropy.

If this is not what you are claiming, please clarify your position.
 
Physics news on Phys.org
  • #37
adaptation said:
So I "think" you are claiming that information is a form of energy and that it is measured by thermodynamic entropy.

If this is not what you are claiming, please clarify your position.

Yep- you got it. Just checking. Sometimes the threads wander uncontrollably.

Here's the original paper that established the equivalence of information and energy.

http://www.google.com/url?sa=t&sour...stTgBA&usg=AFQjCNEgG29b9aHMFGZ7D1RCM3c70eQ_Vg

And a digested/translated version is here:
http://en.wikipedia.org/wiki/Landauer's_principle

A longer discussion is here:
http://plato.stanford.edu/entries/information-entropy/

Why don't you start with these and decide for yourself.
 
  • #38
Last edited by a moderator:
  • #39
Andy Resnick said:
I think what most bothers me about these (selected) responses is the complete and continuing lack of evidence used to justify these outrageous claims. These statements are each contradicted by the entire body of thermodynamics and statistical mechanics.

What the hell are you doing, misrepresenting what I wrote like that? Are you saying I don't know entropy or statistical thermodynamics?

In context, the quote you selected reads "I don't see what would be gained by calling 'information' a form of energy. " - as opposed to using the term entropy. And you did not provide a single reference where people were in fact quantifying energy changes in terms of the amount of 'information'. They were quantifying it in terms of entropy.
 
  • #40
alxm said:
And you did not provide a single reference where people were in fact quantifying energy changes in terms of the amount of 'information'. They were quantifying it in terms of entropy.

Not true- I have provided several references, including a link to the primary reference.
 
  • #41
I may be jumping on this a little late, but I did write a blog post about the original topic of this thread, namely how the weight of a hard drive might theoretically depend on its contents. It only considers magnetic alignment effects, not information entropy, but I think it's relevant. (I may have posted the same link when this issue was previously discussed on PF, I don't remember)
 
  • #42
There is something really wrong here:

For one, entropy in a closed system can increase. Energy cannot change.

And the "information" idea is being tossed around in this thread. The only information that makes sense to decrease entropy is information ABOUT a given micro state, not just random digits:

Consider the free expansion of an ideal gas with large number of molecules. The gas can have more arrangements in the larger volume, and if you WANT, you can equate each microstate with information. The more possible arrangements of molecules, the more "information" you have. However, energy of the gas did not change during the process, so obviously the "information" gained had no effect.

Same with scratching a piece of metal, you can say there are 50TB of information in that scratch (and if you made a dictionary and a language that can interpret scratches, it can be...). However that's not saying that you decreased entropy, in fact you increased it because the information is "useless". In Szilard's engine the information is about the microstate of the particle in the box, not just random BS that you will call "information"

You need to be careful what you call information and what you call entropy. You can't take a specific idea and generalize it this radically.
 
  • #43
The OP's question really has been answered:
PaulS1950 said:
If you have five million magnets and you rearrange some does it change what they weigh?

Academic said:
Yes. So a full harddrive could weigh more or less. (as described above)

And I said that if the energy used to write the data is greater than the amount of energy in the drive in its "empty" state, then the drive must weigh more when full since that energy is stored in the magnetic field of the drive.

I think diazona has the most thoughtful answer. Read his/her blog posting. There are actual numbers.
diazona said:
I may be jumping on this a little late, but I did write a blog post about the original topic of this thread, namely how the weight of a hard drive might theoretically depend on its contents. It only considers magnetic alignment effects, not information entropy, but I think it's relevant. (I may have posted the same link when this issue was previously discussed on PF, I don't remember)
 
  • #44
Curl said:
=

For one, entropy in a closed system can increase. Energy cannot change.

Sigh... these statements are not mutually exclusive.
 
  • #45
Exactly, unless you claim that entropy = energy which is false.
 
  • #46
Andy Resnick said:
Not true- I have provided several references, including a link to the primary reference.

No, you provided some links to some papers using information theory - as if I was disputing the fact that information theory had usefulness. They did nothing at all to support your position that "information is a form of energy like heat or pressure". Rather the opposite, they called entropy by its name, not "the amount of information" or some such.

Then you responded by saying that I was ignorant about thermodynamics. - When I clearly wasn't even disputing the thermodynamics in question. I was disputing your sloppy terminology.

To reiterate my point:
1) Information is an abstract concept and its entropy is an abstraction as well. Completely regardless of whether or not entropy existed in physics, entropy would still be a useful, statistical/combinatorial property of information.
2) Information entropy is analogous to physical entropy - when the information is represented by some physical system. - Which it doesn't have to be since it's an abstraction, which is why it's not a good idea to equate the two and conclude that 'information is a form of energy'. Entropy is a broader term. It's not very useful to say steam has 'less information' than water, because nobody ever represents information that way.

Now tell me what in the above you disagree with, instead of constructing straw-men and pretending you're the only one here who knows about information theory and thermodynamics? And answer me this, if 'information' is a form of energy, then is that energy identical to the entropy of a system (or part of it) or not? If no, then I'd like to know where this mysterious 'information energy' is coming from, physically. If yes, then I'd like to know why you're not calling it 'entropy'.
 
  • #47
Curl said:
Exactly, unless you claim that entropy = energy which is false.

Entropy is a form of energy. It's the amount of internal energy not available to do work.
What thermodynamics textbook have you been reading? (or not reading?)
 
  • #48
Then adding information to a system doesn't increase its total energy, just the useful energy in the system. Therefore, no mass added.

3TB hard drive weighs the same before and after, I'm glad you agree.
 
  • #49
Curl said:
Then adding information to a system doesn't increase its total energy, just the useful energy in the system.

If you can magically turn entropy back into work-performing internal energy (Helmholtz) without any work, then I'd love to see you demonstrate this principle with a perpetuum mobile.
(Which, judging from your other posts, you apparently believe is possible as well. At least you're consistent.)

Therefore, no mass added. 3TB hard drive weighs the same before and after, I'm glad you agree.

I don't agree, and suggest you go learn basic physics before you start stating your deluded opinions on what things are and aren't as if they were fact.
 
  • #50
I have not followed the discussion, but I can see lots of disagreement about information and entropy.

I recommend you guys to take a look at the original papers by the master, Edwin Thompson Jaynes, about the connection between information theory and statistical physics. Here is http://bayes.wustl.edu/etj/articles/theory.1.pdf" .
 
Last edited by a moderator:
  • #51
as far as I know no chemical reactions occur in this process like the book, a hard drive merely rearranges electron to create storage spaces etc. SO no, it wouldn't weigh any more than a brand new one.
 
  • #52
The way I understand it in the modern disks 0 is stored as a charge and 1 is stored as no charge hence there could be a difference between empty/new disk and filled one. The question is does new disk come out formated in as 111111... or as 000000... or as completely random 50:50 combination of 1s and 0s?
 
  • #53
alxm said:
Now tell me what in the above you disagree with, instead of constructing straw-men and pretending you're the only one here who knows about information theory and thermodynamics?

Of your entire post, this is basically the only sentence I disagree with- I know hardly any thermodynamics, and even less about information theory.

Let's construct a machine which can directly convert information into work. There will be apparent paradox, the resolution of which may shed some light on the interrelationship between information and free energy (and entropy).

You and I sit opposite each other, in thermal equilibrium at temperature T. You have a box, full of gas at temperature T, with a partition in the middle. There are N particles of gas in the whole box (N/2 on each side). I send you a binary message, N/2 bits long, encoded with the following information:

If the bit is '0', take a particle from the left side and move it to the right. If the bit is '1', do nothing.

After receiving my message, you have a copy of the message and you give me the box.

But we are not done- we are not yet returned to our starting configuration. There are a few ways to go back- one by simply reversing the steps (you send me the code and I move the particles), another by me allowing the gas to re-equilibrate (either purely dissipatively, or by letting the gas do some work, or perhaps some other method)- but regardless of what happens, we must somehow end up in our starting configuration. Reversing ourselves is boring. More interesting is what I can do with the box of compressed gas.

Here's the apparent paradox: it seems that I can send two messages with the same entropic quantity of information (all '0' or all '1'), and have two different results: if the message is all '0', the gas is fully compressed and I can extract work from it. If the message is all '1', the state of the gas is unchanged and I cannot extract work.

The solution to this paradox lies in the way the information and state of the gas are related. Moving a particle means you performed a measurement on the location of the particle, whereas doing nothing did not require a measurement.

After you had processed the message by moving particles, if you did not forget the message, you now have *two* copies of the information- one is the information in your memory, the other is the distribution of particles in the box. When I allow the gas to re-equilibrate, I have destroyed a copy of the information, consistent with letting the free energy of the gas within the box dissipate. Only then are we back to the starting configuration (to within a parity transformation): you have a single copy of the message, I have a box of gas with N/2 particles per side. In this way, the paradox is similar to Szilard's engine.

Now let's say you forgot the message after moving the particles. Then, when I allow the gas to do work, we are left with *zero* copies of the message: information has been irreversibly converted into work (or heat).

In terms of a heat engine, by taking free energy from a 'hot' source (reading the message) and then deleting the message (the 'cold' reserviour), work can be extracted.
 
Last edited:
  • #54
so axlm = Maxwell's Demon

he does hasve resemblance in the letters in his name.
 
  • #55
Andy, with regard to post #53, are you claiming that the all-zeroes message contains energy and that energy is transferred to the gas?
 
  • #56
alxm said:
If you can magically turn entropy back into work-performing internal energy (Helmholtz) without any work, then I'd love to see you demonstrate this principle with a perpetuum mobile.

I never said that work is not needed, but the work is done elsewhere and energy is NOT ATTACHED to the information.

If I write a letter it could take me 3 hours and I'd burn 30 calories, however if I mail the letter to someone else doesn't mean I'm mailing them 3 hours of work and 30 calories.

I can send a signal using light, say 3 mili Joules worth of photons. This message can have an entire book within it, or it could have nothing.

If you suggest I go read a book, then I suggest you buy some logic for yourself.
 
  • #57
Personal attacks don't really add anything to the discussion...
 
  • #58
Curl said:
If I write a letter it could take me 3 hours and I'd burn 30 calories, however if I mail the letter to someone else doesn't mean I'm mailing them 3 hours of work and 30 calories.

Why are you doing this?
Yes, you ARE mailing them 3 hours of work(your work) and 30 calories(your calories)
That the recipient receives it in, say, 1/10th second and 1/10th caloric moment is absolutely meaningless.
 
  • #59
DrGreg said:
Andy, with regard to post #53, are you claiming that the all-zeroes message contains energy and that energy is transferred to the gas?

I don't think it's splitting hairs to say that I claim *information* can encode free energy, and that it costs kTln(2) units of free energy to delete a bit of information.
 
  • #60
pallidin said:
Why are you doing this?
Yes, you ARE mailing them 3 hours of work(your work) and 30 calories(your calories)
That the recipient receives it in, say, 1/10th second and 1/10th caloric moment is absolutely meaningless.

no no no, the energy it took me to write the letter IS NOT carried with the letter, it stays in my room and all the calories are used to increase the internal energy in my room. The recipient of the letter gets not part of the calories I gave up in writing the message.

Similarly, it takes energy to change the magnets on a hard drive but after you're done writing data to the hard drive and you let it cool off, the energy on the HD is unchanged. Yes it requires work, which is expended from a different system (and goes to your room's air and increases the entropy in your room). The hard drive (the carrier of information) doesn't carry any extra energy.
 
  • #61
Curl said:
If I write a letter it could take me 3 hours and I'd burn 30 calories, however if I mail the letter to someone else doesn't mean I'm mailing them 3 hours of work and 30 calories.

Curl said:
no no no, the energy it took me to write the letter IS NOT carried with the letter, it stays in my room and all the calories are used to increase the internal energy in my room. The recipient of the letter gets not part of the calories I gave up in writing the message.

These statements are true. But I think you are confusing the energy required to write a letter with the free energy value of the information contained within the letter. Just because you expended 30 calories to write a letter doesn't mean your letter has 30/(kTln(2)) bits of information. I can't say how much information is in your letter without looking at it (making a copy).
 
  • #62
It seems like there's a lot of hokus pokus going on in this thread.
 
  • #63
Mu naught said:
It seems like there's a lot of hokus pokus going on in this thread.

Don't rat me out. :wink:
 
  • #64
Andy Resnick said:
These statements are true. But I think you are confusing the energy required to write a letter with the free energy value of the information contained within the letter. Just because you expended 30 calories to write a letter doesn't mean your letter has 30/(kTln(2)) bits of information. I can't say how much information is in your letter without looking at it (making a copy).

Okay, help me understand.

In Maxwell's Demon and Szilard's engine, the information the demon receives is all about the state of the particle(s) in the box. It is an instantaneous bit of information, and the reason it increases free energy is because it reverses the effect of entropy (information loss) by the random motion of the particle. It is similar to free-expanding a gas and then re-compressing it.

Keep in mind that the energy extracted (as a consequence of having information available) still comes from the surroundings, and in the case of Szilard's engine, it comes from "allowing the particle to expand back to its original volume", i.e. transferring momentum from the box walls to the particle.

So the information acquired by the demon simply reverses (partially) the information loss caused by an increase in entropy, which is why free energy can increase with that particular information.

However, I fail to see how a hard drive, who stores random bits, can count as entropy reversal. The data on the HD is not about any particular system, and it is also has no essence of time, which is crucial when you talk about information loss caused by increase in entropy. Even the wiki page itself states: "A neat physical thought-experiment demonstrating how just the possession of information might in principle have thermodynamic consequences". Even this type of information is not guaranteed to decrease entropy (and thus increase free energy).
 
  • #65
Curl said:
Okay, help me understand.

In Maxwell's Demon and Szilard's engine, the information the demon receives is all about the state of the particle(s) in the box. It is an instantaneous bit of information, and the reason it increases free energy is because it reverses the effect of entropy (information loss) by the random motion of the particle. It is similar to free-expanding a gas and then re-compressing it.

<snip>

The data on the HD is not about any particular system

Really? the data on my hard drive definitely corresponds to specific things- experimental results, images of things, emails, music... What sort of data do you store on your HD?
 
  • #66
Curl said:
Okay, help me understand.

I'm certainly willing to try- that's what PF is all about...

How much information is contained in your message? It depends on the message.

Let's say your letter, which cost you 30 cal to write, consists of 100 words of English text. If lossless transmission of your message means that I simply read the letter, then because on average English text has 10 bits per word

Mario C. Grignetti, A note on the entropy of words in printed English, Information and Control, Volume 7, Issue 3, September 1964, Pages 304-306, ISSN 0019-9958, DOI: 10.1016/S0019-9958(64)90326-2.
(http://www.sciencedirect.com/science/article/B7MFM-4DX496Y-P/2/2381c3d9d39a607e325a51df589865a2)

your message carries 1000*kTln(2) Joules of free energy. It may help to compare that to the 30 calories you expended writing it.

But perhaps you say "oh no- you need to know the detailed information about where every single atom of ink are on the page in order completely understand my message". This line of reasoning makes more sense when discussing the transmission of images: say your 30 calorie letter is a drawing of a Laue pattern. How much information does it now contain?

It could be quite a lot- there is a lot of information encoded in a far-field diffraction pattern. Diffraction is actually a very good compression scheme, but it's not lossless:

http://en.wikipedia.org/wiki/Phase_problem
 
  • #67
Andy, you cannot possible correlate the energy expended to do work (write a letter) with the information-value of the text on a page (one is subjective construct, the other is physical principle). There is no physical connection; these are entirely different concepts.

A message does not carry "energy", in a theoretical context (forgetting for a moment the energy content of its mass). The equivalence of terms (entropy in this case) is not intended to imply any equivalence of physics. Entropy means different things in an informative system, versus a physical one.

This is really quite silly.

Really? the data on my hard drive definitely corresponds to specific things- experimental results, images of things, emails, music... What sort of data do you store on your HD?

The universe could care less what you think you've stored on your HD. From a thermodynamic perspective, there is no difference between "ordered" and "random" data; the energy content is purely a question of the volume of massive particles, and has nothing to do with their order.
 
  • #68
Andy Resnick said:
Really? the data on my hard drive definitely corresponds to specific things- experimental results, images of things, emails, music... What sort of data do you store on your HD?

The reason I'm confused is that I can't see how you can neglet the sense of time in these entropy calculations. It really bothers me to assign entropy values to any bit of information or even an English word, for example, as in the book you referenced.

I have a iittle thought experiment that might help you understand my source of confusion:

Consider Szilard's engine where, at some time, the demon gets the information "the particle is on the left". However, instead of closing the shutter/pistion and extracting kT*log(2) J of work, it simply eats some cake and comes back to work after a short break. At this point, the 1 bit of information that was given to the demon is still posessed by the demon (he didn't forget "the particle is on the left"). But, now the particle has moved and the kT*log(2) Joules of free energy are vanished, even though information is preserved... WHAT?

What I'm saying is that information is not directly equal to free energy, and it is not the case that any bit of information means entropy is decreased. Just like not all energy can do work, not all information can decrease entropy. Am I right?
 
  • #69
Curl said:
The reason I'm confused is that I can't see how you can neglet the sense of time in these entropy calculations. It really bothers me to assign entropy values to any bit of information or even an English word, for example, as in the book you referenced.

I have a iittle thought experiment that might help you understand my source of confusion:

I'm not sure why you are bringing up 'time' all of a sudden, but it's not conceptually difficult to incorporate time into thermodynamics.

As for being 'bothered' that one can assign a quantitative amount of information to a word of English: there's not much I can do about that, other than to encourage you to keep learning until your conceptual conflict is resolved to your satisfaction.

Your thought experiment is flawed: the information about the system is then encoded in cupcakes: the number of cupcakes the demon ate is equal to the number of measurements taken. So the demon made an unauthorized copy of the information.
 
  • #70
Ah, if only all information were encoded in cupcakes... :)
 

Similar threads

Back
Top