- #1
saln1
- 10
- 0
Does a hard drive weigh more when data is stored in it?
saln1 said:Does a hard drive weigh more when data is stored in it?
PaulS1950 said:If you have five million magnets and you rearrange some does it change what they weigh?
Why should the random print not be considered information?heusdens said:Same question but now for a book. One contains no information but is being printed on every page, the other contains lot of information.
Andy Resnick said:If I define 'empty' as 'devoid of information' (i.e. all bits set to '0'), and 'full' as 'maximum information' (which would be a random string of 1's and 0's), then because there is a difference in entropy, there is a difference in total energy, and thus a difference in mass. The entropy per bit is kT ln(2), and from that you can calculate the change in mass.
If you have a different defintion of 'empty' and 'full', you may get a different result.
No. As bp_psy's answer implies, you can't pick and choose your definitions of "full" and "empty". You have to use something consistent with the laws of thermodyanamics. As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress. That one contains information more useful to us isn't relevant.Academic said:Yes. So a full harddrive could weigh more or less. (as described above)
russ_watters said:As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress.
The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).Andy Resnick said:That is not true- the information content (the "information" entropy) of any discrete signal stream is related to how well you can predict the next value.
So there is a difference between the information content of the signal and the encoding of that information- some compression algorithms (Huffman is one) operate on the principle of "minimum entropy" = lossless compression.
In fact, a completely random string of binary digits has maximum information- you are completely unable to predict the value of the next digit better than 50% of the time- and so the entropy of each bit is a maximum given by (kT 2ln(2); I erred above).
http://en.wikipedia.org/wiki/Entrop...d_information_theory#Theoretical_relationshipDespite all that, there is an important difference between the two quantities. The information entropy H can be calculated for any probability distribution (if the "message" is taken to be that the event i which had probability pi occurred, out of the space of the events possible). But the thermodynamic entropy S refers to thermodynamic probabilities pi specifically.
Whether the internal energy associated with the 0 and 1 states is different is completely irrelevant here and if you try to use it, you make it easier to falsify the idea that information entropy in a computer carries mass:Andy said:But that's the crux of the issue, isn't it? In fact, the wooden stick may have very different energies associated with it (if, for example, the height changed and gravity is present). And since energy is required to both read and write information in a memory device, leading to a change in the macrostate of the device (since the two configurations are distinguishable), the internal energy (alternatively, the configurational energy, the infomation content, the entropy...) of the memory device has been changed. [emphasis added]
russ_watters said:The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).
And regardless of this, I'm not seeing that information entropy has a direct relation to mass/energy: http://en.wikipedia.org/wiki/Entrop...d_information_theory#Theoretical_relationship
russ_watters said:Assuming that a 1 and a 0 have different internal energies associated with them leads to the conclusion that a string of 0's and a string of 1's have different energy and therefore different mass. But both contain exactly the same amount of information according to you: none.
Another way to slice it: If you have a string of 1's with a single 0 in it somewhere and you choose to flip a bit (and the energy associated with a flip is the same in each direction), the energy change associated with a bit flip does not depend on which bit you flip, but the "information entropy" does. Thus, thermodynamic energy of the device and the "information entropy" are not associated with each other.
Alternately, if the internal energy change or external energy required to flip the bits is different, you may end up with a situation where flipping that 1 results in an increase in thermodynamic entropy and a decrease in information entropy. Thus, again, they are not associated with each other.
I think another key might be that you are assuming that the ability to represent a string of data with fewer bits makes it actually less information. The problem, though, is that those extra bits don't cease to exist if you apply a compression algorithm to them. So if you take the data on a 3 gb flash drive and compress it to 1 gb, you still have 3gb of data on the flash drive even if you are no longer interested in using the other 2 gb.
A practical example is that in order to represent a plain black image on a monitor or piece of paper, you need to use the same number of bits of information as a photo of the Sistine Chappel. Though you can store data compressed, in order to use it, it has to be uncompressed. This would imply that a disk with several compressed photos of clear blue sky on it actually contains more data than a photo of the Sistine Chappel that takes up the same amount of space.
Andy Resnick said:Information is a form of energy, just like heat and pressure.
alxm said:I don't see what would be gained by calling 'information' a form of energy.
alxm said:Also, depending on the storage medium, there's no reason to assume the two states '0' and '1' are equal in energy, so one can't really assume that the internal energy is determined by entropy alone.
Andy Resnick said:Information theory has provided key insights into a number of systems (in addition to large portions of computer science and digital signal processing) including chemistry
Of course- if the energy content of a '1' or '0' are different (say based on a number of electrical charges in a a capacitor, or selection of energy level, or something else), then that must be taken into account as well. But we can also encode the information in a symmetric way, such that the information will persist even without external power supplied
adaptation said:Information is by no means an abstract concept. If it were, we could send information faster than c and violate causality, cause paradoxes, win Nobel Prizes, the works.
Information is basically what makes x different from y, it's the state of a system.
You cannot make this argument by considering ones and zeros. They are just representations of the magnetic states of portions of the drive. They are abstract symbolic constructs. They are irrelevant. You need to consider the physical state of the drive itself.
Andy Resnick said:I don't understand why you consider entropy, which has units of J/K, or the entropy of a bit, which is kT ln(2) and has units of Joules, is not energy (or energy/degree). Has introductory physics somehow become irrelevant?
adaptation said:I have never had a physics class. Can you point me to a source that says information is equivalent to energy. As I said before, I like the idea, but I have no reason to believe it.
russ_watters said:As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress.
alxm said:I don't see what would be gained by calling 'information' a form of energy.
adaptation said:information can be measured as entropy rather than as energy. Energy is information, but not the other way around.
adaptation said:The idea of the article and the book is that information theory can be used to describe physical systems. There is no indication that thermodynamic entropy can describe information. It does not work both ways.
Andy Resnick said:Try reading the whole page:
S is reserved for thermodynamics, but H can can be applied to any statistical system.
Vanadium 50 said:Can someone explain to me how entropy is entering?
Entropy is the log of the number of microstates for a given macrostate. The macrostate of the drive is specified by its contents - not the microstate.
adaptation said:If you can't find a source to support what you claim
Curl said:Andy you are missing an important point. In thought experiments of information entropy such as Szilard's engine, the information is ABOUT another given micro state. The information on a hard drive can be "useless" or it can contain a movie, documents, etc. There is no work that can be done with that type of information. However, if you wrote down the arrangements of the magnets of the hard drive on a piece of paper, then yes, you have information about the state of the hard drive.
However the hard drive doesn't store information about itself. I don't know if I'm explaining this very well, but you see the point?