Does a full 3TB hard drive weigh more than an empty one?

In summary, a hard drive does not weigh more when data is stored in it. The mass difference is below the error of measurement. A book, however, may contain more or less information depending on how well it has been compressed.
  • #1
saln1
10
0
Does a hard drive weigh more when data is stored in it?
 
Physics news on Phys.org
  • #2
Since writing to a HD is just modifying magnetic regions, I would say no.
Electrons are not being added or taken away.
 
  • #3
If you have five million magnets and you rearrange some does it change what they weigh?
 
  • #4
is this a trick question or what?

is it some sort of riddle?
 
  • #5
saln1 said:
Does a hard drive weigh more when data is stored in it?

There was an interesting discussion about this exact question some months ago; a critical definition must be made of what exactly constitutes 'empty' and 'full'.

If I define 'empty' as 'devoid of information' (i.e. all bits set to '0'), and 'full' as 'maximum information' (which would be a random string of 1's and 0's), then because there is a difference in entropy, there is a difference in total energy, and thus a difference in mass. The entropy per bit is kT ln(2), and from that you can calculate the change in mass.

If you have a different defintion of 'empty' and 'full', you may get a different result.
 
  • #6
PaulS1950 said:
If you have five million magnets and you rearrange some does it change what they weigh?

Yes. So a full harddrive could weigh more or less. (as described above)
 
  • #7
Has this effect ever been demonstrated in an experiment? Or is the mass difference below the error of measurement?
 
  • #8
Same question but now for a book. One contains no information but is being printed on every page, the other contains lot of information. Assume the masses of the book (paper + ink) to be exactely the same.

Could one (in principle) by this method detect the amount of information a book contains, just by measuring the mass and not reading it?
 
  • #9
heusdens said:
Same question but now for a book. One contains no information but is being printed on every page, the other contains lot of information.
Why should the random print not be considered information?
 
  • #10
Andy Resnick said:
If I define 'empty' as 'devoid of information' (i.e. all bits set to '0'), and 'full' as 'maximum information' (which would be a random string of 1's and 0's), then because there is a difference in entropy, there is a difference in total energy, and thus a difference in mass. The entropy per bit is kT ln(2), and from that you can calculate the change in mass.

If you have a different defintion of 'empty' and 'full', you may get a different result.
Academic said:
Yes. So a full harddrive could weigh more or less. (as described above)
No. As bp_psy's answer implies, you can't pick and choose your definitions of "full" and "empty". You have to use something consistent with the laws of thermodyanamics. As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress. That one contains information more useful to us isn't relevant.

Consider that you have two bits of data. They might have one of the following four configurations:

00
01
10
11

All four contain exactly the same amount of information regardless of whether one is more useful to you than the others.

A hard drive or any other arrangement of magnets contains the same amount of information regardless of how useful that information is to you. Similarly, two books with the same number of letters and spaces contain exactly the same amount of information, regardless of the arrangement of the letters and spaces.
 
  • #11
russ_watters said:
As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress.

That is not true- the information content (the "information" entropy) of any discrete signal stream is related to how well you can predict the next value.

So there is a difference between the information content of the signal and the encoding of that information- some compression algorithms (Huffman is one) operate on the principle of "minimum entropy" = lossless compression.

In fact, a completely random string of binary digits has maximum information- you are completely unable to predict the value of the next digit better than 50% of the time- and so the entropy of each bit is a maximum given by (kT 2ln(2); I erred above).
 
  • #12
How is there more "energy" associated with a state of random digits? You are thinking too much in terms of your equations and are neglecting logic.
 
  • #13
Andy Resnick said:
That is not true- the information content (the "information" entropy) of any discrete signal stream is related to how well you can predict the next value.

So there is a difference between the information content of the signal and the encoding of that information- some compression algorithms (Huffman is one) operate on the principle of "minimum entropy" = lossless compression.

In fact, a completely random string of binary digits has maximum information- you are completely unable to predict the value of the next digit better than 50% of the time- and so the entropy of each bit is a maximum given by (kT 2ln(2); I erred above).
The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).

[edit] Another issue, maybe more relevant: Using lossless compression, you can *perhaps* fit 3 TB of data on a 1 TB disk drive and depending on the construction, the 1 TB disk drive could be substantially lighter than the 3 TB drive. I don't consider that to be in keeping with the spirit of the question. [/edit]

And regardless of this, I'm not seeing that information entropy has a direct relation to mass/energy:
Despite all that, there is an important difference between the two quantities. The information entropy H can be calculated for any probability distribution (if the "message" is taken to be that the event i which had probability pi occurred, out of the space of the events possible). But the thermodynamic entropy S refers to thermodynamic probabilities pi specifically.
http://en.wikipedia.org/wiki/Entrop...d_information_theory#Theoretical_relationship
 
  • #15
Ugh. I have no intention of rehashing that whole discussion and we're pretty much on a course to do exactly that, so I've found a quote in there I think is key:
Andy said:
But that's the crux of the issue, isn't it? In fact, the wooden stick may have very different energies associated with it (if, for example, the height changed and gravity is present). And since energy is required to both read and write information in a memory device, leading to a change in the macrostate of the device (since the two configurations are distinguishable), the internal energy (alternatively, the configurational energy, the infomation content, the entropy...) of the memory device has been changed. [emphasis added]
Whether the internal energy associated with the 0 and 1 states is different is completely irrelevant here and if you try to use it, you make it easier to falsify the idea that information entropy in a computer carries mass:

Assuming that a 1 and a 0 have different internal energies associated with them leads to the conclusion that a string of 0's and a string of 1's have different energy and therefore different mass. But both contain exactly the same amount of information according to you: none.

Another way to slice it: If you have a string of 1's with a single 0 in it somewhere and you choose to flip a bit (and the energy associated with a flip is the same in each direction), the energy change associated with a bit flip does not depend on which bit you flip, but the "information entropy" does. Thus, thermodynamic energy of the device and the "information entropy" are not associated with each other.

Alternately, if the internal energy change or external energy required to flip the bits is different, you may end up with a situation where flipping that 1 results in an increase in thermodynamic entropy and a decrease in information entropy. Thus, again, they are not associated with each other.

I think another key might be that you are assuming that the ability to represent a string of data with fewer bits makes it actually less information. The problem, though, is that those extra bits don't cease to exist if you apply a compression algorithm to them. So if you take the data on a 3 gb flash drive and compress it to 1 gb, you still have 3gb of data on the flash drive even if you are no longer interested in using the other 2 gb.

A practical example is that in order to represent a plain black image on a monitor or piece of paper, you need to use the same number of bits of information as a photo of the Sistine Chappel. Though you can store data compressed, in order to use it, it has to be uncompressed. This would imply that a disk with several compressed photos of clear blue sky on it actually contains more data than a photo of the Sistine Chappel that takes up the same amount of space.
 
Last edited:
  • #16
russ_watters said:
The fact that you have flipped a coin and gotten "heads" 5 times in a row does not give you the ability to predict what the next flip will be. As a corollary, the fact that if you already know the states of a bunch of bits of data and can therefore compress the information doesn't mean you can use that compression algorithm to generate the next bit (that you don't already know).

And regardless of this, I'm not seeing that information entropy has a direct relation to mass/energy: http://en.wikipedia.org/wiki/Entrop...d_information_theory#Theoretical_relationship

The entropy of information cannot easily be applied to what you already know- the entropy is zero for information you *already* know. The issue is the change of entropy associated with reading the information (alternatively, making a measurement). In fact, it may be more useful to associate (changes to) information entropy with that of making a measurement on a system.

Lossless compression means that the information content of the pre-compressed message is identical to the information content of the compressed image, and clearly lossy compression is associated with the loss of *information*. A lack of information- not knowing what the next measurement will produce- is associated with entropy. Because of this, sometimes people use 'negentropy' to discuss information thermodynamics, as the negentropy is a measure of what you *do* know, not what you *don't* know.

Lastly, energy is energy is energy- a Joule of heat energy is equivalent to a Joule of mechanical energy is equivalent to a Joule of information energy. Energy and mass are likewise equivalent.
 
  • #17
russ_watters said:
Assuming that a 1 and a 0 have different internal energies associated with them leads to the conclusion that a string of 0's and a string of 1's have different energy and therefore different mass. But both contain exactly the same amount of information according to you: none.

Another way to slice it: If you have a string of 1's with a single 0 in it somewhere and you choose to flip a bit (and the energy associated with a flip is the same in each direction), the energy change associated with a bit flip does not depend on which bit you flip, but the "information entropy" does. Thus, thermodynamic energy of the device and the "information entropy" are not associated with each other.

Alternately, if the internal energy change or external energy required to flip the bits is different, you may end up with a situation where flipping that 1 results in an increase in thermodynamic entropy and a decrease in information entropy. Thus, again, they are not associated with each other.

I think another key might be that you are assuming that the ability to represent a string of data with fewer bits makes it actually less information. The problem, though, is that those extra bits don't cease to exist if you apply a compression algorithm to them. So if you take the data on a 3 gb flash drive and compress it to 1 gb, you still have 3gb of data on the flash drive even if you are no longer interested in using the other 2 gb.

A practical example is that in order to represent a plain black image on a monitor or piece of paper, you need to use the same number of bits of information as a photo of the Sistine Chappel. Though you can store data compressed, in order to use it, it has to be uncompressed. This would imply that a disk with several compressed photos of clear blue sky on it actually contains more data than a photo of the Sistine Chappel that takes up the same amount of space.

I really don't understand what you are saying: let's say the memory device was empty- all bit are set to the same number. Then I only need *1* number (well... 2 numbers, one for the number of bits) to completely specify the state of memory. Clearly, that's a low information state. How many bits do I need to represent a 3 TB string of '0' with a single '1' located somewhere? Three numbers- more information is needed to specify the state. And so on...

It's like in order to have a perfectly detailed map of a city, the map must be as large as the city. Making the map smaller means less information can be encoded.

This really isn't a trick question. Information is a form of energy, just like heat and pressure.
 
  • #18
Andy Resnick said:
Information is a form of energy, just like heat and pressure.

Entropy is a form of energy. 'Information' is an abstraction of a physical state, which as such is necessarily subject to entropy.

Information is an abstract concept - not a physical thing. Information has entropy - as an abstract combinatorial property. The physical entropy is a property of whatever physical system is being used to represent the information. I don't see what would be gained by calling 'information' a form of energy. It's narrower than entropy, and confusing.

Also, depending on the storage medium, there's no reason to assume the two states '0' and '1' are equal in energy, so one can't really assume that the internal energy is determined by entropy alone.
 
  • #19
alxm said:
I don't see what would be gained by calling 'information' a form of energy.

Information theory has provided key insights into a number of systems (in addition to large portions of computer science and digital signal processing) including chemistry:

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC38575/pdf/pnas01521-0164.pdf

protein structure:

http://www.bioinf.cs.ipm.ir/IPM_mem...tion_of_Protein_Surface_Accessibility2001.pdf

and neuroscience:

http://web.mit.edu/annakot/OldFiles/MacData/afs.course.lockers/other/hst.722/www/Topics/Quantitative/BorstTheunissen99.pdf

alxm said:
Also, depending on the storage medium, there's no reason to assume the two states '0' and '1' are equal in energy, so one can't really assume that the internal energy is determined by entropy alone.

Of course- if the energy content of a '1' or '0' are different (say based on a number of electrical charges in a a capacitor, or selection of energy level, or something else), then that must be taken into account as well. But we can also encode the information in a symmetric way, such that the information will persist even without external power supplied:

http://en.wikipedia.org/wiki/Magnetoresistive_Random_Access_Memory

The fact that the encoded information does not thermalize over time- that data maintains integrity over time, without external power supplied even though the device is kept at >0 K- is important to understand, and also demonstrates the utility of the thermodynamics of information.
 
Last edited by a moderator:
  • #20
The minimum energy to achieve 0101010101...etc is more than to achieve a random arrangement 10100101010110101011101001100101 yet contains "less information", and by your logic, less energy.

Therefore you are arguing that conservation of energy is violated. Failed logic.
 
  • #21
Information is by no means an abstract concept. If it were, we could send information faster than c and violate causality, cause paradoxes, win Nobel Prizes, the works.

Information is basically what makes x different from y, it's the state of a system. So an electron has less information than an atom of hydrogen. Hydrogen has less information than carbon, and so on. Matter and energy are information.

You cannot make this argument by considering ones and zeros. They are just representations of the magnetic states of portions of the drive. They are abstract symbolic constructs. They are irrelevant. You need to consider the physical state of the drive itself.

[edit: I approached the problem from the wrong direction. I changed my original post a lot:redface:]

An electrical charge is needed to create a magnetic field. We all know that energy is conserved. Since the energy can be reclaimed, we need to consider the energy it took to write the data in the first place. This directly effects weight since weight depends on mass, and mass can be converted to energy.

If we start with a drive that has had its contents deleted, then this drive will certainly weigh less than a drive that has been written fully. It takes very very little energy to delete information.

If we consider the drive being empty as it was shipped from the factory, then we'd need to know how much energy was used in the initial configuration of the magnets.
 
Last edited:
  • #22
Andy Resnick said:
Information theory has provided key insights into a number of systems (in addition to large portions of computer science and digital signal processing) including chemistry

I didn't say otherwise. Treating physical states in the abstract (as 'information') can be useful. I said I don't see that anything is gained by calling information a form of energy, since the energy in question is entropy. And the papers you linked to which deal with energy (and not all of them do) call it entropy.

If anything, you just helped make my case on how this is confusing.

Of course- if the energy content of a '1' or '0' are different (say based on a number of electrical charges in a a capacitor, or selection of energy level, or something else), then that must be taken into account as well. But we can also encode the information in a symmetric way, such that the information will persist even without external power supplied

Are you implying that having different energy states for '1' and '0' would require an external supply of power? Not in reality. All you need is a system that equilibrates sufficiently slowly. If I were to encode data using, say, graphite and diamond for '0' and '1', it'd persist for billions of years at STP, if not longer.
 
  • #23
adaptation said:
Information is by no means an abstract concept. If it were, we could send information faster than c and violate causality, cause paradoxes, win Nobel Prizes, the works.

Information is an abstract concept. So are numbers. Etc.
When they talk about 'information can't be transferred faster than c', they're using 'information' as an abstraction of a physical state. It's the physical state at A that cannot influence the physical state at B faster than the speed of light.

Information is basically what makes x different from y, it's the state of a system.

That's like saying "a [natural] number is what determines how many of something you have". You're saying that an abstract generalization of properties are those properties. Information about something is not the thing in-itself. By that sort of reasoning (which was common among the ancient Greeks), a void cannot exist because it's "nothing". Read up on semiotics.

You cannot make this argument by considering ones and zeros. They are just representations of the magnetic states of portions of the drive. They are abstract symbolic constructs. They are irrelevant. You need to consider the physical state of the drive itself.

Which is what I originally said.
 
  • #24
Information is not an abstraction. The representation of information is. If I took a bunch of water, carbohydrates, iron, calcium, etc. and tossed it into a container, it would not make a person. The configuration of the materials, the identity that their assemblage makes, is information. We, you and I, are composed of all the same stuff, and yet we are different. The fact that we are different is not abstract. This is information.

All matter/energy is information. These are not my ideas. It's discussed by Ben Shumacher of Kenyan College in his http://www.teach12.com/ttcx/coursedesclong2.aspx?cid=1299" .

alxm, you are correct. In "en.wikipedia.org/wiki/Information_theory"[/URL], information can be measured as entropy rather than as energy. Energy is information, but not the other way around. I would be very interested to read a link that shows otherwise. I am inclined to believe that information is energy, but I haven't been convinced yet.
 
Last edited by a moderator:
  • #25
Andy Resnick said:
I don't understand why you consider entropy, which has units of J/K, or the entropy of a bit, which is kT ln(2) and has units of Joules, is not energy (or energy/degree). Has introductory physics somehow become irrelevant?

I have never had a physics class. Can you point me to a source that says information is equivalent to energy. As I said before, I like the idea, but I have no reason to believe it.

"en.wikipedia.org/wiki/Entropy_(information_theory)"[/URL]

There are many different contextual uses of the word entropy. It seems like you are using this definition out of context. We are talking about information entropy. I have not come across the definition you are using as it applies to information, but again, this may be due to my lack of education.
 
Last edited by a moderator:
  • #27
russ_watters said:
As far as the laws of thermodynamics are concerned, a hard drive that is "full" of 0's contains exactly as much information as one that is all random atmospheric noise and one that contains the library of Congress.

alxm said:
I don't see what would be gained by calling 'information' a form of energy.

adaptation said:
information can be measured as entropy rather than as energy. Energy is information, but not the other way around.

I think what most bothers me about these (selected) responses is the complete and continuing lack of evidence used to justify these outrageous claims. These statements are each contradicted by the entire body of thermodynamics and statistical mechanics. I have provided reference after reference in an attempt to demonstrate that the information encoded in a a signal is equivalent to the entropy of the signal, and since entropy is a form of energy, thus information is equivalent to mass. Perhaps these references are considered suspect; nobody has objected to them.

Surely, nobody would object to me stating kT = 1/2 mv^2: that's the basis of most of the ideal gas laws. And nobody would object to me saying kT = m' c^2, as that's the basis of mass-energy equivalence. Arguing against information theory is equivalent to arguing against statistical mechanics- and I wager most people here think statistical mechanics is valid.
 
  • #28
In my searches the best I could find to support your claim was "en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory"[/URL]. (Scroll down to find it.)

[quote="Andy Resnick, post: 2823637"]Your belief is not required- this is not a religion forum.

Here's a few sources:

[url]http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html[/url]

[url]http://ee.stanford.edu/~gray/it.pdf[/url][/QUOTE]

I appreciate the links. Both are worth the read, although I can't claim to have finished the book just yet...

This is from [URL]http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html"[/URL]:
[QUOTE]The [U]thermodynamic entropy[/U], S, for this case is:
S/k = log(Ωp) Ωp = number of equally probable microstates, k = Boltzmann's constant
...Boltzmann's form of this equation is S = k ln(Ωp), where Boltzmann's constant has SI units of JK-1

...

The [U]information theory entropy[/U] (Shannon entropy), H, for this case is defined as:
H = log(Ωp) , Ωp = number of equally probable microstates
...The entropy H quantifies the uncertainty about what message is to be received.[/QUOTE]

The underlines were added by me for emphasis to differentiate between information and thermodynamic entropy.

The idea of the article and the book is that information theory can be used to describe physical systems. There is no indication that thermodynamic entropy can describe information. It does not work both ways.

From the same source:
[QUOTE]S and H (thermodynamic and Shannon entropy) are equivalent, in that S is directly proportional to H, and this is because the same conditions hold for both systems. S is reserved for thermodynamics, but H can can be applied to any statistical system.[/QUOTE]

The book will take me a considerable amount of time to got through. There's a lot of material and a lot of the math is beyond me. Maybe you could refer me to a page number that supports your claim?

I would like to see something that does more than [I]link[/I] one form of entropy to another. I'd like to see something that clearly states that information is energy or information entropy is energy. Until then, my "religion" will have to sustain me.:biggrin:

If I've misunderstood something, don't keep me in the dark.

[I][edit: The below added to respond to Andy Resnick's post above mine. He's too quick!][/I]

I posted links in my responses. If there is some error in my sources or my interpretation of them, please let me know.

I have addressed your sources in my post above.
 
Last edited by a moderator:
  • #29
adaptation said:
The idea of the article and the book is that information theory can be used to describe physical systems. There is no indication that thermodynamic entropy can describe information. It does not work both ways.

Try reading the whole page:

"S and H (thermodynamic and Shannon entropy) are equivalent, in that S is directly proportional to H, and this is because the same conditions hold for both systems. S is reserved for thermodynamics, but H can can be applied to any statistical system. As Shannon and Weaver wrote:

“...the quantity which uniquely meets the natural requirements that one sets up for ‘information’ ... turns out to be exactly that which is known in thermodynamics as entropy.”

The entropy S is a state function of a thermodynamic system, but it can't be directly measured like pressure and temperature. There is no entropy-meter; entropy must be inferred by varying the state of a system near equilibrium and observing how other thermodynamic variables (pressure, temperature, etc.) respond. This is one reason why the statistical mechanics interpretation of entropy is so important:

"[The] ability to make macroscopic predictions based on microscopic properties is the main asset of statistical mechanics over thermodynamics. Both theories are governed by the second law of thermodynamics through the medium of entropy. However, entropy in thermodynamics can only be known empirically, whereas in statistical mechanics, it is a function of the distribution of the system on its microstates." (from statistical mechanics)

It might seem like this statistical interpretation of matter can cause matter to be "influenced" by our knowledge, or lack of knowledge, of its microstates. What does information or knowledge about microstates have to do with how a steam engine works! But this train of thought is a result of a misperception of microscopic states in nature. Which microstate a particle system is in is irreducibly (inherently) uncertain, in same sense that the position and momentum of individual particles are uncertain (Heisenberg's uncertainty principle). All we know about a steam engine is the possible microstates for any given macrostate, and we know nothing of the particular microstate.

The fact that entropy almost always increases or stays the same (the second law of thermodynamics) is a statistical statement about the uncertainty of a particle system's microstate."
 
  • #30
Can someone explain to me how entropy is entering?

Entropy is the log of the number of microstates for a given macrostate. The macrostate of the drive is specified by its contents - not the microstate.
 
  • #31
Information theory defines the entropy in terms of a randomly fluctuating variable:

http://en.wikipedia.org/wiki/Shannon_entropy

Landauer (1961, IBM J. Res. Develop.) pointed out the relationship between aquiring, processing, and deleting information and free energy. When a bit is erased (at temperature T), kT ln(2) units of energy is dissipated into the environment.

Reading a memory state is a sequence of measurements (which may be reversible, but may not be) and the source of the signal is considered to be transmitting entropy at a certain rate.

Some people get confused, trying to relate the *change* in entropy associated with reading a bit- which decreases the receiver's uncertainty- with the *absolute* entropy of the message itself. The absolute entropy is given by Kolmogorov (1965, Prob. Inform. Transmission) and relates to the *minimum* number of bits required to specify the memory state. Thus, the entropy transmission rate is *not* just the data transfer rate, but also reflects the transmission rate of the information (algorithmic information).

But the bottom line is that different messages have different entropies, and thus different energies, and thus different masses. At this point it is helpful to calculate: writing a random 3 TB string of bits requires at least (3*10^12)*kT ln(2) = 8.6*10^-9 Joules. When I read the message, my free energy is increased by that amount. If I erase the memory, I must dissipate that amount of free energy. And since I can read the message in a closed box, the transfer of energy is between me and the memory.

8.6 * 10^-9 J= 8.5 * 10^-16 kg. Good luck trying to measure that.
 
Last edited:
  • #32
Andy Resnick said:
Try reading the whole page:

I read the page several times. Everything you posted supports what I have been saying. Information entropy can describe physical systems, thermodynamic entropy cannot describe information. The equivalence only works in one direction.

Your source:
S is reserved for thermodynamics, but H can can be applied to any statistical system.

If this is contradicted later or earlier in the page, please highlight the specific passage. If it is not, please do not imply that I used (or understood) your source out of context.

You have still not posted anything that clearly states that information is energy. Like I said, I want to believe that information is energy, but I've no prof. I am on your side here, but you're not working with me.

If you can't find a source to support what you claim, you can describe an experiment that starts with abstract information (not matter, not thermal energy, not em energy, not sound energy, not nuclear energy, not potential energy, etc.) and extracts work from it. If this experiment were successful, it would certainly prove your case.

Vanadium 50 said:
Can someone explain to me how entropy is entering?

Entropy is the log of the number of microstates for a given macrostate. The macrostate of the drive is specified by its contents - not the microstate.

Information entropy entered into the discussion because entropy can be a measure of information in information theory. This was introduced in an attempt to prove that information is energy.
 
  • #33
Andy you are missing an important point. In thought experiments of information entropy such as Szilard's engine, the information is ABOUT another given micro state. The information on a hard drive can be "useless" or it can contain a movie, documents, etc. There is no work that can be done with that type of information. However, if you wrote down the arrangements of the magnets of the hard drive on a piece of paper, then yes, you have information about the state of the hard drive.

However the hard drive doesn't store information about itself. I don't know if I'm explaining this very well, but you see the point?
 
  • #34
adaptation said:
If you can't find a source to support what you claim

Hmmm... Let's try this- what do you think I am claiming?
 
  • #35
Curl said:
Andy you are missing an important point. In thought experiments of information entropy such as Szilard's engine, the information is ABOUT another given micro state. The information on a hard drive can be "useless" or it can contain a movie, documents, etc. There is no work that can be done with that type of information. However, if you wrote down the arrangements of the magnets of the hard drive on a piece of paper, then yes, you have information about the state of the hard drive.

However the hard drive doesn't store information about itself. I don't know if I'm explaining this very well, but you see the point?

I think you are confusing the state of a memory device with the device itself. It doesn't matter in what format the information is stored, or how the information is represented: binary, words, pictures, video...

Information theory is a theory about the *transfer* of information. How information flows from one system to another, or how it flows within a system. If you hand me a memory stick, unless I can read the information stored within it, it doesn't matter if you've handed me a photo of bigfoot, the business plan for Google, the recipe for coca-cola, etc.

Again, it's possible to confuse the "shannon entropy"- which relates to the transfer of information from sender to reciever- with Kolmogorov's 'algorithmic information', which quantifies the (thermodynamic) value of the actual information.

Does that help?
 

Similar threads

Back
Top