Weight difference between an empty and a full memory stick

In summary, the weight difference between an empty memory stick and the same memory stick with data can be proofed. This is due to the energy stored in the information on the memory stick, as well as the energy associated with entropy. The amount of information stored can also affect the mass of the device. However, the exact change in mass may be difficult to measure accurately.
  • #71
I'm not sure what point you're trying the make here.
Andy Resnick said:
It's important to remember that the existence of state variables (or even the existence of a thermodynamic state) comes before the definition of reversible, cyclic processes, not the other way around. Also, it's important to realize that the flow of heat can have material effects other than a change in temperature.

So, yes- the actual, measured change in entropy acummulated after a process has occurred depends on the specific process that takes a system from state 1 to state 2. However, the lower bound (the equality in the equation) of the change in entropy is invariant to coordinate changes, satisfies the principle of material indifference-the response of a material is independent of observer- and also satisfies the principle of equipresence (all observers agree on the set of independent variables).
If I understand correctly, you seem to be essentially saying the participating in a process that involves flow of heat can have entropy effects on the material. Putting aside whether that is the case or not - how would that support any point of yours?

Clausius-Duhem inequality, like the 2nd law, only works in one direction - up, towards irreversibility, towards the increase of entropy. But your claim was that physical entropy decreases (and mass increases) with "information". It should therefore be obvious that you can't invoke Clausius-Duhem inequality to justify the alleged effect.

And even if we ignored that the direction is wrong, this wouldn't make anything depend on the data there is, but at best on what the memory has been through. But that has nothing to do with the subject of the thread.
 
Physics news on Phys.org
  • #72
Andy, what do you propose happens to the mass of a system when a bit is changed from a 1 to a 0 and then back to a 1? You have been arguing that there is a lower bound to the change of entropy/energy when a system changes states. Is it not implied that the lower bound must be greater than zero?

A system in the exact same state must have the same mass, no? So no mass can be added to a system in an irreversible way simply by flipping bits. There can't be anything inherent to the process of flipping a bit that adds mass to the system. If both states 0 and 1 are at the same energy level, as is the case with objects that are simply moved horizontally from one place to another, then the content of the drive, the sequence of the 1s and/or 0s, would be totally irrelevant to the drive's mass.

I don't see where entropy even enters the discussion unless entropy is supposed to allow us to ignore conservation of energy. Ignoring inefficient losses, which are not stored within the system, no net work is done when moving an object horizontally or flipping bits between two equivalently energetic states.

Am I missing something here?
 
  • #73
kote said:
A system in the exact same state must have the same mass, no? So no mass can be added to a system in an irreversible way simply by flipping bits. There can't be anything inherent to the process of flipping a bit that adds mass to the system. If both states 0 and 1 are at the same energy level, as is the case with objects that are simply moved horizontally from one place to another, then the content of the drive, the sequence of the 1s and/or 0s, would be totally irrelevant to the drive's mass.

Objects moved horizontally from one place to another are operating under the force of gravity, a conservative force. It just means there's no work done against gravity. There's plenty of other force involved in the real world so we do actually do work when we move an object from one place to another (we have to overcome it's inertia twice: once to start it in motion and once to stop it.)

In real hard drive, you have complicated solid state physics going on. I have no idea how they store 1's and 0's in a hard drive, but in order to make them distinguishable from each other, I assume they'd have to occupy different energy states.

When I built simple logic circuits, we used a voltage of 5 V to represent the 1 and some millivolts to represent the 0. Even if you assumed that we always had the "same" power through the relationship P = IV (the current goes up to compensate for the low voltage, so the power ideally stays the same) you'd have to realize that the dynamic process of switching from high voltage - low current to high current - low voltage are not equivalent once you consider the added problems of thermodynamics and entropy.

So you can't just consider the final state anyway, you also have to consider the dynamic process that allowed the states to change to where they are in the final state.
 
  • #74
Pythagorean said:
So you can't just consider the final state anyway, you also have to consider the dynamic process that allowed the states to change to where they are in the final state.

The current mass of a system depends only on its current state, not on the processes that led to its current state. If the current state is the same as a previous state, then the mass is the same as it was in the previous state.

Also, the moving horizontally thing is hypothetical. What's more important is that when you flip a bit you can flip it back and end up in the same state you started in. Of course work is done in the real world to flip a bit and flip it back, but that work is all lost as heat to the environment and doesn't change the state of the system besides heating it temporarily.
 
  • #75
kote said:
The current mass of a system depends only on its current state, not on the processes that led to its current state. If the current state is the same as a previous state, then the mass is the same as it was in the previous state.

Also, the moving horizontally thing is hypothetical. What's more important is that when you flip a bit you can flip it back and end up in the same state you started in. Of course work is done in the real world to flip a bit and flip it back, but that work is all lost as heat to the environment and doesn't change the state of the system besides heating it temporarily.

Actually that work is lost in addition to work lost through heat from solid state collisions in the conducting material. We're talking about a system of bits, not a single bit (but even with a single bit, the work to from 0 to 1 is not necessarily the same as to go from 1 to 0. I so no reason at all to assume only conservative forces are involved).

Anyway, the system is more ordered (has less entropy) for the empty disk drive. The system is less ordered (has more entropy) for the full disk drive. Everything else being constant, the difference in entropy implies a difference in energy (Gibb's).

A difference in energy means a difference in mass (E=mc^2)
 
  • #76
One128 said:
I'm not sure what point you're trying the make here.

<snip>

This thread has developed into a long and winding road, so sure- a recap is in order.

My first post (#9), I answered the OP in the following way:

"since the empty (which I interpreted to mean devoid of information) memory device has less entropy than the full memory device- one bit of information is associated with kT ln(2) units of energy- the energies are different and so the mass is different"

This caused several objections to be raised, mostly along the lines of "Wait! Information content is observer-dependent, and besides, the 'information entropy' is different than the 'thermal entropy' and so the 'information entropy' does not correspond to a difference in energy."

Every post I have made since the original post has been addressing the various objections: first, the information content of a signal is different than the *encoding* of information in a signal (a subtle point), that 'information entropy' is no different than 'thermal entropy', and that the information content is not observer-dependent. Sometimes, my explanations are more clear than other times.

My line of reasoning follows that of rational continuum mechanics and rational thermodynamics, subjects that are sometimes unfamiliar. The advantage is that the arguments and conclusions are material- and process-independent
 
  • #77
Hi Andy, Thanks for the recap. I’d like to try a recap for myself and perhaps find out if I’m missing a point. Note that one axiom I think we’ve all taken for granted here is that this “memory card” is a hypothetical physical system. No one yet seems to have made the point of narrowing this down to an actual memory card that might be used on a conventional computer. I’m fine with that actually, though it would be nice to understand how a conventional memory card works.

Another axiom I think we’ve all been assuming is that ‘weight’ in this context is a summation of mass plus energy. I’ll keep using this axiom unless it’s challenged.

1. I’m sure we all agree that to decrease entropy, energy must be added. When that happens, the total mass plus energy for this system increases. To answer the OP in this case is to say that to decrease entropy, weight will increase for this closed system. (I don't know yet if this has anything to do with information or not yet.)

2. What I’m not sure about is that given we isolate this memory card from the environment and it undergoes a decay with no energy input, and assuming the physical entropy increases, the memory card might decrease in equivalent mass. In this case, mass plus energy is conserved, so I’m not sure one can claim that a simple increase in entropy of a closed system will necessarily lead to a decrease in that system’s total weight or mass. That problem needs to be addressed separately.

3. Another point I’ve seen suggests that if energy is added to a closed and isolated system, then it doesn’t matter if entropy increases or decreases for that system. The end result is that ‘weight’ must increase. This is obviously problematic if one wants to suggest that weight decreases when energy is added and entropy decreases. Given ‘weight’ being mass plus energy, the addition of energy requires an increase in weight regardless of whether entropy increases or decreases unless #2 above can somehow be proven.

4. Yet another point suggests that information entropy may or may not correspond to thermodynamic entropy. I suspect folks going into physics aren’t very familiar with “information” entropy. I don’t have any idea what information entropy is but I suspect it has nothing to do with this. I haven’t seen anyone yet quote a paper to defend this correlation. There have been many quotes of the literature, but I don’t see a single one that really brings this argument out and properly defends it one way or the other.

5. I’d like to add one more stick to the fire. Assuming we are considering this “memory stick” to be a hypothetical system as opposed to a real memory card, I’d like to resort back to the true, thermodynamic definition of entropy as was discussed earlier. Thermodynamics uses the concept of control volumes. This concept is a philosophical one really, as it has considerable unwritten assumptions. Those assumptions include that a control volume is unaffected by anything external to the control surface. This follows from nonlocalilty. Nothing can influence the going’s on inside a control volume without some causal influence passing the control surface. We can break up any given closed or open physical system into control volumes and show that the entropy within any given control volume is independent of what’s happening external to the control surface. Given this is true, the entropy of a switch in one of two possible positions is independent of anything external to the control surface. Since this is true, the entropy of a memory card is a simple summation of the system's individual control volumes. And if the system's total entropy is a simple summation of the entropy of it's individual control volumes, then any information we claim the system has is observer relative (or observer dependent).

There might be another argument that’s been posted, but I’m starting to fade… :smile:
 
Last edited:
  • #78
Q_Goest,

Your points are well-taken. Memory sticks use something called "flash RAM', similar to an EEPROM. I looked up how they work, but I don't really understand it other than (1) power is not required to maintain the memory state, (2) an empty state is '1', and (3) the difference between '1' and '0' is the level of current. The only other revelant piece of information is that they can withstand a limited number of erase-write cycles, but possibly an infinite number of read cycles. That is a significant fact, and gives clues about the relevant thermodynamics.

A minor point- mass and energy are *equivalent*. 'Weight' is not mass plus energy, it's mass/energy plus gravity.

For point (2), it's not obvious what will happen. Clearly, writing or reading involves interaction with another object, but then the memory stick can be unplugged and stored in a drawer for years. The question is what is the fidelity of the encoded data over that time. I suspect that the contents of the unplugged memory stick can be pretty well considered isolated from the environment, but again, if the data becomes degraded, that indicates the system is not in perfect isolation.

For point (4), I'd recommend reading Shannon's original paper (1949?) as it is quite readable. It's crucial to understand how 'information' is quantified. White noise has a lot of information, while a pure tone does not. A randomly-generated string of ASCII has maximal information, but readable text does not. That is counterintuitive, but nonetheless, that's the definition.

Points (3, 5) require a bit of thought to answer. You are right that a control volume is a fundamental tool of thermodynamics, but it's no more mysterious than the control volumes used to prove Gauss's law. In order to change the state of a bit, something must flow through the boundary of the control surface.

'Entropy' is, unfortunately, as ill-defined as 'heat' and 'energy'. What *is* energy? I have no idea. I do know that the total energy of an isolated system is conserved, but the energy can be partitioned into a 'work' part (PV, electromagnetic, chemical, information, etc.) and a 'heat' part. I don't know what 'heat' is either, but I know that it can act on a body to either change the temperature (the specific heat) or effect some other change (latent heat- which is not limited to a simple change of phase), or some combination. What 'heat' cannot do (by itself) is perform work. 'Heat' is related to 'entropy' while 'work' is not. If I do work on a system to organize it, to structure it, or to somehow select out a specific state, then I have decreased the entropy of that system (at the expense of increasing it somewhere else).

An example I like to use in class is the folding of a protein: on order for a protein to be functional, it must assume a specific folded configuration. Proteins (like other living systems) operate at constant pressure and temperature- this is why they are good model systems for illustrating certain concepts. What is the difference of energy between a folded and unfolded protein? What about the entropy? How is this energy encoded?

Instead of thinking of the internal energy of a system as the potential energy, or the mass energy, it can be thought of as 'configuration energy': the configuration of a system has energy associated with it. Folded and unfolded proteins have different configurations, so they have different energies. Unfolded proteins are much more mobile (structurally), and so they have a higher entropy (larger number of equivalent states).

So it is with the state of a memory device. Since EEPROMS have a limited number of erase-write cycles, the entropy of the memory must be increasing each time a bit is written- I do not know how this entropy is manifested (changes in resistance, noisier signals, some other effect), but the overall entropy of the device must increase. Reading the data is interesting, and I'm not sure how to understand that process yet.
 
  • #79
Are you a physical chemist Andy? (Or a chemical physicist?)
 
  • #80
In the original Shannon paper (http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf ), "bits" are the unit used to measure information.
The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.
Information is measured in bits. 8 gigabits of information is 8 gigabits of information.

Since a flash drive has a static number of bits, it is always storing the same amount of information. The amount of information, being constant, is irrelevant to any changes in the mass of the drive.

The rest of the article talks about compressing predictable patterns down to a minimum number of bits, and how many bits are required to represent different patterns. All of that has to do with random sources and partially predictable results, none of which seem to apply here.
 
Last edited by a moderator:
  • #81
Pythagorean said:
Anyway, the system is more ordered (has less entropy) for the empty disk drive. The system is less ordered (has more entropy) for the full disk drive.

To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?
 
  • #82
Stonebridge said:
To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?

In practice, flash drives don't overwrite old data until they have to. They find an empty spot on the drive and put new data there. Once everything is full, they go back and wipe parts of the drive as needed before rewriting new data. When something is deleted, its bits are left as they are and some small portion of metadata changes to say "this junk here isn't a file anymore." When you delete a 10gb file you may only flip 3 bits in the drive.

Even after you delete a file, that portion of the drive won't be used again until the factory-fresh portions are all used up. Because of this, there are not portions of the drive that ever stay unused, and the drive never returns to having portions that are all 1s or all 0s.

So, in practice, deleting a file has very little impact on the overall sequence of bits in the drive, which, after an initial break in period, remains in an apparently chaotic/random state over the entire drive regardless of the actions of the user. Unless this is specifically played with by the user, after a break in period, flash drives will always have near 50/50 1s and 0s in an apparently random order.

They are designed to work this way because of the limited lifetime mentioned by Andy. The drive lasts longest when the data writing is spread evenly over the entire drive and when bits are only flipped as needed.
 
  • #83
Pythagorean said:
Are you a physical chemist Andy? (Or a chemical physicist?)

Heh... I stopped labeling myself a long time ago. My dissertation was straight-up physics; fluid mechanics and optics. I guess people call me a biophysicist now... I just tell people I'm a scientist.
 
  • #84
kote said:
<snip>

The rest of the article talks about compressing predictable patterns down to a minimum number of bits, and how many bits are required to represent different patterns. All of that has to do with random sources and partially predictable results, none of which seem to apply here.

Setting aside the choice of base (perhaps base 255 would be better for ASCII, for example, or base 10 for decimal numbers), the paragraph above is exactly why image compression (or data compression in general) is analyzed in terms of 'entropy', 'lossless', etc. Again, the information content of a signal is different than how that signal is encoded.
 
  • #85
Stonebridge said:
To get back to the original question then; how exactly do you define empty and full in this context?
If I "filled" a disk drive with one large graphic file would it be full?
If I then "deleted" that file, would it be empty?
If the drive was empty before I stored the file, in what sense is it "empty" after file deletion.
Do we have two types of "empty"?

I believe we've laid out the assumption that an empty disk is either all 1's or all 0's. IIRC, this is how you format it "government style". (There may actually be a pattern, like 10101010101, but that's still a very ordered state.)

I'm not sure if USB sticks are like Hard Drive, where, after a lot of use, even if it's empty, it will still have remnants of the old files, because it never really deletes them unless you format it "government style". All it really does is flag that that part of the drive can be written over. There's software that you can recover files like this with that have been "deleted" but not really. In this case, it could be more difficult to find out which state is more ordered, because the drive isn't really empty, it's just set to overwrite files you don't want anymore. Not sure if Flash Memory does this or not, but it might be something to consider.

So here, we're assuming that it's truly empty as in, all 0's or all 1's or a very simple, repeated pattern of 1's and 0's. This is a very low information state. A file will have to store it's contents as 1's and 0's in a way more complicated pattern to be able to actually store all the information (like words or a map of where pixels go) that will result in a not-so-trivial pattern of 1's and 0's. The disorder of the system will have increased.

Andy Resnick said:
Heh... I stopped labeling myself a long time ago. My dissertation was straight-up physics; fluid mechanics and optics. I guess people call me a biophysicist now... I just tell people I'm a scientist.

I got my undergrad in physics and am going interdisciplinary myself. One of my potential interdisciplinary advisers is a biophysicist. I have no idea what I will do for a dissertation yet though.
 
Last edited:
  • #86
Whether a thumb drive is written with actual data or random data is indifferent: same energy.

Same energy, same mass.
 
  • #87
kote said:
Since a flash drive has a static number of bits, it is always storing the same amount of information. The amount of information, being constant, is irrelevant to any changes in the mass of the drive.
And thanks to Pythagoras for his reply.

So with regards to the original question, there is no difference between full and empty in terms of the amount of information stored. In fact, full and empty have no meaning here.
To go back to a question I posed earlier. Say we consider a byte made up of 8 bits.

Pythagoras, you claim that 00000000 or 11111111 would be empty? A more random configuration of bits would contain more information?

But if I say that the byte is just storing a number and that

00000000 = 0
11111111 = 255
11010010 = 210 (arguably more random that the other two)

then each state holds exactly the same amount of information. A number between 0 and 255. So the memory location is always "full".

Now add a few million more identical memory locations and call it a memory stick or a hard drive. The drive can never be "empty". Full and empty have no meaning.
So how does this impact on the original question? "Is a full memory stick "heavier" than an empty one?"
Are we saying that any mass/energy difference is not a result of any stored information?
After all, I can, on a whim, define the information content of that byte any way I want.
 
  • #88
Stonebridge said:
And thanks to Pythagoras for his reply.

So with regards to the original question, there is no difference between full and empty in terms of the amount of information stored. In fact, full and empty have no meaning here.
To go back to a question I posed earlier. Say we consider a byte made up of 8 bits.

Pythagoras, you claim that 00000000 or 11111111 would be empty? A more random configuration of bits would contain more information?

But if I say that the byte is just storing a number and that

00000000 = 0
11111111 = 255
11010010 = 210 (arguably more random that the other two)

then each state holds exactly the same amount of information. A number between 0 and 255. So the memory location is always "full".

Now add a few million more identical memory locations and call it a memory stick or a hard drive. The drive can never be "empty". Full and empty have no meaning.
So how does this impact on the original question? "Is a full memory stick "heavier" than an empty one?"
Are we saying that any mass/energy difference is not a result of any stored information?
After all, I can, on a whim, define the information content of that byte any way I want.

But you're using a highly interpretive definition of information, useful to humans. This is not the definition of information we're using.

1) We are assuming that the state of the bits corresponds to the physical state of the system (you have to make physical changes in the hardware to represent the bits). We'd have to have a very specific kind of expert to answer that question (that knows how USB sticks work physically). I don't think it's very far out there though. In my experiences with simple gate logic circuits, it is definitely true: The 1 and the 0 correspond to different states (a higher voltage with a 1, a lower voltage with a 0). But I don't know about the micro-circuity of the usb stick.

2) Information, in this context, pertains to the variety of states in the system. We're talking about the physical system that represents the bits, operating under assumption 1) above. If the system has all of it's particles in the same exact state, it is a high-order, low-entropy system. If it's particles occupy many different states, it is a low-order system with high entropy. This is physics.

3) The Gibb's free energy is where my intuition breaks down. I've been assuming the only form of it I'm familiar with: G = H - TS, where T is temperature and S is entropy. So you see can at least see that entropy and energy are related. However, I don't know if this simple relationship works for dynamics situations, and further more, I don't know if H and T are really constant or if they somehow shift to make the energy ultimately the same.

my confidence:
1) fairly confident
2) fairly confident
3) no idea

By the way, I'm not saying the mass changes for sure. We've been answering the question "how would it work". We'd need a tech expert for 1), and a thermo expert for 2) and 3).
 
  • #89
Q_Goest said:
<snip> I haven’t seen anyone yet quote a paper to defend this correlation. There have been many quotes of the literature, but I don’t see a single one that really brings this argument out and properly defends it one way or the other.

<snip>

I dug out a book from the back corner, by John Pierce "An Introduction to Information theory". It's a Dover book, so it has all the usual good qualities. I can't think of a better book to start learning this material.
 
  • #90
Pythagorean said:
2) Information, in this context, pertains to the variety of states in the system. We're talking about the physical system that represents the bits, operating under assumption 1) above. If the system has all of it's particles in the same exact state, it is a high-order, low-entropy system. If it's particles occupy many different states, it is a low-order system with high entropy. This is physics.

More accurately, entropy (in thermodynamic sense) does not quantify how many states the particles do occupy, but how many states they can occupy, in a given macrostate. In a system where 0's and 1's are encoded symmetrically (same energy level etc.), particles can occupy the same number of states whether any single bit is 0 or 1 - the particular set of possible states changes when a bit value changes, but its size remains the same. In this context, the bit configuration is a part of macrostate specification - it is fixed and known - and thus does not contribute to the entropy of the system.

In a somewhat different context, the exact bit configuration could be a part of the entropy, but what would matter is how many bits are known down to (any) value and how many are "undefined" (more accurately, how many different bit configurations there can be, under given constraints) - the particular pattern that the known 0's and 1's form would again be irrelevant with regard to entropy.

Even the latter context could be made thermodynamic - you could theoretically construct some weird machine that would operate on uncertainty of individual bit levels - but such a machine, while strictly conforming to basic thermodynamic laws, would be "Maxwell-demonian" - the relationship between entropy and energy would not be as straightforward as in classical thermodynamics (which makes some assumptions that wouldn't be satisfied here) and consequently, the total energy would still be the same, regardless of bit configuration. (If everything else was the same and construction details didn't entail inherent energy difference between configurations.)
 
  • #91
I think you are focusing on the wrong thing. I encode some information on a memory stick and give it to you. Instead of focusing on a string of '1' and '0' numbers, let's pretend the memory stick encodes base 27: all lowercase letters and a space.

Two preliminaries: I have many more choices of 30-'bit' messages to send than I do 1-'bit' messages. In fact, there are only 2 1-bit messages possible: 'I' and 'a'. There are considerably more 30-bit long strings I can send. So you, the receiver, ascribe more entropy to the 30-bit long message than the 1-bit message. Also, there is differences in uncertainty if you are reading along and encounter (for example) a 'q' rather than a 'c': 'q' is almost always followed by 'u', while 'c' can be followed by many more letters.

Now, before you object that the information in the code is 'observer dependent' becasue I chose english text, the above argument can be brought back to 1's and 0's by me sending you a (binary) message which is the sequence of coin flips, 1 = heads, and 0= tails. There are many more possible results of 30 coin flips than 1 coin flip, although you lose the notion of 'predictability'.

The entropy is defined the exact same way it usually is in statistical mechanics: S = -Sum(p log p).

I can encode messages all kinds of ways; it is possible to use entropy to determine the most efficient way of coding by calculating Sum(p log p):

http://en.wikipedia.org/wiki/Huffman_coding
 
  • #92
Pythagorean said:
<snip>

3) The Gibb's free energy is where my intuition breaks down. I've been assuming the only form of it I'm familiar with: G = H - TS, where T is temperature and S is entropy. So you see can at least see that entropy and energy are related. However, I don't know if this simple relationship works for dynamics situations, and further more, I don't know if H and T are really constant or if they somehow shift to make the energy ultimately the same.

<snip>

The Gibbs free energy is generally used for systems at constant pressure and temperature, but the real utility of the free energy is that it can be used for open systems (which includes chemical reactions). If dG < 0, then the process can occur *spontaneously*, in slight contrast to dS > 0 (irreversible). The free energy change of a process dG is a measure of the driving forces present on a process.

Dill and Bromberg's book "Molecular Driving Forces" is pretty good, but I like Nicholls and Ferguson's "Bioenergetics" better. Bioenergetics is more focused on cellular respiration, but is more fun to read (IMO).
 
  • #93
One128 said:
More accurately, entropy (in thermodynamic sense) does not quantify how many states the particles do occupy, but how many states they can occupy, in a given macrostate.

But isn't that what we're doing to the stick by applying voltages? Changing the amount of available particle states. We're controlling the shape of the gradient of the energy potential that the particles are in, that's how we affect the particles. The particles do their thing after that, falling into particular states with particular probabilities given the new potential. Since we can predict the states, given a certain potential, we can utilize this information to make a physical register that holds the information of 1 or 0.

From an engineering standpoint, there's also no reason to believe that two different system of particles can't make the same 0, since there's so many particles involved. In laymen terms, it could sum up to "a voltage greater than 5 means 1, and voltage less than 5 means 0" but physically, we're talking a huge number of different systems of particles that will pass for a 1 (or 0).

(If everything else was the same and construction details didn't entail inherent energy difference between configurations.)

Well, this may be bordering on philosophy, but I believe all information is necessarily being represented by a physical system. I can't think of any information that's not.
 
  • #94
Andy Resnick said:
Now, before you object that the information in the code is 'observer dependent' becasue I chose english text, the above argument can be brought back to 1's and 0's by me sending you a (binary) message which is the sequence of coin flips, 1 = heads, and 0= tails. There are many more possible results of 30 coin flips than 1 coin flip, although you lose the notion of 'predictability'.

The entropy is defined the exact same way it usually is in statistical mechanics: S = -Sum(p log p).

Entropy is the measure of uncertainty, and as such it is ultimately defined the same way in information theory as in statistical mechanics: as a measure of uncertainty. I believe I may have mentioned that a few times. The crucial thing is: the uncertainty about what, given what. That makes all the difference. Entropy depends on what you know and what you allow to change.

What is the entropy of a binary message which you know nothing about, not even its length, not even an upper limit of its length? It's infinite. What is the entropy of a message which you know to be 30 bits long, but nothing else about it? It's 30 bits. What is the entropy of a 30 bit message that you know to contain an odd number of ones? It's 29 bits. What is the entropy of a 30 bit message that you know to have the same number of ones and zeroes? About 27.2 bits. What is the entropy of a 30 bit message that you know encodes an English word in some particular scheme? You could work that out, probably something between 10 and 20 bits. What is the entropy of a 30 bit message that you know someone produced by tossing a coin? It's 30 bits. What is the entropy of a 30 bit message the contents of which you know? It's zero.

All of the messages mentioned in the last paragraph can in fact be the very same message: 100100010100001100110111101110. Entropy of this message depends entirely of what you know and don't know about it. That's what entropy measures - your uncertainty, given what you know.
 
  • #95
I don't understand what you are saying. For example, if I tell you I am sending you the results of 30 coin tosses, how can you, *a priori*, tell me bit #10 is a '1'? For that matter, how can you tell me the message has an odd number of '1's?

The missing part is the idea of *measurement*. The entropy changes once you perform a measurement.
 
  • #96
Andy Resnick said:
I don't understand what you are saying. For example, if I tell you I am sending you the results of 30 coin tosses, how can you, *a priori*, tell me bit #10 is a '1'? For that matter, how can you tell me the message has an odd number of '1's?

If I know you're sending me the result of 30 coin tosses, and nothing else, then the entropy of such a message - until I receive it - is 30 bits, and I can't tell the value of any bit. After I receive it, the entropy of the same message is zero, and I can tell the value of every bit. If I don't know beforehand that your message has an odd number of ones, then I can't tell you until I receive the message. If I know it beforehand, I can tell you beforehand.

Entropy depends on what you know. If you assume that you know nothing except the length of the message, then every 30-bit message will have an entropy of 30 bits. That doesn't change the fact that if you know more (for example if you know details about the process producing the message that makes some outcomes less likely - for example, if you know the process only produces texts in English), then the entropy of an identical message can be different.

Andy Resnick said:
The missing part is the idea of *measurement*. The entropy changes once you perform a measurement.

Entropy is a measure of your uncertainty. If something reduces that uncertainty, it reduces entropy. Measurement is a vague term; if by "measuring" you mean reading bits of the message, then sure, that eliminates uncertainty about the message and reduces entropy. If by "measuring" you mean determining statistical properties of the message source, then again, that can reduce uncertainty and entropy. - If you measure something that doesn't reduce your uncertainty, then it doesn't reduce entropy.

Anything that affects probability distribution affects entropy, and it's not limited to "measurement". For example, if you have gas filling a container, and the same amount of gas filling half a container (held in place by some barrier), the entropy of the latter is less. Not because you measured the location of individual particles, but because you restricted their possibilities. Although you still don't know their exact location, you know something about them - that they are not in that part of the container. That reduces your uncertainty and entropy. - Similarly, if the contents of a binary message is somehow restricted, the entropy of the message will be less assuming that restriction.
 
  • #97
I am not disagreeing with a single word you said here- when you receive the message (or read the memory stick), the entropy (of you + the memory stick, since the two of you are isolated from the rest of the universe) changes.
 
  • #98
Is there an established physical basis for information theory?

Anytime we have two physical systems (of particles) interacting that we can frame thermodynamically, could we use all the techniques of information theory to discover more about their system?

Could we, in a reductionist way, reduce all applications of information theory to physical processes? If we did, would the two theories produce the same entropy?
 
  • #101
Hi Andy,
Andy Resnick said:
Parenthetically, I am opposed to the reductionist approach-
Do you deny that the entropy of any physical system can be determined from the simple summation of the entropy of the individual parts? I believe that's what one has to defend in order to suggest that entropy can vary depending on the sequence of the physical states of a system (ie: the information entropy). But that's not true; the entropy of a physical system is a simple summation of the entropy of its parts. For example, if the physical state of a 1 has the entropy value of 1 J/K and the physical state of a 0 has the entropy value of 0.1 J/K, then regardless of how the 1's and 0's in a system are arranged, if two systems have the same number of 0's and 1's and the two systems have the same, uniform temperature, the two systems also have the same entropy. For example, there is no difference in the entropy of the following two systems:
1111100000
1010010011

It may be difficult or impossible to pin down entropy values for the physical states of switches, so this may not seem obvious at first. However, the storage of information doesn't depend on the use of switches or flash drives. We could equally store information in the form of a pressurized gas. We could have small containers with pressure switches attached to them and all we'd need to do is to pressurize or depressurize the containers so the pressure switch could change state. The entropy of the gas in any given container is dependant only on the gas in that container, and not on the entropy of the gas in any other container. Similarly, I'm going to assume the entropy of any pressure switch correlates to one of two positions. Any remaining wiring also has the same entropy state when there is no current flowing through it (ie: when the system is "sitting on a table").

Note also that I'm assuming for this particular thought experiment, that the system has come to thermal equilibrium with its environment and is of uniform temperature. If one contests that the temperature must change when the information is read, I would agree*. When this system is 'read', we pass electric current through the wires and the wires must heat up. And if we change the state of any of the pressurized gas containers, there is also a change in the temperature of those containers. But once the system comes back to equilibrium with the environment (assuming an environment with an infinite thermal mass) the total entropy is still a simple summation of the entropy of the individual parts.

Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.


*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.
 
  • #102
Q_Goest said:
Hi Andy,

<snip>
For example, if the physical state of a 1 has the entropy value of 1 J/K and the physical state of a 0 has the entropy value of 0.1 J/K, then regardless of how the 1's and 0's in a system are arranged, if two systems have the same number of 0's and 1's and the two systems have the same, uniform temperature, the two systems also have the same entropy. For example, there is no difference in the entropy of the following two systems:
1111100000
1010010011

<snip>

Also, if information entropy correlates to physical entropy, I'd still like to know what that correlation is. I've gone through the Shannon entropy paper and don't see that. Seems to me that information entropy is a useful analogy, but it doesn't correspond in any way to real entropy. I'd expect there to be a mathematical correlation such as between dynamic viscosity and kinematic viscosity (ie: Kv = Dv / rho) but I don't see any such correlation.

*I think we could expand on the concept of how energy is required to 'read' information, but that's probably out of scope for now at least.

I wonder if the conceptual error arises because you 'cheated'- you completely specified both bit sequences. Communication and information inherently require the idea of 'choice' (for the sender) and 'uncertainty' (for the recipient).

Again, the entropy per bit is kT ln(2). As another angle, think about communication over a noisy channel: Johnson noise. What is the noise temperature of a hot resistor? What does that mean for uncertainty? How does the noise temperature impact how fast a bit stream can be transmitted? How does noise temperature relate to 'thermodynamic' temperature?
 
  • #103
Landauer and Bennett come to mind. Zeroing the memory cost you free energy.
See:
http://en.wikipedia.org/wiki/Landauer's_principle
http://en.wikipedia.org/wiki/Entropy_in_thermodynamics_and_information_theory
Szilard's engine, Landauer's principle

So if you take an idealized USB stick with an initial random state and zero it (or write some predefined sequence) that would cost you minimum Nbits * kT ln 2. Now that free energy doesn't just disappear, the new total energy of the stick is going to be higher (GR stress tensor would change). Energy gravitates, so you can use GR equivalence principle (E=mc^2) and calculate the increase of the equivalent mass of the USB stick.

Conclusions, for an idealized USB stick:
a) you have a USB stick with unknown to you content, you fill it with the data (or zeros) - 'mass' will increase.
b) you have a USB stick with known to you content, you change it to other known content - 'mass' will not change.
b) you have a USB stick with known to you content, you let it deteriorate - 'mass' will decrease.
 
Last edited:
  • #104
dmtr said:
Conclusions, for an idealized USB stick:
a) you have a USB stick with unknown to you content, you fill it with the data (or zeros) - 'mass' will increase.

First, any increase in energy/mass would be due to increased temperature; after the heat dissipates and the stick returns to environment temperature, you'll find its energy/mass unchanged. - That there isn't a lasting increase in energy is obvious if you consider that an idealized stick with a single bit can only ever be in two macrostates (keeping all variables but memory contents constant), so when you pass it down a row of writers, none knowing what the previous one wrote, its mass can't increase more than once in a row.

Second, it's not required that the temperature of the stick increases at all. Energy must be spent to erase information; theoretically, with proper construction, you could fill the stick with data without erasing anything in the stick - for example, you could use reversible operations to exchange your new contents with the one on the stick, and erase the newly acquired previous stick contents in your own memory later as it suits you, increasing your own temperature rather than that of the stick.

Third, change in entropy only implies change in energy under certain assumptions (namely the fundamental assumption of statistical mechanics) that are not satisfied in constructs like Szilard's engine or the Maxwell's demon. Consequently, in these scenarios, there is a (subjective) change in entropy without a change in internal energy. (Yes, it's counterintuitive because it seems to go against the second law, and no, second law is not broken when these scenarios are properly analyzed.) The situation with the stick would be the same - any subjective change in entropy merely due to subjective change in Shannon's entropy of the data contents would not imply a change in the stick's energy/mass.
 
  • #105
One128 said:
First, any increase in energy/mass would be due to increased temperature; after the heat dissipates and the stick returns to environment temperature, you'll find its energy/mass unchanged. - That there isn't a lasting increase in energy is obvious if you consider that an idealized stick with a single bit can only ever be in two macrostates (keeping all variables but memory contents constant), so when you pass it down a row of writers, none knowing what the previous one wrote, its mass can't increase more than once in a row.

Second, it's not required that the temperature of the stick increases at all. Energy must be spent to erase information; theoretically, with proper construction, you could fill the stick with data without erasing anything in the stick - for example, you could use reversible operations to exchange your new contents with the one on the stick, and erase the newly acquired previous stick contents in your own memory later as it suits you, increasing your own temperature rather than that of the stick.

Third, change in entropy only implies change in energy under certain assumptions (namely the fundamental assumption of statistical mechanics) that are not satisfied in constructs like Szilard's engine or the Maxwell's demon. Consequently, in these scenarios, there is a (subjective) change in entropy without a change in internal energy. (Yes, it's counterintuitive because it seems to go against the second law, and no, second law is not broken when these scenarios are properly analyzed.) The situation with the stick would be the same - any subjective change in entropy merely due to subjective change in Shannon's entropy of the data contents would not imply a change in the stick's energy/mass.

Yes. I agree. But I was talking about idealized memory which can be in 3 states - undefined/zero/one. For that type of memory theoretically there is a change in internal energy upon the transition between the initial undefined state and zero or one.

As to the real USB stick, as far as I remember for FLASH memory we simply trap some electrons on top of the transistor gate. To estimate the change of mass we can consider the change in number (and hence the total mass) of these electrons. But this is rather boring.
 
Last edited:

Similar threads

Back
Top