How Does Erasing Computer Memory Affect Entropy?

  • Thread starter jlmac2001
  • Start date
  • Tags
    Entropy
In summary, when the computer erases or overwrites a gigabyte of memory, it must create a certain minimum amount of entropy in order to ensure that the information is completely destroyed. This entropy must then be dumped into an environment at room temperature in order to create heat. Although the amount of heat created is not significant, it is still worth considering.
  • #1
jlmac2001
75
0
I don't get want a computer has to do with entropy. Can someone explain this question?

A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.

A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.


B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
 
Physics news on Phys.org
  • #2
someone please reply please

I really need help with this. It's due tomorrow. Help please!
 
  • #3
Essentially, all of the energy used by a computer to complete its tasks is released into the environment as heat/entropy.
 
  • #4
It would help if you were to write out the specific definition of "entropy" you are using. (I know several different ones that apply to different situations.)
 
  • #5
A good book that uses the computer to explain entropy is Theory of Everything by Stephen Hawking.

Your library might have it, so it's worth a shot.

Its fairly short, and informational, so it's worth reading it all. If you're lazy, you can find it in the last half somewhere.
 
  • #6
Hmm...this indeed appears to be quite an interesting question. (I am currently studying Thermal Physics!)

I think here we are meant to take the thermodynamic definition of entropy. More specifically, I think we are dealing with the equation S = k*ln(W), where S is entropy, W is multiplicity (Number of microstates for a particular macrostate over the total number of microstates) and k is Boltzmann's constant.

The trick here I think comes in calculating W. The total number of microstates is (I think):
(2^8)*(2^10)*(2^10)*(2^10) = 2^38. However, I am also not quite sure how to explain why there has to be a minimum amount.

Regards,
The Keck
 
  • #7
The point is that information cannot be destroyed, according to the deterministic and time symmetric laws of physics. Hence, when you instruct the computer to delete a gigabyte of information from its memory, each bit of that information must actually be transferred into some other form (typically heat, ie. random thermal motion).

Similarly, a binary logic gate with two inputs and only one output must create a bit of entropy for each cycle, so you can find papers that calculate the amount of heat that a particular CPU would produce if had its maximum theoretical energy efficiency.
 
Last edited:
  • #8
So essentially what you are saying is that jlmac2001's questions A & B can not be calculated using the formula I stated in my previous post [S = k*ln(W)], but will have to involve other more complicated ones?

If that is so, it seems to me that we are reading too deeply into the question. I think this question can be solved without having to discuss things is the time symmetric laws of physics.

Thanks for the quick reply!

Regards,
The Keck
 
  • #9
Keck, note I actually did not say that any more complicated equations must be used; I merely explained in physical terms the reason why that particular one is relevant, since nobody else (except perhaps JR) had done so. You on the other hand advised jlmac to "plug and chug", rushing to put numbers into an equation you had found without knowing "..how to explain why.." it works. Such demonstrates only an ability to pattern-match the symbols in the question with the symbols in textbook equation lists.
 
Last edited:
  • #10
cesiumfrog,

Sorry if I sounded like a computer, just churning numbers without thinking. The reason was because me and my friends had already discussed (Or at least try to understand) the physical aspects of the question.

What I believe is happening is that when the memory is erased, you reduce the 'disorder' in the data (Local entropy), but would increase the total entropy. So reducing the local entropy requires energy and this gives rise to increasing total entropy. (Can you argue that the process is irreversible and so entropy has to increase?)

However, I am not sure if the heat created is significant.

Thanks and sorry again!

Regards,
The Keck
 

FAQ: How Does Erasing Computer Memory Affect Entropy?

How does the concept of entropy relate to thermodynamics?

Entropy is a measure of the disorder or randomness of a system, and it is closely related to the second law of thermodynamics. This law states that the total entropy of a closed system will always increase over time, meaning that the system will tend towards a state of maximum disorder.

What is the relationship between entropy and energy?

Entropy and energy are closely related concepts. In thermodynamics, entropy can be thought of as the amount of energy in a system that is unavailable to do work. As entropy increases, the availability of energy decreases, and vice versa.

How does entropy affect chemical reactions?

In chemical reactions, entropy plays a crucial role in determining the direction of the reaction. In general, reactions tend to proceed in the direction that increases the overall entropy of the system. This is because the increase in disorder of the products leads to a higher overall entropy.

What is the connection between entropy and information theory?

Entropy also has a connection to information theory, which is the study of how information is processed and transmitted. In this context, entropy is used to measure the uncertainty or randomness of a system. The higher the entropy, the more uncertain or unpredictable the system is.

How does the concept of entropy apply to the universe as a whole?

The concept of entropy also has implications for the universe as a whole. The second law of thermodynamics suggests that the universe is constantly moving towards a state of maximum disorder, or maximum entropy. This has led to theories about the ultimate fate of the universe, such as the heat death hypothesis.

Back
Top