- #1
jlmac2001
- 75
- 0
I don't get want a computer has to do with entropy. Can someone explain this question?
A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.
A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.
B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?
A bit of computer memory is some physical object that can be in two different states, often interpreted as 0 to 1. A byte is eight bits, a kilobyte is 1024 (=2^10) bytes, a megabyte is 1024 kilobyes and a gigabyte is 1024 megabytes.
A) Suppose that your computer erases or overwrites one gigabyte of memory, keeping no record of the information that was stored. Explain why this process must create a certain minimum amount of entropy and calculate how much.
B) If this entropy is dumped into an environment at room temperature, how much heat must come along with it? Is this amount of heat significant?