Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
In a free expansion, I know that we cannot use the equation dS=dQ/T...(1). Instead we use dS>dQ/T...(2).
The question is that why we can use △S=ncᵥln(T_f/T_i)+nRln(V_f/V_i) , which is derived from the equation(1), to calculate the entropy change? Shouldn’t it be a inequality too?
Is entropy consistent from all reference frames? For an observer at the surface of a black hole, a finite amount of time would pass, but the observer would observe an unbounded amount of time passing for the outside universe, hence from his/her reference frame, information, entropy of the...
Hello,
I am trying to figure out where my reasoning falls apart in this thought experiment:
To determine if a process "A" is reversible (or at the very least internally reversible), I try to picture a reversible process "B" that involves only heat transfer and links the same two endpoints that...
Howdy,
Say you've got two highly reflective mirrors forming a cavity. Some broadband light goes in, but only narrowband light comes out. Entropy is definitely decreased as far as the photons are concerned. Where does it go?
This has been bugging me. I have a partial solution I was hoping you...
In Brazil Nut effect /Granular convection the large grains move upward and the smaller ones go downward. This sorting is supposed to reduce the multiplicity of this system. But according to the second law of thermodynamics, entropy and multiplicity of the system should increase.
I am looking...
Carlo Rovelli described in "The Order of Time" that
"Living beings are made up of similarly intertwined processes. Photosynthesis deposits low entropy from the sun into plants. Animals feed on low entropy by eating. (If all we needed was energy rather than entropy, we would head for the heat of...
Homework Statement
[/B]
Strap in, this one's kind of long. (This problem is from 'Six Ideas That Shaped Physics, Unit T' by Thomas A Moore, 2nd edition. Problem T6R2.)
Imagine that aliens deliver into your hands two identical objects made of substances whose multiplicities increase linearly...
A quantum system goes from an uncertain to a certain state upon measurement.This indicates a decrease of entropy--is there a corresponding increase of entropy elsewhere(environment/observer)?Is there any work done on the system in the act of measurement?
It is sometimes said that entropy is "unlikely" to return to the "pattern" that it came from, for instance: if we have a vat with blue gasmolecules and white gasmolecules separated by a slit, if we remove the slit, the blue and white molecules will mingle, unlikely to return to their separated...
There is something that is unclear to me, and because entropy bounds and their violations were discussed in the other thread, I thought it is a good opportunity to learn something. The problem is essentially a matter of impression. The statements go roughly in the following way: for a system...
Why is there a tendency to use the concept of entropy to explain everything from relativity to quantum mechanics. Why people think this concept is so satisfactory?
Jacob Bekenstein asserts that the entropy of a black hole is proportional to its area rather than its volume. Wow.
After watching Leonard Susskind's video 'The World as a Hologram', it seems to me that he's implying that we are all black hole stuff. Perhaps we (our galaxies and their black...
I have read a bit from the book Cycles of Time(Penrose), and I wondered whether an increase in entropy in one part of the Universe, lead to a decrease in entropy in other parts, and maybe the universe's expansion is an attempt by the Universe to keep entropy at the same level.
And eventually you...
Entropy is always increasing, say the thermodynamicists, and the increase will ultimately caue a "heat death" of the universe.
But gravity seems to contradict this. Gravity, by clumping matter together, always engenders a decrease in entropy. Indeed, some cosmologists propose an eventual...
For ex. if two particles close to each other require n bits of info to describe them, why does it take n bits to describe them when they are far apart? Shouldn't the information content be the same for the macrosystem?
The resolution for Maxwell's demon paradox is that the demon has limited memory and the demon will eventually run out of information storage space and must begin to erase the information it has previously gathered. Erasing information is a thermodynamically irreversible process that increases...
Hi again Physics Forums! Last time I was here, I was an undergrad student. Now I almost finished a PhD in quantum information and became a high school teacher.
I've never properly learned thermodynamics. I'm now trying to get it on the same level, that I understand the other topics in classical...
Is there an expression similar to the Sackur-Tetrode equation that describes the statistical entropy of fermions or bosons, maybe for the electron gas in a metal or the photon gas in a cavity?
Homework Statement
I'm attempting to calculate the translational entropy for N2 and I get a value of 207.8 J/Kmol. The tabulated value is given as 150.4 and I am stumped as to why the decrepancy.
T = 298.15 K and P = 0.99 atm and V = 24.8 L
R = 8.314 J/Kmol[/B]
Homework Equations
Strans =...
I have read that the early universe had a very low entropy. I don't understand why. A giant ball of plasma at billions of degrees K with particles moving in all directions. It seems like the definition of total disorder. Why is the entropy considered low?
I find my self quite confused about some aspects of the concept of entropy. I will try to explain my confusion using a sequence of examples.
All of the references I cite are from Wikipedia.
Ex 1. Does the following isolated system have a calculable value for its entropy?
Assume a spherical...
Hello,I would like some help for a problem
Homework Statement
Initially:At t=0 [/B]the cylindrical capacitor of capacitance c=\frac{\epsilon s}{d} (d the distance between the 2 electrodes and s their surface; \epsilon = \epsilon(T) is the dielectric permittivity) is discharged and we close the...
Making use of the partition function, it is straight forward to show that the entropy of a single quantum harmonic oscillator is:
$$\sigma_{1} = \frac{\hbar\omega/\tau}{\exp(\hbar\omega/\tau) - 1} - \log[1 - \exp(-\hbar\omega/\tau)]$$However, if we look at the partition function for a single...
Homework Statement
I'm asked to compute the molar entropy of oxygen gas @ 298.15 K & 1 bar given:
molecular mass of 5.312×10−26 kg, Θvib = 2256 K, Θrot = 2.07 K, σ = 2, and ge1 = 3. I'm currently stuck on the vibrational entropy calculation.
Homework Equations
[/B]S = NkT ∂/∂T {ln q} + Nk...
I have my first question. It's about entropy in the Carnot cycle and I'll try to be direct.
The equal sign in the Carnot cycle efficiency equation is related to the fact that the total entropy doesn't change at the end of the whole cycle (being related to the fact that the heat exchanges occur...
Homework Statement
Box divided by a partition into two equal compartments containing ideal gas.
Each compartment is having volume V ,temp T and pressure P
1.entropy of the system when the partition is given?
2.entropy of the system when the partition is removed.
[/B]Homework EquationsThe...
Homework Statement
We put 1kg iron of temperature 100 Celsius into container with 1kg of ice, temperature 0 Celsius. What is state of the system after reaching equilibrium? Calculate change of entropy.
Coefficient of melting of ice (c_L) is 330 kJ/kg, coefficient of heat transfer of iron (c_I)...
Homework Statement
Calculate the change in molar entropy of steam heated from 100 to 120 °C at constant volume in units J/K/mol (assume ideal gas behaviour).
Homework Equations
dS = n Cv ln(T1/T0)
T: absolute temperature
The Attempt at a Solution
100 C = 373.15 K
120 C = 393.15 K
dS = nCvln...
Hello,
If two bodies, who say start with ##T_{cold}=T_c## and ##T_{hot}=T_h## and then they are brought in contact with one another and then after some time they both have the same temperature. What would be the entropy of the entire system?
Also another quick question, I've looked at some...
In deriving the Carnot Efficiency, the assumption is made that theoretically most efficient engine will generate no net entropy, meaning that the entropy that enters the system during heat absorption must equal the entropy that leaves the engine during heat rejection. Why is the case? Why would...
Suppose you have an experiment that measures the property of an atom as a whole, maybe you can put it through a double-slit or measure its spin, whatever. Presumably that will collapse the wavefunction that you used to describe the atom in that experiment. Would this entail that in the process...
Hello everyone !
I try to find the expression of the time derivative of the entropy for the CMB (photon gas) but I am stuck with the calculations.
We are in the matter-domination area and at the present time (Ro=1). No radiation and no vacuum, only the curvature. The different equations are...
Homework Statement
The change in entropy is zero for:
A. reversible adiabatic processes
B. reversible isothermal processes
C. reversible processes during which no work is done
D. reversible isobaric processes
E. all adiabatic processes
Homework Equations
## dS = \frac{dQ}{T} ##
The Attempt...
I'm looking for a book to help me understand a project I'm working on measuring the magnetocaloric effect. I'd like to understand a bit more about the link between magnetism and entropy. I'm a third year bachelor student so I've studied no quantum mechanics (yet), but I'm not against doing so if...
My somewhat ropey understanding of entropy is that it is a measure of order/disorder and that in a closed system entropy always increases. Was discussing it with my teenage daughter. Whilst trying to convey my limited understanding it struck me that if the universe is contracting (we had also...
Homework Statement
The unitary time evolution of the density operator is given by
$$\rho(t)=\textrm{exp}(-\frac{i}{\hbar}Ht)\,\rho_0 \,\textrm{exp}(\frac{i}{\hbar}Ht)$$
General definition of entropy is
$$S=-k_B\,Tr\,\{\rho(t) ln \rho(t)\}$$
Proof: $$\frac{dS}{dt}=0$$
Homework Equations
I am not...
If the universe keeps expanding and eventually ends in a "big freeze" or heat death, does this contradict the third law of thermodynamics?
The third law of thermodynamics states that a crystal at absolute zero has zero entropy. Since the entropy of the universe can never decrease, as the age...
Why can sometimes entropy remain constant with increase of temperature and vice versa?Entropy implies transfer of heat and heat must increase with temperature.I am unable to intuitively understand.
Why is entropy lost by hot water less than the entropy gained by the cold water?From perspective of energy,why is it better to take water and heat it to a temperature than it is to mix hot water and cold water to get a particular temperature.
If I have and object at a different temperature than the thermal/heat reservoir (whatever it's called) an heat flow will take place. If I write the entropy balance for the thermal reservoir it will be:
##\frac {dS} {dt} = \frac {\dot Q} T + \dot S_{gen}##
Now I remember something my professor...
Most textbooks include an example of entropy of mixing that involves removing a partition between two (in principle) distinguishable gases, and compare this to the case where the two gases are indistinguishable. What I’ve not yet been able to figure out is what the consequences of this...
My question is regarding a few descriptions of Entropy. I'm actually unsure if my understanding of each version of entropy is correct, so I'm looking for a two birds in one stone answer of fixing my misunderstanding of each and then hopefully linking them together.
1) A measure of the tendency...
Definition 1 The von Neumann entropy of a density matrix is given by $$S(\rho) := - Tr[\rho ln \rho] = H[\lambda (\rho)] $$ where ##H[\lambda (\rho)]## is the Shannon entropy of the set of probabilities ##\lambda (\rho)## (which are eigenvalues of the density operator ##\rho##).
Definition 2 If...
It is assumed that entropy increases in the universe. However, the fluid and acceleration equations are derived assuming that.
TdS=dE+pdV where dQ = TdS.
But dQ is usually set equal to zero to derive these equations. Hence since T is non zero, dS should be zero and so there would be no...
Every year since the 90's I come back to some of my pet topics in physics, like statistical physics.
This time it was the reading of a Wikipedia article on entropy that surprised me.
The derivation of the second law from the Gibbs entropy was unknown to me.
I didn't know how heat, how change of...
Homework Statement
A 2.45 kg aluminium pan at 155 C is plunged into 3.58 kg of water. If the entropy change of the system is 162 J/k, what is the initial temperature of the water?
Homework Equations
Q = mcΔT ΔS=mcln(T_2/T_1) Q_water + Q_Aluminium = 0
c water = 4184 J/kg*K c aluminium = 900...
Homework Statement
Find the ∆S per mol between liquid water at -5 ºC and ice at -5ºC at 1020hPa
Data:
∆CP,m fusion = 37,3 J K-1 mol-1
∆fusH = 6,01 kJ mol-1
The answer is 21.3 J/K mol
Homework Equations
Usually I solve these problems by steps when they are at P=1 atm but since its at P=1020...