Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Any state analytic in energy (which includes most physical states since they have bounded energy) contains non-local correlations described by the Reeh-Schlieder theorem in AQFT. It is further shown that decreasing the distance between wedges will increase the entanglement as measured by a...
Homework Statement
a)A stone at 400K with heat capacity ##c_p## is placed in a very large lake at 300K. The stone cools rapidly to 300K. Calculate the entropy change, due to this process, of the stone and lake.
b)An insulated cool-box of a Carnot refrigerator at temperature T loses heat...
Homework Statement
For a box containing 1m^{3} of nitrogen at S.T.P., estimate the number of microstates which make up the equilibrium macrostate.
Homework Equations
S = Nk_{b}(ln\frac{V}{N} + \frac{5}{2} + \frac{3}{2}ln\frac{2πmk_{b}T}{h^{2}})
where the entropy of a volume, V ...
Homework Statement
5. The nuclei of atoms in a certain crystalline solid have spin one. Each nucleus can be in anyone of three quantum states labeled by the quantum number m, where m = −1,0,1. This quant number measures the projection of the nuclear spin along a crystal axis of the solid. Due...
Please see attached picture.
I need verification of my answers. I unfortunately found these problems on an old book with no answer. I would really appreciate it.
(a) Ok. For this one, I am really not sure. PLEASE help.
I get a very complicated formula.
(1+x)(x^4)/(1-x)^2=Kp.
Now, this...
I know how to get Von Neumann entropy from a density matrix. I want to get a real number from measurements that give real numbers as outcomes. (there are complex numbers in a density matrix).
So suppose Charlie sends 1000 pairs of particles in the same state to Bob and Alice. They agree to...
I was following along in my Thermodynamic textbook and began playing with some definitions. In the following formulation, I somehow managed to prove (obviously incorrectly) that dq = TdS for even irreversible processes. I was hoping someone could point out where in the proof I'm going wrong...
Homework Statement
A 3.5 kg block of cobber at 100 degrees celsius (373 K) is put in 0.8 kg water at 0 degrees celsius (273 K).
The equilibrium temperature is 30 degrees celsius (303 K).
Calculate the change of entropy for the system of cobber and water.
Homework Equations
ΔS=\frac{Q}{T}...
Ok so entropy cannot be destroyed, right? So let's say you have a reaction that decreases entropy (s<0) but it also is exothermic (h<0) and that overpowers the entropy decrease so it is spontaneous (ie h-ts=g<0). If that happens, where does the entropy go?
In the Gibbs free energy equation, does the standard change in entropy equal q(sys)/T(system)?
Or in math terms:
T(surr) * q(sys)/T(sys) = T(surr) * dS(standard)
Thus
dS(standard) = q(sys)/T(sys)
(surr) = surroundings
(sys) = systems
(standard) = at standard conditions
Homework Statement
Is boiling of egg accompanied by an increase in entropy?
The Attempt at a Solution
I guess entropy decreases because as the egg boils, the stuff inside it gets hardened and changes into a solid mass. So the disorderliness decreases and the entropy should decrease. But...
Homework Statement
The temperature at the surface of the Sun is approximately
5 700 K, and the temperature at the surface of the
Earth is approximately 290 K. What entropy change
occurs when 1 000 J of energy is transferred by radiation
from the Sun to the Earth?
Homework Equations...
Homework Statement
As a model of a paramagnet, consider a system of N fixed particles with spin 1/2 in a magnetic fiels H along z axis. Each particle has an energy e=μH (spin up) or e=-μH
Using S=kln(Ω), show that
S=k [ (N-E/e)/2 ln( 2N/(N-E/e) ) + (N+E/e)/2 ln( 2N/(N+E/e) ) ]...
"Dumping" Entropy
Hello everyone,
I am reading Daniel Schroeder's Thermal Physics book. One phrase he uses that I find particular confusing is that a system has to "dump" entropy. As one example, in chapter 5 he briefly discusses how a fuel cell functions, stating, "In the process of...
From Statsitical And Thermal Physics (Reif. international edition 1985)
160 page. (5.4.4)
S(T,V;√) = √[∫{cv(T`)/T`}dT` + Rln(V) - Rln(√) + constant]
(integral is from T0 to T , cv is specific heat)
This is a entropy of system for temperature 'T' , Volume 'V' , Moles '√' <--...
Hi..
Consider a rod which is insulated on its lateral surface, now this rod is brought in contact with a source at temperature T1 and sink at temperature T2 now a temperature gradient sets up in the rod after steady state is reached temperature at some distance X from the source end is given as...
hi to everybody out there,
entropy as i know of now is associated with heat which is basically energy in transit(for a rev. process it is indicative of unavailable part of energy) having said that what is meant by entropy change for a system as heat flows into or out of it(for instance consider...
Homework Statement
Show using Boltzmann's principle (S=k.lnW), show that with respect to changes in V and T:
dS=k.N.\frac{dV}V{}+\frac{C.dT}V{T}
Where W=T^{\frac{C}k{}}V^{N}The Attempt at a Solution
S=k.lnT^{\frac{C}k{}}V^{N}=k.lnT^{\frac{C}k{}}+klnV^{N}
S=C.lnT+N.lnV
Now I know that the...
Hello,
I'm studying for my exam for tomorrow and we solved an exercise in class , but a question was not answered and I don't know how to solve it.
Homework Statement
1 Kg of water is heated at 0 degree C is brought into contact with a large heat reservoir at 100 degrees C. When the water...
One mole of an ideal monatomic gas initially at 298 k expands from 1.0 L to 10.0 L. Assume the expansion is irreversible, adiabatic, and no work is done. Calculate delta S of the gas and delta S of the surroundings.
I know that delta dS = dq/T but q = 0 in adiabatic processes right? So does dS...
Hello,
Do living beings use food to decrease the entropy of his body?
If so, could anyone explain the process of how we do it? (or at least name a couple of keywords that I can search)
Thank you very much.
Homework Statement
Hey guys,
So I have this equation for the entropy of a classical harmonic oscillator:
\frac{S}{k}=N[\frac{Tf'(T)}{f(T)}-\log z]-\log (1-zf(T))
where z=e^{\frac{\mu}{kT}} is the fugacity, and f(T)=\frac{kT}{\hbar \omega}.
I have to show that, "in the limit of...
Homework Statement
Hey guys,
Here's the question. For a distinguishable set of particles, given that the single particle partition function is Z_{1}=f(T) and the N-particle partition function is related to the single particle partition function by Z_{N}=(Z_{1})^{N} find the following...
There is a container containing water (state 1) which is being stirred. There is a temperature rise (state 2) due to stirring. It is required to find out change in entropy of the system if the process is reversible. Since, there is no heat transfer there would be no change in entropy due to...
It is well-known that with known marginal probabilities a_{i} and
b_{j} the joint probability distribution maximizing the entropy
H(P)=-\sum_{i=1}^{m}\sum_{j=1}^{n}p_{ij}\log{}p_{ij}
is p_{ij}=a_{i}b_{j}
For m=3 and n=3, a=(0.2,0.3,0.5), b=(0.1,0.6,0.3), for example,
\begin{equation}...
Homework Statement
Ten kmol per hour of air is throttled from upstream conditions of 25°C
and 10 bar to a downstream pressure of 1.2 bar. Assume air to be an ideal gas with Cp= (7/2)R.
(a)What is the downstream temperature?
(b)What is the entropy change of the air in J mol-1K-1?
(c)What...
The 2nd law of thermodynamics state that entropy increases with time and entropy is just a measure of how hard it is to distinguish a state from another state (information theoretical view) or how hard it is to find order within a system (thermodynamic view). There are many ways to view entropy...
To preface my question, I know it is related to the Gibbs paradox, but I've read the wikipedia page on it and am still confused about how to resolve the question in the particular form I state below.
Suppose a completely isolated ideal gas consisting of identical particles is confined to a...
I've always been slightly confused by the Second Law of Thermo. For example, with Maxwell's Demon, where a demon controls the partition between two gas chambers to select all the fast moving particles into one chamber, the Second Law is not violated because the demon's actions and thought...
Homework Statement
A compressor processes 1.5kg/min of air in ambient conditions (1 bar and 20ºC). The compressed air leaves at 10bar and 90ºC. It is estimated that the heat losses trough the walls of the compressor are of 25kJ/min. Calculate:
a) The power of the compressor
b) The...
1. A photon that emerges when an electron jumps one orbital down -- will have a fixed energy
...i.e. the different between the (potential) energy of the orbitals.
However a "free/unbound" photon can have any energy level.
Is that correct?
2. What is the lowest level of energy a...
Hi,
Under standard conditions, why does water have higher entropy than helium? Isn't helium a gas? I understand that water has more atoms, but it seems more ordered and is a liquid. I'm not sure how a qualitative analysis could lead to the conclusive result that water is higher in entropy...
Homework Statement
One end of a metal rod is in contact with a thermal reservoir at 695K, and the other end is in contact with a thermal reservoir at 113K. The rod and reservoirs make up an isolated system. 7190J are conducted from one end of the rod to the other uniformly (no change in...
Homework Statement
Calculate the variation of entropy in the following processes:
a) Heating of 18 kg of water from 15 to 40ºC at ambient pressure.
b) Compression of 9 kg of water from ambient pressure to 7atm at the temperature of 15ºC.
Homework Equations
ΔS=Cp*ln(T_final/T_initial)...
For a thermodynamic system there exists a function called entropy S(U,N,V) etc.
We then define for instance temperature as:
1/T = ∂S/∂U
μ = ∂S/∂N
etc.
When taking these partial it is understood that we only take the derivative of S wrt the explicit dependece on U,N etc. right? Because...
If we have a qubit in a mixed state, let say 1/2(+><+)+1/2(-><-) and we measure it. Is then the result a pure state + or - ? If this is the case, then the entropy of the system decreases.
Now the question another way round is :
Suppose we measure a quantum system without gaining...
Hello, I am trying to understand a short literature article (doi: 10.1021/ja01635a030). I am not sure how much liberty I have to reproduce its contents here, and I can't explain it here because I don't understand it -- which is why I have this question.
I believe it is proposing that a...
I'm interested in the ultimate fate of the universe. And it seems that the most prevalent theory is the Big Freeze.
From what I can gather the BF is caused by dark energy making the universe expand to the point that stars can no longer form, resulting in cold dark space filling the universe...
Hello Community,
I have a question that I'm struggling to get clarification on and I would greatly appreciate your thoughts.
Big bang theories describe an extremely low thermodynamic entropy (S) state of origin (very ordered).
Question: Is the big bang considered to be a high or low shannon...
Homework Statement
Considering entropy as a function of temperature and volume and using the Maxwell relation;
$$ \left(\frac{\partial S}{\partial V}\right)_T = \left(\frac{\partial p}{\partial T}\right)_V$$
Show that the entropy of a vessel is given by;
$$ S= R...
Hello everyone,
I've been reviewing some concepts on Thermodynamics and, even though I feel like I am gaining a level of comprehension about the subject that I could not have achieved before as an undergraduate, I am also running into some situations in which some thermodynamic concepts seem to...
Hi everyone!
I have a little problem for an upcoming exams, and I think I need just small hints to solve it.
My problem is that I have to write about ten/fifteen pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to...
Quantum entropy and ...??
Homework Statement
My problem is that I have to write about ten pages about SUPERDENSE CODING and QUANTUM CRYPTOGRAPHY, and my professor has taken for granted that these are strongly linked to quantum entropy. He never told us why! Indeed he talked about that as...
Is entropy a measure of "disorder"?
In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or...
Hi All,
I am not sure if this is the right section to post this question but it does involve probability..so please redirect me if necessary.
I am currently looking at the Robinson et al. (2013) paper on rank vector entropy in MEG (doi: 10.3389/fncom.2012.00101). Due to my lack of...
Okay, I am considering a cycle, where the working fluid is an ideal gas, with heat capacities Cv and Cp, the cycle consists of: isochoric increase in volume, adiabatic expansion back to initial pressure and a isobaric compression back to initial conditions.
Questions:
-
q1) I am asked to...
Normally due to H+ being the reference state in solution, all 'standard molar' state variables and 'standard value of formation' state variables are 0 for it. But H2(g) has a standard enthalpy of formation = 0 and standard molar entropy of 115 Jmol-1K-1. Then shouldn't ΔG°(298) for the reaction...
This article insinuates that the physics community had forgotten Gibbs entropy long ago and has used Boltzmann entropy since. Isn't this nonsense? For me it was always clear that Boltzmann entropy is problematic...