Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Homework Statement
In a previous problem I had to find the entropy of a black hole where I ended with this:
S_{BH}=\frac{8 \pi^2 G M^2 k}{h c}
Now I am to find the temp, given the energy of a black hole is mc2.
Homework Equations
T=(\frac{\partial S}{\partial u})^{-1}
The...
Homework Statement
The entropy of an ideal paramagnet is given by S=S_{0}+CE^{2}, where E is the energy (which can be positive or negative) and C is a positive constant. Determine the equation for E as a function of T and sketch your result.
Homework Equations
[tex]
\frac{1}{T}=\frac{\delta...
I don't want to argue about whether the notion of "wave function collapse" is a good way of understanding quantum mechanics, or not. For the purposes of this discussion, let's just adopt uncritically the naive approach to quantum mechanics, that:
Between measurements, the system evolves...
A formula for entropy:
dS=dQ/dT
This is a formula that represents the rate of change of heat with respect to temperature.
Q=mcdT
This is another formula that states how the temperature changes as heat is added.
from this can we say that entropy is just specific heat capacity * Mass...
So, until now I know:
(DV/DS)p=(DT/Dp)s=a*T/cp*(rho) (enthalpy)
(Dp/DT)v=(DS/DV)t=-a/k (helmoltz)
(DS/Dp)t=-(DV/DT)p=-Va (gibbs)
a=expansion coefficient
k=isothermal compression coefficent
cp=heat capacity at constante pressure
I want to deduce DT/DV at constant entropy=(DT/DV)s. BUT...
I'm having a hard time reconciling two ideas in physics.
One is that systems tend towards the maximum amount of disorder, or "entropy always increases". And, when two systems are brought together they are likely to be found at the energy E* which maximizes the number of states of the...
Hello
I may well be all wrong about this but I am trying to understand from a microscopic point of view why Entropy is a concave function of internal energy. I found this in the following .pdf:
http://physics.technion.ac.il/ckfinder/userfiles/files/avron/thermodynamics_potentials.pdf
I...
Hello dbmorpher here,
I was looking at some articles on entropy and wondered, could this be used to calculate time?
Entropy is known as the arrow of time, figurative I know but could actually be used to measure.
If there was a device that calculated the entropy in a set system couldn't it...
Using a simple example: air, why does it naturally form a uniform mixture? Brownian motion wouldn't differentiate between different gases that make up the air.
N.B. My set question is actually almost identical to another posted here https://www.physicsforums.com/showthread.php?t=368672 but my question is not answered there.
Homework Statement
The temperature inside a refrigerator is 275K. The room containing it has a
temperature of 295K. In...
Trying an example from a textbook but I don't understand Legendre transform at all. "construct Legendre transforms of the entropy that are natural functions of (1/T, V, n) and (1/T, V, μ/T)". I don't really understand where to start. An example prior to this exercise just states: A = E - TS =...
I just have this confusion which is completely eating me up. They say entropy of a system is a state property. Then they say that for a completely isolated system, entropy either increases or remains zero depending on the process being irreversible or reversible.
So, let's say for an...
Hello
I was wondering, with regards to the Tds equation Tds = de + pdv:
1. All of my textbooks state that integrating this equation, although derived for a reversible process, will give the entropy change regardless of the process or whether or not the process is reversible. However, I...
Homework Statement
We have a piston in a cylinder containing Helium. The initial states are:
P1=150kPa
T1= 20°C
V1=0.5m3
Following a polytropic process, the final states are:
P2=400kPa
T2=140°C
V2 is unknown.
We're also given R=2.0769 kPa.m3.(kg.K)-1
And Cp=5.1926kJ.(kg.K)-1
As...
We know that if we brought a body at temperature T' in contact with a heat reservoir at temperature T (where T' < T) entropy of the universe increases.
What if we brought a body at temperature T' in contact with a reservoir at temperature T ( where T' > T)?
It is supposed also that the entropy...
Entropy -- can only calculate entropy only at constant temperature?
Is that possible to calculate entropy when dq=+100J , temperature change from 271K to 273K ?
Ice melt until 273K , not an isolated system.
Or I can only calculate entropy only at constant temperature?
Thank you
There have been some discussions here as to what type of processes create entropy rather than just move it around. It is established that a gradient of temperature can create entropy. However, the issue moved to partial pressure, and then even away from that.
The previous discussion...
"change" in entropy
While reading a textbook on introductory thermodynamics , I came across the following-
"When a system is in equilibrium, the entropy is maximum and the change in entropy ΔS is zero "
And also
"We can say that for a spontaneous process, entropy increases till it reaches a...
Hi there, I have to solve this problem:
Use the following data to estimate the molarity of a saturated aqueous solution of ##Sr(IO_3)_2##
So, I think I should use the Van't Hoff equation in some way, but I don't know how.
I also have:
##\Delta_r G=\Delta G^o+RT\ln K##
##K## is the...
Homework Statement
1 kg of water with specific heat (C) of 4180 Joules /kg/degree is given at 0°C. It is taken to 100°C by two methods :-
(i) by bringing it in contact with a reservoir at 100°C.
(ii) by bringing it in contact with a reservoir at 50°C , and then with another reservoir at...
From theory, we know that Boltzmann entropy for a given distribution, defined through a set of occupancy numbers {ni}, of the macrostate M, is given by:
S=k log(Ω{ni})
where omega is the number of microstates for the previously given set of occupancy number, {ni} . Assuming that the system...
Entropy is a measure of energy availiable for work ?
"Entropy is a measure of energy availiable for work". Can someone explain this to me? Give some examples that show in what sense it is true. It has to come with a lot of caveats, proviso's etc. because its simply not true on its face.
I...
Homework Statement
I was asked to prove that (dP/dS)T (subscript T ie, at a constant temperature) equals κPV ("kappa"PV, or, isothermal compressibility x pressure x volume).
By using the Maxwell relation -(dS/dP)T = (dT/dV)P I got an answer of -1/(alpha*volume) but cannot find out how to...
How do I go from the entropy of a system, S(U,V,N), to its internal energy, U(S,V,N)?
For instance, for an ideal classical gas, we have
S=(3/2)N*R*ln(U/N) + N*R(V/N) + N*R*c
where R is the Boltzmann constant, N is the particle number, V is the volume and "c" is a constant.
How can I...
Hi PF,
I would like to implement different random number generators using AVR microcontroller (both PRNG and TRNG). So I would like to get suggestions about different sources of entropy for TRNG and algorithms for PRNG. Also wanted to test the randomness.
And What is chaos theory...
For a system that is completely isolated from its surroundings, basic thermodynamics requires that the quasi-static heat flux dQ and the entropy change dS be related by:
dQ = TdS
and since the system is isolated...
Given a system of multiple free electrons. Say 2 of the electrons accidentally collide and become joined (opposite spin) by the weak force. So, the positions of those 2 electrons are now correlated.
Was the total entropy of the system reduced by those 2 electrons joining?
Thank you...
A friend asks me this. If considering the equation: ∫\frac{dQ}{T}, then it is technically feasible to work out some forms of expressions with measurable physical quantities like temperature and specific heat, therefore it is possible to work out a precise value for entropy change. But is there a...
Homework Statement
Hello everyone. My problem is as follows: In a spontaneous process where two bodies at different temperatures T_{1} and T_{2}, where T_{1}>T_{2}, are put together until they reach thermal equilibrium. The number of atoms or molecules of the first is N_{1} and N_{2} for the...
I am asking this question because I am trying to understand what entropy is and I just can't seem to get it clear.
Now I think the asnwer is 0.5.
The pressure of the can is 30.
the Atmospheric pressure is around 15.
You divide the pressure in the can by the atmospheric pressure and you...
Homework Statement
I have the equation
Z=1/N!h3N∫∫d3qid3pie-βH(q,p)
How can I get the entropy from this equation assuming a classical gas of N identical, noninteracting atoms inside a volume V in equilibrium at T where it has an internal degree of freedom with energies 0 and ε
What...
Homework Statement
the problem that i was give as homework, is the question "how do we reduce residual entropy experimentally?"
Homework Equations
the residual entropy occurs when the calculated molar entropy is greater than measured value, thus S bar calc - S bar exp = residual...
Entropy is the measure of ored and disorder. But who tells that what is order and what is disorder? Isnt it a relative or subjective thing? How to define it in general, or it can be definet only for thermodinamic systems?
If 12007 kJ of heat is lost to the surroundings with an ambient temperature of 25 degrees centigrade during a cooling process, and the ambient temperature of the surroundings is unaffected by the heat addition, what is the entropy change of the surroundings?
If Δs=∫δQ/T, then Δs=ΔQ/T=12007...
Common extensive quatities such as mass, charge, volume can be defined for general systems. I can imagine that we can measure and define them without any problem in case of any kind of complex system as well. However, I do not know the general definition of the entropy, only the thermodynamic...
Does entropy in a closed system always increase OR remain constant ( in equilibrium ).
I have a friend arguing it is ALWAYS increasing.
His latest argument was, "if no energy enters or leaves an isolated system, the availability of the remaining energy decreases."
I'm wondering if there's an expression/correction for finding the entropy of a density using histograms of different bin sizes. I know that as the number of bins increases, entropy will also increase (given a sufficient number of data points), but is there an expression relating the two? All I...
I'm a little confused about entropy and it's increase along the arrow of time. My perception of entropy is that it is a measure of order in a system and that high entropy represents dissorder. The final result given sufficient time.
From what I have read, the universe has began with low and...
Why is the standard entropy of aqueous ions negative? I thought it could be no less than 0, which represents a perfect crystal at 0 K?
Is it negative so that calculations can be performed properly? Or is it because it because ions solutes actually have less entropy than a perfect crystal?
Reading popular books (written by Hawking, Penrose, Greene, Linde, Guth and certainly many more) one finds numerous statements like
entropy was low after the big bang ... Weyl-curvature hypothesis ... entropy increases with time ... black holes violate unitarity and therefore entropy or phase...
Greetings,
I want to ask you somthing if i understood well this subject.
Lets say we have an order 1 binary source.
H(a)=-Paa*log(Paa)-Paa*log(Pab) bit/symbol.
From what i understand this is the average information of a symbol generated after an "a", like aa or ab.
Is it right?
Assuming a mini-universe with the same laws as our current one.
A gas within that universe reaches a state of maximum entropy. Would it remain in that state of maximum entropy once it is reached? Maybe the question does not make much sense. In that case, forgive my ignorance.
edit: the...
Entropy of molten lead "freezing"
Lead melts at 327.5 C.° The latent heat of melting of lead is 24.1 J/g, and the heat capacity of solid lead is 0.14 J/g °C. You take 100 grams of molten lead at a temperature of 327.5 C° and pour it on the sidewalk. The lead freezes and then comes into thermal...
Homework Statement
I am having trouble working out the change in Entropy. The question is as follows:
A mass of 1 kg of air is initially at 4.8 bar and 150 degC and it is enclosed in a rigid container. The air is heated until the temperature is 200 degC.
Calculate the change in entropy...
Homework Statement
One body of constant pressure heat capacity C_P at temperature T_i it's placed in contact with a thermal reservoir at a higher temperature Tf. Pressure is kept constant until the body achieves equilibrium with the reservoir.
a) Show that the variation in the entropy of the...
Hi everyone, I've hit a bit of a snag with part c of this problem (can't figure out how to invert a function T(ν)), so I'm starting to question whether I have the previous parts correct.
Homework Statement
Consider a system of N identical but distinguishable particles, each of which has a...
entropy = jouls/kelvin
supose 1 liter of ideal gas is allowed to freely expand into a 2 liter volume in an isolated system
the energy in the system would remain the same,
the temperature in the system would remain the same
therefore if entropy =jouls/kelvin the entropy would remain the same...