Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
For this,
I don't understand how we can apply the change in entropy equation for each solid since the ##\frac{dT}{dt}## for each solid will be non-zero until the solids reach thermal equilibrium. My textbook says that the ##\Delta S## for a system undergoing a reversible process at constant...
For this,
Why dose they write the change in entropy equation as ##\Delta S = \frac{Q}{T}##? Would it not better to write it as ##\Delta S = \frac{\Delta Q}{T}##, since it clear that we are only concerned about the transfer of heat in our system while it remains at constant temperature as all...
In the book "Cycles of Time" by Roger Penrose, there is a part of the explanation of entropy that I don't understand.
There are 10^24 balls, half of which are red and the other half blue.
The model is to arrange the balls in a cube with 10^8 balls on each edge.
It also divides the cube into...
In classical statistical physics, entropy can be defined either as Boltzmann entropy or Gibbs entropy. In quantum statistical physics we have von Neumann entropy, which is a quantum analog of Gibbs entropy. Is there a quantum analog of Boltzmann entropy?
I came across the following statement from the book Physics for Engineering and Science (Schaum's Outline Series).
I cannot seem to find a satisfactory answer to the questions.
Is the statement in above screenshot talking about entropy change the statement of Second Law of Thermodynamics or is...
Entropy question.
Take a finite number of identical atoms in a specific volume of space at a moment of time.
Run two thought experiments on this system
scenarios (both time independent)
1: expand the volume of space of the system instantaneously by a factor of 10. The fixed number of atoms...
Unfortunately, I have problems with the following task
For task 1, I proceeded as follows. Since the four bases have the same probability, this is ##P=\frac{1}{4}## I then simply used this probability in the formula for the Shannon entropy...
Two systems A & B (with orthonormal basis ##\{|a\rangle\}## and ##\{|b\rangle\}##) are uncorrelated, so the combined density operator ##\rho_{AB} = \rho_A \otimes \rho_B##. Assume the combined system is in a pure state ##\rho_{AB} = |\psi \rangle \langle \psi |## where ##|\psi \rangle =...
Hi,
Unfortunately I am not getting anywhere with task three, I don't know exactly what to show
Shall I now show that from ##S(T,V,N)## using Legendre I then get ##S(E,V,N)## and thus obtain the Sackur-Tetrode equation?
Hi,
Unfortunately, I have problems with the task 4
In task 3 I got the following
$$ T_f=T_ie^{\Delta S_i - c_i} $$
Then I proceeded as follows
$$ \Delta S = \Delta S_1 + \Delta S_1 $$
$$ \Delta S =c_1ln(\frac{T_ie^{\Delta S_i - c_i}}{T_1})+c_2ln(\frac{T_f}{T_2})$$
$$ \Delta S...
For now it is only about the 1 task
If the task states that:
You can approximate that their dynamics in water resembles that of an ideal gas.
Does it then mean that I can take glucose as the ideal gas and then simply calculate the entropy for the ideal gas?
For a freely expanding ideal gas(irreversible transformation), the change in entropy is the same as in a reversible transformation with the same initial and final states. I don't quite understand why this is true, since Clausius' theorm only has this corrolary when the two transformations are...
Summary: doesn't this decrease entropy ?
Cellulose is known for its hydrophilic quality, which can be explained from the polarity of its hydroxyl groups.
We all know water can overcome the force of gravity through a piece of paper you put in the water.
Correct me if I'm wrong but this is a...
If you were to condense all the energy in the universe into a point, wouldn't the temperature be very high, yet the entropy be very low? Also if you were to spread out all of the energy in the universe, wouldn't the temperature be near zero and the entropy be very high? And this makes entropy...
Summary: Trying to understand the relationship between gravity, thermodynamics and entropy, thank you.
Gravity can take a diffuse cloud of gas filling a given volume of space at equilibrium density and temperature, and turn it into a burning star surrounded by empty space. Does this mean that...
Christoph Schiller, "From maximum force to physics in 9 lines -- and implications for quantum gravity" arXiv:2208.01038 (July 31, 2022).
This paper asserts that nine propositions can be used to derive the Standard Model and GR and can point the way to quantum gravity, although he cheats a bit...
Quantum gates must be reversible.
The usual justification for this is that in QM the time evolution of a system is a unitary operator which, by linear algebra, is reversible (invertible).
But I am trying to get a better intuition of this, so I came up with the following explanation:
In order to...
Now, it's been said that the majority of the entropy in the universe resides within the cumulative entropy of black holes inside the universe. How do they know that?
Now, I'm not so interested in how they determine the black hole's entropy, I know there's a relatively simple formula for that...
Boltzmann entropy definition is given by: $$ S = k_B lnW $$ where ##W## is the weight of the configuration which has the maximum number of microstates.
This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the...
In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
If the Universe could somehow reach a state of infinite entropy (or at least a state of extremely high entropy), would all fundamental symmetries of the physical laws (gauge symmetries, Lorentz symmetry, CPT symmetry, symmetries linked to conservation principles...etc) fail to hold or be...
I was reading about thermodynamics postulates when i came over the differnetial fundamental equation:
I understand that the second element is just pressure and last element is chemical energy, but he problem is i don't understand what is the use of entropy and how does it contribute to a...
Hey to all,...
It is now generally believed that information is preserved in black-hole evaporation.
This means that the predictions of quantum mechanics are correct whereas Hawking's original argument that relied on general relativity must be corrected.
However, views differ as to how...
Is there any approach in any books out there, where we consider that in universe exists only one field, let it be called the Unified Field (UF), in which all of the known fields (gravitational, EM field, quark field, gluon field, lepton field, Higgs Field, e.t.c.) are just components (pretty...
Hi Pfs,
There are different kinds of entropies.
I discoved the free entropy.
https://arxiv.org/pdf/math/0304341.pdf
the second law says that the total entropy cannot decrease when time goes by.
Is it always the same "time" for the different entropies?
the author, Voiculescu, wrote articles...
I am a software trainer and know about as much Physics as I've been able to pick up from the BBC‘s occasional documentaries.
I have never even taken a physics course in high school or college so I am an absolute lay-person here with what’s probably a very lay-person question…
I was watching...
I think the solution is:
$$dU=\delta W_{prop}$$
$$dU=TdS-PdV$$
$$dV=0$$
then, $$TdS=\delta W_{prop}$$ and so $$dS=dU/T$$
and by the way, it correct to say that, if the transformation between the initial and the final state would happen in a reversible way then the heat transfer could be...
I have a rather general question about the definition of entropy used in most textbooks:
S = k ln Ω, where Ω is the number of available microstates.
Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit).
Obviously this is not a number between 0 and 1, so...
one of the claimed successes of string theory is its ability to derive the correct Hawking-Bekenstein equations to calculate the quantum entropy of a black hole without any free paramenters, specifically Extremal black hole entropy using supersymmetry and maximal charge.
I was wondering if...
I have a question about the Thermodynamic Identity.
The Thermodynamic Identity is given by
dU = TdS - PdV + \mu dN .
We assume that the volume V and that the number of particles N is constant.
Thus the Thermodynamic Identity becomes
dU = TdS .
Assume that we add heat to the system (we see that...
Hello
Sorry for my English...
We approach slowly (in a quasi-reversible way) an electrical charge of a glass of salt water.
Some ions arrange themselves in the glass.
What can we say about entropy of this transformation?
Bernadette
PS: My reflection comes from reading an old physics book...
Let's say you have a very dirty small room room and a giant clean library (lots of organized books) and let's say these occupy the same number of microstates. The entropy according to this equation is the same for the library and the room. But one is more ordered than the other one. How does it...
In a certain thermodynamics textbook, specific work done by an isentropic compressor/pump in an ideal rankine cycles, is given by the following;
Wpump = h2 - h1
Wpump = v(P2 - P1), where v = v1
When I carry out these two calculations between any two states, I get vastly different answers...
As we know, dipole can be only arranged either parallel or anti-parallel with respect to applied magnetic field ## \vec{H} ## if we are to use quantum mechanical description, then parallel magnetic dipoles will have energy ## \mu H ## and anti-parallel magnetic dipoles have energy ## -\mu H##...
So I am midway through my Thermodynamics course in college and still feel a bit unsure about the 2nd law and entropy.
I've learned that the 2nd law was states 3 different ways, and by contrapositive proofs we can determine they are all equivalent. What we end up getting for the 2nd law is...
Hello everybody, would somebody please put me on the right track to answering this question?
'Consider water undergoes a heat transfer
at constant pressure of 10 MPa and
changes from liquid to steam. Find the
entropy of the system (sfg) as well as the
heat transfer per unit mass in this...
In textbooks, Bekenstein-Hawking entropy of a black hole is given as the area of the horizon divided by 4 times the Planck length squared. But the corresponding basis of the logarithm and exponantial is not written out explicitly. Rather, one oftenly can see drawings where such elementary area...
If the universe was very hot right after the Big Bang how come the entropy of the universe was lower at that point than now? Isn't heat a reason for higher entropy?
Summary:: An elementary example calculation involving entropy in a textbook seems wrong
I was reading an elementary introduction to entropy and the second law of thermodynamics. The book gave the example of a gas in a chamber suddenly allowed to expand into an additional portion of the...
Per the Heisenberg uncertainty principle, a particle does not have a precisely defined location. Does such uncertainty contribute to the transfer of thermal energy (i.e. entropy)? Is uncertainty the primary means for the transfer of thermal energy at the quantum level?
Homework Statement:: Why is the entropy of a closed system constant in a reversible process, and not related by ##\Delta S = \int_{i}^{f}\frac{dQ}{T}## (See below for the question in more details)
Relevant Equations:: ##\Delta S = \int_{i}^{f}\frac{dQ}{T}##
I am reading chapter 24 of Physics...
Wasn’t sure whether I should post this here since it’s a more qualitative question, or under the Thermodynamics thread because that’s a more specific topic.
For all practical purposes, the laws of thermodynamics are inviolable, and statistical mechanics puts them on an even firmer theoretical...
Hi,
starting for this thread Question about entropy change in a reservoir consider the spontaneous irreversible process of heat transfer from a source ##A## at temperature ##T_h## to another source ##B## at temperature ##T_c## (##T_h > T_c##). The thermodynamic 'system' is defined from sources...
I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction...
Through an intriguing fictitious dialog between Sadi Carnot and Robert Sterling, Prof. Israel Urieli of the Ohio University shows that it is not required to invoke entropy, the second law of thermodynamics, and the Carnot cycle with the [ideal] adiabatic processes in order to find out the...
In chemical reactions generally ΔG < 0 , but if we were to consider a reversible path between pure reactants and products at 1 bar pressure , shouldn't the ΔG = 0 for every reaction ? and if it is due to non-pv work , I don't see any non pv work being done in reactions happing in a closed...
I know the math behind these, and I'm happy to use more precise language if needed, I just wanted to get some input on this sweeping generalization that entropy is the conversion of potential to kinetic energy.
A brief summary of two important branches of entropy:
1) thermodynamics - the total...
I have been reading about protein thermodynamics and found different types and models for entropy calculation before and after protein folding. I understand Vibrational, conformational, configurational entropy are some of the most studied "types" of protein folding entropy.
My questions is...