Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Does entropy increase when two protons collide at moderate velocity? Is momentum of one fully transferred to the other. Is the vector coming in more certain than the vector going out after the event. I guess the answer might invoke the uncertainty principle but is there some certainty with...
Before we prove this, consider a thought experiment.
We have the following setup
We break the left partition so that the gases on the left mix.
What happens next is that due to a chemical potential difference, gas flows from the right compartment to the mixture.
Note that
- the partial...
I've been working on this problem for the past 3 days. I have other papers with different ways of tackling the problem. However, I just cannot get to the answer (change in entropy = 2Nlog(2)).
We define ##dA=dU+P_0dV-T_0dS \leq 0##. In my notes it says if you fix pressure and entropy, ##dA=dH##. I don't get this, because at constant T and S, I get ##dA=dU+P_0V##. It seems that somehow, ##P_0=P##. Is this correct, or am I missing something?
Second question about this:
If ##T_0=T##...
I have an issue with (b). What I did was simply integrate ##dS##. It's a perfect gas, so, $$\left(\frac{\partial E}{\partial T}\right)_V=NC_V$$ and $$\left(\frac{\partial E}{\partial V}\right)_T=0$$ Next I used the relation that ##PV=NkT## to get ##\frac{P}{T}=\frac{Nk}{T}##, and after...
I tried two different methods when solving this question and have no idea why.
Method 1:
Using the Maxwell relation of $$\left(\frac{\partial S}{\partial V}\right)_T=\left(\frac{\partial P}{\partial T}\right)_V=\frac{R}{V-b}$$ then integrating it, I get $$\Delta S = \int_i^f\frac{R}{V-b}dV$$...
Answer for (d) is 0, answer for (e) is not.
Firstly, I don't get why (e) is not zero. It says "the same expansion" so that expansion is reversible. Reversible processes -> entropy = 0?
Secondly, part (e) seems to be the exact same as (d) so I'm not sure why it's different!
Thanks in advance
Okay, I agree with this logic. However, if we consider a reversible section first, then an irreversible section, I get the following:
$$\frac{dQ_{rev}}{T} \leq \frac{dQ}{T} $$ which is the opposite to equation (14.8). Why is this? Is it "somehow" not viable to think of a reversible section than...
I wrote a homage to Asimov's story "The Last Question".
I tried to use modern insights on the topics that were touched upon in the original story.
Those are, amongst others, Pernrose's CCC, Carols's suggested related relation between entropy and time and several others.
I wonder if I...
Hello,
My name is Mason C. Turner and I work in the cybersecurity field. My background includes both military communications as well as private sector experience.
According to the laws of physics, to the best of my understanding information and energy are directly interchangeable in a...
Consider the problem of calculating the entropy change when we mix two ideal gases.
Here is the setup
The initial state consists of two ideal gases separated by a partition.
We remove the partition and the gases diffuse into each other at constant temperature and pressure.
This is an...
Since the energy variation is zero:
$$
\Delta U = \Delta U_{1} + \Delta U_{2} = 0
$$
The energy for a monatomic ideal gas is ## u = CRT##, and the energy for a Van der Waals gas is
$$
u = CRT - \frac{a}{v},
$$
obtained through
$$
\frac{1}{T} = \frac{CR}{a + \frac{a}{v}}.
$$
Summing the...
Then
$$q_{irrev}=0\tag{1}$$
Take the system from state 2 back to state 1 using a reversible process B.
My first question is: why can the system not be isolated for this reversible process to be possible?
Assume we have a non-isolated system in process B.
Process A and process B together...
Here is a way to solve the problem.
Since ##dq_1=-dq_2## then
$$\int_{T_1}^T C_PdT=-\int_{T_2}^T C_PdT\tag{1}$$
$$\implies T=\frac{T_1+T_2}{2}\tag{2}$$
$$dq_1=C_PdT\tag{3}$$
$$dS_1=\frac{dq_1}{T}=\frac{C_P}{T}dT\tag{4}$$
$$\Delta...
When I was taught about temperature in high school, I was told that substances that are hot have molecules that move fast, while substances that are cold have molecules that move slowly. I was also told that everything moves towards greater disorder or entropy. This is apparently because there...
If a process is irreversible, on the other hand, then
$$\oint \frac{\delta q}{T}\leq 0=\oint dS\tag{1}$$
Apparently, from this equation we can conclude that
$$dS \geq \frac{\delta q}{T}\tag{2}$$
How do we mathematically justify the step from (1) to (2)?
Next, consider an isolated system...
My understanding of the Boltzmann's H-theorem is that if a set of a large number of colliding bolls is not in the thermodynamical equilibrium (i.e. the probability distribution function W doesn't obey the Maxwell distribution), its entropy will grow (without supplying heat) until the equilibrium...
After re-reading the book, I did figure out what I was supposed to do. Take both waters through a series of reservoirs to bring them down to their final temperature while allowing for a quasi-static process. Thus, $$\Delta S = m_1c \int_{T_1}^{T*} \frac{dT}{T} + m_2c \int_{T_2}^{T*}...
I am using the symbol ##\delta## in ##\delta q_{rev}## and ##\delta w## to denote an inexact differential.
$$\delta q_{rev}=C_VdT+\frac{nRT}{V}dV$$
We can turn this inexact differential into an exact differential by multiplying by the integrating factor ##\frac{1}{T}##.
$$\frac{\delta...
My doubts are about the second question above, ie the irreversibly expansion.
For the first question, we have
a)
$$dS=\frac{dq_{rev}}{T}=\frac{nR}{V}dV$$
$$\implies \Delta S=nR\ln{\frac{V_2}{V_1}}=2.88\mathrm{\frac{J}{K}}$$
b)
$$q_{rev}=T\Delta S=298.15\text{K}\cdot...
Let's consider the book to be our system.
The book spontaneously absorbs heat from the surroundings and somehow converts this to gravitational potential energy.
Assuming gravitational potential energy is zero at the table top, the potential energy at ##3.2\text{cm}## above the table is...
Here is how I did this problem
Let's call the two samples sample 1 and sample 2.
The change in entropy for sample 1 is
$$\Delta S_1=\int dS_1=\int_{U_1}^{U_1+\Delta U}\frac{1}{T_1}dU\tag{1}$$
$$=\frac{1}{T_1}\Delta U\tag{2}$$
Similarly, ##\Delta S_2=-\frac{1}{T_2}\Delta U##.
Note that I...
Hi, as in a previous thread I would like to better understand the Feynman's analysis of brownian ratchet as described here:
https://www.feynmanlectures.caltech.edu/I_46.html
https://en.wikipedia.org/wiki/Brownian_ratchet
Consider the case in which the two boxes (i.e. heat baths) are at the same...
Hi, soppose we have a resistor at a given temperature T connected through a diode to a cell battery.
The voltage accross the resistor due to thermal noise should charge the cell converting termal energy into chemical energy without limits.
Does the above process violate the second law of...
I'd like to check if my reasoning is right here and that the numerical factors in the final result are correct. The disks occupy an effective area ##A = (A_{\mathrm{box}}-2r)^2##, excluding the region of width ##r## at the boundary. The area available to the ##n##th disk is then ##A_n = A - 4\pi...
So I had to find change in entropy of system in reversible isothermal process.
$$T\Delta S_{sys.}=Q\implies \Delta S_{sys.}=nRln\left(\frac{V_2}{V_1}\right)$$
This was good because for isothermal process ##\Delta U=0\implies Q=W##
Then I read this
Throughout an entire reversible process, the...
We know that there is no law of conservation for the entropy. It is quite the contrary: If we have a closed system without exchange of heat the entropy cannot get less. It will reach the max. If we have not a closed system but a stream of entropy only into a system, the entropy will increase...
I've never had any physics class before so please bare with me on my lack of understanding.
I've been thinking about gravity and its relation to entropy lately and was wondering if my thinking is correct.
Entropy seems to be an opposing force to gravity. where gravity is creating gradients...
I'm studying how to compute excess entropy in molecular dynamics (MD). I've found it is needed to compute the two-body correlation function (neglecting high-order terms), the details can be found, for example, in this article.
So the definition of correlation function (CF for short) is
##C(t...
Entropy reduction or quantum phenomena can occur microscopically, but entropy reduction is absolutely impossible by chance, and if a macroscopic object's wave function collapses due to measurement, does that mean that the macroscopic object will never be able to cause quantum phenomena? Even in...
Q: Why the entropy value of this steady flow open system is not equal to zero?
My idea is as represented by the following equation.
$$
\frac{dS_{sys}}{dt}=0,\,\,\,\,dt\ne 0
$$
$$
\therefore dS_{sys}=0\,\,\,\,\,\,\,\,\therefore ∆Ssys=∆Sair=0
$$
$$
\therefore...
As far as I know, entropy could be reversed by the Poincaré recurrence theorem if it had a finite horizon given by some amount of vacuum energy causing an accelerating expansion.
However, I found this lecture by Leonard Susskind () where he tells a way through which the vacuum could decay into...
In the far future there will be most likely a point where a maximal state of entropy will be reached in the universe and after the last black hole evaporates there could be no more structures and no more work could be done.
According to the Poincaré recurrence theorem for a closed universe...
If we have a kg of something that is 100miljon Celsius degrees, and can controlably use this heat somehow, we can sustain life, grow crops, drive steam engines and with these we could build a whole city like New York, we can create a lot of mass with very low entropy, things that are very...
My studies relate with construction engineering and environment improvements and I have a passion about combinatorics and exact sciences. I'm always in touch with the novel things that pop out in science related media. I don't like when people start make politics upon science findings.
I'm the...
Is entropy real? It seems like it's not real because it depends on how you group microstates together into a macrostate, and the way you group them can be arbitrary. For example (at 13:04 of the video below), there are 91,520 microstates in the macrostate “9 in left; 1 in right” but 627,264...
In a discussion about the (change in the) Helmholtz potential being interpretable as the maximum available amount of work for a system in contact with a thermal reservoir (i.e. the free energy), Callen seems to insist this fact is true only for reversible processes. Why should this be? I...
In his classic textbook, Callen remarks that
I have labelled the claims (1) and (2). I am not sure about either. For the first, I have tried to proceed as follows (all equations are from Callen's second edition and all 0 subscripts are with respect to some reference state of an ideal gas):
I...
In Chapter 5 of his famous textbook on thermodynamics, Callen argues for the "equivalence" of the maximum entropy (Max-Ent) principle and the minimum energy (Min-En) principles. I quote from Callen first:
As far as I know (though Callen never makes this explicit in what, I think, represents...
I am continuing to try to understand maximum work reversible processes (and a subset thereof -- Carnot cycles) better. I am here curious about the following system.
My question is about how I can know/prove that there exists a way to take the gas (the primary subsystem) reversibly with respect...
This question was, effectively, asked here (please refer to that question for additional context); however, I don't think the given answer is correct (or at least complete) despite my having added a bounty and having had a productive discussion with the answerer there. In particular, I don't...
Hello everyone,
I am seeking some clarification regarding a question related to thermodynamics and statistical mechanics. My understanding is that when we combine two identical boxes with the same ideal gas by removing the wall between them, the resulting system's entropy stays the same...
The Bekenstein Bound places a upper limit on the amount of entropy that a given volume of space may contain.
This limit was described by Jacob Bekenstein who tied it quite closely to the Black Hole Event Horizon.
Put simply, black holes hold the maximum entropy allowed for their volume. If you...
What does entropy in the following sentence means? Does it mean the same as the term "information content" before it? Is entropy more technical a term than information content?
He remembered taking a class in information theory as a third-year student in college. The professor had put up two...
The starting point is the identity
$$\left(\frac{\partial u}{\partial T}\right)_n = T\left(\frac{\partial s}{\partial T}\right)_n.$$
I then try to proceed as follows:
Integrating both with respect to ##T## after dividing through by ##T##, we find
$$ \int_0^T \left(\frac{\partial s}{\partial...
Dear everyone, I wish to discuss in this thread a classic/semi-classic interpretation on the origin of Bekenstein-Hawking entropy and the related resolution to Hawking's information missing puzzle, which were published in Nucl.Phys.B977 (2022) 115722 and Nucl.Phys.B990 (2023) 116171 after...
Hi everyone!
It's about the following task:
Calculate the molar entropy of H2O(g) at 25°C and 1 bar.
θrot = 40.1, 20.9K, 13.4K
θvib=5360K, 5160K, 2290K
g0,el = 1
Note for translational part: ln(x!) = x lnx - x
Can you explain me how to calculate this problem?
I would like to calculate the entropy or enthalpies (steam, specific and inner energy) using the SRK [suave-redlich-kwong] equation, the Wilson approximation and (if necessary) the Antoine equation. and the Clausius-Clapeyron equation for a mixture of 0.199 mol/l nitrogen and 0.811 mol/l carbon...
Hello,
is someone able to explain why these two are wrong. I am not sure how to figure out the enthalpy direction as the reaction is not changing state of matter, nor is it changing temperature.
(Please solve without calculating anything)
Thank you
The FAQ by @bcrowell cites an explanation by physics netizen John Baez as to how entropy rises when a star loses heat and contracts. However, the linked explanation falls short of describing the key role that gravity must be playing. The FAQ by @bcrowell discusses why a low-entropy state of...