Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Hello,
I am a 15 year old who has done research around the topic of why the universe is expanding at an accelerating rate, and how it will come to an end. After years of thought (since I was 11) I have come up with my hypothesis that time itself is accelerating and slowing down, and has been...
So for a collection of particles each with mass m, the pressure beneath them, ##p(z)## should be higher than the pressure above them ##p(z + \Delta z)##.
This is a change in force per unit area (force per unit volume I suppose) times a volume to equate with the gravitational force
$$ \frac...
Can we make sense out of the formula of entropy like we do for density (like "quantity of mass per unit volume")? What's the sense of Q/T? Couldn't it be something else?
Of course it probably is a 'me-problem', but I haven't studied Thermodynamics deeply yet and was wondering what Entropy...
Hello, everyone :).
I try to resolve this common problem. But, when i got in the interpolation of state 2, the values not make the sense.
I have 25 psia and 75 F, but, in the superheated water table, there are not values with 25 psia (only 20 psia and 40 psia). And, the temperature values...
So just by by using the definition of the partition function...
$$ Z = \sum_i e^{ \frac {-E_i} {k_BT} } = e^{ \frac {-0} {k_BT} } + e^{ \frac {-\epsilon} {k_BT} } = 1 + e^{ \frac {-\epsilon} {k_BT} } $$
And then, a result we obtained in class by using the Boltzmann H factor to solve for ##S##...
I am trying to understand how visually salient features could be discovered by unsupervised learning. I do not want to assume that we already have edge detectors or convolutional neural networks, rather I am trying to imagine how these could be discovered by observing the world.
Imagine that a...
Summary:: Proving that entropy change in mixing of gas is positive definite
>
>An ideal gas is separated by a piston in such a way that the entropy of one prat is## S_1## and that of the other part is ##S_2##. Given that ##S_1>S_2##, if the piston is removed then the total entropy of the...
a) ##P_f=\frac{nRT_f}{V_f}=\frac{nR\frac{T_i}{2}}{2V_0}=\frac{1}{4}\frac{nRT_i}{V_0}=\frac{1}{4}P_i##
b) ##Q=\Delta U=nC_V \Delta T=n\frac{5}{2}R(-\frac{T_i}{2})=-\frac{5}{4}nRT_i=-\frac{5}{4}P_i V_0## (##L=0## since the gas expands in a vacuum;Now, (a) and (b) are both correct but not (c), for...
By using the given relationship that S=a/T --(1) along with the equation ∫ (delta Q rev)/T=∫dS -- (2) I found out that my answer for the value of Q is mc*ln (T2/T1)*a upon equating (1) & (2).
But the solution is instead given as Q=a*ln*(T1/T2).
I would be grateful if someone would point out...
Which is a higher entropy state: solar system as it is today with planets going around the sun at fixed distances in an orderly fashion OR all the planets and sun bumping into one another and forming a single body?
If the entropy of the combined single body is higher, why doesn't the solar...
Upon seeing the question in my assignment I knew in a isobaric process, work done by the gas is ##W=P\Delta V## so if volume is increased ##4## times the original considering the original volume as ##V## we can say after expansion the volume is ##4V##. Then ##W=P(4V-V)=3PV## and the ##Q## would...
I'm no expert but, as I understand it, in an open system, one that can take in energy and matter from outside of itself, the overall entropy level can be prevented from increasing (and can actually decrease) under the right conditions. I have three questions:
1. Can the kinds of quantum...
In many cases, the concentrations of defects or charges are quite big enough to use SA, due to a big number of Avogadro's number.
The derivation for the well-known formula of a defect concentration is followed.
If the n_v is expected to be lower than 1, then it would be impossible to use SA...
Hello, just wanted to ask regarding the otto cycle; if we were to find the entropy at the phase of isentropic compression and I was already able to derive the temperature 1 and 2 and the pressures at 1 and 2 and I also have the compression ratio of 10. How do I derive the entropy (s1 and s2)...
On page 50 of "From eternity to here", Sean Carroll writes that the protostellar cloud had a lower entropy than the solar system it produced. That strikes me as odd. A solar system looks more arranged than a dust cloud. When talking about entropy, someone always mentions the milk in the coffee...
I have heard from a knowledgeable physics proffessor, time exists independently and it is not a consequence of arrow of time. Could some body explain this?
Hi everyone,
I have a fundamental question to the first part of Swendsen's Intro to StatMech and Thermodynamics (book).
Suppose we have two isolated systems of volumes ##V_1## and ##V_2##. We distribute ##N## ideal gas particles across the two systems with total energy ##E##.
Suppose we bring...
I have a simple question sort of about exact differentials and deciding which variables matter and when.
I know we can write entropy ##S## as ##S(P,T)## and ##S(V,T)## to derive different relations between heat capacities ##C_V## and ##C_P##. I was wondering if it is technically correct to...
Hello! Could you please recommend me some good books about entropy for physics enthusiasts (someone who doesn't know physics but wants to learn about this)? Thank you!
Hi All,
I would like to know how can one connect the two definitions of entropy
##\Delta S = \int_{T_i}^{T_f} \frac{dQ}{T} ## and ##\Delta S = k_B \ln (\frac{W_f}{W_i})##,
particularly I am interested in how the logarithm emerges. Does it have to do with some linear dependence of the heat with...
Hey guys! This is problem from Callens Thermodynamics textbook and I'm stuck with it.
My goal was to get a expression for the entropy ##S## which is dependent on ##T## so I can move into the ##T-S##-plane to do my calculations:
I startet by expressing the fundamental equation as a function of...
In classical statistics, we derived the partition function of an ideal gas. Then using the MB statistics and the definition of the partition function, we wrote:
$$S = k_BlnZ_N + \beta k_B E$$, where ##Z_N## is the N-particle partition function. Here ##Z_N=Z^N##
This led to the Gibb's paradox...
One of the most fundamental equations in chemical thermodynamics states: $$ \Delta_rH_m^⦵ = \Delta_rG_m^⦵ + T \Delta_rS_m^⦵ $$
If we look at this equation in context of net chemical reaction in electrolytic or galvanic cell, it is usually interpreted as follows: Enthalpy of reaction denotes...
Hey guys! I'm currently struggling with a specific thermodynamics problem.
I'm given the entropy of a system (where ##A## is a constant with fitting physical units): $$S(U,V,N)=A(UVN)^{1/3}$$I'm asked to calculate the specific heat capacity at constant pressure ##C_p## and at constant volume...
So what I did was find the change in Q per min.
Mass melted per min * latent heat capacity = Q per min = 11.5 kg /min * 3.4*10^5 J/kg = 3910000 J/min
Now the equilibrium temperature is 100 degrees Celsius or 373.15 degrees kelvin.
If I do 3910000 J/min / 373.15 K I get 10478 J/(K*min).
This...
In the industry, coal and other fuels are typically represented by
- their C, H, O, N, S elemental composition for the combustible part
- the composition of the ashes (SiO2, Al2O3, Fe2O3, ...)
- the Low Heat Value (LHV) which is the heat that can be extracted from combustion product
With...
Context
Boltzmann first defined his entropy as S = k log(W). This seems to be pretty consistently taught. However, the exact definitions of S & W seem to vary slightly.
Some say S is the entropy of a macrostate, while others describe it as the entropy for the system. Where the definition of...
Hi all,
First post here. I'm a casual physics enthusiast, but I've been reading and thinking a lot about this topic lately.
The thing I'm most interested in is the fact that black hole formation involves the simultaneous limits of two things: time dilation and the information bound. I find it...
It looks very easy at first glance. However, the variable S is a variable in the given expression. I have no clue to relate the partial derivatives to entropy and the number of particles.
Is the purpose of the 0th, 1st & 2nd Laws of Thermodynamics simply to legitimate the thermodynamic properties of Temperature, Internal Energy & Entropy, respectively?
It seems that all these laws really do is establish that these properties are valid thermodynamic state properties and the...
In the book for our thermodynamics, it states that a process that is internally reversible and adiabatic, has to be isentropic, but an isentropic process doesn't have to be reversible and adiabatic. I don't really understand this. I always thought isentropic and reversible mean the same thing...
I've calculated the change in the entropy of material after it comes in contact with the reservoir:
$$\Delta S_1 = C \int_{T_i+t\Delta T}^{T_i+(t+1)\Delta T} \frac{dT}{T} = C \ln{\frac{T_i+(t+1)\Delta T}{T_i+t\Delta T}}$$
Now I would like to calculate the change in the entropy of the...
We know that
$$dU=\delta Q + \delta W$$
$$dU = TdS - pdV$$
So from this:
$$dS = \frac{1}{T}dU + \frac{1}{T}pdV \ (*)$$
For an ideal gas:
$$dU = \frac{3}{2}nkdT$$
Plugging that into (*) and also from ##p=\frac{nRT}{V}## we get:
$$S = \frac{3}{2}nk \int^{T_2}_{T_1} \frac{1}{T}dT +...
I have been able to get everything except entropy. I know it's not zero. I know I have to find a reversible path to calculate it, but keep coming up with strange values so I don't think I'm doing it correctly.
can I do CpdT/T + CvdT/T = ds? I am having trouble calculating my P2 (I know my final...
I have used the Lagrange multiplier way of answering. So I have set up the equation with the constraint that ## \sum_{x}^{} p(x) = 1##
So I have:
##L(x,\lambda) = - \sum_{x}^{} p(x)log_{2}p(x) - \lambda(\sum_{x}^{} p(x) - 1) = 0##
I am now supposed to take the partial derivatives with respect...
I understand entropy as a movement from order to less order and that the universe's entropy is increasing.
So I was wondering, life takes molecules and organizes them into organisms, so isn't life a reversal of entropy?
When trying to describe why the entropy goes up for a irreversible process, such as gas expanding into a vacuum, it seems fairly easy at a high level. the valve between the two chambers opens, the free expansion occurs, the pressure drops proportional to the volume change and the temp remains...
I'm reading Brian Greene's latest book 'Until the End of Time' (I'll pause here while you finish groaning at yet another layperson reading popularist physics books.) In it, he's describing entropy in a way I've never heard before and it clarifies something that's always stuck on my craw about...
Hi, I'm new to PF and not really sure which forum may be the most appropriate to find people with an interest in probability and entropy. But the title of this forum looks promising. If you share an interest in this topic would be delighted to hear from you.
Thinking of the common language notion of "entropy" as "uncertainty", how can running a simulation based on a probability model implement entropy increasing? After all, the simulation picks definite outcomes to happen, so (intuitively) there is less uncertainty about the future as definite...
Question
---
So I've done a calculation which seems to suggest if I combine the system of a measuring apparatus to say an experimenter who "reacts" to the outcome of the the measurement versus one who does not. Then the change in entropy in both these situations is bounded by:
$$ \Delta S_R...
I thought that would be something like, using similar counts from Einstein solid, ##S = kln(\frac{(q+N-1)!}{q!(N-1)!})##
Where q is ##hv##
v is frequency
But the results are not similar, so i am little stuck
I am struggling to understand Callen's explanation for stability, I understand that the concavity of S(U) must be negative because otherwise we can show that this means that the temperature increases as the internal energy decreases (dT/dU<0) but I cannot understand equation (8.1) which...
Not sure whether this is the right category but bear with me.I've seen graphics where the information increase with time of evolution is projected like the one in the link below from Carl Sagan's book
https://www.researchgate.net/figure/Page-from-the-book-of-Carl-Sagan-21_fig2_322153131
Now I...
The following paper appeared earlier this year on arxiv, entitled "Islands in Schwarzschild Black Holes":
https://arxiv.org/pdf/2004.05863.pdf
First, a bit of background: this paper appears to be part of a larger research effort aimed at resolving the black hole information paradox by showing...