Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Dear admins and moderators,
I am sure this subject has come up many times before, and could well be a stupid question. If so, could you direct me to the relevant thread(s)?
Setting aside its itinerant electron, (Hydrogen Atom,) the Proton is THE building block of the Universe.
"...Despite...
Hi.
Processes involving a friction force whose direction somehow depends on the direction of the velocity, such as ##\vec{F}=-\mu\cdot\vec{v}##, aren't symmetric with respect to time reversal. If you play it backwards, this force would be accelerating.
On the other hand, friction dissipates...
What is the entropy change of the system in the Gibbs Free Energy Equation?
The general expression for entropy change is ΔS=q/T
The only exchange between the system and the surroundings is ΔH done reversibly, with no PV work and no matter transfer, therefore
q(syst) = ΔH(syst)
therefore surely...
The text says:
"Steel bullet of 25kg with a Temperature of 400 Celsius, is being dropped on the bottom of an oil liquid of 100kg at a temperature of 100 Celsius. The system is isolated. Calculate
a) The change of entropy of the bullet,
b) the change of entropy of the oil,
c) the total change of...
What about if we allow for a temperature and volume change in a solid or a liquid?
Would the entropy change still only depend on the temperature change or also on the volume change.
For a solid I would think that the volume change doesn't matter since it doesn't change the "amount of disorder"...
Problem Statement: 1 kg of water at 273 K is brought into contact with a heat reservoir at 373 K. When the water has reached 373 K, what is the entropy change of the water, of the heat reservoir, and of the universe?
Relevant Equations: dS=Cp*(dT/T)-nR*(dP/P)
dS=Cv*(dT/T)+nR*(dV/V)
I am...
In a recent study (https://phys.org/news/2018-08-flaw-emergent-gravity.html) it has been discovered an important flaw in Emergent/Entropic Gravity because it has been discovered that holographic screens cannot behave according to thermodynamics...
But then, doesn't this also invalidate...
The multiplicity of states for a particle in a box is proportional to the product of the volume of the box and the surface area of momentum space.
$$ \Omega = V_{volume}V_{momentum}$$
The surface area in momentum space is given by the equation:
$$p^{2}_{x}+ {p}^{2}_{y}+{p}^{2}_{z} =...
Although I've read many papers that propose a relation between action and entropy, I've been told that there is no generally accepted relation in physics.
But how/why are these concepts unrelated?
What about nobel laureate Frank Wilczek? He proposes that entropy and action are closely related...
So I get into these discussions on other ... less scientific ... fora, and then run into trouble and have to come here for correct answers.
I state these as assumptions but they are really questions. Please correct.
Entropy is usually applied in a thermodynamics context, but it can be applied...
If we reversed the second law of thermodynamics, and entropy decreased, and also managed to reverse the motion for all matter in the universe, so that in 24 hours, everything would return to the state it was at then, would time be said to be flowing backwards? Or would time still be flowing...
Hi,
consider an adiabatic irreversible process carrying a thermodynamic system from initial state A to final state B: this process is accompanied by a positive change in system entropy (call it ##S_g##). Then consider a reversible process between the same initial and final system state. Such...
I tried following:
$$ dS_{\text{total}} = |\frac{dQ}{T_c}| |\frac{dQ}{T_H}| $$
where ## T_h ## is temperature of hot water and ## T_c ## is temperature of cold water. Coefficient for water wasn't provided in the assignment so i used following value c = 4190 J/kgK.
$$ dS_{\text{total}} =...
Help!
Hi, I need
in the secodn law of thermodynamic, we have the ENTROPY "S".
Well, I need help for this:
We have dS ≈ dQ
Then we have dS = λ *dQ
where we have λ = λ (T, ... )
I have to demostrate that :
λ = 1/T , where T = temperature.
Thanks for the advices and help!
I saw another post about dS = dQ/T, but the subject of question was different - not related to the entropy of universe.
This is what i understand from this formula:
As the temperature goes down, the entropy goes up. Is this not the opposite (contradictory) to what entropy (disorder) is about...
My understanding is that to define the entropy of a system what you have to do is as follows:
Define the boundaries of your system.
Define a set of "microstates" of the system.
Define a partition of microstates of the system where each element of the partition is measurable and known as a...
Hi, I am also having problem with the assumption Entropy makes about the movement of molecules in a 3D space. Does it assume that gas molecules could have equal chance of going in any direction? If so then how is it possible outside a free-fall lab as Gravity bias would always make all the...
For {pi}i=1,n being the probability distribution, I want to show that a Huffman reduction cannot be decreasing, and I reached a point where I need to show that
q+H(p1,...,pn-s,q) ≥ H(p1,...,pn), where
q = pn-s+1 + ... + pn and s is chosen such that: 2 ≤ s ≤ m (m≥n) and s = n (mod(m-1))
where...
This is my first post and I need to preface my question by saying I have no physics background, so I'm genuinely asking for help in understanding.
A thought occurred to me about the continuing expansion and acceleration of the universe and I'm asking for your help in understanding where my...
Hey guys, so I am reading this book and on pages 89-90, the author says:
"Increasing temperature correspond to a decreasing slope on Entropy vs Energy graph", then a sample graph is provided, and both in that graph and in the numerical analysis given in page 87 the slope is observed to be an...
i don't really understand why S of the universe must be always positive,i know that only reversible process have constant entropy but why real proceses always increase S in the universe?
sorry for bad english I am not from USA or UK
Homework Statement
A gas sample containing 3.00 moles of Helium gas undergoes a state change from 30 degrees celsius and 25.0 L to 45 degrees celsius and 15.0 L. What is the entropy change for the gas (ideal gas)? For He, Cp = 20.8 J/k*mol
Homework Equations
ΔS = Cv*ln(Tf/Ti) + nR*ln(Vf/Vi) =...
Hi everyone, I have a few questions I'd like to ask regarding what I have read/heard about these two definitions of entropy. I also believe that I have some misconceptions about entropy and as such I'll write out what I know while asking the questions in the hope someone can correct me. Thanks...
If a reservoir is in thermal contact with a system, why is the entropy simply Q/T ? Shouldn't this equation only valid for reversible process? Why is it reversible?
Homework Statement
[/B]
What is the contribution of the conduction electrons in the molar entropy of a metal with
electronic coefficient of specific heat? I can't figure out how to comprehend this, which relation/theory might lead to this?
and How this answer is relevant to the point of molar...
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard...
I never really understand the concept of entropy through classical thermodynamics. Here are a few questions.
1. The change in entropy dS in an isolated system is always >=0, but how does it imply the system tends to a state with maximum entropy? How to know that there exist a maximum?
2. Why is...
Homework Statement
2.Relevant equations[/B]The Attempt at a Solution
How does a reversible process in the universe imply the entropy doesn't increase? I understand that the change of entropy in a closed reversible cycle is 0 in the system, but I don't get why a not closed reversible process...
Sir Roger Penrose in his book Cycles of Time on page 19 states the result of a calculation of probability of mixing red and blue balls as an illustration of entropy as state counting and the Second Law. He assumes an equal number of each. There is a cube of 10^8 balls on an edge subdivided into...
Many people talking about there are similarities and common positions in quantum entaglement and superposition with entropy. I need to know about this phenomenon
Hello,
The state | W \rangle = \frac { 1 } { \sqrt { 3 } } ( | 001 \rangle + | 010 \rangle + | 100 \rangle ) is entangled.
The Schmidt decomposition is :
What would the Schmidt decomposition be for | W \rangle ?
I am also intersted in writing the reduced density matrix but I need the basis...
1. Homework Statement
if a rigid adiabatic container has a fan inside that provides 15000 j of work to an ideal gas inside the container,
does the change in entropy would be the same as if 15000 j of heat are provided to the same rigid container (removing the insulation)?2. Relevant equations...
My background is that I'm an applied mathematician and engineer, self-taught in GR and QFT. It's an old idea, in some dozen or so SciFi books. But I'm looking for a mathematical framework for handling it. The second law of thermodynamics, that entropy always increases in a closed system, can be...
Imagine there is an radiation concentrator (winston cone) surrounded with extremely many layers of foil for radiation insulation, except at the smaller opening. Every part of the setup is initially in thermal equilibrium with the surroundings. The amount of thermal radiation flowing through the...
Hi,
Could you please help me to clarify a few points to understand entropy intuitively?
Entropy is defined as:
Please have a look at the attachment, "entropy111".
Source of attachment: http://faculty.chem.queensu.ca/people/faculty/mombourquette/chem221/4_secondthirdlaws/SecondLaw.asp
The...
Homework Statement
When the air outside is very cold and dry, your climate control system must humidify the cabaret air so that the singers don't lose their voices. The climate control let's pure water evaporate into the dry air and raises the moisture content of that air. As this evaporation...
Homework Statement
During the fall, the outside air's temperature is comfortable but its humidity is too high for direct use inside the cabaret. The air feels clammy and damp. So your climate control system chills the outdoor air to extract some of its moisture and then reheats that air back up...
From a heuristic standpoint it makes sense that when a system goes from being periodic to chaotic, the occupied volume of the phase space increases (while not violating liouville theorem). Since the volume of phase space is proportional if not equal to the entropy, shouldn’t entropy always...
when selecting rare entropy sources for trng and one can see similarities trough an applied hidden markov model, will it be still good entropy?
(structure is the same, even though type of source input is different)
If we have two sequences s1 and s2, both of N coin tosses, is the entropy of getting two sequences that are exactly the same then lower than sequences of which can be said that they differ by x incident tosses? Is the entropy of getting sequences s1 and s2 that differ by N/2 tosses the highest...
Sometimes I go back and think about this stuff, and I always find something I don't understand very well.
Consider an irreversible isothermal expansion of an ideal gas from state ##A## to state ##B## and suppose I know the amount of heat given to the system to perform the expansion - I'll...
It is said that entropy causes an arrow of time. However, how about the irreversability of a measurement like electron spin. When measured a certain spin, the previous value gets lost. So does that also require an arrow of time?
Hello;
If a system receives a thermal energy Q, can it keep its entropy constant (that is, with equal value before it receives the energy) without wasting the energy received?
I just read a book by nuclear physicist Carlo Rovelli on the subject of "Time" and he says that 'entropy' is the only non-reversible process in the basic equations of physics, and he believes time and entropy are related (if I understand him correctly). So this started me thinking on entropy...
<edit: moved to homework. No template in the post.>
An ice maker inputs liquid water at 25 degrees C and outputs ice at -5 degrees C. Assume there is 1 kg of water and the volume does not change.
Cp liquid 4.18 kJ/kg-K
Cp solid 2.11 kJ/kg-K
∆H fusion 334 kJ/kg
I need to...
Homework Statement
Derive an expression for the change of temperature of a solid material that is compressed adiabatically and reversible in terms of physical quantities.
(The second part of this problem is: The pressure on a block of iron is increased by 1000 atm adiabatically and...
Homework Statement
In a monatomic crystalline solid each atom can occupy either a regular lattice site or an interstitial site. The energy of an atom at an interstitial site exceeds the energy of an atom at a lattice site by an amount ε. Assume that the number of interstitial sites equals the...