Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
I've been reading some speculative articles about the possible quantum arrow of time which emerges through collapse and irreversibility.
My question is: does collapse in the Copenhagen interpretation (or perhaps in a objective-collapse model) allow the spontaneous decrease of entropy where an...
Homework Statement
5 moles of liquid argon undergoes vaporization at its normal boiling point (87.5 K) and the resulting argon gas is subsequently heated to 150 K under constant volume conditions. Calculate the change of entropy for this process. The standard enthalpy change of vaporization, ∆...
1-Whats the relationship between entropy and İnformation ?
2-Can Entrophy always increases statement imply information lost ?
3-If it implies how its lost ?
Consider an expanding universe of infinite extent containing only a single particle. Does the entropy of this universe increase over time due to expansion? If it makes any difference in being a sensible question, consider an expanding universe with N particles where N is a known, finite number.
Calculate the change in entropy for the system, the surroundings and the Universe if 2 moles of ethane are completely combusted at 298 K. Standard entropies of C2H5(g), O2(g), H2O(l) and CO2(g) are 229.6, 205.1, 69.9, 213.7 J mol-1 K-1 , respectively. Standard enthalpy changes of formation of...
Hi.
I just read an article where following cooling method is described. Apparently it's very common, but I don't know what it's called:
A gas under pressure is released into a vacuum through a small hole. The average particle speed in this beam of gas is the same as before, but the...
Hello.I have a question about entropy of a thermodynamic system.
1)If we have let say a gas that is separated by some thermo isolated walls (so no heat goes in or out) does the entropy of that gas conserve? I taught that if S=dQ/dt, because Q=0,then the entropy should be conserved.
2)So,does the...
Consider the dependence of entropy and of temperature on the reduced Planck's constant (taken from page 23 of Thomas Hartman's lecture notes(http://www.hartmanhep.net/topics2015/) on Quantum Gravity):
$$S \propto \hbar, \qquad \qquad T \propto \hbar.$$
I do not quite see how entropy can depend...
Homework Statement
There exists a tank filled with air with a given volume, temperature, and pressure. The tank exists in a room at a given temperature and pressure.
That is:
For the tank: P=1MPa, T=700k, V=1m^3
Outside: T=295K, P=100kPa
Homework Equations
\psi 2-\psi...
Hi. This is the problem I'm trying to solve:
A system may be in two quantum states with energies '0' and 'e'. The states' degenerescences are g1 and g2, respectively. Find the entropy S as a function of the Energy E in the limit where the number of particles N is very large. Analyse this...
MENTOR NOTE: NO TEMPLATE BECAUSE SUMITTED TO WRONG FORUM.
3.1) A quantity of 0.10 mol of an ideal gas A initially at 22.2 degrees C is expanded from 0.200 dm3 to 2.42 dm3 . Calculate the values of work (w), heat (q), internal energy change (delta U), entropy change of the system (deltaSsys)...
Homework Statement
A hot rock ejected from a volcano's lava fountain cools from 1100º C to 40.0º C and its entropy decreases by 950 J/K. How much heat transfer occurs from the rock? (Source: OpenStax "College Physics for AP Students", Chapter 15.6)
Homework Equations
I used the equation ΔSh +...
The second law of thermodynamics predicts the end of the life of the universe being one where thermal equilibrium exists throughout the universe (maximum entropy) - essentially all energy has been dissipated. My question is if according to the first law of thermodynamics which describes the...
Homework Statement
For some reason it is not letting me add the image here, here is the link to the question:
http://imgur.com/a/3DLWM
The part I'm stuck on is the last part. Basically, the question is to obtain the following equation for the entropy of vaporisation using the Redlich-Kwong...
https://en.wikipedia.org/wiki/Future_of_an_expanding_universe "Over an infinite time there could be a spontaneous entropy decrease, by a Poincaré recurrence or through thermal fluctuations (see also fluctuation theorem)"
Homework Statement
5 kg of water at 60 degrees are put in contact with 1 kg of ice at 0 degrees and are thermally isolated from everything else. The latent heat of ice is 3.3x105 J/kg.
What is the change of entropy of the universe when 100J of energy are transferred from the water to the ice...
Homework Statement
A well-insulated tank of volume 6 m3 is divided into two equal volumes. The left part is initially filled with air at 100 C and 2 bar, and right side cell is initially empty. A valve connecting two cells will be opened so that gas will slowly pass from cell 1 to cell 2. The...
Homework Statement
It is problem 21 in the attached file.
Homework EquationsThe Attempt at a Solution
The answer seems to be C. I thought it is D. Can someone explain it to me please?
Sorry if this is a stupid question, I don't fully understand entropy. Snow flakes are highly structured, they form from water vapor which has very little structure. I must be misunderstanding entropy, my interpretation of it is that isolated system must evolve into more chaotic less structured...
In holographic entanglement entropy notes like here, they let alpha go to one in (2.41) and get (2.42). But (2.41) goes towards infinity, when doing that! Can someone explain how alpha --> 1 will make (2.41) into (2.42)? Thank you!
Hello.
The entropy S is a state variable or state function as the integral of dS = dQ/T is a path-independent, provided that the path is reversible process path. However, such a path-independency of the integral breaks down when the path includes irreversible process. So, I guess we can only...
Hello. I recently discovered Gerard 't Hooft's (what a complicated name to type, isn't it?*apostrophe*apostrophe*apostrophe) equation for the entropy of a simple black hole (what is meant by "simple" I have no idea). It is:
Where "S" is the entropy of a simple black hole
A is the area of the...
The colloquial statistical mechanics explanation of entropy as if it is caused by probability is dissatisfying to me, in part because it allows highly organized (i.e. with a real potential for work) arrangements to appear as 'random fluctuations', though with very low probability. But as far as...
If all the matter in the universe is eventually headed towards dis-integration into it's most basic form. Not sure what that is but for this thought experiment, let's say its single protons.
What would happen if all those protons formed a single mass? Would that be a singularity exhibiting...
So say I smash a glass plate on a chess board much larger than the plate. Simplistically, say entropy is the number of ways of rearranging the glass pieces across the squares of the board. Over time, it's likely that entropy increases since the glass would spread out, meaning each configuration...
So when my dad first explained the fundamental idea of thermodynamics to me, that entropy never decreases, he pointed out the odd fact that according to basically all other laws of physics, any motion or reaction could be run backwards and be just as valid as it is run forwards. It would break...
Hello! I have this GRE question:
In process 1, a monoatomic ideal gas is heated from temperature T to temperature 2T reversibly and at constant temperature. In process 2, a monoatomic ideal gas freely expands from V to 2V. Which is the correct relationship between the change in entropy ##\Delta...
I was watching The Feymann online lectures and he talked about the arrow of time.And entropy etc.
I have some questions.
1-Can ve say non-conservative force are time irreversible, but conservative force are time reversible ?
2-So From one , If a non-conservatice force acts on a system that...
Homework Statement
A flue gas is cooled from 1100 C to 150 C and the heat is used to generate saturated steam at 100 C in a boiler. The flue gas has a heat capacity given by CP/R = 3.83 + 0.000551 T, where T is in K. Water enters the boiler at 100 C and is vaporized at this temperature. Its...
Hunt & Ott 2015, Defining Chaos
NB: For a more introductory version, phys.org ran a piece on this article two summers ago
This paper was published as a review of the concept of chaos in the journal Chaos for the 25th anniversary of that journal. The abstract is extended with a clearer...
Is there any entropic gain when the surface of a liquid is minimised? Per example, molecules "enjoy" maximum entropy when they are at the interior. Is this valid?
Dear All Gravitinos,
I write this post here to discuss a new conjecture on resolutions of the schwarzschild singularity and the physics interpretation for the micro states of black-holes (arxiv: 1606.06178, published in Nucl. Phys. B2017,02,005...
I am only aware that the formula has to do with entropy/thermodynamics. I could really use the help on how it applies in physics and what the formula is really about.
Hello,
I am currently trying to get my head around the concept of entropy. One way to understand it is that it can be related to the amount of available energy levels in a system.
From what I read, the availability of energy levels in a system:
1) increase with an increase in the system...
I was reading some articles related to entropy and I come to know that,
The term “Entropy” shows up both in thermodynamics and information theory.
Now my question is :
What’s the relationship between entropy in the information-theory sense and the thermodynamics sense?
I need some clear and...
Homework Statement
As you know it's finals time and I desperatly need help with one physics task (it's not my main subject, but I still have to pass it -.-). Here it is: According to data below count temperature-depended entropy and entropy itself for reaction: CH4 + 2 O2 = CO2 + 2 H2O CH4...
Hi folks, I have a question, I will first write down some old truths and then ask what is unclear to me.
Now we know that Boltzmann's constant is the average kinetic energy of one molecule of ideal gas, related to its temperature T.
The German scientist Clausius defined entropy change of some...
How is entropy defined (if it is) for phenomena taking place on a cosmological scale?
Entropy in thermodynamics is defined for equilibrium conditions. Do we assume cosmological phenomena are approximated by equilibrium conditions?
Hi.
If an ideal gas of ##N## particles is allowed to expand isothermically to double its initial volume, the entropy increase is
$$\Delta S=N\cdot k_B \cdot \log\left(\frac{V_f}{V_i}\right)=N\cdot k_B \cdot \log\left(\frac{2V}{V}\right)=N\cdot k_B \cdot \log\left(2\right)\enspace .$$
This can...
is incresing the entropy of low entropy system easier than trying to increase the entropy of a high entropy system?
or is it vice versa?
let's say it requires x amount of energy to increase a low entropy system, now will increasing an already high entropy system require 2x amount of energy or...
Gravity tends to make ordered structures of free particles. Does this mean that gravity is decreasing the entropy of these particles, or is there some compensating mechanisms in order to let the entropy increase (for example the emergence of gravity waves, though I doubt that's enough to compensate.
Consider three identical boxes of volume V. the first two boxes will contain particles of two different species 'N' and 'n'.
The first box contains 'N' identical non interacting particles in a volume V. The second box contains 'n' non interacting particles. The third box is the result of mixing...
Those treatments of Entropy in continuum mechanics that I've viewed on the web introduce Entropy abruptly, as if it is a fundamental property of matter. For example the current Wikepedia article on continuum mechanics ( https://en.wikipedia.org/wiki/Continuum_mechanics ) says:
Are other...
Hi I've been wondering about Boltzmann's equation
S = k ln W
Where W is the number of different distinguishable microscopic states of a system.
What I don't get is that if it's the position and velocity of a particle that describes a microstate doesn't it mean that W would be infinite...
Homework Statement
By applying the first law to a quasi static process, show that the entropy can be expressed as
S = (16σ/3c) VT3
Homework Equations
U = 4(σ/c) VT4
PV = 1/3 U[/B]The Attempt at a Solution
I know I should be using
dS = dQ/T but it's unclear to me how to use this unless I...
Homework Statement
Four distinguishable particles move freely in a room divided into octants (there are no actual partitions). Let the basic states be given by specifying the octant in which each particle is located.
1. How many basic states are there?
2. The door to this room is opened...
Bell's theorem debunks theories concerning local hidden variables.
Many people interpret that as the complete absence of local hidden variables.
Hidden variable theories were espoused by some physicists who argued that the state of a physical system, as formulated by quantum mechanics, does not...