Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Homework Statement
Supercooled water is water that is liquid and yet BENEATH the freezing point.
a) A sample of 131 g of supercooled liquid water freezes to solid ice at a temperature of -8.00 ° C. Using the following,
Cp,ice = 38.09 J/molK
Cp,liquid = 74.539 J/molK
fusH° (at T=0 °...
Homework Statement
This is an issue I'm having with understanding a section of maths rather than a coursework question. I have a stage of the density function on the full phase space ρ(p,x);
ρ(p,x) = \frac {1}{\Omega(E)} \delta (\epsilon(p,x) - E)
where \epsilon(p,x) is the...
1. Assume that one mole of H2O(g) condenses to H2O(l) at 1.00atm and 95 Celcius. Calculate q, w, ΔH, ΔS of the system, ΔS of surroundings. BUT I AM NOT ASKING HOW TO CALCULATE THESE VALUES, SEE LAST SENTENCE OF POST.
Homework Equations
q = nCΔT
ΔH = n(Cp)ΔT
W = -PΔV
ΔS = [q_reversible] / T
Cp...
Hello there,
I have a question on the dissipation and entropy.
Let us consider a Newtonian dampener with viscosity coefficient η, pulled at a fixed rate e', immersed in an infinite bath at temperature T.
The mechanical work input in time dt is then dW = ηe'*e'dt, and is all dissipated...
I asked a question on this forum a few days ago about the entropy change of the surroundings, and am grateful for the insight provided. However, something faulty in my conceptualization is preventing me from solving this problem.
Let's say you have a set processes shown in the following...
Hello,
I am taking a biochemistry course right now, and I am so confused by this 'hydrophobic effect' and how it relates to entropy.
Hydrophobic effect: THe exclusion of hydrophobic groups or molecules by water. (I get this part!) This appears to depend of the increase in entropy of solvent...
Homework Statement
You mix 1 liter of water at 20°C with 1 liter of water at 80°C. Set up an equation to find the change in entropy for one of the volumes of liquid in terms of initial temperature (T1) and the temperature after the two volumes of water mixed (T2)
Homework Equations...
Hello, I was just reading how the second law of thermodynamic is the only principle of physics that is irreversible, and that fact appears to have a correlation with how we view time and how time can only move in one direction, being the same as the flow of entropy. So, are entropy and time...
According to my textbook, during an irreversible process, the entropy change of the surroundings is given by \frac{q}{T} where q is the heat transferred to the surroundings during the process. Why are we allowed to use this equation, considering that this equation only holds for reversible...
The spectrum of the Cosmic Microwave Background radiation - the flash of the Big Bang, aligns almost precisely with the shape of the Black Body radiation curve. This means that the CMB radiation came from a state that was in thermal equilibrium.
Since thermal equilibrium is a state of maximum...
For a reversible process, I imagine it is correct to say that
dS = \frac{dq}{T} where all quantities refer to system quantities (not the surrounding).
However, for an adiabatic process, dq = 0 .
Thus, should it be the case that for an adiabatic reversible process,
dS =...
Homework Statement
A container is divided into two parts by a thermally conducting wall. There are N atoms of a monatomic ideal gas on the left side, 2N on the right. The gas on the left is initially at absolute temperature 200K, the gas on the right at 500K.
a. After thermal equilibrium...
Hello,
This is a question I've been working on out of blundell and blundell,
http://imageshack.us/a/img560/3342/entwopy.jpg
The red box is my answers to the question which I am pretty sure are ok.
I am having trouble with the very last part of the question.
By the logic of the...
We were shown in class how to get those entropys.
For reversible isothermal - ΔT=0 thus ΔE=0 thus Q = -W.
ΔS(sys) = Qrev/T = nR(V1/V2)
And ΔS(surr) = -nR(V1/V2) because surroundings made opposite work.
For irreversible isothermal in vacuum - ΔT=0 thus ΔE=0.
No work is done by...
We physicists must be careful to insure that theories begin with correct principles. One basic principle is that all quantities must be capable of being observed or measured. If a theory uses a quantity that cannot be observed, then it is not a physics theory, but a hypothesis or a...
Hey,
The entropy of a black hole is S = kB (4∏GM2)/(hbar c)
S=Q/T
T= Q/S
T = Q (hbar c)/ (4∏GM2kB)
The temperature derived from hawking radiation is:
T = c3 hbar/ (8 pi G M kB)
Which implies Q = (1/2)M c2
Is this true?
I have found online that the heat should equal...
Homework Statement
Using the expression for the entropy change of an ideal gas when the volume and temperature change and TV\gamma-1 = constant, show explicitly that the change in entropy is zero for a quasi-static adiabatic expansion from state V1T1 to state V2T2.Homework Equations
TV\gamma-1...
Homework Statement
Every second at Niagara Falls, some 5.0 10^3 m3 of water falls a distance of 50.0 m. What is the increase in entropy per second due to the falling water? Assume that the mass of the surroundings is so great that its temperature and that of the water stay nearly constant at...
Since the entropy increase in a system is a function of time, it would seem that for different observers, rates of entropy would move differently. I am struggling a little hear with putting this into a coherent question, as I am a layman, but the jist is: both the notion of a closed system and...
Assume we have an ideal gas of N particles inside a thermally isolated cylinder of volume V, and that the cylinder is equipped with a piston that can trap the air on one side. (Assume the piston occupies no volume in the cylinder.) Initially, the piston is fully withdrawn and the gas occupies a...
I have a question about the increasing behavior of entropy.
Suppose we heat a metal (Take for instance Fe) till it radiates energy and then put it in space (No medium). So the metal radiates energy in electromagnetic waves which decreases the entropy of the metal (Due to decrease in internal...
Hi
As I understand it there are several attmepts to explain the low entropy configuration of the universe at the big bang.
Is seems to me the choices on the table that I am aware of are a mutliverse as in the Caroll/Chen model . A cyclic universe as in the CCC model or some hybrid of the two...
Homework Statement
Okay, so I am having difficulties with understanding the concepts around entropy, take this question:
What is the total entropy change for 7 kg of ice melting from -5 C° to 5 C° in room at 5 C°.
Homework Equations
dS=dQ/T
Q =mΔH
m*c*ln(tfinal/tinitial)
c_ice=2
c_water=4...
Hello guys,
I just registered on this forum, so I hope you can help me with some differential entropy stuff.
So I have made the attached pdf file (it's a bit messy, sorry) that demonstrates a toy version of the problem I'm trying to solve.
Thing is I'm quite new to information theory, so...
I am not sure if my work is correct so I need your help to confirm it.
Question:
Find the change in entropy when:
n mol of ideal gas (ɣ=7/5) is reversibly compressed until the pressure is three times the ideal gas at constant temperature.
My answer:
S=n Cv ln(T2/T1) + nR ln(V2/V1)...
Homework Statement
"In the following a<b<c are finite positive constants. One mole of an ideal monoatomic gas, initially at volume Vi and temperature 1000K, expands to a final volume cVi in 3 reversible steps: (1) isothermal expansion from Vi to aVi (2) adiabatic expansion from aVi to bVi (3)...
Hi All,
I tried to find entropy of liquid nitrogen in various data books, using nitrogen's CAS number, MSDS, but I was able to get the entropy of nitrogen till only 100K, not below that.
Does anyone know what's the entropy of liquid nitrogen or that where can I find it?
Thanks!
In ordinary QM and QFT entropy is defined using a density operator for a generalized state:
S = -\text{tr}\left(\rho\,\ln\rho\right)
b/c for the gravitational field we do neither know the fundamental degrees of freedom nor the Hilbert space states, a definition like
\rho =...
hi
i am a little bit confused about the definition of entropy that says: dS=dQ_rev/T
what does this dQ_rev mean? is this definition wrong, if we are talking about irreversible processes?
My idea was that when you have an irreversible process like the isobaric expansion of a gas, then...
Homework Statement
From Hill's "Introduction to Statistical Thermodynamics", question 3-4 reads:
(note "STR" denotes the case of most probable distribution and should be read as C*)
Homework Equations
The most probable distribution for a system of independent indistinguishable...
The Bekenstein-Hawking entropy is expected to be, and has been shown to be in some cases, derived from counting states.
However, entropy is not defined for continuous probability densities, and so I have heard it said that relative entropy (of which the mutual information is a form) is more...
I have gotten to the thermodynamics portion of my physics class, and right now I'm reading about entropy. It is taking me forever to read the chapter though because I keep pondering so many questions.
- How can life be possible with the concept of entropy? Cells must organize themselves in...
I have been having a discussion with a friend and I think we have reached the limit of our knowledge, so hopefully someone on here can help!
Basically the question is if a photon is emitted it will travel forever providing it never hits anything. So let's say we are close to the edge of the...
Negative Changes in Entropy??
Perhaps this is not the right forum location, but I would like to ask some of the more experienced physicists here about the notion of negative changes in entropy in the universe. According to the text I am currently reviewing by Hill, the probability that a...
Hi everybody, I have a problem with this equation:
it represents the motion of a gas in a duct with friction (Fanno flow). I need to plot this function in a T-s plot, do you have any ideas? the Mach number changes along the curve...
R=constant
S2-S1=R*(M2-1)*log(T2/T1)$
If in theory there was a very efficient and powerfull refrigerator.
Is it possible that a plant/solar panel could turn that heat into usuable energy, faster then what the frige uses up?
Homework Statement
Hello,
I'm studying for my final exam on statistical physics, and I found an exercise of which I think it is really easy but I'm unsure of how to do it! So now I wonder if I actually don't understand what I'm doing at all!
The question is as follows:
Calculate for...
Given that heat capacity is a ratio of change of energy over change of temperature, while entropy is a change of energy over absolute temperature...
I was wondering if there is any basis for the idea that energy will tend to flow from media having low heat capacity to media having high heat...
Roughly speaking, I want to know how badly Shannon can fail in the one-shot setting.
The standard ideas of asymptotic information theory give a precise meaning to the entropy of a given probability distribution in terms of best achievable compression with vanishing error in the limit of many...
How can you acuratley measure this? I can't see how you can give randomness a number? I've seen in some places that S = ln(the number of possible arangments) Is this true in all cases? But how can you measure the number of possible arangements?, it seems imposible to calculate the entropy for a...
I'm trying to understand entropy, because I have no clue what it is. Iv seen questions like this posted in other places on the web, but there seems to be a disagreement over the answer. Some people say it is the randomness and chaos of a system, and other people say it is how close it is to...
Just a quick question of something I found in my textbook but can't get how they produced it.
C_p =(∂Q/∂T)_p
that is the definition of heat capacity at a constant pressure p. Q is heat and T is temperature. This equation is fine and I know how to derive it. Now it is the next line which...
I would like to make a stand regarding the topic of entropy reversal. Entropy CAN in fact be reduced in a closed system, and this happens spontaneously according to the fluctuation theorem. Its been published in a well known scientific journal over a decade ago, and beforehand has been...
I measured the heat capacity of a sample under various magnetic fields. When I calculated the magnetic entropy as a function of temperature, I found the magnetic entropy measured at 0.5 Tesla is higher than that measured at 0 T. As far as I know, the magnetic field will try to allign the...
This isn't a homework question per se, but more of a coursework question. Specifically, I'm a bit at a loss as to how to go about learning a particular section of the coursework for a physical chemistry (i.e. thermochemistry) class I am taking. The section in question is that pertaining to...
Hey there. We are struggling with a problem from an old exam about statistical mechanics. I hope you can help us or give any clues. Here is the problem.
Homework Statement
A chamber is divided by a wall into two sections of equal volume. One section of the chamber is initially filled by an...