Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.
Homework Statement
10 moles of an ideal gas Cv = 20.8 J/mol at T0 = 300 K and P0 =0.3 MPa occupy the left half of an insulated vessel. At time t=0 a 1 kW electrical heating element is turned on. after 30 s, the partition dividing the vessel ruptures and the heating element is turned off...
Homework Statement
Determine ΔSsys when 3.0 mol of an ideal gas at 25°C and 1 atm is heatedd to 125°C and expanded to 5 atm. Rationalize the sign of ΔSsys.
Homework Equations
State Function: dS = (dU)/T + (PdV)/T
State Function for Entropy of Ideal Gas: dS = (CV,mdT)/T + (nRdV)/V
Ideal gas...
Do I understand correctly (in general terms) or wildly incorrectly if I imagine that the constant of expansion and the second law of thermodynamics are very closely connected, or even that the constant of expansion is potentially the source of the second law?
So I am going to be making some keys; and I want to make them more random... So I made this:
#I know that the code is pretty crude, but it works.
#P.S. it is an infinite loop, run at your own risk
import random
from random import randint
import string
import time
x=1
n=randint(5,90)...
Homework Statement
A weather balloon is filled with Helium gas and released from the ground. It goes up 18km and achieves a diameter of 15m. Determine if the following values are greater than zero, less than zero, or equal to zero: ΔV, ΔP, ΔT, ΔU, ΔH, Ssys, surr, Stot
Homework Equations
ΔU =...
Consider a gas as your system, confined in the usual frictionless piston-cylinder. The piston is massless, external pressure is constant, Pext. Let the system be at initial state T1 and P1 = Pext. We want to compare the following two processes: in the first process, we reversibly heat the gas...
When the ideal gas entropy is derived, we consider N atoms in a box of volume=Lx*Ly*Lz. Then, we make the assumption that Lx,Ly,Lz >>de Broglie wavelength of atoms. I am not sure why we need to make this assumption? Thanks!
Homework Statement
A two component gaseous system has a fundamental equation of the form
$$S=AU^{1/3} V^{1/3} N^{1/3} + \frac{BN_1N_2}{N}$$ where $$N=N_1+N_2$$
and A and B are positive constants. A closed cylinder of total volume 2V_0 is separated into two equal subvolumes by a rigid diathermal...
How exactly did Hawking compute that black hole entropy is 1/4 that of a Planck area and concluded about the holographic principle where information of a volume is located on the area of black hole? And if there was no holographic principle, how big should entropy of the black hole be with...
I was studying 2nd law thermodynamics. In that context found the clausius's inequality saying closed integral of dQ/T <0 for irreversible process. And from the reversible process entropy was defined. And from that view they said that for irreversible process dS>dQ/T. Now when I saw some...
Hi there,
I was wondering if you could help me, I think I may have some concepts wrong or incomplete.
Homework Statement
We have an adiabatic cylinder of volume ##V_1## filled with a gas of pressure ##p_1## and temperature ##T_1## in thermal equilibrium, closed with a piston. All of a suden...
when they mean disorder do they mean the range of velocities of the particles increasing as entropy increases? so there are larger differences between the low KE particles and the high KE particles?
Hi,
I am working on investigating an idea I proposed regarding a ramjet that operates in subsonic flow (of a fixed speed) with a convergent intake. That utilizes the pressure immediately behind a standing shock-wave for compression.
I have posted a link to my initial report here and I now need...
Homework Statement
(Excerpted from a longer, multipart problem but essentially)
Show that for an ideal gas,
$$ \frac{\partial p}{\partial T}\bigg)_\mu = \frac{S}{V}. $$
Homework Equations
• The ideal gas law, of course
$$ pV = Nk_{\rm B}T $$
• Pressure, temperature, and chemical potential...
Homework Statement
I've seen this problem appear in more than one textbook almost without any changes. It goes like this:
Assume the entropy ##S## depends on the volume ##\bar{\Omega}## inside the energy shell: ##S(\bar{\Omega})=f(\bar{\Omega})##. Show that from the additivity of ##S## and the...
Homework Statement
Let ##X## and ##Y## be two independent integer-valued random variables. Let ##X## be uniformly distributed over ##\left\{1,2,...,8\right\}##, and let ##\text{Pr}\left\{Y=k\right\} =2^{-k},~k=1,2,3,...##
(a) Find ##H(X)##.
(b) Find ##H(Y)##.
(c) Find ##H(X+Y,X-Y)##.
Homework...
How should I prove this?
From John Preskill's quantum computation & quantum information lecture notes(chapter 5)
If a pure state is drawn randomly from the ensemble{|φx〉,px}, so that the density matrix is ρ = ∑px|φx〉<φx|
Then, H(X)≥S(ρ)
where H stands for Shannon entropy of probability {px}...
Hey,
I am going to write my bachelor's thesis on Entropy in a Quantum Mechanical Framework. My professor told me he would refer me to some literature. However it will take him some time.
I would like to get started myself. Could you please refer me to papers, books, articles? I have already read...
Homework Statement
For a certain reaction, ΔG = 13580 + 16.1 T log10(T) - 72.59 T. Find ΔS and ΔH for the reaction at 298.15 K.
Homework Equations
ΔG = ΔH - TΔS
\left[\frac{\partial (\Delta G)}{\partial T} \right]_P = - \Delta S
The Attempt at a Solution
For the sake of this thread's length I...
Problem statement:
A sample of 8.02 × 10-1 moles of nitrogen gas ( γ = 1.40) occupies a volume of 2.00 × 10-2 m3at a pressure of 1.00 × 105 Pa and temperature of 300 K. It is isothermally compressed to half its original volume. It behaves like an ideal gas. Find the change in entropy of the gas...
Homework Statement
A solid metallic cube of heat capacity S is at temperature 300K. It is brought in contact with a reservoir at 600K. If the heat transfer takes place only between the reservoir and the cube, the entropy change of the universe after reaching the thermal equillibrium is
A...
Homework Statement
Two equal bodies with temperatures T1 and T2.
T1 > T2. The specific heat c does not depend on temperature.
What is the maximal work which can be done from this system?
I have to get equation for max work which includes c and temperatures.
Homework Equations
This seems to...
Isentropic means a process where entropy remains constant. Now formula for entropy is
ΔS = ΔQ/T
now in an isentropic process, ΔS=0...so that means ΔQ = 0 ...right?
but if ΔQ = 0, that is an adiabatic process.
so are isentropic and adiabatic processes are...
Hi, I've a few questions about entropy.
Entropy is the measure of disorder in universe, and the entropy of universe always increase. So, is saying this correct that entropy is the unusable energy causing the disorder and due to this the amount of useable energy decrease every time when the...
Hi everyone,
I have a conceptual question about entropy. I understand perfectly why mathematically the units of entropy are energy per temperature (SI: J / K). However, I would like to better understand the significance of these units.
For example, the SI units for speed/velocity are m / s...
As we know S=Q/T. And
Entropy is defined as number of microstates of a system. So does that prove that, the lower the temperature the more the microstates available?
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.
thank you
I know that entropy is a measure of disorder. But Entropy is also a function of the state of a system, and has a value determined by the state variables of the system.
Does that mean Entropy describes the equilibrium state of a system. Please explain in layman terms in this context what is...
Homework Statement
Hi all,
There is a question from the course book:
Homework Equations
S=k_B ln W
The Attempt at a Solution
My solution:
So first of all, for each molecule, there are 2 motions: translational and rotational.
For rotational I get:
W_1 =\Omega \left ( \theta \right )
For...
Hi everybody,
I have some troubles to define the enthalpy and entropy of a half reaction. If we consider the 2 following reactions :
Anode : H2 = 2H+ + 2e-
Cathode : 2H+ + 2e- + 1/2O2 = H2O
So the global reaction is :
H2 + 1/2O2 = H2O
We know the global reaction's entropy can...
The directionality of time seems to be linked to the process of increasing disorder.
Is the 'passage' of time similarly linked?
If so, it would seem that the passage of time would generally slow down as the universe cools. And perhaps time should pass more slowly in cooler regions of the...
I am unable to grasp why is entropy inversely proportional to temperature. My book says that "Heat added to a system at a lower temperature causes higher entropy increase than heat added to the same system at a higher temperature." What is meant by this statement?
Homework Statement
A mass m of water cools down from 50degc to 10degc (the temperature of the surrounding environment). Calculate the entropy increase of the system (the water). The water has specific heat capacity c.
Homework Equations
dS=dQrev/T
S is entropy, Q is heat added to the system, T...
Homework Statement
Homework Equations
The Attempt at a Solution
I have already submitted these questions and have gotten them wrong. I'm just wanting to see how to properly do it. Thanks.
Homework Statement
50.0g of water (the system) at 30C is frozen to ice at a final temp of -10C in a freezer. Assuming that the volume of water remains the same during the process, calculate the change in entropy of the system and the change of entropy of the thermal universe when the system...
I saw an old interview with Roger Penrose where at one point he was talking about the degree of organization the universe exhibited at it's initial state. He said the second law of thermodynamics tells us as time passes the universe is becoming more disorderly, which means if we were to go back...
How do you calculate the entropy of an ideal gas with n = 1, Cv,m = 1.5R, Ti = 300K, P=3bar and expands against Pext = 1bar until final volume is twice initial volume at Tf = 200K?
HI,
I am reading Shannon's paper on Theory of Communication and I having trouble with a concept.
Shannon writes:
The output of a finite state transducer driven by a finite state statistical source is a finite state statistical source, with entropy (per unit time) less than or equal to that of...
Whilst I understand that entropy is a measure of the number of specific ways in which a system may be arranged. The units for entropy don't make sense to me intuitively. Why joules per kelvin? What way at all does that show how "disordered" a system is. When I hear joules per kelvin, I think of...
A Monatomic gas passes from state 1 (pressure p1, volume V1) to state 2 (p2, V2).
Derive an expression for the change in entropy of a monatomic ideal gas.
The required final equation is: ΔS = Cv ln(T2/T1) + nRln(V2/V1)
In my attempt, I am retrieving ΔS = Cv ln(T2/T1) + Rln(V2/V1)
i.e.; the...
I am sure this has a simple answer, but I don't seem to get it at the moment. I am going through a derivation of the Boltzmann's distribution by maximising the entropy with the constraints that the sum of the probabilities add to 1 and the average energy is some constant value. My question is...
Homework Statement
A 6 sided dice is loaded such that 6 occurs twice as often as 1. What is the total probability of rolling a 6 if the Shannon entropy is a maximum?
Homework Equations
Shannon Entropy:
$$S=-\sum_i p_i \ln{p_i}$$
where ##p_i## is the probability that we roll ##i##.
The Attempt...
Air:
V1= 0,1 m3
P1= 1 MPa
T1= 20 C
After isothermal expansion P2= 0,1 MPa
I had to find T2, M, V2, L, Q and found all those (T2=20 C; M=1,1927 kg; V2=1 m3; L=Q=230258 J) but need s (enthropy) for creating illustration in T-s
I can`t find how it`s possible to calculate s1 and s2, is it possible...
Hi all I hope you can help me with the statistical origins of the Second Law. I cannot find anything that mathematically proves that order from disorder is impossible only improbable.
Leading me to think that a system (Kelvin engine) that allows order to be created from disorder (work from...
Homework Statement
1)[/B] A quantity of propane is contained in a piston-cylinder assembly. This working
fluid undergoes a process starting from an initial state of 2.0 MPa and 60oC. At
the end of the process the pressure of the propane is 1.0 MPa and the specific
entropy is 0.089 kJ/kgK higher...
Homework Statement
The mass flow rate through a steam turbine operating under steady conditions is 100 kg/s. Steam enters the turbine at 12 MPa and 400oC. A mixture of vapour and liquid water exits the turbine at 10 kPa. At the exit state 93% of the mass of the water is in vapour form. The...