Entropy Definition and 1000 Threads

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. M

    Proving Entropy statement is equivalent to Clausius statement

    For this, I don't understand how we can apply the change in entropy equation for each solid since the ##\frac{dT}{dt}## for each solid will be non-zero until the solids reach thermal equilibrium. My textbook says that the ##\Delta S## for a system undergoing a reversible process at constant...
  2. M

    The Entropy Change of Melting Ice: Why is the Equation Written as ΔS = Q/T?

    For this, Why dose they write the change in entropy equation as ##\Delta S = \frac{Q}{T}##? Would it not better to write it as ##\Delta S = \frac{\Delta Q}{T}##, since it clear that we are only concerned about the transfer of heat in our system while it remains at constant temperature as all...
  3. matsu

    I What is the calculation for value B in Penrose's entropy model?

    In the book "Cycles of Time" by Roger Penrose, there is a part of the explanation of entropy that I don't understand. There are 10^24 balls, half of which are red and the other half blue. The model is to arrange the balls in a cube with 10^8 balls on each edge. It also divides the cube into...
  4. Demystifier

    A Quantum analog of Boltzmann entropy?

    In classical statistical physics, entropy can be defined either as Boltzmann entropy or Gibbs entropy. In quantum statistical physics we have von Neumann entropy, which is a quantum analog of Gibbs entropy. Is there a quantum analog of Boltzmann entropy?
  5. V

    B Entropy change for reversible and irreversible processes

    I came across the following statement from the book Physics for Engineering and Science (Schaum's Outline Series). I cannot seem to find a satisfactory answer to the questions. Is the statement in above screenshot talking about entropy change the statement of Second Law of Thermodynamics or is...
  6. MatthewKM

    I Two Entropy scenarios on a system

    Entropy question. Take a finite number of identical atoms in a specific volume of space at a moment of time. Run two thought experiments on this system scenarios (both time independent) 1: expand the volume of space of the system instantaneously by a factor of 10. The fixed number of atoms...
  7. G

    Calculating Shannon Entropy of DNA Sequences

    Unfortunately, I have problems with the following task For task 1, I proceeded as follows. Since the four bases have the same probability, this is ##P=\frac{1}{4}## I then simply used this probability in the formula for the Shannon entropy...
  8. ergospherical

    A Proving Subadditivity of Entropy for Uncorrelated Systems in Pure States

    Two systems A & B (with orthonormal basis ##\{|a\rangle\}## and ##\{|b\rangle\}##) are uncorrelated, so the combined density operator ##\rho_{AB} = \rho_A \otimes \rho_B##. Assume the combined system is in a pure state ##\rho_{AB} = |\psi \rangle \langle \psi |## where ##|\psi \rangle =...
  9. G

    Apply the Legendre Transformation to the Entropy S as a function of E

    Hi, Unfortunately I am not getting anywhere with task three, I don't know exactly what to show Shall I now show that from ##S(T,V,N)## using Legendre I then get ##S(E,V,N)## and thus obtain the Sackur-Tetrode equation?
  10. L

    I am stuck on a calculation -- Entropy change for a compound system

    Hi, Unfortunately, I have problems with the task 4 In task 3 I got the following $$ T_f=T_ie^{\Delta S_i - c_i} $$ Then I proceeded as follows $$ \Delta S = \Delta S_1 + \Delta S_1 $$ $$ \Delta S =c_1ln(\frac{T_ie^{\Delta S_i - c_i}}{T_1})+c_2ln(\frac{T_f}{T_2})$$ $$ \Delta S...
  11. Jimyoung

    Why is the Entropy of the Universe (total entropy) a path function?

    I understand that S (Ssys) is a state function but I can't understand why Ssurr and Suniv (or Stot) are a path function.
  12. L

    Approximating Glucose as an Ideal Gas: Can We Calculate Entropy?

    For now it is only about the 1 task If the task states that: You can approximate that their dynamics in water resembles that of an ideal gas. Does it then mean that I can take glucose as the ideal gas and then simply calculate the entropy for the ideal gas?
  13. Ahmed1029

    A Free expansion of an ideal gas and changes in entropy

    For a freely expanding ideal gas(irreversible transformation), the change in entropy is the same as in a reversible transformation with the same initial and final states. I don't quite understand why this is true, since Clausius' theorm only has this corrolary when the two transformations are...
  14. casparov

    I Physics of paper absorbing Water -- Doesn't this decrease Entropy?

    Summary: doesn't this decrease entropy ? Cellulose is known for its hydrophilic quality, which can be explained from the polarity of its hydroxyl groups. We all know water can overcome the force of gravity through a piece of paper you put in the water. Correct me if I'm wrong but this is a...
  15. J

    I Trying to better understand temperature and entropy

    If you were to condense all the energy in the universe into a point, wouldn't the temperature be very high, yet the entropy be very low? Also if you were to spread out all of the energy in the universe, wouldn't the temperature be near zero and the entropy be very high? And this makes entropy...
  16. G

    B Does gravity defy the 2nd Law?

    Summary: Trying to understand the relationship between gravity, thermodynamics and entropy, thank you. Gravity can take a diffuse cloud of gas filling a given volume of space at equilibrium density and temperature, and turn it into a burning star surrounded by empty space. Does this mean that...
  17. ohwilleke

    I Can Core Theory Be Derived From Nine Lines?

    Christoph Schiller, "From maximum force to physics in 9 lines -- and implications for quantum gravity" arXiv:2208.01038 (July 31, 2022). This paper asserts that nine propositions can be used to derive the Standard Model and GR and can point the way to quantum gravity, although he cheats a bit...
  18. A

    I Quantum computation and entropy

    Quantum gates must be reversible. The usual justification for this is that in QM the time evolution of a system is a unitary operator which, by linear algebra, is reversible (invertible). But I am trying to get a better intuition of this, so I came up with the following explanation: In order to...
  19. bbbl67

    I Amount of black hole entropy inside the Universe?

    Now, it's been said that the majority of the entropy in the universe resides within the cumulative entropy of black holes inside the universe. How do they know that? Now, I'm not so interested in how they determine the black hole's entropy, I know there's a relatively simple formula for that...
  20. Dario56

    I Boltzmann Entropy Formula – Derivation

    Boltzmann entropy definition is given by: $$ S = k_B lnW $$ where ##W## is the weight of the configuration which has the maximum number of microstates. This equation is used everywhere in statistical thermodynamics and I saw it in the derivation of Gibbs entropy. However, I can't find the...
  21. James Brown

    Problem understanding entropy (two different definitions?)

    In the definition of entropy, there are two. One is about the degree of randomness and one is about energy that is not available to do work. What is the relationship between them?
  22. S

    I Would infinite entropy break all symmetries?

    If the Universe could somehow reach a state of infinite entropy (or at least a state of extremely high entropy), would all fundamental symmetries of the physical laws (gauge symmetries, Lorentz symmetry, CPT symmetry, symmetries linked to conservation principles...etc) fail to hold or be...
  23. mohamed_a

    I Problem regarding understanding entropy

    I was reading about thermodynamics postulates when i came over the differnetial fundamental equation: I understand that the second element is just pressure and last element is chemical energy, but he problem is i don't understand what is the use of entropy and how does it contribute to a...
  24. S

    A Why do we need a quantum correction for black hole entropy?

    Hey to all,... It is now generally believed that information is preserved in black-hole evaporation. This means that the predictions of quantum mechanics are correct whereas Hawking's original argument that relied on general relativity must be corrected. However, views differ as to how...
  25. Delta2

    I Is a Unified Field Theory the Key to Understanding the Universe?

    Is there any approach in any books out there, where we consider that in universe exists only one field, let it be called the Unified Field (UF), in which all of the known fields (gravitational, EM field, quark field, gluon field, lepton field, Higgs Field, e.t.c.) are just components (pretty...
  26. L

    Change of entropy in the Universe in a thermodynamic cycle

    (a) We first find that: ##T_A=\frac{P_A V_A}{nR}=\frac{1\cdot 10^5 \cdot 4}{40\cdot 8.314}K\approx 1202.7904 K##, ##\frac{T_B}{T_A}=\frac{\frac{P_B V_B}{nR}}{\frac{P_A V_A}{nR}}=\frac{P_B V_B}{P_A V_A}=\frac{P_A \frac{V_A}{2}}{P_A V_A}=\frac{1}{2}##, ##\frac{T_C}{T_B}=\frac{P_C...
  27. H

    A Is there a generalized second law of thermodynamics?

    Hi Pfs, There are different kinds of entropies. I discoved the free entropy. https://arxiv.org/pdf/math/0304341.pdf the second law says that the total entropy cannot decrease when time goes by. Is it always the same "time" for the different entropies? the author, Voiculescu, wrote articles...
  28. C

    B Is Entropy the Key to Understanding the Big Bang and the Fate of the Universe?

    I am a software trainer and know about as much Physics as I've been able to pick up from the BBC‘s occasional documentaries. I have never even taken a physics course in high school or college so I am an absolute lay-person here with what’s probably a very lay-person question… I was watching...
  29. Simobartz

    Entropy due to this irreversible process

    I think the solution is: $$dU=\delta W_{prop}$$ $$dU=TdS-PdV$$ $$dV=0$$ then, $$TdS=\delta W_{prop}$$ and so $$dS=dU/T$$ and by the way, it correct to say that, if the transformation between the initial and the final state would happen in a reversible way then the heat transfer could be...
  30. Philip Koeck

    A Definition of entropy for indistinguishable and distinguishable particles

    I have a rather general question about the definition of entropy used in most textbooks: S = k ln Ω, where Ω is the number of available microstates. Boltzmann wrote W rather than Ω, and I believe this stood for probability (Wahrscheinlichkeit). Obviously this is not a number between 0 and 1, so...
  31. K

    I String theory calculation of Extremal black hole entropy problem

    one of the claimed successes of string theory is its ability to derive the correct Hawking-Bekenstein equations to calculate the quantum entropy of a black hole without any free paramenters, specifically Extremal black hole entropy using supersymmetry and maximal charge. I was wondering if...
  32. S

    I Understand the Thermodynamic Identity: Is This Correct?

    I have a question about the Thermodynamic Identity. The Thermodynamic Identity is given by dU = TdS - PdV + \mu dN . We assume that the volume V and that the number of particles N is constant. Thus the Thermodynamic Identity becomes dU = TdS . Assume that we add heat to the system (we see that...
  33. Bernadette

    B Exploring Entropy with Svante Arrhenius' Salt Water Experiment

    Hello Sorry for my English... We approach slowly (in a quasi-reversible way) an electrical charge of a glass of salt water. Some ions arrange themselves in the glass. What can we say about entropy of this transformation? Bernadette PS: My reflection comes from reading an old physics book...
  34. J

    Entropy: Does Disorder Really Measure Order?

    Let's say you have a very dirty small room room and a giant clean library (lots of organized books) and let's say these occupy the same number of microstates. The entropy according to this equation is the same for the library and the room. But one is more ordered than the other one. How does it...
  35. R

    Enthelpy and Isentropic compression/expansion

    In a certain thermodynamics textbook, specific work done by an isentropic compressor/pump in an ideal rankine cycles, is given by the following; Wpump = h2 - h1 Wpump = v(P2 - P1), where v = v1 When I carry out these two calculations between any two states, I get vastly different answers...
  36. C

    Entropy of spin-1/2 Paramagnetic gas

    As we know, dipole can be only arranged either parallel or anti-parallel with respect to applied magnetic field ## \vec{H} ## if we are to use quantum mechanical description, then parallel magnetic dipoles will have energy ## \mu H ## and anti-parallel magnetic dipoles have energy ## -\mu H##...
  37. guyvsdcsniper

    I Relating Entropy and the 2nd law

    So I am midway through my Thermodynamics course in college and still feel a bit unsure about the 2nd law and entropy. I've learned that the 2nd law was states 3 different ways, and by contrapositive proofs we can determine they are all equivalent. What we end up getting for the 2nd law is...
  38. Nick Goodson

    Figuring Out Heat Transfer & Entropy of Steam at 10 MPa

    Hello everybody, would somebody please put me on the right track to answering this question? 'Consider water undergoes a heat transfer at constant pressure of 10 MPa and changes from liquid to steam. Find the entropy of the system (sfg) as well as the heat transfer per unit mass in this...
  39. G

    B Black Hole Entropy: Basis of Logarithm Explored

    In textbooks, Bekenstein-Hawking entropy of a black hole is given as the area of the horizon divided by 4 times the Planck length squared. But the corresponding basis of the logarithm and exponantial is not written out explicitly. Rather, one oftenly can see drawings where such elementary area...
  40. S

    I Is Entropy in the Universe Lower Now Than After the Big Bang Due to Heat?

    If the universe was very hot right after the Big Bang how come the entropy of the universe was lower at that point than now? Isn't heat a reason for higher entropy?
  41. B

    I Elementary Ideas About Entropy -- Is this textbook example correct?

    Summary:: An elementary example calculation involving entropy in a textbook seems wrong I was reading an elementary introduction to entropy and the second law of thermodynamics. The book gave the example of a gas in a chamber suddenly allowed to expand into an additional portion of the...
  42. O

    I Entropic effects of the Uncertainty Principle?

    Per the Heisenberg uncertainty principle, a particle does not have a precisely defined location. Does such uncertainty contribute to the transfer of thermal energy (i.e. entropy)? Is uncertainty the primary means for the transfer of thermal energy at the quantum level?
  43. Afo

    Why does the entropy of a closed system remain constant in a reversible process?

    Homework Statement:: Why is the entropy of a closed system constant in a reversible process, and not related by ##\Delta S = \int_{i}^{f}\frac{dQ}{T}## (See below for the question in more details) Relevant Equations:: ##\Delta S = \int_{i}^{f}\frac{dQ}{T}## I am reading chapter 24 of Physics...
  44. W

    Coping mechanisms for thermodynamics?

    Wasn’t sure whether I should post this here since it’s a more qualitative question, or under the Thermodynamics thread because that’s a more specific topic. For all practical purposes, the laws of thermodynamics are inviolable, and statistical mechanics puts them on an even firmer theoretical...
  45. cianfa72

    I Entropy change due to heat transfer between sources

    Hi, starting for this thread Question about entropy change in a reservoir consider the spontaneous irreversible process of heat transfer from a source ##A## at temperature ##T_h## to another source ##B## at temperature ##T_c## (##T_h > T_c##). The thermodynamic 'system' is defined from sources...
  46. L

    B Is Entropy Truly Undefined in Physical Systems?

    I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction...
  47. K

    I Theoretical maximum efficiency of a heat engine without Carnot

    Through an intriguing fictitious dialog between Sadi Carnot and Robert Sterling, Prof. Israel Urieli of the Ohio University shows that it is not required to invoke entropy, the second law of thermodynamics, and the Carnot cycle with the [ideal] adiabatic processes in order to find out the...
  48. Yash Agrawal

    Thermodynamics of chemical reactions

    In chemical reactions generally ΔG < 0 , but if we were to consider a reversible path between pure reactants and products at 1 bar pressure , shouldn't the ΔG = 0 for every reaction ? and if it is due to non-pv work , I don't see any non pv work being done in reactions happing in a closed...
  49. Tertius

    I Is Entropy the inexorable conversion of potential to kinetic energy?

    I know the math behind these, and I'm happy to use more precise language if needed, I just wanted to get some input on this sweeping generalization that entropy is the conversion of potential to kinetic energy. A brief summary of two important branches of entropy: 1) thermodynamics - the total...
  50. EFech

    How Do Configurational and Conformational Entropy Differ in Protein Folding?

    I have been reading about protein thermodynamics and found different types and models for entropy calculation before and after protein folding. I understand Vibrational, conformational, configurational entropy are some of the most studied "types" of protein folding entropy. My questions is...
Back
Top