Entropy Definition and 1000 Threads

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. Amaterasu21

    I How Does Relativity of Simultaneity Clash w/Thermodynamics?

    In special relativity, observers can disagree on the order of events - if Alice thinks events A, B and C are simultaneous, Bob can think A happened before B which happened before C, and Carlos thinks C happened before B which happened before A - provided A, B and C are not causally connected, of...
  2. C

    Entropy question: Does a substance at 0K have no entropy?

    Is it 0 K for a substance to have no entropy?Sorry. Just had to get that off my chest.
  3. H

    A Exploring Fringe Visibility from Entropy of Two-Level Particles

    i consider a pair of two level particles which can be up or down. this pair is described in the tensor product by the unitary vector (cos(\theta) (du + ud) + sin(\theta) (dd + u)) /\sqrt 2 i take its density matrix , trace it on one of the two particles and find the density matrix of each one...
  4. Adam564

    Does a Well-Defined Entropy Exist for Non-Ideal Gases A and B?

    The conclusion of my attempt I am listing below is that there do exist entropies for both but I am not sure. $$dU=TdS-pdV$$ $$dS=\frac{dU}{T}+\frac{p}{T}dV$$ Therefore, gas A: $$S=\frac{{\Delta}U}{T}+\alpha_A(\frac{-N}{{\Delta}V})$$ Gas B...
  5. T

    Help with this thermodynamics and entropy question please

    So I've answered the first question and I got a final temp of 42.06 Celsius. Now for this second one, I don't know why I am getting it wrong: Im doing 0.215*ln(315.06/291.46) + 1*ln(315.06/319.91) But it says I am wrong. What about my process is faulty?
  6. R

    Gibbs "Paradox" and the Entropy of mixing

    (not a paradox nowadays, but it was an issue for years) https://en.m.wikipedia.org/wiki/Gibbs_paradox It's not a question about a formula. I don't understand the motivation in physics to claim Gibbs mixing "paradox", the discontinuity point. What bothers the physicist to ask for a continuous...
  7. iVenky

    I Maxwell's Demon and the Uncertainty Principle

    Maxwell's demon measures the position and velocity of the particle. How can it do that when it violates the uncertainty principle? Does that mean uncertainty principle is unavoidable otherwise we will violate II law of thermodynamics as in the case of Maxwell's demon?
  8. LCSphysicist

    What Does the First Entropy Equation for a Permeable Membrane Indicate?

    Actually i am trying to see what the first equation to the entropy means, maybe N1 remets to the part 1 (the left suppose) of the system? (or the molecules type 1?) I am not sure about the equations i will do below, probably it will be wrong, anyway. ∂S/∂U1 = 1/T1 = 3NR/2U1 Okay, it will give...
  9. LCSphysicist

    Easy entropy problem, what is the permitted entropy?

    There is five physically possible entropy to exist, and five entropy which can't be real, find it all. I could found just four entropy, what is the another? B, H and J: S(λU,λV,λN) ≠ λS(U,V,N) D: ∂S/∂U < 0 what is the another? (another or other??)
  10. LCSphysicist

    How Does a Semi-Permeable Membrane Affect Entropy in a Two-Chamber Gas System?

    We need to find the system's entropy variation. I don't think i understood pretty well what is happening in this process, can someone help me how to start?
  11. M

    B Low entropy early Universe and the heat death musings

    I came upon a realization recently. The early universe is always described to have begun in a state of extremely low entropy and it's been increasing ever since. But the same amount of stuff exists now as it did back then. Only thing that's changed is how big the universe is now vs then. So...
  12. LCSphysicist

    Sketch a qualitatively accurate graph of the entropy of a substance

    Sketch a qualitatively accurate graph of the entropy of a substance as a function of temperature, at fixed pressure. Indicate where the substance is solid, liquid, and gas . Explain each feature of the graph briefly. What you think about?: dU = -P*dV + Q*dS (1) V = C*T => dV = C*dT Nfk*dT/2 +...
  13. LCSphysicist

    The Effect of Increasing the Size of a Solid on Its Entropy

    Does increase the size of the solid body increase its entropy? I was thinking about it using the Einstein model of solid. S = k*lnΩ Ω = (q+n-1)!/((q)!(n-1)!) I am not sure how this question should be answer, i think if we talk about rigid bodies, the question don't even have sense, but about...
  14. W

    A Feature Selection, Information Gain and Information Entropy

    Sorry for being fuzzy here, I started reading a small paper and I am a bit confused. These are some loose notes without sources or refs. Say we start with a collection F of features we want to trim into a smaller set F' of features through information gain and entropy (where we are using the...
  15. K

    Why is entropy of an object inversely related to its temperature?

    I read in a book "Quantum Space" by Jim Baggot, page 290, that the entropy of an object is inversely proportional to its temperature. (He was describing the temperature of a black hole. Does this statement only apply to black holes?) No doubt he is correct, but wouldn't an increase of energy...
  16. blazh femur

    Is randomness real or the inability to perceive hyper complex order?

    How did you find PF?: random Brownian motion Is randomness real or is it simply defined as such due to our inability to perceive hyper complex order? Randomness is a troublesome word. I'd feel better if I knew it was an objective phenomenon and not merely a placeholder description of...
  17. J

    Exploring Entropy: A Systems Engineering Perspective

    As a systems engineer I have thought a lot about entropy in trying to get a better intuitive sense for what it is, at a more macro level than it is usually discussed. I have some ideas and am looking for a forum to present and explore them with others. I wish to discuss more from an...
  18. micklat

    I Explaining atoms and bonding using entropy

    I am a biology undergraduate interested in abiogenesis. The entropic explanation for the origin of life is that life is allowed to exist because it increases universal entropy. I am curious about how far we can take this theory down. How can you explain the emergence of atoms and atomic...
  19. T

    Calculate the bond-dissociation energies and entropy of a molecule

    If we know the molecular structure of a complex chemical (organic), can we calculate the dissociation energy for each and every bond somehow? Also, can we calculate the standard etropy of the same molecule? These information would be needed to calculate Gibbs free energy for reactions of a...
  20. A

    Is Entropy decreased for Free expansion of a Waals gas?

    Previous of this problem, there was another problem. that is "What is the change in Temperature of van der Waals gas in free expansion?". I got them. It was C_V dT= -aN^2/V^2 dV Then, I got T=T0-aN^2/2VC_V So i knew that the Temperature is decreased by free expansion in adiabatic process. Then I...
  21. A

    Von Neumann Entropy time derivative(evolution)

    I'm not sure about my proof. So please check my step. I used log as a natural log(ln). Specially, I'm not sure about "d/dt=dρ/dt d/dρ=i/ħ [ρ, H] d/dρ" in the second line. and matrix can differentiate the other matrix? (d/dρ (dρ lnρ))
  22. M

    How to calculate the entropy for two sources

    Hello to everyone, I'm studying thermodinamics and I would like to understand better the meaning of entropy and how to calculate it. I know that if A and B are two possible states of a system, the equation whcih defines variation of entropy from A to B is...
  23. J

    Exploring the Relationship between Entropy and Competition in Systems

    Is there any particular reason why entropy could not be a measure of or proxy for the level of competition within a system?
  24. G

    How to think about entropy microstates/macrostates for a gas in a box

    I'm trying to relate an analogy from Brian Greene about entropy microstates/macrostates to the real world. In the analogy, you have 100 coins that you flip. The microstate is which particular coins landed heads up. The macrostate is the total number of coins that are heads up. So a low entropy...
  25. Haorong Wu

    I Exploring Entropy in Intro to Statistical Physics by Huang

    Hi, I am currently reading Introduction to statistical physics by Huang. In the section of entropy, it reads But what if I choose ##R-P## as a closed cycle? Then in a similar process, I should have ##\int_{R} \frac {dQ} {T} \leq \int_{P} \frac {dQ} {T}## and ##S \left ( B \right ) - S \left (...
  26. P

    Entropy and the Helmholtz Free Energy of a Mass-Piston System

    Attempt at a Solution: Heat Absorbed By The System By the first law of thermodynamics, dU = dQ + dW The system is of fixed volume and therefore mechanically isolated. dW = 0 Therefore dQ = dU The change of energy of the system equals the change of energy of the gas plus the change of energy...
  27. forkosh

    I Von Neumann entropy for "similar" pvm observables

    The von Neumann entropy for an observable can be written ##s=-\sum\lambda\log\lambda##, where the ##\lambda##'s are its eigenvalues. So suppose you have two different pvm observables, say ##A## and ##B##, that both represent the same resolution of the identity, but simply have different...
  28. S

    A QFT topics for entanglement entropy

    I want to know what are the QFT topics that I need to understand in order to proceed in reading papers on entanglement entropy such as, Entanglement Entropy and Quantum Field Theory Entanglement entropy in free quantum field theory Entanglement entropy: holography and renormalization group An...
  29. L

    Entropy change for two masses of water mixed adiabatically

    the entropy change for a reversible adiabatic process is zero as it remains constant. Is this a reversible process? assuming T1>T2: hot (h) water has mass M, temp T1 cold (c) water has mass nM, temp T2 let the final temperature be Tf if δQ=0 as the process is adiabatic, |Qh|=|Qc| so Qh=-Qc...
  30. Pouyan

    Change in Entropy for an isolated system

    ΔU_A + ΔU_B = 0 (Is this because of isolated system am I right?) ΔU_A = CA * (T_final - T_A ) ΔU_B=CB * (T_final-T_B) And because of a very slow process : S=ln(T) T_final= (CA T_A + CB T_B)/(CA + CB) ΔS_final = CA*ln(T_f/TA) + ln(T_f/TB) * CB My QUESTION is : When we say No heat exchange...
  31. K

    The Increasing Disorder: Exploring the Limits of Entropy in the Universe

    In a shrinking universe heat will increase, but also volume available to place particles will decrease. What happens to entropy when the volume gets very small and the temperature is very high?
  32. P

    Constant Pressure Specific Heat in terms of Entropy and Enthelpy

    If ##N## is constant (per the partial derivatives definitions/ the subscripts after the derivatives) then ##G## is constant ##H - TS = constant## Taking the derivative of both sides with respect to ##T## while holding ##N,P## constant we get the following with the use of the product rule...
  33. T

    What is the change in entropy for a colloid settling out of solution?

    If it occurs spontaneously then it must increase entropy but the possible micro states reduce so what else is occurring to increase entropy
  34. Saptarshi Sarkar

    Meaning of thermodynamic probability

    I was studying statistical mechanics when I came to know about the Boltzmann's entropy relation, ##S = k_B\ln Ω##. The book mentions ##Ω## as the 'thermodynamic probability'. But, even after reading, I can't understand what it means. I know that in a set of ##Ω_0## different accessible states...
  35. Philip Koeck

    Show that entropy is a state function

    In a (reversible) Carnot cycle the entropy increase of the system during isothermal expansion at temperature TH is the same as its decrease during isothermal compression at TC. We can conclude that the entropy change of the system is zero after a complete Carnot cycle. The mentioned textbook now...
  36. Y

    Entropy change when melting ice then refreezing the water

    ##dmL_f= Q \; \; ##,##∆T=\frac{T(v_l-v_s)∆P}{L} \; \;##,##\frac{dmL_f}{T_0}= dS_2 \; \;##,##\frac{dmL_f}{T_1}= dS_1 ##
  37. Rahulx084

    Thermodynamics -- Temperature of a Heat Source?

    In heat engine we define a heat source from where heat is transferred to the system, we say that heat source has a temperature ##T_h## , When we define a Carnot heat engine, the first process we have is an isothermal expansion and we say heat has to come in system through this process and here...
  38. F

    What is the Entropy at ZERO Degrees Kelvin?

    Planck states that all perfect crystalline system have the same entropy in limit T approaches zero,so we can put the entropy equal zero.Can we demonstrate that or is it only a presumption?
  39. V

    Relation between the arrow of time and entropy

    What is the relation of the arrow of time and entropy according to thermodynamics?
  40. olgerm

    How to calculate entropy from positions and velocities of gas molcules

    How to calculate entropy from positions and velocities of gas molecules? lets say we have 2 different gases. entropy should be bigger after mixing them, than before when these are separated. But how to calculate exact entropies by knowing only positions and velocities of gas molecules?
  41. WhiteWolf98

    Finding Isentropic Enthelpy, knowing Isentropic Entropy

    A short background: My question focuses solely on the part of the refrigeration cycle to do with the compressor, where the cycle begins. The first state is before the refrigerant enters the compressor, and the second state is after the refrigerant leaves the compressor. My goal is to obtain...
  42. Physics345

    Entropy and the Laws of Thermodynamics

    Hello everyone, I have to write a paper about entropy and how it relates to the laws of thermodynamics, energy, and work. I have taken a deductive approach starting from the zero-th law to the second law of thermodynamics as follows. Entropy is the disorder of a system (Class Video, 2019)...
  43. S

    Can Macroscopically Distinguishable Objects Have the Same Entropy?

    I'm kinda confused on the concept of entropy of everyday, low entropy states like macroscopic objects. It is said that the entropy is a measure of disorder, or distinguishability between macroscopic states. Can two objects which are macroscopically distinguishable/look different have the same...
  44. Hawzhin Blanca

    A Many worlds, observer and Entropy

    According to Everett-interpretation or many world interpretation of quantum mechanics, each decision an observer makes, the world splits into two parallel universes, let’s say an observer in some point in Spacetime is tests the Schrödinger’s cat experiment, in one branch of the universe the cat...
  45. Y

    Differentials of Entropy for Air and Water at Different Temperatures

    for a)##\Delta S=\mp \int_{T_i}^{T_0}\frac{C(T)}{T}dT## and ##\Delta S_{th}=\int_{T_i}^{T_0}\frac{dQ}{T_0}dT## so ##S_{univ}=\Delta S_{th}+\Delta S##. What is ##dQ## equal to ? I don't know how to answer question b). Thank you for your help.
  46. J

    Thermodynamic equilibrium with fixed energy/entropy

    If you take a system with fixed entropy S0 and let it evolve, it reachs equilibrium. Let Ueq be the energy of the system at equilibrium. Now take the same system with fixed energy U=Ueq (S is not fixed anymore), how do you know that the equilibrium reached is the same as before, that means with...
  47. H

    Entropy in a non inertial reference frame

    I know that the entropy of a system is the same in different inertial frames. Is this still the case for non inertial frames? For example, is the entropy of a body as seen from a rotating reference frame the same as the entropy seen from a fixed frame?
  48. T

    Entropy increase of solid vs liquid

    A hypothetical question. Heat Q is transferred from water to a metallic solid. Both have same heat capacities and the same initial temperature. Now since molecules in a liquid are more randomly oriented than a solid, will the entropy decrease of the liquid be more than the entropy increase of...
  49. F

    Why do we need to supply energy to a biological body to keep entropy decreasing?

    Why when we supply energy for biological body then the body can keep entropy not increase?Because we know that by definition temperature equals partial derivative of internal energy to entropy.So that when temperature being constant,if internal energy increase(supplying energy for body) then...
Back
Top