Entropy Definition and 1000 Threads

Entropy is a scientific concept, as well as a measurable physical property that is most commonly associated with a state of disorder, randomness, or uncertainty. The term and the concept are used in diverse fields, from classical thermodynamics, where it was first recognized, to the microscopic description of nature in statistical physics, and to the principles of information theory. It has found far-ranging applications in chemistry and physics, in biological systems and their relation to life, in cosmology, economics, sociology, weather science, climate change, and information systems including the transmission of information in telecommunication.The thermodynamic concept was referred to by Scottish scientist and engineer Macquorn Rankine in 1850 with the names thermodynamic function and heat-potential. In 1865, German physicist Rudolph Clausius, one of the leading founders of the field of thermodynamics, defined it as the quotient of an infinitesimal amount of heat to the instantaneous temperature. He initially described it as transformation-content, in German Verwandlungsinhalt, and later coined the term entropy from a Greek word for transformation. Referring to microscopic constitution and structure, in 1862, Clausius interpreted the concept as meaning disgregation.A consequence of entropy is that certain processes are irreversible or impossible, aside from the requirement of not violating the conservation of energy, the latter being expressed in the first law of thermodynamics. Entropy is central to the second law of thermodynamics, which states that the entropy of isolated systems left to spontaneous evolution cannot decrease with time, as they always arrive at a state of thermodynamic equilibrium, where the entropy is highest.
Austrian physicist Ludwig Boltzmann explained entropy as the measure of the number of possible microscopic arrangements or states of individual atoms and molecules of a system that comply with the macroscopic condition of the system. He thereby introduced the concept of statistical disorder and probability distributions into a new field of thermodynamics, called statistical mechanics, and found the link between the microscopic interactions, which fluctuate about an average configuration, to the macroscopically observable behavior, in form of a simple logarithmic law, with a proportionality constant, the Boltzmann constant, that has become one of the defining universal constants for the modern International System of Units (SI).
In 1948, Bell Labs scientist Claude Shannon developed similar statistical concepts of measuring microscopic uncertainty and multiplicity to the problem of random losses of information in telecommunication signals. Upon John von Neumann's suggestion, Shannon named this entity of missing information in analogous manner to its use in statistical mechanics as entropy, and gave birth to the field of information theory. This description has been proposed as a universal definition of the concept of entropy.

View More On Wikipedia.org
  1. A

    Use of Entropy for a Control Volume in Energy Balance

    Hi all, I'm having some trouble figuring out why entropy is used instead of enthalpy for an open system. From what I understand, an open system uses entropy to calculate internal energy. Since the control volume is constant (i.e. Δv = 0), wouldn't using : h = u + PΔv effectively be h = u? So...
  2. S

    Entropy vs. enthelpy in chemical reactions.

    Hello, I am learning about using Free energy change /delta G to determine if a chemical reaction will occur spontaneously. /delta G = /delta H + T*/delta S. Now, enthalpy change can drive a reaction which leads to a decrease in entropy (multiple reactants => single product). My Question...
  3. K

    A Does Minimal Black Hole Entropy Suggest a Fundamental Spacetime Structure?

    If we plug the Planck mass into the Bekenstein-Hawking formula for the BH entropy, we'll get S = A/4l^2 = 4πGM^2/cħ = 4π ≈ 12.56 nat for the minimal Schwartzschild black hole. If we assume that each entropy unit is a compact area on the horizon, can we consider the minimal BH a dodecahedron...
  4. E

    Required heat, entropy change of object dropped in water

    Homework Statement A 1 liter container is filled with argon to pressure of 10^5Pa at 303K. Dropped into pool at temp 323K. How much heat is needed to heat the gas to 323K? what is the entropy change in the gas and the universe? ignore the entropy change in container. Homework Equations...
  5. J co

    Which Entropy Functions Are Not Extensive?

    Homework Statement Which of the following are not extensive functions: S[1] = (N/V)[S[0]+[C][v] ln(T) + R ln(V)] S = (N)[S[0]+[C][v] ln(T) + R ln(V/N)] S[3] = ([N])[S[0]+[C][v] ln(T) + R ln(V/N)] 2. Homework Equations I'm not really sure how to approach this problem. The definition that I...
  6. S

    Entropy change in a reversible isothermal process

    Why does ∆S = 0 for a reversible process, but for a reversible isothermal process, ∆S is given by nRln(Vf/Vi) (or other variations of that equation)?
  7. Philip Koeck

    Why can one calculate entropy change for thermal conduction?

    A hot object in thermal contact with a cold one will finally reach a temperature in between. Why can the entropy change of each object be calculated as if the process was reversible? Is there a reversible process with the same final and initial state and what would that be?
  8. N

    Enthelpy at the throat of the nozzle

    Hi, This is my first post in this forum :woot: I have a total enthalpy ht=3000 kj/kg with velocity at inlet v1=20 ms-1, Speed of sound a1=562.5 ms-1 and outlet pressure P2 = 1 bar With the formula ht=h*+(a*2/2) how do i calculate actual value at throat h* ? NOTE: (by trial and error) I can...
  9. R

    Entropy of a Gas Under Pressure: Does Temperature Trump Pressure?

    Can anyone please answer this question? I have read that increased temperature increases entropy and increased pressure decreases entropy ,for a gas.And vice versa.decreased temperature decreases entropy and decreased pressure increases entropy.Can anyone please tell me for a gas under pressure...
  10. R

    Gibbs free energy -- mathematical expression

    I am not able to understand the mathematical expression of "change in Gibbs free energy", For a chemical reaction occurring at constant temperature and constant pressure, (ΔS)total = (ΔS)system + (ΔS)surrounding Considering that reaction is exothermic, ΔH be the heat supplied by system to...
  11. S

    Time, Spacetime & The Arrow of Entropy

    Physicists refer to "spacetime", lumping together the dimensions of X, Y, Z, and T as if they're all common and same. This reductionism is the product of mathematical rigor. But in our daily lives, we don't experience T in the same way we experience X, Y, and Z. I can arbitrarily set the...
  12. O

    Irreversible adiabatic process - is the entropy change zero?

    Homework Statement A well insulated container consists of two equal volumes separated by a partition - one half an ideal gas while the other is a vacuum. The partition is removed, and the gas expands. What is the entropy change per mole? Homework Equations dS = dQrev/T S/R = Cp/R dT/T -...
  13. entropy1

    Why is entropy not reversible?

    Is there an easy way to explain in layman terms why entropy in an open system is not reversible?
  14. gloppypop

    Power Consumption and Entropy Generation

    Homework Statement [/B] 2-3-15 Homework Equations [/B] P = power. W = work. U = internal energy. S = entropy. t = time. Q = heat. T = temperature. F = force. d = distance. P = ΔW/Δt= ΔU/Δt ΔS = ΔQ/T dm/dt = ρ⋅dV/dt W = F ⋅ d ΔU = Q - W Where m is mass, V is volume, and ρ is the...
  15. C

    Conceptual explanation for ΔS=Q/T

    So understand how to solve a problem using the equation: ΔS=Q/T But is there a conceptual explanation for why the equation works. And I'm not looking for a proof, but I simply want to understand why this equation should be intuitive. Thanks for your time!
  16. A

    Can the entropy be reduced in maximization algorithms?

    In maximization algorithm like that is used in artificial intelligence, the posterior probability distribution is more likely to favour one or few outcomes than the prior probability distribution. For example, in robots learning of localization, the posterior probability given certain sensor...
  17. zawy

    Is the Moon responsible for reducing Earth's entropy and making life possible?

    Spontaneous negative entropy reactions can occur when the internal energy decrease is greater than the negative dS*T. dG=dU-dST is spontaneous if dG is negative. The moon is receiving at least 1E8 J/s from the loss of rotational energy from the Earth's water and air. A lot more rotational...
  18. D

    Confusion about [itex]T[/itex] in the definition of entropy

    In the derivation of the Clausius inequality, T is the temperature of the reservoir at that point in the cycle, but in the definition of entropy it becomes the temperature of the system. This seems to work for a Carnot cycle, where the two are the same, but for other processes, such as an object...
  19. T S Bailey

    Is Shannon Entropy Dependent on Perspective?

    If you have multiple possible states of a system then the Shannon entropy depends upon whether the outcomes have equal probability. A predictable outcome isn't very informative after all. But this seems to rely on the predictive ability of the system making the observation/measurement. This...
  20. B

    Calculating microstates and entropy

    Homework Statement Two identical brass bars in a chamber with perfect (thermally insulating) vacuum are at respective temperature T hot>T cold. They are brought in contact together so that they touch and make perfect diathermal contact and equilibrate towards a common temperature. We want to...
  21. mfig

    I Why do we see the claim that an isentropic process is adiabatic and reversible?

    Why do we always see the claim that an isentropic process for a system is adiabatic and reversible? The change in entropy for a process is the sum of the entropy transfer accompanying heat and the entropy production. The entropy production term is always at least zero, and the transfer term...
  22. A

    Why define entropy with heat instead of work?

    From what I understand, in the Carnot cycle summing qi/Ti for each step results in zero, thus indicating a new state function, entropy = qrev/T. But since dE = 0 = q+w, then q = -w, and looking at the equations derived from the cycle summing wi/Ti for each step should also result in zero. So why...
  23. T

    Can Entropy Decrease in Endothermic Reactions?

    I forgot where I saw it but recently I saw somewhere that entropy decreases in endothermic reactions. On other sites however, they say that entropy can only increase and not decrease. Can someone tell me which is right? Thanks
  24. Feeble Wonk

    B Can entropy be measured in a singularity of infinite density?

    Please have pity for the idiot in the room. I've tried to look into this concept through a few papers I've found on-line, but the mathematics involved is too far over my head. I'm trying to wrap my head around the general concept of entropy as it applies to a singularity. In a singularity of...
  25. Coffee_

    Finding the entropy from the heat capacity

    Let's say that we have some canonical ensemble where I know that the heat capacity is given by ##C_{V}=\alpha(N,V) T^{n}## Since ##C_{V}=T\frac{\partial S(T,V)}{\partial T}## I know that ##S(V,T)=\frac{1}{n} \alpha(N,V) T^{n} + f(N,V) ## Where the function ##f(N,V)## has to do with the fact...
  26. A

    Entropy Generation Homework: 100kPa-500kPa, 300K-500K, 600K

    Homework Statement Air at 100 kPa, 300 K is to be delivered to a pipeline at 500 kPa, 500 K. The scheme involves reversible adiabatic compression of the air and then internally reversible isobaric heating. Assume that, heat is exchanged with a reservoir at 600 K. Determine the work and...
  27. O

    How Do You Calculate Work and Entropy Generation for Reversible Processes?

    Homework Statement Atmospheric air at 300K and 100kPa is to be delivered to a line at 600K, 200kPa. Two possible schemes for doing this are suggested. The first scheme involves reversible adiabatic compression and then internally reversible isobaric heating. .The second scheme involves...
  28. Ryaners

    Calculating entropy change associated with change in temperature

    Hi folks, This is a question about how to calculate entropy change when there is a temperature change involved. I got the correct answer to this, but I don't actually understand why it's correct..! Any help is much appreciated. Homework Statement One mole of liquid bromine is heated from 30...
  29. D

    Shannon entropy of logic gate output.

    Homework Statement A particular logic gate takes two binary inputs A and B and has two binary outputs A' and B'. I won't reproduce the truth table. Suffice to say every combination of A and B is given. The output is produced by A' = \text{NOT} \ A and B' = \text{NOT} \ B . The input has...
  30. D

    Copenhagen interpretation and entropy confusion?

    I don't see how the Copenhagen interpretation and the second law of thermodynamics can be compatible. In the Copenhagen interpretation, upon losing coherence the system chooses a single definite state and all other possible states are eradicated. This seems to be losing entropy to me, as it's...
  31. rjbeery

    "Randomness Through Entropy" Paradox

    In Information Theory, entropy is defined as the unpredictability of information content and, as such, the entropy of the output from so-called pseudo random number generators (PRNG) is often measured as a test of their "randomness". An interesting paradox arises with this definition... Start...
  32. Kiarash

    Why is the logarithm of the number of all possible states of

    Temperature of a system is defined as $$\left( \frac{\partial \ln(\Omega)}{ \partial E} \right)_{N, X_i} = \frac{1}{kT}$$Where Ω is the number of all accessible states (ways) for the system. Ω can only take discrete values. What does this mean from a mathematical perspective? Many people say we...
  33. M

    Why Is radiation the highest form of entropy?

    Hello everyone, I have heard in a sminar that radiation is the highest form of entropy. Why? I was wondering too if all matter would transform at the end in radiation somehow
  34. K

    Did the Pressure of a Monatomic Gas Change During Isentropic Heating?

    Homework Statement A sample containing 3.65 mol of a monatomic ideal gas is heated from 289K to 458K, and the entropy remains constant. If the initial volume of the sample was 0.0980m^2, by what factor did the pressure increase or decrease during this process? Homework EquationsThe Attempt at...
  35. A

    Entropy Difference of an Unknown Gas (not an ideal gas)

    Homework Statement Temperature, pressure and volume measurements performed on 1 kg of a simple compressible substance in three stable equilibrium states yield the following results. State 1 (T1=400 C , V1= 0,10 m3, P1=3 MPa) State 2 (T1=400 C , V1= 0,08 m3, P1=3,5 MPa) State 3 (T1=500 C , V1=...
  36. M

    Standard entropy of liquid lead at 500C?

    Homework Statement The standard entropy of lead at 25C is S(298)=64.80 J/Kmol. The heat capacity of solid lead is Cp(s) = 22.13 + .01172T + 0.96x105T-2. The heat capacity of liquid lead is Cp(l) = 32.51 - 0.00301T Melting point is 327.4C Heat of fusion is 4770J/mol. Calculate the standard...
  37. K

    What is the change in entropy ΔS of the gas?

    Homework Statement Two moles of an ideal gas undergo a reversible isothermal expansion from 3.37×10−2m3 to 4.29×10−2m3 at a temperature of 29.6 ∘C. What is the change in entropy ΔS of the gas? Homework Equations pV=nRT The Attempt at a Solution W=∫V2V1pdV, I don't know how to use this...
  38. J

    Entropy of Liquid water Calculation.

    Homework Statement Calculate the ΔS° when 0.5 mole of liquid water at 0°C is mixed with 0.5 mole of liquid water at 100°C.Assume Cp=18cal/deg mole over the whole range of temperatures. Homework Equations ΔS=∫ Cp/T dT
  39. lecturer

    Entropy & Enthalpy Analogy: Understand Easily

    Iwant any analogy for entropy & enthalpy... For example quasi static process can be related to "old man with health problem walks almost static"... so students can easily understand. ... can you please suggest any analogy for ENTROPY & ENTHALPY
  40. F

    What happens to entropy when doubling the volume?

    Homework Statement A container of volume 2V is divided into two compartments of equal volume by an impenetrable wall. One of the compartments is filled with an ideal gas with N particles. The gas is in equilibrium and has a temperature T. How does the total energy, the entropy, the temperature...
  41. S

    Does Entropy Change Consider Irreversibility in Solids and Liquids?

    usually, entropy change in solid and liquid is formulated as Cp(or Cv)ln(T2/T1) or Q/T(integral)+ Sgen. so, considering the former and the latter equations, Do their entropy changes include irreversibility due to Sgen? And next question, reservoir has no change in temperature and volume...
  42. T

    Internal energy + entropy for molecule

    Hello, Internal energy can be defined theoretically for one molecule (U = 1/2 Kb T) for example but entropy is defined for a system thus for many molecules. Then we define temperature equal to δU / δS but here U can be defined for one molecule, so S can also be defined for one molecule? How...
  43. T

    Why Entropy is Always on the Rise: Exploring the Question

    The most common illustration of entropy is the box with the partition and the two gases on either side. It took energy to separate the gases on either side of the partition, but if we remove the partition, no additional work needs to be done in order to get them to mix. We must expend more...
  44. Islam Hassan

    Gravity and Entropy: Exploring the Relationship

    If gravity leads, for example to a gas cloud gradually coalescing into a proto-star, does this proto-star not have less entropy than the gas cloud that engendered it? If yes, then in what sense is overall entropy increased, in what way can we say that entropy globally has been increased via the...
  45. R

    Entropy change mixing oil and water question

    My question is: when oil and water from separate containers are poured into the same container and given time to equilibrate, is the entropy change positive, negative, or zero? I've been told that there is zero change in entropy because the water and oil do not mix at all. However, I suspect...
  46. jdawg

    Ideal Gas Entropy Equation Conceptual Question

    Homework Statement I'm having a little trouble knowing when to use the ideal gas equations for entropy vs just the ones like this: (T2/T1)=(p2/p1)^((k-1)/k). I've noticed a pattern in the solutions for my homework( where you're finding isentropic efficiency of turbines and compressors) they...
  47. vetgirl1990

    Entropy change of calorimetric process

    Homework Statement A 20 kg sample of mercury is completely solidified and liberates 231.6 kJ of energy. What is the original temperature of the mercury? (The melting point of mercury is 234K, the heat of fusion of mercury is 11.3 kJ/kg, and the specific heat of mercury is 140 J/kg•K.) . What is...
  48. T

    Phase changes, change of entropy is temperature independent

    Good morning everyone! My chemistry final is approaching and I have a few difficulties in the thermodynamics chapter. One of the things that are bothering me is the calculation of the temperature of phase change. We know that if a mole of ice at 273K melts, the entropy change would be...
  49. S

    Where does entropy generation come from?

    Let's think about two thermal reservoirs. They are internally reversible. This means they don't have entropy generation(Sgen) right? Each reservoirs' entropy change is different; one is minus(-) and the other is plus(+). But total entropy change in the isolated system has plus value(entropy...
  50. A

    Calculating entropy generation of a process

    Lets say we have T1. From a reservoir 4000K, Q is added and that makes the temperature T2. How do we calculate the entropy generation in a process like that? Isn't it Sgen=S2 - S1 - (Q/T) But which temperature we will use in the (Q/T)? T1, T2 or 4000K?
Back
Top