Boltzmann distribution

In statistical mechanics and mathematics, a Boltzmann distribution (also called Gibbs distribution) is a probability distribution or probability measure that gives the probability that a system will be in a certain state as a function of that state's energy and the temperature of the system. The distribution is expressed in the form:





p

i




e




ε

i




/


k
T





{\displaystyle p_{i}\propto e^{-{\varepsilon _{i}}/{kT}}}
where pi is the probability of the system being in state i, εi is the energy of that state, and a constant kT of the distribution is the product of Boltzmann's constant k and thermodynamic temperature T. The symbol






{\textstyle \propto }
denotes proportionality (see § The distribution for the proportionality constant).
The term system here has a very wide meaning; it can range from a single atom to a macroscopic system such as a natural gas storage tank. Because of this the Boltzmann distribution can be used to solve a very wide variety of problems. The distribution shows that states with lower energy will always have a higher probability of being occupied .
The ratio of probabilities of two states is known as the Boltzmann factor and characteristically only depends on the states' energy difference:







p

i



p

j




=

e


(

ε

j




ε

i


)


/


k
T





{\displaystyle {\frac {p_{i}}{p_{j}}}=e^{{(\varepsilon _{j}-\varepsilon _{i})}/{kT}}}
The Boltzmann distribution is named after Ludwig Boltzmann who first formulated it in 1868 during his studies of the statistical mechanics of gases in thermal equilibrium. Boltzmann's statistical work is borne out in his paper “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations Regarding the Conditions for Thermal Equilibrium"
The distribution was later investigated extensively, in its modern generic form, by Josiah Willard Gibbs in 1902.The generalized Boltzmann distribution is a sufficient and necessary condition for the equivalence between the statistical mechanics definition of entropy (The Gibbs entropy formula



S
=


k


B






i



p

i


log


p

i




{\textstyle S=-k_{\mathrm {B} }\sum _{i}p_{i}\log p_{i}}
) and the thermodynamic definition of entropy (



d
S
=



δ

Q

rev



T




{\textstyle dS={\frac {\delta Q_{\text{rev}}}{T}}}
, and the fundamental thermodynamic relation).The Boltzmann distribution should not be confused with the Maxwell–Boltzmann distribution. The former gives the probability that a system will be in a certain state as a function of that state's energy; in contrast, the latter is used to describe particle speeds in idealized gases.

View More On Wikipedia.org
Back
Top