This foundational formula connects the macroscopic thermodynamic quantity of entropy (S) with the number of possible microscopic arrangements, or microstates (W), corresponding to the system’s macroscopic state. The equation, \(S = k_B \ln W\), reveals that entropy is a measure of statistical disorder or randomness. The constant \(k_B\) is the Boltzmann constant, linking energy at the particle level with temperature.
Boltzmann’s Entropy Formula
- Ludwig Boltzmann
Boltzmann’s entropy formula provides a statistical definition for the thermodynamic concept of entropy, which was previously defined by Rudolf Clausius in terms of heat transfer (\(dS = \frac{\delta Q}{T}\)). Boltzmann’s breakthrough was to link this macroscopic quantity to the statistical properties of the system’s constituent particles. A ‘macrostate’ is defined by macroscopic variables like pressure, volume, and temperature. A ‘microstate’ is a specific configuration of the positions and momenta of all individual particles. The key insight is that a single macrostate can be realized by an enormous number of different microstates. The quantity W, sometimes called the statistical weight or thermodynamic probability, is this number.
The formula implies that the equilibrium state of an isolated system, which is the state of maximum entropy according to the Second Law of Thermodynamics, is simply the most probable macrostate—the one with the largest number of corresponding microstates (largest W). The logarithmic relationship is crucial because it ensures that entropy is an extensive property. If you combine two independent systems, their total entropy is the sum of their individual entropies (\(S_{tot} = S_1 + S_2\)), while the total number of microstates is the product (\(W_{tot} = W_1 W_2\)). The logarithm turns this product into a sum: \(k_B \ln(W_1 W_2) = k_B \ln W_1 + k_B \ln W_2\). This formula is famously engraved on Boltzmann’s tombstone in Vienna.
Type
Disruption
Usage
Precursors
- Rudolf Clausius’s formulation of the second law of thermodynamics and the classical definition of entropy
- James Clerk Maxwell’s work on the statistical distribution of molecular speeds in a gas
- Development of probability theory by mathematicians like Pierre-Simon Laplace
- The kinetic theory of gases
Applications
- information theory (shannon entropy)
- black hole thermodynamics (bekenstein-hawking entropy)
- materials science for predicting phase stability
- computational chemistry for calculating reaction entropies
- glass transition physics
Patents:
Potential Innovations Ideas
Professionals (100% free) Membership Required
You must be a Professionals (100% free) member to access this content.
AVAILABLE FOR NEW CHALLENGES
Mechanical Engineer, Project or R&D Manager
Available for a new challenge on short notice.
Contact me on LinkedIn
Plastic metal electronics integration, Design-to-cost, GMP, Ergonomics, Medium to high-volume devices & consumables, Regulated industries, CE & FDA, CAD, Solidworks, Lean Sigma Black Belt, medical ISO 13485
We are looking for a new sponsor
Your company or institution is into technique, science or research ?
> send us a message <
Receive all new articles
Free, no spam, email not distributed nor resold
or you can get your full membership -for free- to access all restricted content >here<
Historical Context
Boltzmann’s Entropy Formula
(if date is unknown or not relevant, e.g. "fluid mechanics", a rounded estimation of its notable emergence is provided)
Related Invention, Innovation & Technical Principles