This foundational formula connects the macroscopic thermodynamique quantity of entropy (S) with the number of possible microscopic arrangements, or microstates (W), corresponding to the system’s macroscopic state. The equation, [latex]S = k_B \ln W[/latex], reveals that entropy is a measure of statistical disorder or randomness. The constant [latex]k_B[/latex] is the Boltzmann constant, linking energy at the particle level with temperature.
Boltzmann’s Entropy Formula
- Ludwig Boltzmann
Boltzmann’s entropy formula provides a statistical definition for the thermodynamic concept of entropy, which was previously defined by Rudolf Clausius in terms of heat transfer ([latex]dS = \frac{\delta Q}{T}[/latex]). Boltzmann’s breakthrough was to link this macroscopic quantity to the statistical properties of the system’s constituent particles. A ‘macrostate’ is defined by macroscopic variables like pressure, volume, and temperature. A ‘microstate’ is a specific configuration of the positions and momenta of all individual particles. The key insight is that a single macrostate can be realized by an enormous number of different microstates. The quantity W, sometimes called the statistical weight or thermodynamic probability, is this number.
The formula implies that the equilibrium state of an isolated system, which is the state of maximum entropy according to the Second Law of Thermodynamics, is simply the most probable macrostate—the one with the largest number of corresponding microstates (largest W). The logarithmic relationship is crucial because it ensures that entropy is an extensive property. If you combine two independent systems, their total entropy is the sum of their individual entropies ([latex]S_{tot} = S_1 + S_2[/latex]), while the total number of microstates is the product ([latex]W_{tot} = W_1 W_2[/latex]). The logarithm turns this product into a sum: [latex]k_B \ln(W_1 W_2) = k_B \ln W_1 + k_B \ln W_2[/latex]. This formula is famously engraved on Boltzmann’s tombstone in Vienna.
Type
Disruption
Utilisation
Precursors
- Rudolf Clausius’s formulation of the second law of thermodynamics and the classical definition of entropy
- James Clerk Maxwell’s work on the statistical distribution of molecular speeds in a gas
- Development of probability theory by mathematicians like Pierre-Simon Laplace
- The kinetic theory of gases
Applications
- information theory (shannon entropy)
- black hole thermodynamics (bekenstein-hawking entropy)
- materials science for predicting phase stability
- computational chemistry for calculating reaction entropies
- glass transition physics
Brevets :
Potential Innovations Ideas
!niveaux !!! Adhésion obligatoire
Vous devez être membre de l'association pour accéder à ce contenu.
DISPONIBLE POUR DE NOUVEAUX DÉFIS
Ingénieur mécanique, chef de projet ou de R&D
Disponible pour un nouveau défi dans un court délai.
Contactez-moi sur LinkedIn
Intégration électronique métal-plastique, Conception à coût réduit, BPF, Ergonomie, Appareils et consommables de volume moyen à élevé, Secteurs réglementés, CE et FDA, CAO, Solidworks, Lean Sigma Black Belt, ISO 13485 médical
Nous recherchons un nouveau sponsor
Votre entreprise ou institution est dans le domaine de la technique, de la science ou de la recherche ?
> envoyez-nous un message <
Recevez tous les nouveaux articles
Gratuit, pas de spam, email non distribué ni revendu
ou vous pouvez obtenir votre adhésion complète - gratuitement - pour accéder à tout le contenu restreint >ici<
Historical Context
Boltzmann’s Entropy Formula
(if date is unknown or not relevant, e.g. "fluid mechanics", a rounded estimation of its notable emergence is provided)
Related Invention, Innovation & Technical Principles