This foundational formula connects the macroscopic termodinámica quantity of entropy (S) with the number of possible microscopic arrangements, or microstates (W), corresponding to the system’s macroscopic state. The equation, [latex]S = k_B \ln W[/latex], reveals that entropy is a measure of statistical disorder or randomness. The constant [latex]k_B[/latex] is the Boltzmann constant, linking energy at the particle level with temperature.
Boltzmann’s Entropy Formula
- Ludwig Boltzmann
Boltzmann’s entropy formula provides a statistical definition for the thermodynamic concept of entropy, which was previously defined by Rudolf Clausius in terms of heat transfer ([latex]dS = \frac{\delta Q}{T}[/latex]). Boltzmann’s breakthrough was to link this macroscopic quantity to the statistical properties of the system’s constituent particles. A ‘macrostate’ is defined by macroscopic variables like pressure, volume, and temperature. A ‘microstate’ is a specific configuration of the positions and momenta of all individual particles. The key insight is that a single macrostate can be realized by an enormous number of different microstates. The quantity W, sometimes called the statistical weight or thermodynamic probability, is this number.
The formula implies that the equilibrium state of an isolated system, which is the state of maximum entropy according to the Second Law of Thermodynamics, is simply the most probable macrostate—the one with the largest number of corresponding microstates (largest W). The logarithmic relationship is crucial because it ensures that entropy is an extensive property. If you combine two independent systems, their total entropy is the sum of their individual entropies ([latex]S_{tot} = S_1 + S_2[/latex]), while the total number of microstates is the product ([latex]W_{tot} = W_1 W_2[/latex]). The logarithm turns this product into a sum: [latex]k_B \ln(W_1 W_2) = k_B \ln W_1 + k_B \ln W_2[/latex]. This formula is famously engraved on Boltzmann’s tombstone in Vienna.
Tipo
Disruption
Utilización
Precursors
- Rudolf Clausius’s formulation of the second law of thermodynamics and the classical definition of entropy
- James Clerk Maxwell’s work on the statistical distribution of molecular speeds in a gas
- Development of probability theory by mathematicians like Pierre-Simon Laplace
- The kinetic theory of gases
Aplicaciones
- information theory (shannon entropy)
- black hole thermodynamics (bekenstein-hawking entropy)
- materials science for predicting phase stability
- computational chemistry for calculating reaction entropies
- glass transition physics
Patentes:
Potential Innovations Ideas
Membresía obligatoria de Professionals (100% free)
Debes ser miembro de Professionals (100% free) para acceder a este contenido.
DISPONIBLE PARA NUEVOS RETOS
Ingeniero Mecánico, Gerente de Proyectos o de I+D
Disponible para un nuevo desafío a corto plazo.
Contáctame en LinkedIn
Integración de electrónica de plástico y metal, diseño a coste, GMP, ergonomía, dispositivos y consumibles de volumen medio a alto, industrias reguladas, CE y FDA, CAD, Solidworks, cinturón negro Lean Sigma, ISO 13485 médico
Estamos buscando un nuevo patrocinador
¿Su empresa o institución se dedica a la técnica, la ciencia o la investigación?
> Envíanos un mensaje <
Recibe todos los artículos nuevos
Gratuito, sin spam, correo electrónico no distribuido ni revendido.
o puedes obtener tu membresía completa -gratis- para acceder a todo el contenido restringido >aquí<
Historical Context
Boltzmann’s Entropy Formula
(if date is unknown or not relevant, e.g. "fluid mechanics", a rounded estimation of its notable emergence is provided)
Related Invention, Innovation & Technical Principles