Thermodynamics
Four laws — one of them so obvious it's the zeroth, one of them called "the most profound law of physics" by the people who mean it — that tell you what engines can do, why ice melts, why your laptop needs a fan, and why the universe is slowly running out of usable energy even though total energy is conserved.
1. Why thermodynamics runs every engine on the planet
Thermodynamics is the physics of heat, work, and the invisible constraint called entropy that quietly vetoes half the perpetual-motion machines ever proposed. It was worked out in the nineteenth century by engineers trying to make steam engines burn less coal, and the rules they found turned out to be general enough to describe refrigerators, chemical reactions, neutron stars, black holes, and — more recently — the power budget of a GPU cluster.
Here is the scale of the thing. A 2024 H100 GPU dissipates about 700 watts under load. A rack of 8 of them is 5.6 kilowatts of heat you have to move out of a closet. A 50,000-GPU training cluster dumps 35 megawatts continuously. That heat does not magically vanish; it must be pumped into water or air and then dumped to the atmosphere, and the laws that govern how well that pumping can be done were written down by Rudolf Clausius and Sadi Carnot before anyone had ever seen a transistor. If you work on AI infrastructure, energy, chemistry, climate, or biology, you are doing thermodynamics whether or not the word appears in your job title.
Energy is conserved (first law). But the usefulness of that energy decreases every time something real happens (second law). The gap between "energy exists" and "energy can be turned into work" is entropy, and it grows. Every engine, refrigerator, chemical reaction, and living cell is a local fight against that trend, powered by a larger entropy increase somewhere else.
Three reasons to care if you are not a mechanical engineer:
- Infrastructure. Power plants, HVAC, refrigeration, jet engines, rocket nozzles, and data-center cooling are direct applications. Carnot's bound on efficiency is the reason you cannot make a gasoline engine better than about 40 percent without exotic tricks.
- Chemistry and biology. Chemical equilibria, reaction rates, protein folding, and membrane transport are governed by Gibbs and Helmholtz free energies — direct descendants of the second law.
- Information. Shannon entropy in information theory is Boltzmann entropy renamed. Landauer's principle ties the minimum energy to erase a bit to $k_B T \ln 2$. Thermodynamics puts a hard floor under the efficiency of every computation ever performed.
2. Vocabulary cheat sheet
The symbols you will see repeatedly. Glance now; each is defined properly in the sections below.
| Symbol | Read as | Means |
|---|---|---|
| $T$ | "T" | Temperature. Measured in kelvins ($\text{K}$). Not the same as "heat" — a spark is hot but carries almost no energy. |
| $U$ | "U" | Internal energy of a system — the total kinetic plus potential energy of all its microscopic parts. SI unit: joule. |
| $Q$ | "Q" | Heat — energy transferred because of a temperature difference. Joules. $Q > 0$ when heat flows into the system. |
| $W$ | "W" | Work — energy transferred by a force moving through a distance. Joules. Sign conventions vary; we use $W > 0$ for work done by the system on its surroundings. |
| $S$ | "S" | Entropy. A measure of how many microscopic arrangements are compatible with the macroscopic state. Units: joules per kelvin. |
| $P$, $V$ | "P, V" | Pressure (pascals) and volume (cubic meters). The two handles you grab a gas by. |
| $n$, $N$ | "n, N" | Amount of substance: $n$ is moles, $N$ is the number of molecules. $N = n N_A$ with $N_A \approx 6.022 \times 10^{23}$. |
| $R$, $k_B$ | "R, k-B" | Gas constant $R \approx 8.314\,\text{J/(mol·K)}$ and Boltzmann constant $k_B \approx 1.381 \times 10^{-23}\,\text{J/K}$. Related by $R = N_A k_B$. |
| $F$, $G$ | "F, G" | Helmholtz and Gibbs free energies — the "available work" at fixed $T$ and fixed $(T, P)$ respectively. |
| $Z$ | "Z" | Partition function. The sum over states that ties statistical mechanics to thermodynamics. |
3. Temperature, heat, and work — three words that are not synonyms
Confusion about thermodynamics almost always starts with the words temperature, heat, and work being used interchangeably in English. In physics they mean three different things, and getting them straight is half the battle.
Temperature is a property of a system in equilibrium. Microscopically it is (up to a factor) the average kinetic energy per degree of freedom. Macroscopically it is what a thermometer measures. The key fact, so obvious it is called the zeroth law, is that if system A is in thermal equilibrium with B, and B is in equilibrium with C, then A is in equilibrium with C. That is what licenses the existence of a temperature scale at all.
Heat is not a substance. It is an amount of energy transferred from one body to another because they were at different temperatures. You cannot "have" heat sitting in a box. You can have internal energy sitting in a box, and some of it may leave as heat if you open the lid to something colder. The distinction matters because "heat" and "work" are the two channels through which energy crosses a system boundary, and the first law depends on knowing which is which.
Work is energy transferred by any other mechanism: a piston pushed, a shaft turned, a charge moved through a potential difference. Mechanical work on a gas in a cylinder is the familiar $P\,dV$:
Work done by a gas during expansion
- $W$
- Work done by the gas on the surroundings, in joules. Positive when the gas expands and pushes on whatever contains it.
- $P$
- Pressure of the gas during the process, in pascals. It may vary as volume changes, so it lives inside the integral.
- $dV$
- An infinitesimal change in the gas's volume, in cubic meters.
- $\int_{V_1}^{V_2}$
- Integral from the starting volume $V_1$ to the ending volume $V_2$. Graphically: the area under the curve on a $P$-vs-$V$ diagram.
Analogy. Think of pushing a syringe. The force is pressure times area, the distance moved is small, and the tiny bit of work is pressure times the tiny change in volume. Total work is the sum (integral) of all those tiny bits.
For a gas expanding at constant pressure the integral collapses to $W = P\Delta V$. For an isothermal expansion of an ideal gas, you need the ideal gas law (coming in section 6) and the result is $W = nRT \ln(V_2/V_1)$, which we will derive in context.
4. The first law — energy bookkeeping, nothing more
The first law of thermodynamics says that energy cannot be created or destroyed; any change in a system's internal energy comes from heat added or work done.
First law of thermodynamics
- $\Delta U$
- Change in the internal energy of the system, in joules. Positive means the system gained energy.
- $Q$
- Heat added to the system, in joules. Positive means heat flowed in.
- $W$
- Work done by the system on the surroundings, in joules. Positive means the system pushed outward.
Analogy. A bank account. Heat is a deposit. Work-by-the-system is a withdrawal. Internal energy is the balance. You cannot spend what you did not deposit, and no interest is ever paid.
That is it. The first law is a conservation statement, nothing more. What makes it useful is that it lets you track the energy budget of any process, no matter how complicated, as long as you are careful about which flows are heat and which are work.
A few named processes fall out of the first law by holding one variable fixed:
- Isothermal (constant $T$). For an ideal gas, $U$ depends only on $T$, so $\Delta U = 0$, and therefore $Q = W$: every bit of heat added is converted directly to work done by the gas.
- Adiabatic (no heat exchanged, $Q = 0$). Then $\Delta U = -W$: the gas cools when it expands because the work comes out of its internal energy. This is why a can of compressed air gets cold when you discharge it.
- Isobaric (constant $P$). Work is $P\Delta V$, heat is $nC_P \Delta T$, and internal-energy change is $nC_V \Delta T$, where $C_P$ and $C_V$ are molar heat capacities.
- Isochoric (constant $V$). No work, so $\Delta U = Q$ and all added heat goes into internal energy.
The key relation between the two heat capacities of an ideal gas is:
Mayer's relation
- $C_P$
- Molar heat capacity at constant pressure: how much heat per mole per kelvin it takes to warm the gas when pressure is held fixed.
- $C_V$
- Molar heat capacity at constant volume: same, but volume held fixed instead.
- $R$
- Universal gas constant, $\approx 8.314\,\text{J/(mol·K)}$. The extra $R$ accounts for the work done pushing back the atmosphere when the gas is allowed to expand.
Analogy. Heating a gas at constant pressure costs more because part of your heat budget goes into pushing the piston outward, not just into warming the gas. That extra cost is exactly $R$ per kelvin per mole.
5. The second law — the rule that kills perpetual motion
The first law forbids getting energy out of nothing. The second law forbids something subtler: getting work out of energy that is already there, unless some of it is thrown away as waste heat. Clausius's original 1850 statement:
Kelvin gave an equivalent version focused on engines:
These two statements look different and sound like folk wisdom. Clausius showed they are equivalent and extracted from them a new state function, entropy, defined for a reversible process by:
Clausius's definition of entropy
- $dS$
- Infinitesimal change in the system's entropy, in joules per kelvin.
- $\delta Q_{\text{rev}}$
- Infinitesimal amount of heat added reversibly. "Reversible" is an idealization: the process is slow enough that the system is always in equilibrium. The delta is written $\delta$ instead of $d$ because $Q$ is not a state function.
- $T$
- Temperature of the system at the moment the heat was added, in kelvins.
Why this matters. Despite the awkward definition, $S$ turns out to depend only on the current state of the system, not on how it got there. That is what makes it a state function like $U$, $P$, $V$, and $T$. Section 8 below will give a much more intuitive definition in terms of counting microstates.
The second law then states, in its cleanest form:
The second law as an inequality
- $\Delta S_{\text{universe}}$
- Total entropy change of the system plus its surroundings, in joules per kelvin.
- $\ge 0$
- Entropy never decreases. It stays constant for ideal reversible processes and strictly increases for any real (irreversible) process.
Analogy. Imagine dropping food coloring into a glass of water. It diffuses out until the glass is uniformly tinted, and you never see the reverse — coloring spontaneously regathering into a blob. No law of conservation is violated by the reverse; the atoms could in principle all swim the other way. But the number of configurations that look "spread out" vastly outnumbers the ones that look "concentrated," so the system evolves toward what there is more of. That is the whole second law.
A consequence: when heat $Q$ flows from a hot reservoir at $T_H$ to a cold one at $T_C$, the hot side loses entropy $Q/T_H$ and the cold side gains $Q/T_C$. Since $T_H > T_C$, the cold gain is larger, and total entropy grows. Reversing that flow would shrink total entropy — forbidden.
6. The ideal gas law and $PV$ diagrams
The simplest system you can do real calculations with is the ideal gas: a dilute collection of non-interacting particles. The equation of state that ties its four variables together is:
The ideal gas law
- $P$
- Pressure of the gas, in pascals.
- $V$
- Volume, in cubic meters.
- $n$
- Amount of gas, in moles. One mole is $6.022\times 10^{23}$ molecules.
- $R$
- Universal gas constant, $8.314\,\text{J/(mol·K)}$.
- $T$
- Absolute temperature, in kelvins. Note: must be absolute — not Celsius, not Fahrenheit.
Analogy. Imagine a box of bouncing billiard balls. Pressure is how hard they collectively slam the walls; temperature is how fast they are moving on average; volume is how much room they have. If you crank the temperature, the balls move faster and slam harder — so $P$ grows if $V$ is fixed. If you shrink the box, they hit the walls more often — so $P$ grows if $T$ is fixed. The equation is that intuition made precise.
The microscopic derivation, which we do carefully in section 8, shows that temperature in an ideal gas is literally a measure of average kinetic energy: $\frac{1}{2} m \langle v^2 \rangle = \frac{3}{2} k_B T$.
A PV diagram plots pressure on the vertical axis against volume on the horizontal axis. Each point is a state. A curve is a process. The area under a curve is $\int P\,dV$, the work done by the gas. A closed loop is a cycle, and the area it encloses is the net work done per cycle — the basis of every heat engine on the planet.
An isothermal process at temperature $T$ traces out a hyperbola $P = nRT/V$. An adiabatic process for an ideal gas traces out a steeper curve $PV^\gamma = \text{const}$, where $\gamma = C_P/C_V$ is the heat-capacity ratio ($\gamma \approx 1.4$ for air at room temperature). A simple isothermal expansion from $V_1$ to $V_2$ gives work:
Isothermal work for an ideal gas
- $W_{\text{iso}}$
- Work done by the gas during the isothermal expansion, in joules.
- $n$
- Moles of gas.
- $R$
- Universal gas constant.
- $T$
- Constant temperature at which the expansion happens.
- $V_1$, $V_2$
- Initial and final volumes.
- $\ln$
- Natural logarithm. Comes from integrating $1/V$.
Why this matters. Since $\Delta U = 0$ for an isothermal process on an ideal gas, the first law gives $Q = W$. So the same formula also tells you how much heat had to flow in to keep $T$ constant during the expansion.
7. Heat engines and the Carnot bound
A heat engine is any device that takes heat $Q_H$ from a hot reservoir at $T_H$, converts some of it to work $W$, and dumps the rest $Q_C$ into a cold reservoir at $T_C$. Its efficiency is the fraction of input heat that turns into useful work:
Heat engine efficiency
- $\eta$
- Efficiency — a dimensionless number between $0$ and $1$. Often expressed as a percentage.
- $W$
- Net work output per cycle, in joules.
- $Q_H$
- Heat absorbed from the hot reservoir per cycle, in joules.
- $Q_C$
- Heat dumped to the cold reservoir per cycle, in joules. By the first law (the engine returns to its starting state), $W = Q_H - Q_C$.
Analogy. A water wheel. The water drops from a high reservoir to a low one, and part of the fall is captured as useful work (turning the wheel) while the rest continues downstream. You cannot capture it all — the water has to keep moving. An engine is the same: some heat must keep flowing to the cold side or the whole thing stops.
In 1824, twenty-five years before the first law was written down, Sadi Carnot asked a sharp question: What is the best efficiency possible for a given $T_H$ and $T_C$? His answer, now called Carnot's theorem, is that the maximum is achieved by a reversible cycle of two isothermal steps (at $T_H$ and $T_C$) and two adiabatic steps connecting them:
Carnot efficiency
- $\eta_{\text{Carnot}}$
- The maximum possible efficiency of any heat engine operating between reservoirs at $T_H$ and $T_C$. Dimensionless.
- $T_C$
- Absolute temperature of the cold reservoir, in kelvins. Must be absolute — measuring in Celsius breaks the formula.
- $T_H$
- Absolute temperature of the hot reservoir, in kelvins.
Why this matters. This is one of the hardest limits in physics. No engine — gasoline, steam, turbine, fusion reactor, thermoelectric, nothing — can exceed it. A coal plant burning at $T_H = 800\,\text{K}$ dumping to a cooling tower at $T_C = 300\,\text{K}$ has theoretical maximum efficiency $1 - 300/800 = 62.5\%$. Real plants run at about 40% because real cycles are not reversible and real materials fail at high temperatures.
Worked example — the coal plant. A 1 GW electric output power plant running at 40 percent thermal efficiency must burn $1/0.40 = 2.5$ gigawatts of coal continuously, which means it must dump $2.5 - 1.0 = 1.5$ gigawatts of waste heat into a river or cooling tower. At the Carnot limit of 62.5 percent it would still dump $0.6$ gigawatts. You cannot design that waste heat away; it is required by the second law. This is also why combined cycle plants (gas turbine feeding a steam turbine) beat 60 percent efficiency: the exhaust of the gas turbine, at around 900 K, is still hot enough to usefully drive a second Carnot-like cycle down to 300 K.
Refrigerators and heat pumps are engines run backward: you do work $W$ to pump heat $Q_C$ out of a cold space into a hot one, moving $Q_H = Q_C + W$ total. Their figure of merit is the coefficient of performance, $\text{COP}_{\text{ref}} = Q_C/W$, with a Carnot-style upper bound of $T_C/(T_H - T_C)$. A home freezer with $T_C = 260\,\text{K}$ and $T_H = 300\,\text{K}$ has a theoretical maximum COP of $260/40 = 6.5$ — each joule of electricity could in principle move $6.5$ joules of heat out. Real freezers hit about half that.
8. Statistical mechanics — why any of this works
Thermodynamics, as we have presented it, is a phenomenological theory. It tells you what is allowed and what is forbidden, but it does not explain where the rules come from. Statistical mechanics is the bridge from microscopic physics to those rules. It was Boltzmann's contribution, and it is the reason entropy can be defined without waving your hands at "reversibility."
The central idea is simple. A macroscopic state (say: a mole of gas at temperature $T$, volume $V$, pressure $P$) corresponds to an astronomical number of compatible microscopic states — individual positions and velocities of all $10^{23}$ molecules. Call the number of microstates consistent with a given macrostate $\Omega$. Then:
Boltzmann's entropy formula
- $S$
- Entropy of the macrostate, in joules per kelvin.
- $k_B$
- Boltzmann's constant, $1.381\times 10^{-23}\,\text{J/K}$. Converts microscopic counting to macroscopic units.
- $\ln$
- Natural logarithm. The log turns a multiplicative count into an additive quantity, so entropies of independent systems just add.
- $\Omega$
- The number of microstates compatible with the given macrostate. Typically an astronomical integer.
Analogy. Shuffle a deck. There is exactly one "all spades, all hearts, all diamonds, all clubs in order" macrostate, but $52!/(13!)^4 \approx 5.4 \times 10^{28}$ equally likely "random-looking" macrostates. You never see the ordered one after shuffling because there are so many more ways to be disordered. The universe runs on the same principle, just with $10^{23}$ particles instead of 52 cards.
This formula is engraved on Boltzmann's tombstone in Vienna. It is the most succinct statement of the microscopic origin of the second law. Entropy increases because the system wanders through state space and state space is overwhelmingly dominated by high-entropy configurations.
From here you can derive the Boltzmann distribution. A small system in contact with a large heat reservoir at temperature $T$ has probability of being in a given microstate $i$ (with energy $E_i$) equal to:
Boltzmann distribution
- $p_i$
- Probability that the system is in microstate $i$. Dimensionless.
- $E_i$
- Energy of microstate $i$, in joules.
- $k_B T$
- The thermal energy scale. At room temperature, $k_B T \approx 4.1 \times 10^{-21}\,\text{J} \approx 0.025\,\text{eV}$. This is the natural unit of "how much energy is lying around in the ambient thermal bath."
- $Z$
- Partition function: $Z = \sum_i e^{-E_i/k_B T}$. The normalization that makes the probabilities sum to one.
Why this matters. The factor $e^{-E/k_B T}$ shows up everywhere — in chemical reaction rates (Arrhenius), semiconductor carrier densities, protein folding, nuclear fusion cross sections, atmospheric density profiles, magnet ordering transitions, and the cosmic microwave background spectrum. It is the most-cited exponential in physics.
The partition function $Z$ is the magic object. Once you have it you can get everything:
Everything from the partition function
- $\langle E \rangle$
- Average internal energy — what thermodynamics calls $U$.
- $\beta$
- Shorthand for $1/(k_B T)$, called inverse temperature. Units of $1/\text{J}$.
- $F$
- Helmholtz free energy — the thermodynamic potential for systems at fixed $T$ and $V$. In joules.
- $\ln Z$
- Natural log of the partition function. A dimensionless quantity that encodes the whole thermodynamics.
- $\left(\partial F/\partial T\right)_V$
- Partial derivative of $F$ with respect to $T$, holding $V$ fixed. Gives entropy up to a sign.
Analogy. Think of $Z$ as the master key. The thermodynamic functions $U$, $F$, $S$ are all rooms you can enter by turning it different ways (differentiating it differently). Statistical mechanics is basically "write down $Z$, then take derivatives until you have the answer."
For the ideal gas, the single-particle partition function gives $\langle E \rangle = \frac{3}{2} k_B T$ per particle — the equipartition theorem in its simplest form. Multiply by $N$ and differentiate: you recover $PV = Nk_B T = nRT$. Thermodynamics was right, and we now know why.
One more classic application: the Maxwell-Boltzmann speed distribution. The probability density that a molecule in an ideal gas has speed $v$ is:
Maxwell-Boltzmann distribution
- $f(v)$
- Probability density of molecular speeds, in $\text{s/m}$. Integrating $f(v)\,dv$ from $v_1$ to $v_2$ gives the fraction of molecules with speeds in that range.
- $m$
- Mass of one molecule, in kilograms.
- $k_B$
- Boltzmann's constant.
- $T$
- Absolute temperature, in kelvins.
- $v^2 e^{-mv^2/2k_B T}$
- The core of the distribution: the $v^2$ factor is the surface area of a sphere of radius $v$ in velocity space (more directions available at higher speed); the exponential is the Boltzmann weight of the kinetic energy $mv^2/2$.
- $4\pi (m/2\pi k_B T)^{3/2}$
- Normalization so that $\int_0^\infty f(v)\,dv = 1$.
Why this matters. The shape of $f(v)$ explains why reaction rates and atmospheric escape depend exponentially on temperature. Even though the average speed is modest, the high-speed tail contains the molecules energetic enough to climb out of a potential well — to escape Earth's gravity, ionize another molecule, or fuse with a neighbor. Small changes in $T$ shift that tail enormously.
9. Interactive: the Maxwell-Boltzmann distribution
Pull the temperature slider below and watch how the speed distribution of nitrogen molecules (the dominant component of air) changes. Notice two things. First, the peak shifts to the right as $T$ grows — hotter gas has a faster "most probable" speed. Second, the whole curve gets wider and shorter, preserving total area. The high-speed tail, where interesting chemistry and escape physics live, grows much faster than the peak does. A small temperature bump produces a dramatic increase in the fraction of molecules above any fixed "activation" threshold.
10. Free energy and chemical potential — when $T$ and $P$ are fixed
Most real-world thermodynamics does not happen in closed boxes at fixed energy. It happens in beakers at fixed temperature and pressure, or in cells at fixed temperature and chemical environment. In those settings, the quantity that tells you what a system can actually do is not internal energy but free energy.
The Helmholtz free energy $F$ is relevant at fixed $T$ and $V$:
Helmholtz free energy
- $F$
- Helmholtz free energy, in joules. The maximum work a system at fixed $T$ can deliver.
- $U$
- Internal energy.
- $T$
- Absolute temperature of the surrounding bath.
- $S$
- Entropy of the system.
Analogy. Think of internal energy as the cash in your wallet. Entropy times $T$ is the "tax" the environment collects whenever you try to spend it. Free energy is the after-tax spending power. You cannot extract more work than $F$, no matter how clever your engine.
The Gibbs free energy $G$ is relevant at fixed $T$ and $P$ — the natural setting for chemistry, biology, and anything open to the atmosphere:
Gibbs free energy
- $G$
- Gibbs free energy, in joules. Minimized at equilibrium for a system at fixed $T$ and $P$.
- $H = U + PV$
- Enthalpy — internal energy plus the work $PV$ needed to make room for the system against atmospheric pressure. Chemists use it constantly; it is what a calorimeter measures at constant pressure.
- $TS$
- The entropy contribution, subtracted because the bath gets to collect it back.
Why this matters. A chemical reaction runs spontaneously at fixed $(T,P)$ if and only if $\Delta G < 0$. The equilibrium constant and reaction rates are set by $\Delta G$. ATP hydrolysis in your cells releases about $-30\,\text{kJ/mol}$ of Gibbs free energy; that is where the energy for every muscle twitch and every enzymatic step comes from.
For systems that can exchange particles — chemical reactions, phase transitions, electrochemical cells, the interior of a star — you also need the chemical potential:
Chemical potential
- $\mu_i$
- Chemical potential of species $i$, in joules per particle (or joules per mole, depending on conventions). The free-energy cost of adding one more particle of species $i$.
- $G$
- Gibbs free energy of the whole system.
- $N_i$
- Number of particles of species $i$.
- $\left(\partial G/\partial N_i\right)_{T,P,N_{j\ne i}}$
- Partial derivative of $G$ with respect to $N_i$, holding $T$, $P$, and all other particle numbers fixed.
Analogy. If free energy is wealth, chemical potential is the price per particle at the margin. Particles flow from high-$\mu$ to low-$\mu$ the same way water flows downhill or money flows between markets with different prices — always toward equilibrium, which is reached when $\mu$ is the same everywhere.
Equilibrium conditions fall out immediately. Thermal equilibrium: $T$ equal throughout. Mechanical equilibrium: $P$ equal throughout. Chemical equilibrium: $\mu$ equal throughout for every species that can move. These three conditions, along with the second law, determine every phase diagram, every chemical equilibrium constant, and every osmotic pressure in biology.
11. Simulation in code
Two small programs. The first samples the Maxwell-Boltzmann distribution for nitrogen and reports the mean speed and rms speed, checking them against theory. The second runs a toy Carnot cycle on an ideal gas and prints the efficiency.
import numpy as np # Physical constants # ---------------- kB = 1.380649e-23 # J/K amu = 1.66054e-27 # kg m_N2 = 28.0 * amu # nitrogen molecule mass T = 300.0 # kelvin — room temperature N = 200000 # number of molecules # Sample each Cartesian velocity component from a zero-mean Gaussian # of width sqrt(kB T / m). This is the Maxwell-Boltzmann distribution # factored into three independent 1D Gaussians. sigma = np.sqrt(kB * T / m_N2) vx = np.random.normal(0.0, sigma, N) vy = np.random.normal(0.0, sigma, N) vz = np.random.normal(0.0, sigma, N) # Speed is the magnitude of the velocity vector v = np.sqrt(vx**2 + vy**2 + vz**2) v_mean_theory = np.sqrt(8 * kB * T / (np.pi * m_N2)) v_rms_theory = np.sqrt(3 * kB * T / m_N2) v_peak_theory = np.sqrt(2 * kB * T / m_N2) print(f"mean speed (measured): {v.mean():8.2f} m/s") print(f"mean speed (theory): {v_mean_theory:8.2f} m/s") print(f"rms speed (measured): {np.sqrt((v**2).mean()):8.2f} m/s") print(f"rms speed (theory): {v_rms_theory:8.2f} m/s") print(f"most-probable (theory): {v_peak_theory:8.2f} m/s") # For N2 at 300 K: mean ~ 476 m/s, rms ~ 517 m/s, peak ~ 422 m/s. # These numbers are faster than the speed of sound in air (~343 m/s) # — the sound speed is smaller because it is an average with a factor of gamma in it.
import numpy as np # Ideal-gas Carnot cycle # ---------------------- # 1 mole of ideal gas, gamma = 5/3 (monatomic) R = 8.314 # J/(mol K) n = 1.0 # mole gamma = 5.0 / 3.0 T_H = 800.0 # hot reservoir, K T_C = 300.0 # cold reservoir, K # Stage A: isothermal expansion at T_H from V1 to V2 V1 = 1.0e-3 # m^3 (1 liter) V2 = 3.0e-3 # m^3 (3 liters) Q_H = n * R * T_H * np.log(V2 / V1) # heat absorbed W_A = Q_H # U unchanged at constant T # Stage B: adiabatic expansion from T_H, V2 to T_C, V3 # For an adiabat: T V^(gamma-1) = const V3 = V2 * (T_H / T_C) ** (1.0 / (gamma - 1.0)) W_B = n * R * (T_H - T_C) / (gamma - 1.0) # Stage C: isothermal compression at T_C from V3 to V4 # V4 chosen so stage D closes the loop back to (T_H, V1) V4 = V1 * (T_H / T_C) ** (1.0 / (gamma - 1.0)) Q_C = n * R * T_C * np.log(V4 / V3) # negative — heat rejected W_C = Q_C # Stage D: adiabatic compression from T_C, V4 to T_H, V1 W_D = -W_B # adiabats are symmetric in sign W_net = W_A + W_B + W_C + W_D eta = W_net / Q_H eta_carnot = 1.0 - T_C / T_H print(f"Q_H = {Q_H:.1f} J (heat in from hot reservoir)") print(f"Q_C = {Q_C:.1f} J (heat out to cold reservoir, negative)") print(f"W_net = {W_net:.1f} J (work out, = |Q_H| - |Q_C|)") print(f"eta (cycle) = {eta:.4f}") print(f"eta (Carnot) = {eta_carnot:.4f}") # Output: # eta (cycle) = 0.6250 # eta (Carnot) = 0.6250 # A real Carnot cycle saturates the bound exactly — that's the whole point.
12. Cheat sheet and see also
First law: $\Delta U = Q - W$.
Second law: $\Delta S_{\text{universe}} \ge 0$.
Ideal gas: $PV = nRT = N k_B T$.
Isothermal work: $W = nRT\ln(V_2/V_1)$.
Adiabatic: $PV^\gamma = \text{const}$, $TV^{\gamma-1} = \text{const}$.
Carnot efficiency: $\eta = 1 - T_C/T_H$.
Boltzmann entropy: $S = k_B \ln\Omega$.
Boltzmann distribution: $p_i \propto e^{-E_i / k_B T}$.
Equipartition: $\frac{1}{2} k_B T$ per quadratic degree of freedom.
Gibbs free energy: $G = H - TS$. Spontaneous at fixed $(T,P)$ iff $\Delta G < 0$.
See also
Classical Mechanics
The microscopic picture of a gas is billions of classical particles bouncing around. $\frac{1}{2} m v^2$ is the same quantity everywhere.
Electromagnetism
Thermal emission of EM waves is thermodynamics meeting Maxwell. The Stefan-Boltzmann law $P = \sigma T^4$ and the Planck spectrum both come from this crossroads.
Quantum Mechanics
Bose-Einstein and Fermi-Dirac distributions replace Maxwell-Boltzmann when quantum effects matter. Lasers, superconductors, and neutron stars live here.
Math: Probability
Statistical mechanics is probability over phase space. Expected values, marginals, and large-deviation theory all appear in the thermodynamic limit.
Math: Calculus
Thermodynamic potentials are functions; differentiating them gives physical quantities. Partial derivatives and the chain rule are the daily tools.
AI: Foundation Models
The power draw of a training run is pure thermodynamics. Carnot sets the limit on how efficiently a data center can dump its waste heat.
Further reading
Enrico Fermi, Thermodynamics (Dover, 1937). Seventy pages. Still the clearest classical introduction, written by a working experimentalist.
Kerson Huang, Statistical Mechanics (Wiley, 2nd ed. 1987). The standard graduate text. Develops the machinery carefully.
Daniel Schroeder, An Introduction to Thermal Physics (Addison-Wesley, 2000). The undergraduate text most people actually like. Worked examples are excellent.
David Chandler, Introduction to Modern Statistical Mechanics (Oxford, 1987). The book that taught a generation of chemists and biophysicists how to think about free energies.
Rolf Landauer, "Irreversibility and heat generation in the computing process," IBM Journal of Research and Development, 5(3), 1961. The paper that tied computation to thermodynamics.