Frontier Physics
UPDATED Apr 11 2026 20:00A tasting menu of the arguments physicists have over lunch in 2026. String theory and its cousins, loop quantum gravity, quantum information as a working tool and as a theory of its own, topological phases of matter, holography and the AdS/CFT correspondence, the ER=EPR conjecture linking entanglement to wormholes, quantum vacuum fluctuations and zero-point energy, the dark sector (dark matter candidates and dark energy), quantum thermodynamics, and a handful of questions nobody has cracked. This page does not try to teach you any one of them. It tries to give you enough of a map that you can read a popular article or an arXiv title page and know which neighborhood you are in.
1. What "frontier" means
Every tenured physicist will tell you the same thing: the fun stuff is the stuff we do not yet know how to calculate. The Standard Model is a triumph and a cage. It fits essentially every collider measurement, and it also leaves us with no accepted theory of gravity at short distances, no explanation for dark matter, no story about how the universe survived its matter-antimatter asymmetry, no first-principles derivation of the value of the cosmological constant, and no clean understanding of what happens inside a black hole or at the big bang. The "frontier" is where people try to push past those walls — sometimes with bold new frameworks, sometimes with careful re-examinations of old problems, and sometimes with entirely new experimental tools that make old questions answerable.
This page is deliberately not a textbook on any one of these topics. It is a map. Each section gives you the minimum vocabulary, the main idea, the most common misconception, and a pointer to where to read more. If you want depth, follow the Further reading links; for each of these subjects there is a multi-year apprenticeship ahead of you. If you want breadth — an answer to "what do theoretical physicists actually argue about?" — this page should do it.
Quantum gravity (string theory vs loop quantum gravity vs asymptotic safety vs give up). The measurement problem in quantum mechanics (Copenhagen, many-worlds, QBism, pilot-wave, or "shut up and calculate"). Dark sector searches (WIMPs, axions, sterile neutrinos, primordial black holes). Quantum supremacy and the future of quantum computing. The structure of the string landscape and whether "the multiverse" is science or not. Most of the rest of this page is context for these fights.
2. Vocabulary cheat sheet
Words you will see on every frontier-physics preprint. You do not need to master the math behind each; recognizing them is enough.
| Term | Meaning | Where it lives |
|---|---|---|
| String | A one-dimensional extended object whose vibration modes correspond to particles. | String theory. |
| Brane | A higher-dimensional extended object on which open strings can end. Matter can live on branes. | String theory, M-theory. |
| Compactification | Rolling up extra spatial dimensions so small that everyday physics does not notice them. | String theory, Kaluza-Klein. |
| Duality | A surprising mathematical equivalence between two seemingly different theories. | Strings (T-, S-, AdS/CFT), condensed matter. |
| Landscape | The enormous space of possible string-theory vacua, each giving different low-energy physics. | String theory. |
| Spin network | A graph whose edges carry $SU(2)$ representations — basis for quantum states of geometry. | Loop quantum gravity. |
| Qubit | A two-level quantum system. The smallest quantum information unit. | Quantum computing. |
| Entanglement entropy | The von Neumann entropy of the reduced density matrix of a subsystem. | Quantum info, holography. |
| Topological invariant | A quantity that stays constant under smooth deformations. Integer-valued. | Topological matter, TQFT. |
| Anyon | A 2D quasiparticle with neither bosonic nor fermionic exchange statistics. | Topological matter, fractional QH. |
3. String theory at a flyover
String theory's founding observation: if you replace point particles with tiny one-dimensional strings, the mathematical pathologies that make quantum gravity non-renormalizable soften. Different vibrational modes of the string correspond to different particles, and one of the lowest modes of a closed string always has the quantum numbers of a graviton — a massless spin-2 particle that mediates gravity. So string theory automatically contains quantum gravity, which no other framework can currently claim.
That is the good news. The bad news is a long list of complications:
- Extra dimensions. String theory is mathematically consistent only in 10 spacetime dimensions (for superstrings) or 11 (for M-theory). To match the world, you have to "compactify" the extra 6 or 7 dimensions into a tiny Calabi-Yau manifold or similar.
- Supersymmetry. The superstring requires supersymmetry on the worldsheet (2D theory of the string). Whether spacetime supersymmetry survives at low energies is a model-dependent question.
- The landscape. There are ${\sim} 10^{500}$ (or more) plausible choices of compactification, each giving a different low-energy effective field theory. No one knows how to pick the one that matches our universe from first principles.
- No unique low-energy prediction. Because of the landscape, asking "what does string theory predict for the top quark mass?" is like asking "what does object-oriented programming predict your IDE to look like?" — the answer depends on which vacuum you picked.
- Experimental inaccessibility. The natural energy scale of string excitations is the Planck scale, $\sim 10^{19}$ GeV, which is 15 orders of magnitude beyond the LHC.
Compactification dimensionally reduces a 10D theory to a 4D effective theory. The number of light scalar fields you get in 4D is related to the topology of the compact manifold. For a Calabi-Yau threefold, the count is given by its Hodge numbers $h^{1,1}$ and $h^{2,1}$.
Calabi-Yau moduli counting
- $X$
- A Calabi-Yau manifold — a complex manifold with vanishing first Chern class, used as the compact factor of 10D spacetime.
- $h^{1,1}(X)$
- A topological invariant counting independent closed 2-forms modulo exact ones. Counts the "Kähler moduli" that determine the sizes of the compact dimensions.
- $h^{2,1}(X)$
- Counts independent closed $(2,1)$-forms. Gives the "complex structure moduli" that deform the shape of the manifold.
- $+1$
- The dilaton — a separate scalar field controlling the string coupling.
Why you care. Each modulus is a massless scalar field in 4D until something gives it a potential. A phenomenologically viable string vacuum needs every modulus stabilized (given a mass). The "moduli problem" — making sure there are no rolling light scalars messing up cosmology — is one of the main obstacles to realistic string model-building.
Beyond the Standard Model, the most often-cited phenomenological target of string theory is to reproduce the Standard Model (three generations, right gauge group, right Yukawa patterns) from a specific compactification. Hundreds of such "string standard models" have been constructed; none of them has been independently verified as the correct one, and many are arguably equally good fits. The honest state of play: string theory is a rich mathematical framework that contains gravity and gauge theory, and it has taught us deep things about holography, entanglement, and strongly coupled systems. Whether it is the correct microscopic theory of nature is an open empirical question.
→ Deep Dive: String Theory — Extra Dimensions, Dualities, and AdS/CFT
4. Loop quantum gravity briefly
Loop quantum gravity (LQG) is the other major non-string approach to quantum gravity. Where string theory starts from a bigger object (the string) that hopes to tame gravity as a byproduct, LQG starts from general relativity itself and tries to quantize it directly, using mathematical tools (Wilson loops) borrowed from gauge theory.
The key technical move is to reformulate gravity in "Ashtekar variables" — a connection and a densitized triad — so that it looks like an $SU(2)$ gauge theory. You then quantize by defining an inner product on holonomies of this connection along closed loops. The result is a kinematical Hilbert space whose basis states are "spin networks," graphs whose edges carry $SU(2)$ representations. Area and volume operators have discrete spectra in this framework, with a minimum quantum of area on the order of the Planck area, $\ell_P^2 \sim 10^{-70}\,\text{m}^2$.
LQG's strengths: it is background-independent (it does not assume a fixed spacetime) and it makes a clean, conceptually arresting claim that spacetime itself is discrete at the Planck scale. Its weaknesses: the dynamics (the "Hamiltonian constraint") is very hard to define consistently, it has not reproduced a low-energy classical limit that clearly matches general relativity, and it does not naturally incorporate matter or gauge interactions the way string theory does. There is active work on "spin foams" (a path-integral formulation) and on LQG-inspired cosmological models (loop quantum cosmology) that predict a bounce instead of a big-bang singularity.
The honest short version: LQG is a minority program compared to string theory by headcount, but it is the cleanest example of trying to take "spacetime is really discrete at small scales" seriously as a mathematical structure. It has not yet produced a falsifiable experimental prediction.
5. Quantum information and quantum computing
Quantum information has evolved in the past thirty years from a niche curiosity into a full-fledged subfield with its own vocabulary, conferences, and billion-dollar hardware programs. It is also the place where frontier physics most directly meets engineering.
Qubits, gates, and entanglement
The fundamental unit is a qubit — a two-level quantum system whose state is an arbitrary complex superposition of two basis states $|0\rangle$ and $|1\rangle$:
One qubit state
- $|0\rangle, |1\rangle$
- Orthonormal basis states. In a spin-$\tfrac12$ system they are spin up and spin down; in a photonic system they are horizontal and vertical polarization; in a transmon qubit they are the ground and first-excited state of an LC resonator.
- $\alpha, \beta$
- Complex probability amplitudes. Only the relative phase and the magnitudes have physical meaning; an overall phase is unobservable.
- $|\alpha|^2, |\beta|^2$
- Probabilities of measuring the qubit in $|0\rangle$ or $|1\rangle$ respectively, in the computational basis.
Analogy (with a caveat). A qubit is like a coin spinning in the air, not yet landed — it is not "50% heads, 50% tails" but in a genuine superposition. The caveat is that the coin analogy obscures phase: two qubits in superposition can interfere constructively or destructively, and that is where the power of quantum computing comes from, not from "trying all answers at once."
An $n$-qubit state lives in a $2^n$-dimensional complex Hilbert space. Most such states are entangled — meaning the joint state cannot be written as a product of individual qubit states. A two-qubit Bell state,
A maximally entangled two-qubit state
- $|00\rangle, |11\rangle$
- Basis states where both qubits are $0$ or both are $1$.
- $1/\sqrt 2$
- Normalization. Gives each outcome a $1/2$ probability.
The correlation. Measure the first qubit: you get $0$ or $1$ with equal probability. But whichever outcome you get, the second qubit (measured in the same basis) gives exactly the same result. The correlation is stronger than any classical joint distribution permits (CHSH inequality), and it has been tested experimentally to extraordinary precision (the 2022 Nobel Prize went to Aspect, Clauser, and Zeilinger for these tests).
You manipulate qubits with gates: unitary operations on the Hilbert space. Important single-qubit gates include the Pauli $X$ (bit flip), the Hadamard $H$ (equal superposition), and the phase gate. The canonical two-qubit gate is the controlled-NOT (CNOT), which flips the target qubit iff the control is in $|1\rangle$. A minimal universal gate set is $\{H, T, \text{CNOT}\}$ — with those you can approximate any unitary on any number of qubits to arbitrary precision (Solovay-Kitaev theorem).
Entanglement as a resource
The entanglement entropy of a bipartite state quantifies how much of its information is shared between the two parts:
Von Neumann entanglement entropy
- $|\psi\rangle$
- A pure state of the combined system $A \cup B$.
- $\rho_A = \text{Tr}_B |\psi\rangle\langle\psi|$
- The reduced density matrix of $A$, obtained by "tracing out" the other subsystem. It summarizes all measurements that only touch $A$.
- $S(\rho_A)$
- Von Neumann entropy of $\rho_A$. Zero if and only if the combined state factors as $|\psi_A\rangle \otimes |\psi_B\rangle$, positive otherwise. Maximal for maximally entangled states.
Why this matters beyond quantum computing. Entanglement entropy is the central quantity in the black-hole information paradox, the AdS/CFT "Ryu-Takayanagi formula" relating bulk geometric areas to boundary entropies, and modern attempts to define "spacetime from entanglement." It is the cleanest place where quantum information and quantum gravity are now the same subject.
Shor's algorithm and what is actually hard
Peter Shor's 1994 algorithm factors an $n$-bit integer in time polynomial in $n$ on a quantum computer, compared to the best known classical time of $\sim \exp(n^{1/3})$. Since RSA cryptography relies on factoring being classically hard, Shor is the reason governments care about quantum computers. The core idea is to reduce factoring to finding the period of a modular-exponentiation function, and to use the quantum Fourier transform to find that period in one shot.
Concretely, to break RSA-2048 you probably need several thousand perfect logical qubits, which with current error-correction overheads means tens of millions of physical qubits. Google, IBM, IonQ, and others are in the hundreds-to-thousands-of-physical-qubits range as of 2025. Useful cryptographic attacks are still not tomorrow. "Post-quantum cryptography" (classical algorithms resistant to quantum attack) is already being standardized by NIST, because nobody wants to find out too late.
Current quantum computers run a dozen or so shallow algorithms interestingly well — variational chemistry, optimization-heuristic benchmarks, sampling demonstrations — and struggle with deep, error-sensitive circuits. The "quantum advantage" line is blurry and contested, but the trajectory of hardware is steep and the theoretical promise is on firm footing.
6. Interactive: a one-qubit Bloch sphere
A single qubit's pure state lives on the surface of a unit sphere — the Bloch sphere. The north pole is $|0\rangle$, the south pole is $|1\rangle$, and points on the equator are equal superpositions with different phases. Drag the sliders for the polar angle $\theta$ and the azimuthal angle $\varphi$ to explore the parameter space.
A single qubit's pure state on the Bloch sphere. Read off the state vector, probabilities, and relative phase.
7. Topological matter
Condensed-matter physics in the last two decades has been dominated by the discovery that many materials have phases distinguished not by local order parameters (like magnetization or superconducting density) but by topological invariants — integers that stay put under smooth changes of parameters. These invariants live in the band structure of the material, and their non-triviality forces conducting states to exist on any boundary of the sample, even though the bulk is insulating.
- Topological insulators. Discovered theoretically in 2005 (Kane and Mele) and experimentally in 2007 (HgTe quantum wells, later Bi$_2$Se$_3$). The bulk is an insulator; the surface or edge hosts a one-way (chiral) metallic state protected by time-reversal symmetry. The edge state cannot be localized or scattered backward by disorder, as long as the symmetry is preserved.
- Quantum Hall effect. The oldest topological phase, discovered by von Klitzing in 1980. In a 2D electron gas in a strong magnetic field, the Hall conductance takes exquisitely quantized values $\sigma_{xy} = n e^2 / h$, where $n$ is an integer (or a fraction for the fractional Hall effect). The integer is the first Chern number of the band, and the quantization is exact enough to be used as a resistance standard.
- Anyons. In 2D, quasiparticle excitations need not be bosons or fermions. Exchanging two of them can multiply the wavefunction by any phase $e^{i\theta}$ (abelian anyons) or by a unitary matrix acting on a degenerate subspace (non-abelian anyons). Non-abelian anyons are the resource behind topological quantum computing.
- Topological quantum computing. If you could braid non-abelian anyons around each other, the resulting unitary operation would depend only on the topology of the braid — not on the precise trajectory. That means quantum gates would be intrinsically robust to local perturbations. Microsoft's "Station Q" program pursues Majorana-based implementations; experimental claims here have been volatile but the idea is theoretically clean.
The unifying mathematical structure is that the ground-state manifold of a topological phase is classified by a topological invariant — an integer (Chern number, $\mathbb{Z}_2$ index, winding number) computed from the Berry connection of the filled bands. The invariant is not a smooth function of Hamiltonian parameters; it jumps when the gap closes, signaling a topological phase transition. This is a genuinely new kind of phase transition, different from the Landau paradigm of spontaneous symmetry breaking.
First Chern number of a filled band
- $\mathcal{F}_{xy}$
- The Berry curvature of the band in the two-dimensional Brillouin zone, $\mathcal{F}_{xy} = \partial_{k_x} A_{k_y} - \partial_{k_y} A_{k_x}$, where $A_{k}$ is the Berry connection built from the Bloch states.
- $\text{BZ}$
- The Brillouin zone — the fundamental domain of crystal momenta. For a 2D crystal it is a torus.
- $C \in \mathbb{Z}$
- The Chern number — an integer because the curvature is integrated over a closed 2-manifold (Gauss-Bonnet / Chern-Weil theorem).
Why integers. Think of it as counting how many times the band's wavefunction winds as you walk around the Brillouin zone. Like the number of times a string wraps around a pole, it cannot change by a little bit: either you wrap once or you do not. The Hall conductance inherits this integer from the geometry.
8. Holography and the AdS/CFT correspondence
Why is the holographic principle surprising? In every classical and quantum field theory you learned before encountering it, the number of degrees of freedom in a box scales with the volume of the box — twice the volume, twice the bits. The holographic principle says the true number of independent degrees of freedom in quantum gravity is far smaller: it scales like the surface area. The intuition comes from black holes. If you pack more and more information into a ball, eventually the ball collapses into a black hole. The maximum information in the ball without forming a black hole is bounded by the Bekenstein-Hawking entropy $S_{BH} = A / 4G_N$, which is an area, not a volume.
The best-understood realization of holography is Juan Maldacena's 1997 AdS/CFT correspondence. Maldacena showed that Type IIB superstring theory on AdS₅ × S⁵ (a 5-dimensional anti-de Sitter space times a 5-sphere) is exactly dual to $\mathcal{N}=4$ super-Yang-Mills theory — a 4-dimensional conformal field theory — living on the 4-dimensional boundary. Every observable in the bulk gravitational theory maps to an observable in the boundary gauge theory, and vice versa. In the large-$N$ limit (where $N$ is the rank of the gauge group), the bulk theory becomes classical gravity — a dramatic simplification that makes many calculations tractable.
The AdS/CFT dictionary translates between the two sides. The radial coordinate in AdS, measuring distance from the boundary, maps to the energy (RG) scale in the boundary theory: deep in the bulk corresponds to low-energy (IR) physics, near the boundary to high energy (UV). A black hole in the bulk corresponds to a thermal state in the boundary theory. Most strikingly, the area of a minimal surface in the bulk reaching the boundary maps to the entanglement entropy of the corresponding boundary region — the Ryu-Takayanagi formula.
Ryu-Takayanagi formula
- $S(A)$
- The von Neumann entanglement entropy of a boundary region $A$ in the dual CFT.
- $\gamma_A$
- The minimal-area bulk surface anchored to the boundary of $A$ (a codimension-2 surface in the bulk that is homologous to $A$).
- $G_N$
- Newton's constant in the bulk gravitational theory.
Why this is profound. The left side is a purely quantum-information quantity — the entropy of a density matrix in the boundary field theory. The right side is a geometric area in a classical gravitational spacetime. The formula says these two seemingly unrelated quantities are identical. Geometry encodes entanglement structure, and vice versa.
A major advance in 2019 resolved a long-standing tension with the black hole information paradox. The "island formula" extends the Ryu-Takayanagi prescription to dynamical spacetimes by including contributions from "islands" — disconnected bulk regions whose entropy must be added to the minimal surface calculation. Including islands reproduces the correct "Page curve" for the entropy of Hawking radiation: entropy rises initially (as radiation is emitted), then turns over and decreases, as expected for a unitary process. The entropy does not grow forever (which would signal information loss) but returns to zero when evaporation is complete. This gives a gravitational derivation of the Page curve from first principles.
AdS/CFT has also produced directly applicable results outside pure quantum gravity. The viscosity-to-entropy ratio of the quark-gluon plasma created at RHIC and the LHC is very close to the AdS/CFT prediction $\eta/s = 1/4\pi$ — the smallest value allowed by any known theory. Holographic models of superconductivity (holographic superconductors) provide a solvable toy model for high-$T_c$ systems. These applications of gravitational physics to condensed matter are collectively called AdS/CMT.
9. ER = EPR — entanglement as the fabric of spacetime
In 2013, Juan Maldacena and Leonard Susskind proposed one of the most striking conjectures in modern theoretical physics: every pair of entangled particles is connected by a Planck-scale Einstein-Rosen bridge (wormhole). The slogan is ER = EPR: Einstein-Rosen (1935) wormholes equal Einstein-Podolsky-Rosen (1935) entanglement. The two 1935 Einstein papers, long thought to be unrelated, describe the same underlying phenomenon at different scales.
The conjecture was motivated partly by the firewall paradox, posed in 2012 by Almheiri, Marolf, Polchinski, and Sully (AMPS). The AMPS argument goes as follows: quantum mechanics requires that the Hawking radiation emitted by an old black hole is fully entangled with earlier radiation (so that the overall evolution is unitary — no information loss). But smoothness of the event horizon requires that the outgoing Hawking photon is entangled with its partner mode inside the horizon. Both cannot be true simultaneously, because monogamy of entanglement forbids a single quantum system from being maximally entangled with two different partners at once. The apparent resolution is that the horizon is not smooth — an infalling observer hits a "firewall" of high-energy radiation instead of passing through freely.
ER=EPR offers a different resolution. If the entanglement between the Hawking photon and its interior partner mode IS a wormhole, then the infalling observer does not encounter a firewall — they pass through a wormhole connecting the exterior to the interior. The entanglement is not "shared" in the forbidden sense; it is structural. The wormhole and the entanglement are two descriptions of the same geometric relationship. In this picture, spacetime connectivity is not fundamental — it is a derived property of the entanglement structure of the quantum state. A highly entangled pair of boundary regions corresponds to a short, connected bulk wormhole; a less entangled pair corresponds to a longer or disconnected geometry.
The ER=EPR conjecture is not yet a proven theorem. It is strongly supported by calculations in AdS/CFT — in particular by the observation that the "thermofield double" state (a maximally entangled state of two CFTs) is dual to an eternal AdS black hole with two exterior regions connected by a non-traversable wormhole. The broader program of deriving spacetime geometry from the quantum information structure of its boundary dual is sometimes called "It from Qubit" (after John Wheeler's "It from Bit"). Concretely: the entanglement entropy of boundary subregions (Ryu-Takayanagi), the modular Hamiltonian structure, and the quantum error-correcting properties of the AdS/CFT map all suggest that the bulk geometry is a code protecting boundary information — that spacetime is, in some deep sense, a quantum error-correcting code.
10. Quantum fluctuations and vacuum energy
The quantum harmonic oscillator has a ground-state energy of $\frac{1}{2}\hbar\omega$ even in its lowest possible state — the zero-point energy. This is not a technicality: you cannot extract it classically, but it is real, and every mode of every quantum field contributes it. The quantum vacuum is therefore not empty: it is a seething sea of zero-point fluctuations across all field modes. Sum over all modes and you get an infinite (or, with a UV cutoff, astronomically large) vacuum energy density.
The most direct experimental confirmation of vacuum fluctuations is the Casimir effect. Place two uncharged conducting plates very close together in vacuum. The boundary conditions at the plates exclude vacuum modes whose wavelength does not fit an integer number of times in the gap. Fewer modes between the plates than outside means less zero-point energy between the plates, and the system reduces its energy by pulling the plates together. The force per unit area is:
Casimir force between parallel plates
- $F/A$
- Force per unit area (pressure). The negative sign indicates attraction.
- $\hbar c$
- The combination of Planck's constant and the speed of light; sets the scale of quantum fluctuations.
- $d$
- Plate separation. The $d^{-4}$ scaling means the force is extremely short-ranged but measurable at submicron separations.
- $\pi^2/240$
- A numerical prefactor from summing zero-point energies over all allowed modes between the plates (related to the Riemann zeta function $\zeta(-3)$).
Measured confirmation. The Casimir force was first accurately measured by Steven Lamoreaux in 1997 and has since been confirmed to better than 1% precision. It is now relevant to nanoscale engineering, where quantum vacuum forces between surfaces can dominate over other interactions.
Vacuum fluctuations also underlie particle creation near horizons — the Hawking radiation mechanism. Near a black hole event horizon, virtual particle-antiparticle pairs constantly appear from the vacuum. Normally both partners annihilate. But if the pair appears just outside the horizon, one partner can fall inside while the other escapes outward as a real particle. The escaping particle carries positive energy (Hawking radiation), while the infalling particle carries negative energy, reducing the black hole's mass. The black hole slowly evaporates at a temperature $T_H = \hbar c^3 / (8\pi G M k_B)$ inversely proportional to its mass. The same mechanism, applied to an accelerating observer in flat spacetime, gives the Unruh effect: an accelerating detector sees the Minkowski vacuum as a thermal bath at temperature $T_U = \hbar a / 2\pi c k_B$.
The cosmological constant problem is the most acute fine-tuning problem in all of physics. The vacuum energy density predicted by QFT — obtained by integrating zero-point energies up to the Planck scale — is roughly $10^{120}$ times larger than the observed dark energy density $\rho_\Lambda \sim 10^{-29}$ g/cm³. Something (unknown) cancels the QFT prediction almost perfectly, leaving only this tiny residue. No symmetry, no mechanism, and no generally accepted anthropic argument fully explains this. It remains one of the deepest unsolved problems at the intersection of quantum field theory and cosmology.
Quantum fluctuations in the very early universe play a pivotal role in cosmology. During inflation, the universe expanded exponentially fast, stretching microscopic quantum vacuum fluctuations of the inflaton field to macroscopic scales. These fluctuations froze as the horizon swept past them, seeding the density perturbations that eventually collapsed into galaxies. The cosmic microwave background (CMB) power spectrum is, in a very direct sense, a map of inflationary vacuum fluctuations. The nearly scale-invariant spectrum predicted by single-field slow-roll inflation fits the Planck 2018 data to remarkable precision, providing strong (though not unique) evidence that quantum fluctuations generated large-scale structure.
11. Dark matter and dark energy — the missing 95%
The atoms described by the Standard Model make up only about 5% of the total energy budget of the universe. The remaining 95% is dark: dark matter (~27%) and dark energy (~68%), both inferred from their gravitational effects but otherwise undetected in any laboratory. This is arguably the most glaring empirical failure of the Standard Model.
Evidence for dark matter is abundant and independent across multiple length scales. Vera Rubin's 1970s measurements of galaxy rotation curves showed that stars in the outer parts of spiral galaxies orbit far too fast for the visible mass to hold them: the rotation curves stay flat rather than falling as $v \propto r^{-1/2}$. Gravitational lensing of galaxy clusters (and especially the Bullet Cluster, where visible gas is separated from the lensing center of mass after a cluster collision) shows that the mass distribution is not the same as the light distribution. The CMB acoustic peak positions and heights require a specific ratio of baryonic to non-baryonic matter to fit. All of these constraints point to the same conclusion: there is roughly five times as much gravitationally active non-baryonic matter as ordinary matter.
| Candidate | Mass range | Detection method |
|---|---|---|
| WIMP | 10 GeV–10 TeV | Direct nuclear recoil, indirect gamma rays, LHC production |
| Axion | μeV–meV | Microwave cavity (ADMX), NMR (CASPEr) |
| Primordial black hole (PBH) | 10⁻¹⁵–100 M☉ | Microlensing (MACHO, EROS, HSC) |
| Sterile neutrino | keV–GeV | X-ray line search (XMM-Newton, Chandra) |
Direct detection experiments search for the faint nuclear recoil a dark matter particle would produce when scattering off a heavy nucleus deep underground. LUX-ZEPLIN (LZ) and XENONnT, both xenon-based detectors with multi-tonne active masses, have set world-leading limits on WIMP-nucleon cross sections below $10^{-47}$ cm². The axion is motivated independently by the strong CP problem (why QCD does not break CP symmetry, despite having a topological term that could). The Peccei-Quinn mechanism introduces a new symmetry whose spontaneous breaking produces the axion; its mass is in the μeV range. The ADMX experiment searches for axion-photon conversion in a microwave resonant cavity immersed in a strong magnetic field. No confirmed detection in any channel as of 2026.
Dark energy makes up ~68% of the energy density of the universe and is responsible for the accelerating expansion discovered in 1998 by the supernovae surveys of Perlmutter, Schmidt, and Riess (2011 Nobel Prize). The simplest model is Einstein's cosmological constant $\Lambda$ — a uniform energy density of the vacuum — which is consistent with all observations but requires the extreme fine-tuning of the cosmological constant problem. The alternative is quintessence — a slowly rolling scalar field with a time-varying equation of state $w(z) = p/\rho$, where $w = -1$ is the cosmological constant. DESI 2024 results reported ~3σ evidence for $w \neq -1$ and for $w$ evolving with redshift, which would rule out a pure cosmological constant. The tension has not reached 5σ (discovery threshold) but has motivated intense theoretical activity.
Quantum thermodynamics sits at the boundary between statistical mechanics and quantum information theory, and touches on several frontier topics. Landauer's principle states that erasing one bit of classical information dissipates at least $k_B T \ln 2$ of heat into the environment — connecting information erasure to thermodynamic irreversibility. Landauer's principle has been experimentally verified in single-atom systems and places ultimate limits on the energy efficiency of computation. Quantum heat engines can in principle exceed classical Carnot efficiency by exploiting quantum coherence as a thermodynamic resource — a quantum advantage in thermodynamics, distinct from quantum advantage in computation. The thermalization of isolated quantum many-body systems is governed by the eigenstate thermalization hypothesis (ETH): in generic (non-integrable) quantum many-body systems, individual energy eigenstates are indistinguishable from thermal states with respect to local observables, explaining why quantum systems thermalize under unitary evolution. Exceptions — many-body localization (MBL) and quantum many-body scars — are active research areas.
12. Hard problems that may or may not have answers
A short list of open questions, each of which is both deep and not obviously solvable.
- The measurement problem. Why does a quantum superposition "collapse" to a definite outcome when measured? Decoherence tells you how a system appears classical; it does not tell you why one branch is realized. Many-worlds skips the collapse entirely but saddles you with an ontological multiverse. Pilot-wave theories add hidden variables. QBism says probabilities are subjective bets. Everyone has a favorite; none is experimentally distinguished.
- The cosmological constant problem. Naive QFT predicts a vacuum energy $10^{120}$ times larger than measured. No generally accepted solution exists. Anthropic arguments using the string landscape are the closest thing to a story, and many physicists dislike them.
- Black hole information. Hawking's 1974 calculation predicted black holes evaporate into thermal radiation, which would destroy information and violate unitarity. AdS/CFT says unitarity wins, but the detailed mechanism (Page curve, islands, ER=EPR) is still being worked out. There has been tremendous progress since 2019 via quantum extremal surfaces.
- The strong CP problem. QCD could violate CP symmetry via a topological theta term, but it visibly does not. Why is the theta parameter so close to zero? The most popular explanation is the axion, and axion searches are ramping up.
- Hierarchy problem. Why is the Higgs mass 125 GeV and not Planck-scale? Supersymmetry was invented partly to solve this. The LHC has not found SUSY, so alternative mechanisms (composite Higgs, extra dimensions, twin Higgs, anthropic arguments) are in the running.
- Dark sector nature. Dark matter exists; what is it? Dark energy looks like $\Lambda$; is it really? The DESI 2024–2025 hints at a time-varying equation of state kept this debate lively.
None of these is the kind of problem that an afternoon of thinking solves. Each has been the subject of active work for decades, and each probably has at least one genuinely new idea ahead of it before it gets pinned down.
13. A tiny quantum simulator
Simulating a quantum circuit on a classical computer is exponentially expensive in the number of qubits — that is half the motivation for quantum hardware in the first place. But for two or three qubits it is trivially cheap, and it is a useful way to build intuition. Here is an implementation that applies a Hadamard gate and a CNOT to produce a Bell state.
import numpy as np
# Basis states of two qubits: |00>, |01>, |10>, |11> (little-endian convention)
ket = lambda i, n=4: np.eye(n)[i]
# Single-qubit gates
H = (1 / np.sqrt(2)) * np.array([[1, 1], [1, -1]])
I2 = np.eye(2)
X = np.array([[0, 1], [1, 0]])
# Tensor two single-qubit gates into a two-qubit gate
def kron(A, B): return np.kron(A, B)
# CNOT with qubit 0 as control, qubit 1 as target
CNOT = np.array([[1,0,0,0],
[0,1,0,0],
[0,0,0,1],
[0,0,1,0]])
# Start in |00>, Hadamard the first qubit, then CNOT
psi = ket(0) # |00>
psi = kron(H, I2) @ psi # (|0>+|1>)/sqrt(2) tensor |0>
psi = CNOT @ psi # Bell state (|00>+|11>)/sqrt(2)
print("Final amplitudes:", psi.round(4))
print("Probabilities:", np.abs(psi)**2)
# Entanglement check: reduce to qubit 0 and compute von Neumann entropy
rho = np.outer(psi, psi.conj())
rho_reshape = rho.reshape(2, 2, 2, 2)
rho_A = np.einsum("ijkj->ik", rho_reshape) # partial trace over qubit 1
eigs = np.linalg.eigvalsh(rho_A)
S = -sum(l * np.log2(l) for l in eigs if l > 1e-12)
print(f"Entanglement entropy S(rho_A) = {S:.4f} (expect 1.0)")
# Same calculation without NumPy, as explicit lists.
from math import sqrt, log2
# |00>, |01>, |10>, |11>
psi = [1.0, 0.0, 0.0, 0.0]
# Hadamard on qubit 0 (acts on first index of the pair)
def hadamard0(v):
a, b, c, d = v
s = 1 / sqrt(2)
return [s*(a + c), s*(b + d), s*(a - c), s*(b - d)]
def cnot_01(v):
a, b, c, d = v
return [a, b, d, c] # control 0, target 1
psi = hadamard0(psi)
psi = cnot_01(psi)
print("Bell amps:", psi) # [0.707, 0, 0, 0.707]
What the output confirms: starting in $|00\rangle$, the two operations produce $(|00\rangle + |11\rangle)/\sqrt 2$, the canonical Bell state. The entanglement entropy of either qubit's reduced state is exactly $\log_2 2 = 1$ — one "bit" of entanglement. That matches what the theory says maximal two-qubit entanglement should give. Scale this up to $n$ qubits and the state vector has $2^n$ complex components; simulating even 40 qubits needs terabytes of memory, which is why people build the real hardware.
14. Cheat sheet
String theory
1D objects whose vibrations are particles. Needs 10D spacetime. Includes gravity automatically.
Landscape
$\gtrsim 10^{500}$ possible string vacua, each giving different low-energy physics.
AdS/CFT
Gravity in $D$+1 dimensions $\leftrightarrow$ gauge theory in $D$ dimensions.
Loop quantum gravity
Quantize GR directly. Spacetime discrete at the Planck scale, with minimum area $\sim \ell_P^2$.
Qubit
$|\psi\rangle = \alpha|0\rangle + \beta|1\rangle$, $|\alpha|^2+|\beta|^2=1$.
Bell state
$(|00\rangle+|11\rangle)/\sqrt 2$
Shor
Factors in $O(\log^3 n)$ on a quantum computer.
Topological phase
Integer-valued invariant (Chern, $\mathbb{Z}_2$) in the band structure.
Hall conductance
$\sigma_{xy} = n\, e^2/h$, $n \in \mathbb{Z}$.
See also
Quantum field theory
QFT is the machinery every frontier framework assumes. AdS/CFT is a statement about a particular CFT; topological matter is a statement about low-energy effective QFTs with topological terms.
Quantum mechanics
The measurement problem, entanglement, and qubit algebra all live here. Read QM first if the bra-ket notation or unitarity statements are unfamiliar.
General relativity
GR is the classical theory whose quantization all these frameworks struggle with. Anti-de Sitter space, black holes, and cosmological bounces are all GR constructions.
Cosmology
Inflation, dark energy, and the cosmological constant problem are where quantum gravity most directly meets data. Early-universe physics is the testing ground.
Linear algebra
Hilbert spaces, tensor products, eigenvalue problems, and unitary matrices are the grammar of everything here. Quantum computing is linear algebra on $2^n$-dimensional spaces.
Algorithms
Shor, Grover, and the quantum Fourier transform are algorithmic ideas. Complexity classes like BQP versus NP are the right language to compare "what quantum computers can do."
Further reading
- Michael Nielsen & Isaac Chuang — Quantum Computation and Quantum Information (2000, still in print). The canonical textbook for everything quantum-info.
- Joseph Polchinski — String Theory, Vol I & II (1998). The standard graduate reference. Dense. Start with Zwiebach if you want a gentler first pass.
- Barton Zwiebach — A First Course in String Theory (2nd ed., 2009). Actually readable. Used in many senior undergraduate courses.
- Carlo Rovelli — Quantum Gravity (2004). The authoritative LQG text, by one of the field's founders.
- Shinsei Ryu & Tadashi Takayanagi — "Holographic derivation of entanglement entropy from AdS/CFT," 2006. The paper that started the modern entanglement-entropy-in-gravity program.
- Xiao-Gang Wen — Quantum Field Theory of Many-Body Systems (2004). Great treatment of topological phases, anyons, and TQFT.
- Scott Aaronson — Quantum Computing Since Democritus (2013). A polemical, funny, very pedagogical overview of quantum info and complexity.
- Wikipedia, String theory landscape, and topological insulators. Good starting points if you want to drill in.