Quantum Chemistry
Quantum chemistry is the machinery that turns "what does the Schrödinger equation say about this molecule?" into a practical number — a bond length, a reaction barrier, an excitation energy, a binding affinity. You will meet the equation, the Born-Oppenheimer split, Hartree-Fock mean-field theory, basis sets and how they encode atoms, the correlated methods that refine HF, density functional theory and its zoo of functionals, and finally a walkthrough of what a working computational chemist actually does at the keyboard.
1. Why quantum chemistry matters
A molecule is, at bottom, a small collection of nuclei and electrons glued together by Coulomb's law and the rules of quantum mechanics. Everything else — its shape, its vibration frequencies, its color, its reactivity, whether it binds a protein, whether it catches fire, whether it conducts electricity — is downstream. If you could solve the Schrödinger equation exactly for any molecule, you would have a complete physical theory of chemistry. For molecules with more than two electrons, you cannot solve it exactly. Quantum chemistry is the art and science of solving it approximately, and well enough to predict the number you need.
That sounds academic until you realize that computational chemistry now sits on the critical path of major industries. Every pharmaceutical company runs docking and QM/MM calculations on lead compounds before synthesizing them. Every battery company uses DFT to screen electrolytes and electrode materials. Every catalyst group at a national lab or materials startup runs DFT to map transition states and tune selectivity. Semiconductor foundries use it to model defect energies. Even AI weather models and molecular dynamics force fields increasingly borrow their ground truth from quantum-chemistry reference calculations.
You cannot solve the many-electron Schrödinger equation for anything bigger than H$_2^+$. The whole subject is a hierarchy of approximations, each trading cost for accuracy, each with known failure modes. Knowing which method to pick for which problem is 80% of being good at this.
Reasons to care, whether you are a materials researcher, a pharma scientist, a battery engineer, or an ML practitioner:
- Drug design. QM calculations on protein-ligand complexes (QM/MM) give you reaction barriers for enzymes, binding energies for tight inhibitors, and $pK_a$ predictions for ionizable groups.
- Materials and catalysts. DFT is the workhorse for predicting cohesive energies, band gaps, surface reconstructions, and transition-state energies on heterogeneous catalysts.
- Spectroscopy. IR, UV-vis, NMR chemical shifts, and EPR parameters all have quantum-chemical expressions and routinely computed accuracy better than experiment.
- Environmental chemistry. Atmospheric radical reactions, photolysis cross-sections, and aerosol nucleation energies are computed by QC before being used in climate models.
- ML + QC. Neural network potentials (ANI, SchNet, NequIP, Allegro, MACE) are trained on DFT or CCSD(T) reference data. The data you feed them is still the rate-limiting step.
This page assumes you have met atoms and orbitals and the very basics of quantum mechanics (Schrödinger equation, wavefunctions, operators). Everything else we build.
2. Vocabulary cheat sheet
| Symbol / term | Read as | Means |
|---|---|---|
| $\hat H$ | "H-hat" or Hamiltonian | Energy operator. Its eigenvalues are allowed total energies. |
| $\Psi$ | "Psi" | Many-electron wavefunction. $|\Psi|^2$ is a probability density for finding electrons. |
| $\rho(\mathbf r)$ | "rho" or electron density | Probability of finding any electron at position $\mathbf r$. Central object of DFT. |
| BO | "Born-Oppenheimer" | Nuclei are fixed while electrons equilibrate; electronic energy depends on nuclear positions. |
| HF | "Hartree-Fock" | Mean-field approximation: each electron sees the average field of the others. |
| SCF | "self-consistent field" | Iterative solution where the electronic field and orbitals are updated together until they stop changing. |
| Basis set | — | A finite collection of atom-centered functions used to expand the unknown orbitals. |
| MP2, CCSD, CCSD(T) | — | Post-HF methods that include electron correlation; CCSD(T) is the "gold standard." |
| DFT | "density functional theory" | Alternative to wavefunction methods: compute energy from $\rho(\mathbf r)$ via a functional. |
| B3LYP, PBE, M06-2X | — | Widely used DFT functionals. |
| $E_{\text{xc}}$ | "exchange-correlation energy" | The unknown-and-approximated piece of the DFT energy. All the art lives here. |
3. The molecular Schrödinger equation
Write down the Hamiltonian of a molecule with $M$ nuclei and $N$ electrons:
The full molecular Hamiltonian
- $\hat H$
- Total energy operator. Its eigenvalues are allowed energies of the molecule; its eigenfunctions are the many-body wavefunctions.
- $M$, $N$
- Number of nuclei and number of electrons, respectively.
- $M_A$, $Z_A$
- Mass and charge (atomic number) of nucleus $A$.
- $m_e$, $e$
- Electron mass and elementary charge.
- $\hbar$
- Reduced Planck constant. Quantum of action.
- $\nabla_A^2$, $\nabla_i^2$
- Laplacian with respect to the coordinates of nucleus $A$ (or electron $i$). The kinetic energy operators.
- $R_{AB}$, $r_{iA}$, $r_{ij}$
- Distances: nucleus-nucleus, electron-nucleus, and electron-electron, respectively.
- $\epsilon_0$
- Vacuum permittivity, the constant in Coulomb's law.
Reading it term by term. The first term is nuclear kinetic energy, the second is electronic kinetic energy, and the last three are the three Coulomb interactions (nucleus-nucleus repulsion, electron-nucleus attraction, electron-electron repulsion). Everything quantum-chemical is an attack on this expression.
The time-independent Schrödinger equation for the whole molecule is $\hat H \Psi(\mathbf R, \mathbf r) = E \Psi(\mathbf R, \mathbf r)$, where $\mathbf R$ collects all nuclear coordinates and $\mathbf r$ collects all electron coordinates. This is a partial differential equation in $3(M + N)$ variables. For a single water molecule, that is $3 \cdot (3 + 10) = 39$ spatial dimensions. Direct numerical solution is hopeless beyond the smallest systems.
Chemists in atomic units set $\hbar = m_e = e = 4\pi\epsilon_0 = 1$. Distances come out in bohr ($a_0 \approx 0.529$ Å), energies in hartree (1 E$_h$ $\approx 27.2$ eV $\approx 627.5$ kcal/mol). Everything below is in atomic units unless stated otherwise.
4. Born-Oppenheimer approximation
A proton is 1836 times heavier than an electron. On the timescale that electrons move, the nuclei look frozen. Max Born and J. Robert Oppenheimer (1927) exploited this: write the total wavefunction as a product of a nuclear piece and an electronic piece,
The Born-Oppenheimer factorization
- $\chi(\mathbf R)$
- Nuclear wavefunction — depends only on the nuclear positions. Captures vibrations, rotations, and translation.
- $\psi_{\text{el}}(\mathbf r; \mathbf R)$
- Electronic wavefunction, which depends on the electron coordinates $\mathbf r$ but is parametric in $\mathbf R$ — solved at each fixed nuclear geometry.
- $;$
- Semicolon separates the independent variables ($\mathbf r$) from the parameters ($\mathbf R$). Standard notation.
Why the split works. Because electrons are light, they adjust near-instantaneously to wherever the nuclei are. Solve the electronic Schrödinger equation at each fixed geometry and you get a potential energy function $E_{\text{el}}(\mathbf R)$ — the potential energy surface (PES) — on which the nuclei move. The PES is the central concept of all molecular dynamics, reaction rate theory, and vibrational spectroscopy.
Plug into the full Schrödinger equation and (dropping small terms) you get the electronic Schrödinger equation at fixed nuclear geometry $\mathbf R$:
Electronic Schrödinger equation
- $\hat H_{\text{el}}(\mathbf R)$
- Electronic Hamiltonian — same as $\hat H$ minus the nuclear kinetic term, with the nuclear coordinates treated as fixed parameters.
- $\psi_{\text{el}}$
- The electronic wavefunction. Contains everything you want to know about bonding, electron density, and so on.
- $E_{\text{el}}(\mathbf R)$
- Electronic energy at that nuclear geometry. Varying $\mathbf R$ and recomputing gives you the potential energy surface.
When BO breaks. Near conical intersections (where two electronic states cross), the electronic wavefunction changes abruptly as the nuclei move, and the assumption that electrons adjust instantly fails. Photochemistry, singlet fission in solar cells, and the non-adiabatic transitions in retinal isomerization are all Born-Oppenheimer breakdowns you have to handle specially.
From here on, "solving quantum chemistry" almost always means solving the electronic equation at one or more nuclear geometries. The nuclei are a parameter.
5. The Hartree-Fock method
Even with nuclei frozen, the electronic equation couples every electron to every other through $1/r_{ij}$. Hartree-Fock (HF) is the simplest useful approximation: replace the true many-body problem with a set of one-electron problems, where each electron moves in the average field of all the others.
The HF ansatz for the ground-state wavefunction is a single Slater determinant:
The Slater determinant
- $\phi_i$
- One-electron spin-orbital — a product of a spatial orbital and a spin function. The unknown we solve for.
- $\det$
- Determinant. Writing $\Psi$ as a determinant automatically makes it antisymmetric under swapping any two electrons, enforcing the Pauli principle.
- $1/\sqrt{N!}$
- Normalization factor so $\int |\Psi|^2 = 1$.
- $\mathbf r_i$
- Position (and spin) of the $i$-th electron.
Why a determinant. Electrons are fermions — swapping any two must change the sign of $\Psi$. A determinant does exactly that automatically, because swapping two columns flips its sign. If two orbitals were identical, two columns would be equal, and the determinant (and the wavefunction) would be zero. That is the Pauli exclusion principle falling out of the math.
Applying the variational principle — minimize $\langle\Psi|\hat H|\Psi\rangle$ over the orbitals — gives the Hartree-Fock equations:
Hartree-Fock equations
- $\hat f$
- Fock operator — the effective one-electron Hamiltonian felt by each electron in the mean field of the others.
- $\hat h$
- One-electron core Hamiltonian (kinetic energy + electron-nucleus attraction).
- $\hat J_j$
- Coulomb operator — classical electron-electron repulsion averaged over orbital $j$.
- $\hat K_j$
- Exchange operator — a purely quantum-mechanical correction from antisymmetry. Non-local, and the most expensive ingredient.
- $\varepsilon_i$
- Orbital energy of spin-orbital $\phi_i$. For the highest occupied orbital, $-\varepsilon_i$ approximates the ionization energy (Koopmans' theorem).
Mean-field limitation. Because every electron sees only an averaged field, it does not dynamically "dodge" the instantaneous positions of the others. That missing dodging is called electron correlation and accounts for errors of tens of kcal/mol in bond energies. HF is a starting point, not the destination.
The Fock operator depends on the very orbitals you are trying to find. So Hartree-Fock is solved self-consistently: guess orbitals, build $\hat f$, diagonalize it to get new orbitals, repeat until nothing changes. The iteration is called the Self-Consistent Field (SCF) procedure, and it is the inner loop of almost every modern quantum chemistry code.
Hartree-Fock gets you about 99% of the total electronic energy. Unfortunately, chemistry lives in the last 1%: bond dissociation energies, activation barriers, and intermolecular interactions are all differences between large numbers. HF systematically overestimates bond lengths, underestimates bond energies, and misses van der Waals interactions entirely. Worse, it fails qualitatively for systems with near-degenerate frontier orbitals (transition metals, bond breaking). You almost always need something better.
6. Basis sets — encoding orbitals on a computer
You cannot represent an arbitrary function $\phi_i(\mathbf r)$ on a computer. You expand it in a finite set of pre-chosen functions:
Basis expansion of a molecular orbital
- $\phi_i$
- The $i$-th molecular orbital (an unknown to be determined).
- $\chi_\mu$
- A basis function — typically centered on one of the atoms. Often a sum of Gaussians chosen to mimic a Slater orbital.
- $C_{\mu i}$
- Coefficient — how much of $\chi_\mu$ goes into $\phi_i$. These are the unknowns the SCF procedure finds.
- $K$
- Number of basis functions. Typically a few dozen to a few thousand, depending on the system and target accuracy.
Why Gaussians. The true atomic orbitals are Slater-type (exponential decay, $e^{-\zeta r}$), but products of two Slater functions on different centers have no closed-form integral. Boys (1950) showed that a product of two Gaussians centered on different atoms is another Gaussian on a third point — which makes the four-center electron-repulsion integrals analytically tractable. That single observation is what made modern quantum chemistry possible.
The Pople zoo
Standard chemistry basis sets are built up from primitive Gaussians and labeled with a coded name:
- STO-3G — "Slater-type orbital approximated by 3 Gaussians." Minimal basis: one function per occupied orbital. Tiny, fast, and qualitatively wrong for most things, but great for teaching.
- 3-21G, 6-31G — Split-valence. Core electrons get one function, valence electrons get two (split for flexibility). 6-31G has 6 primitives contracted into one core function plus 3+1 valence.
- 6-31G* (or 6-31G(d)) — Add polarization functions (d-functions on heavy atoms). Lets the orbitals polarize under bonding.
- 6-31G** (or 6-31G(d,p)) — Add p-functions on hydrogens too. Useful when H's are doing interesting chemistry (e.g. hydrogen bonding).
- 6-31+G* — Add diffuse functions for anions and excited states.
- cc-pVDZ, cc-pVTZ, cc-pVQZ — Correlation-consistent (Dunning). Designed for systematic convergence of correlated calculations. DZ, TZ, QZ mean double/triple/quadruple zeta — each level adds one more function per orbital.
- def2-SVP, def2-TZVP — Karlsruhe basis sets, widely used with DFT.
A minimal basis for water is 7 functions. A 6-31G* basis for water is 19. A cc-pVQZ basis for water is 115. The cost of a Hartree-Fock or DFT calculation scales roughly as $O(K^4)$ in the worst case, so basis-set size is a big deal.
7. Post-Hartree-Fock — adding electron correlation
Electron correlation is the difference between the exact energy and the Hartree-Fock energy. All the effort beyond HF goes into recovering as much of it as possible. The main options, in increasing cost and accuracy:
- MP2 (Møller-Plesset second order). Treat electron correlation as a perturbation on top of HF. Gives ~80-90% of correlation energy at a cost of $O(N^5)$. Decent for weak interactions and reasonable geometries. Poor for systems with small HOMO-LUMO gaps.
- CI (Configuration Interaction). Expand the wavefunction in multiple Slater determinants (HF ground + excited configurations) and let the coefficients vary. Full CI is exact within the basis, but cost scales exponentially. Truncated CI (CIS, CISD) is cheaper but not size-consistent.
- CCSD and CCSD(T). Coupled cluster with single and double excitations (CCSD) or with perturbative triples (CCSD(T)). The latter is the "gold standard" of single-reference quantum chemistry. Cost $O(N^6)$ for CCSD, $O(N^7)$ for CCSD(T).
- CASSCF / CASPT2. Multi-reference methods. Needed when a single determinant cannot even qualitatively describe the system (biradicals, transition-metal complexes, bond breaking). Expensive and requires chemical judgment to choose the active space.
As a gut rule, CCSD(T)/cc-pVQZ on a closed-shell organic molecule hits around 1 kcal/mol accuracy on reaction energies. Reaching that costs orders of magnitude more CPU than DFT, which is why DFT dominates routine work and CCSD(T) is used for benchmarks.
8. Density functional theory
DFT takes a radically different approach: forget the many-electron wavefunction and work with the electron density $\rho(\mathbf r)$ instead. The density is a function of just three spatial variables, no matter how many electrons there are — so you have swapped a $3N$-dimensional problem for a 3-dimensional one. If that sounds too good to be true, it is, and the catch lives in a single ugly term.
Kohn and Sham (1965) operationalized this by mapping the interacting electron system onto a fictitious non-interacting system that has the same density. The orbitals of the non-interacting system obey an HF-looking equation, the Kohn-Sham equation:
Kohn-Sham equations
- $-\tfrac{1}{2}\nabla^2$
- Kinetic energy operator (in atomic units).
- $v_{\text{ext}}$
- External potential — the Coulomb attraction of the nuclei.
- $v_H[\rho]$
- Hartree potential — the classical Coulomb repulsion of the electron density with itself.
- $v_{\text{xc}}[\rho]$
- Exchange-correlation potential — everything we do not know, lumped into one term.
- $\phi_i^{\text{KS}}$
- Kohn-Sham orbital. Mathematically a one-electron orbital, not the "real" wavefunction, but its density adds up to the true $\rho$.
- $\varepsilon_i$
- Kohn-Sham orbital energy. Not a true physical energy (except for the highest occupied, which is exactly minus the ionization energy).
Where the art lives. The entire cleverness and all the failure modes of DFT live in the choice of exchange-correlation functional $E_{\text{xc}}[\rho]$ that gives rise to $v_{\text{xc}}$. There is no general recipe for writing it down, and the zoo of functionals below reflects different strategies for approximating the unknown.
The functional zoo: Jacob's ladder
John Perdew organized functionals into five rungs of a ladder, each adding one more ingredient that the functional can see:
- LDA (local density approximation). $E_{\text{xc}}$ depends only on $\rho(\mathbf r)$ at each point. Exact for a uniform electron gas. Underestimates bond lengths and overbinds. Too crude for chemistry but fine for solid-state metals.
- GGA (generalized gradient approximation). Depends on $\rho$ and $|\nabla\rho|$. The classic GGA is PBE (Perdew-Burke-Ernzerhof, 1996) — still the workhorse of materials science. BLYP and PW91 are also common.
- meta-GGA. Adds the kinetic energy density $\tau$ or the Laplacian $\nabla^2\rho$. SCAN, M06-L, TPSS.
- Hybrid functionals. Mix in a fraction of exact HF exchange. B3LYP (Becke 3-parameter, Lee-Yang-Parr, 1994) is by far the most cited functional in chemistry. PBE0, M06-2X, $\omega$B97X-D are modern competitors.
- Double hybrids. Also include a bit of MP2 correlation. Even more accurate, more expensive. B2PLYP, DSD-PBEP86.
Routine organic chemistry: B3LYP/6-31G* or $\omega$B97X-D/def2-TZVP. Non-covalent interactions (docking, host-guest): $\omega$B97X-D or B3LYP-D3 (with dispersion correction). Transition metals: M06, TPSS, or multi-reference methods. Solids and surfaces: PBE or SCAN. If you need reference accuracy: CCSD(T)/CBS and skip DFT altogether.
Two persistent failure modes you should always keep in mind. First, standard DFT has no dispersion (van der Waals), so it does not bind stacked aromatic rings or fold peptides correctly without an empirical correction like Grimme's D3 or D4. Second, self-interaction error — an electron spuriously interacts with itself through the Hartree term — causes delocalization errors, and in extreme cases predicts the wrong ground state (e.g. dissociating H$_2^+$ to two half-electrons). Range-separated hybrids like $\omega$B97X address this partially.
9. Interactive: method accuracy vs cost
Pick a method and a basis set and the figure shows an approximate cost (relative to HF/STO-3G = 1) and an approximate mean-absolute-error on reaction energies (kcal/mol) for a typical small-molecule benchmark. These numbers are illustrative not authoritative — real errors depend heavily on the system — but the shape of the trade-off is real.
Log-scale dot with cost on the X axis and mean absolute error (MAE) on the Y axis. Lower and cheaper is better — the lower-left is the practical sweet spot.
Things to look for:
- HF/STO-3G sits at the origin but with enormous error. Fine for teaching, bad for predictions.
- B3LYP/6-31G* lives near the lower-left — a few kcal/mol of error for modest cost. This is why it has been the default for 25 years.
- CCSD(T)/cc-pVQZ is near 1 kcal/mol error but costs orders of magnitude more. Benchmark territory.
- MP2 is cheaper than CCSD but noticeably worse on dispersion-dominated systems.
10. How a calculation is run in practice
A typical workflow for a medicinal or materials chemist wanting DFT results on a small molecule looks like this:
- Build a starting structure. Either a SMILES string converted to 3D by RDKit, or a hand-drawn structure in Avogadro / GaussView / IQmol. Sanity-check bond lengths and hybridization.
- Pick a method and basis. Start with B3LYP-D3/def2-SVP or $\omega$B97X-D/def2-TZVP for most organic chemistry.
- Geometry optimization. Let the code relax atoms until forces are below a threshold ($\sim 10^{-4}$ Hartree/bohr). Check that the optimized structure is what you expected.
- Frequency calculation. Compute harmonic vibrational frequencies. Confirm a minimum (all real frequencies) or a transition state (exactly one imaginary). This also gives you thermodynamic corrections (ZPE, entropy).
- Single-point energy. Re-compute the energy at a larger basis or a higher method for the final number. A common recipe is "geometry optimization at B3LYP-D3/def2-SVP, single-point at DLPNO-CCSD(T)/def2-TZVP."
- Solvation and corrections. If your property involves solvent, re-run in a PCM or SMD continuum solvent model. If dispersion matters, use D3(BJ) or D4.
- Sanity-check the answer. Does the computed IR spectrum agree with experiment? Does the dipole moment match? Is the reaction energy on the order of what chemistry common sense says?
Popular software (open-source and commercial):
- PySCF (open source). Python-native, modular, perfect for hacking and teaching.
- Psi4 (open source). Well-rounded; good coupled-cluster implementation.
- NWChem, Orca, Q-Chem. Free for academics; good DFT and correlated methods.
- Gaussian (commercial). The industry standard in pharma; polished, reliable, opinionated.
- VASP, Quantum ESPRESSO, CP2K. Plane-wave codes used for solids, surfaces, and AIMD.
- CP2K, BigDFT. Linear-scaling for biomolecules.
11. Quantum chemistry in code
A toy Hartree-Fock for H$_2$ in a minimal STO-3G basis, plus a high-level PySCF script that runs B3LYP on water.
# pip install pyscf
from pyscf import gto, scf, dft, cc
# 1. Build a molecule.
mol = gto.M(
atom = """
O 0.000 0.000 0.117
H 0.000 0.757 -0.469
H 0.000 -0.757 -0.469
""",
basis = "6-31g(d,p)",
)
# 2. Hartree-Fock
mf_hf = scf.RHF(mol).run()
print(f"HF total energy: {mf_hf.e_tot:.6f} Eh")
# 3. Same molecule with B3LYP
mf_dft = dft.RKS(mol)
mf_dft.xc = "b3lyp"
mf_dft.run()
print(f"B3LYP energy: {mf_dft.e_tot:.6f} Eh")
# 4. CCSD on top of HF for benchmark accuracy
cc_obj = cc.CCSD(mf_hf).run()
print(f"CCSD correlation: {cc_obj.e_corr:.6f} Eh")
print(f"CCSD total: {cc_obj.e_tot:.6f} Eh")
# 5. Optimize geometry + vibrational frequencies
from pyscf.geomopt.geometric_solver import optimize
mol_opt = optimize(mf_dft)
from pyscf.hessian import thermo
mf2 = dft.RKS(mol_opt); mf2.xc = "b3lyp"; mf2.run()
hessian = mf2.Hessian().kernel()
freq_info = thermo.harmonic_analysis(mf2.mol, hessian)
print("frequencies (cm-1):", freq_info["freq_wavenumber"])
import numpy as np
# Toy RHF for H2 at R = 1.4 bohr in a minimal basis.
# Integrals are hard-coded from standard tables for teaching.
# S: overlap, T: kinetic, V_ne: nuclear attraction (one-electron)
S = np.array([[1.0, 0.6593], [0.6593, 1.0]])
T = np.array([[0.7600, 0.2365], [0.2365, 0.7600]])
V_ne = np.array([[-1.8804, -1.1948], [-1.1948, -1.8804]])
H_core = T + V_ne
# Two-electron integrals (mu nu | la si) in chemists' notation
ERI = np.zeros((2,2,2,2))
ERI[0,0,0,0] = ERI[1,1,1,1] = 0.7746
ERI[0,0,1,1] = ERI[1,1,0,0] = 0.5697
ERI[0,1,0,1] = ERI[1,0,1,0] = 0.2970
# (Hermitian + permutation symmetry gives the rest; abbreviated here)
def symm_orthog(S):
eigval, U = np.linalg.eigh(S)
return U @ np.diag(1/np.sqrt(eigval)) @ U.T
X = symm_orthog(S)
P = np.zeros_like(S) # density matrix
for it in range(30):
# Build Fock from density + 2-electron integrals
G = np.zeros_like(H_core)
for mu in range(2):
for nu in range(2):
for la in range(2):
for si in range(2):
G[mu,nu] += P[la,si] * (ERI[mu,nu,la,si] - 0.5*ERI[mu,la,nu,si])
F = H_core + G
Fp = X.T @ F @ X
eps, Cp = np.linalg.eigh(Fp)
C = X @ Cp
# Only 1 doubly-occupied orbital for H2 (2 electrons)
P_new = 2 * np.outer(C[:,0], C[:,0])
if np.abs(P_new - P).max() < 1e-8: break
P = P_new
E_elec = 0.5 * np.sum(P * (H_core + F))
E_nuc = 1.0 / 1.4
print(f"Converged in {it+1} iterations")
print(f"E_total = {E_elec + E_nuc:.6f} Eh (exact ~-1.133 Eh at this geometry)")
Three practical notes:
- Real codes never do the ERI loop explicitly. They use density fitting, integral screening, and linear-algebra tricks that change the scaling prefactor dramatically. A B3LYP single-point on caffeine takes about a second on a laptop; the same calculation ten years ago took an hour.
- For systems with more than ~50 heavy atoms, you move to semiempirical methods (GFN-xTB) or neural-network potentials (ANI, SchNet, MACE) for the first pass, and only spend DFT where you really need it.
- Everything in this section is ground-state electronic structure. Excited states, magnetic properties, and molecular dynamics each add another layer of machinery.
12. Cheat sheet
Full Hamiltonian
Kinetic + Coulomb, $3(M+N)$ dims
Born-Oppenheimer
Separate nuclear and electronic motion
Hartree-Fock
Single Slater determinant, mean-field
SCF
Iterate until orbitals stop changing
Basis sets
Expand orbitals in atom-centered Gaussians
Post-HF
MP2 ≪ CCSD ≪ CCSD(T)
DFT
Minimize $E[\rho]$ via Kohn-Sham orbitals
Jacob's ladder
LDA → GGA → meta-GGA → hybrid → double hybrid
Failure modes
No dispersion, self-interaction error
Workflow
Build → opt → freq → single point → corrections
See also
Atomic structure
Orbitals, quantum numbers, and angular momentum — the one-electron building blocks that quantum chemistry combines into molecules.
Chemical bonding
MO theory, VSEPR, and hybridization are the conceptual scaffolding built on top of quantum chemistry, and the language most chemists think in.
Organic chemistry
Arrow-pushing is shorthand for what electrons really do. Quantum chemistry maps activation barriers, selectivities, and mechanisms that would take years to probe experimentally.
Biochemistry
QM/MM lets you put an enzyme active site under a quantum-chemistry microscope while treating the surrounding protein as a classical force field.
Physics: Quantum mechanics
The underlying physics of the many-body Schrödinger equation. Everything on this page is an approximation strategy for a problem quantum mechanics only describes in principle.
Math: Linear algebra
Molecular orbitals are eigenvectors of the Fock or Kohn-Sham matrix. Everything about HF and DFT is ultimately a large Hermitian eigenvalue problem.
Further reading
- Attila Szabo and Neil S. Ostlund — Modern Quantum Chemistry: Introduction to Advanced Electronic Structure Theory (Dover, 1989). The standard graduate textbook for Hartree-Fock and post-HF. Still unmatched for pedagogical clarity.
- Frank Jensen — Introduction to Computational Chemistry (3rd ed., Wiley, 2017). A comprehensive survey covering wavefunction and density functional methods with practical guidance.
- Trygve Helgaker, Poul Jørgensen, and Jeppe Olsen — Molecular Electronic-Structure Theory (Wiley, 2000). The deep reference for anyone serious about coupled cluster and response theory.
- Robert G. Parr and Weitao Yang — Density Functional Theory of Atoms and Molecules (Oxford, 1989). The classic DFT textbook from the group that introduced most of it to chemistry.
- John Perdew's Jacob's ladder papers — see for example Perdew et al., J. Chem. Phys. 123, 062201 (2005), "Prescription for the design and selection of density functional approximations."
- PySCF documentation — pyscf.org. The easiest modern entry point to hands-on quantum chemistry in Python.