# Blackbody Radiation

Many consider Max Planck's investigation of blackbody radiation at the turn of the twentieth century as the beginning of quantum mechanics and modern physics. After all, given the technology of Planck's era, the blackbody represented perhaps the simplest macroscopic system that displayed an overt deviation from classical physics that could be easily measured.

An **ideal blackbody** absorbs all incident electromagnetic radiation, all of which is subsequently *re-radiated*. At thermal equilibrium, the rate at which a blackbody absorbs energy is equal to the rate at which it radiates energy. Using the principles of statistical physics, it can be shown that the resulting spectral distribution of the radiation of the blackbody depends only on its temperature $T$.

## The Ultraviolet Catastrophe

Thermodynamically, one can model a blackbody essentially as a gas of photons. In general, one can consider each photon to have some energy $E(f)$ as a function of its frequency $f.^\text{[1]}$ The classical assumption is that $E$ is independent of $f$ and that, furthermore, $E$ is allowed to vary continuously. Where $k$ denotes the Boltzmann constant, these assumptions lead to the **Rayleigh-Jeans law**

$u(f, T) = \frac{8\pi h f^3}{c^3} \cdot \frac{1}{e^{hf/(kT)}-1} \approx \frac{8\pi f^2}{c^3} kT,$

which gives the spectral energy density (power per volume per frequency) of a blackbody.

The spectral energy density is given by

$u(f, T) = \bar{E} N(f),$

where $\bar{E}$ is the average energy of the photons and $N(f)$ denotes the density of photons per frequency $f$. Statistical mechanics gives that the photons adhere to a Boltzmann distribution $e^{-E/(kT)}$, so the average energy is

$\bar{E} = \frac{\int_0^\infty E e^{-E/(kT)} \, dE}{\int_0^\infty e^{-E/(kT)} \, dE} = \frac{(kT)^2}{kT} = kT.$

Meanwhile, it can be shown that the photons have a density of

$N(f) = \frac{8\pi f^2}{c^3}.$

Therefore

$u(f, T) = \frac{8\pi f^2}{c^3} kT.\ _\square$

For small $f$, the Rayleigh-Jeans law agrees reasonably well with experimental values. However, for larger $f$, such as the ultraviolet portion of the spectrum, the Rayleigh-Jeans distribution fails entirely. Worse yet, the form of $u$ is unbounded as $f$ grows large. When integrated over the entire spectrum, the Rayleigh-Jeans distribution would suggest that blackbodies radiate infinite energy, a problem that became known as the "ultraviolet catastrophe."

## Planck's Law

There were many failed approaches to resolving the ultraviolet catastrophe. Planck's key insight was that if the photons were not allowed to take on any arbitrary energy but, rather, only *quantized* (discrete) multiples proportional to their frequency, then the correct spectrum is obtained. In particular, Planck considered there to be some constant $h$ such that the energy of a photon with frequency $f$ could be any integer multiple of $hf$, or

$E(f) = nhf$

for positive integers $n$.

When summed over all $n$, the average energy of the photons can be obtained, and the energy spectrum obtained has the form

$u(f, T) = \frac{8\pi f^2}{c^3} \left(\frac{hf}{e^{hf/(kT)} - 1}\right),$

which, unlike the Rayleigh-Jeans law, vanishes as $f$ grows large.

As with the derivation of the Rayleigh-Jeans law, for average photon energy $\bar{E}$ and photon density $N(f)$, the spectral energy density is$u(f, T) = \bar{E} N(f),$

where $\bar{E}$ is the average energy of the photons and $N(f)$ denotes the density of photons per frequency $f$. Assuming a discrete distribution of photon energy $E(f) = nhf$, the average energy can be found by computing

$\displaystyle \bar{E} =\frac{\sum_{n=0}^\infty nhf e^{-nhf/(kT)}}{\sum_{n=0}^\infty e^{-nhf/(kT)}} = \frac{hf}{e^{-hf/(kT)} - 1}.$

(The sums can be evaluated using standard geometric series methods.)

As before, the photon density is

$N(f) = \frac{8\pi f^2}{c^3}.$

Therefore,

$u(f, T) = \frac{8\pi f^2}{c^3} \left(\frac{hf}{e^{hf/(kT)} - 1}\right),$

which is the Planck distribution. $_\square$

At the time, Planck viewed the quantization of $E$ as a purely mathematical "trick," but eventually it became clear that the quantization of light had a physical basis, which was fleshed out by Einstein just a few years later in his explanation of the photoelectric effect.

Since the radiation emitted by a blackbody is *isotropic* (the same in all directions), it holds that the *intensity* (power per unit area) of radiation is simply $I = \frac{cu}4$. One can thus express Planck's law in terms of intensity and wavelength as

$I(\lambda, T) = \frac{2\pi}{\lambda^5} \left(\frac{hc^2}{e^{hc/(\lambda kT)} - 1}\right).$

It can be shown that the wavelength at which the maximum intensity is achieved is directly proportional to $\frac{1}T$:

$\lambda_{\text{max}} \propto \frac{1}{T},$

a result known as the **Wien displacement law**. The constant of proportionality is the **Wien displacement constant** $b \approx 2900 \, \mu\text{m} \cdot \text{K}$.

In general, as a blackbody heats up, its emission spectrum shifts to a high-energy spectrum, represented by emission at a shorter wavelength or, equivalently, higher frequency.

## Stellar Radiation

As a first approximation, many objects, such as stars, can be treated as ideal blackbodies, for which the Planck law applies. For instance, it can be experimentally determined that the sun has a peak intensity of about $500 \, \text{nm}$, so Wien's displacement law gives that the temperature of the sun is

$T_\text{sun} = \frac{2900 \, \mu\text{m} \cdot \text{K}}{500 \, \text{nm}} = 5800 \, \text{K}.$

(The temperature of stars are in fact determined by measuring their emission spectra.)

The Planck distribution can be integrated over the entire spectrum to show that the intensity of the radiation (power per unit area) $I$ is proportional to the fourth power of the temperature $T$, a result known as the **Stefan-Boltzmann law**:

$I = \sigma T^4,$

where $\sigma$ is the *Stefan-Boltzmann constant*

$\sigma = \frac{2 \pi^5 k^4}{15 c^2 h^3}.$

This allows us to calculate, for example, the total power $P$ emitted by the sun. Since the radiation from the sun is emitted over a spherical surface of radius $r_\text{sun}$, one finds

$P = 4 \pi r_\text{sun}^2 I = 4 \pi r_\text{sun}^2 \sigma T_\text{sun}^4.$

Using the value of $T_\text{sun} = 5800 \, \text{K}$ obtained earlier and $r_\text{sun} = 6.96 \cdot 10^8 \, \text{m}$ yields $P_\text{sun} = 3.8 \cdot 10^{26} \, \text{W}$.

Temperature of the universe:The cosmic microwave background (CMB) originates from hot plasma in the early universe. Suppose we treat the source of the CMB as a blackbody. If the universe expands at a constant rate such that its volume $V$ is given by

$V = H_0 t,$

where $H_0$ is the

Hubble constant, then the total energy emitted by the CMB is given by$E = \sigma T^4 V = \sigma T^4 H_0 t,$

where $T$ is the temperature of the CMB. Because energy is conserved, over time the temperature of the universe will become

$T = \left(\frac{E}{\sigma H_0 t}\right)^{\frac14}.$

In other words, as the universe expands, its temperature will gradually cool down $($the current temperature of the CMB is about $2.7 \, \text{K} ).$

## Notes

[1] Historically, scientists would not have referred to electromagnetic radiation as photons but as oscillating waves (the first to do so was Einstein). But the same arguments essentially apply.

**Cite as:**Blackbody Radiation.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/blackbody-radiation/