# Master Theorem

The **master theorem** provides a solution to recurrence relations of the form

\[ T(n) = a T(n/b) + f(n), \]

for constants \( a \geq 1 \), \( b > 1 \) and \( f \) asymptotically positive. Such recurrences occur frequently in the runtime analysis of many commonly encountered algorithms.

## Introduction

Many algorithms have a runtime of the form

\[ T(n) = a T(n/b) + f(n), \]

where \( n \) is the size of the input and \( a \geq 1 \) and \( b > 1 \) are constants with \( f \) asymptotically positive. For instance, one can show that runtime of the *merge sort* algorithm satisfies

\[ T(n) = 2T(n/2) + n. \]

Similarly, traversing a binary tree takes time

\[ T(n) = 2 T(n/2) + O(1).\]

By comparing \( \log_b{a} \) to the asymptotic behavior of \( f(n) \), the **master theorem** provides a solution to many frequently seen recurrences.

## Statement of the master theorem

First consider an algorithm with a recurrence of the form

\[T(n) = a T(n/b).\]

Drawing a recursion tree suggests that such an algorithm ought to have a runtime of \( \Theta\left(n^{\log_b{a}}\right) \). The tree has a total of \( \log{n}_b \) levels, so the total work done must be \(\Theta\left(a^{\log{n}_b}\right) = \Theta\left(n^{\log_b{n}}\right) \).

Intuitively, the master theorem argues that if an asymptotically positive function \( f \) is added to the recurrence, so that one instead has

\[ T(n) = a T(n/b) + f(n), \]

it is possible to determine the asymptotic form of \( T \) based on a relative comparison between \( f \) and \( n^{\log_b{a}} \).

Master theorem.Given a recurrence of the form

\[ T(n) = a T(n/b) + f(n), \]

for constants \( a \geq 1 \), \( b > 1 \) and \( f \) asymptotically positive, the following statements are true:

Case 1.If \( f(n) = O\left(n^{\log_b{a} - \epsilon}\right) \) for some \( \epsilon > 0 \), then \( T(n) = \Theta\left(n^{\log_b{a}}\right) \).

Case 2.If \( f(n) = \Theta\left(n^{\log_b{a}}\right) \), then \( T(n) = \Theta\left(n^{\log_b{a}} \log{n} \right) \).

Case 3.If \( f(n) = \Omega\left(n^{\log_b{a} + \epsilon}\right) \) for some \( \epsilon > 0 \) (and \( af(n/b) \leq c f(n) \) for some \( c < 1 \) for all \( n \) sufficiently large), then \( T(n) = \Theta\left(f(n)\right) \). \(_\square\)

Simply put, if \( f(n) \) is polynomially smaller than \( n^{\log_b{a}} \), then \( n^{\log_b{a}} \) dominates, and the runtime is \( \Theta\left(n^{\log_b{a}}\right) \). If \( f(n) \) is instead polynomially larger than \( n^{\log_b{a}} \), then \( f(n) \) dominates, and the runtime is \( \Theta\left(f(n)\right) \). Finally, if \( f(n) \) and \( n^{\log_b{a}} \) are asymptotically the same, then \( T(n) = \Theta\left(n^{\log_b{a}} \log{n} \right) \).

Note that the master theorem does not provide a solution for all \( f \). In particular, if \( f \) is smaller or larger than \( n^{\log_b{a}} \) by less than a polynomial factor, then none of the three cases are satisfied. For instance, consider the recurrence

\[ T(n) = 3 T(n/3) + n \log{n}.\]

In this case, \( n^{\log_b{a}} = n \). While \( f \) is asymptotically larger than \( n \), it is larger only by a logarithmic factor; it is not the case that \( f(n) = O\left(n^{\log_b{a} - \epsilon}\right) \) for some \( \epsilon > 0 \). Therefore, this master theorem makes no claim about the solution to this recurrence.

## Examples

As mentioned in the introduction, the mergesort algorithm has runtime

\[ T(n) = 2T(n/2) + n. \]

\( n^{\log_b{a}} = n \) and \( f(n) = n \), so case 2 of the master theorem gives \( T(n) = \Theta\left(n^{\log_b{a}} \log{n} \right) = \Theta\left(n \log{n}\right) \).

Similarly, as mentioned before, traversing a binary tree takes time

\[ T(n) = 2 T(n/2) + O(1).\]

\( n^{\log_b{a}} = n \), which is asymptotically larger than a constant factor, so case 1 of the master theorem gives \( T(n) = \Theta\left(n^{\log_b{a}}\right) = \Theta\left(n\right) \).

Consider the recurrence

\[ T(n) = 6 T(n/3) + n.\]

In this case, \( n^{\log_b{a}} = n^2 \) and \( f(n) = n \). Since \( f(n) \) is polynomially smaller than \( n^{\log_b{a}} \), case 1 of the master theorem implies that \( T(n) = \Theta\left(n^2\right) \).

Consider the recurrence

\[ T(n) = 27 T(n/3) + n^3. \]

In this case, \( n^{\log_b{a}} = n^3 \) and \( f(n) = n^3 \). Since \( f(n) \) is asymptotically the same as \( n^{\log_b{a}} \), case 2 of the master theorem implies that \( T(n) = \Theta\left(n^3 \log{n} \right) \).

Consider the recurrence

\[ T(n) = 2 T(n/2) + n^2. \]

In this case, \( n^{\log_b{a}} = n \) and \( f(n) = n^2 \). Since \( f(n) \) is asymptotically larger than \( n^{\log_b{a}} \), case 3 of the master theorem asks us to check whether \( af(n/b) \leq c f(n) \) for some \( c < 1 \) and all \( n \) sufficiently large. This is indeed the case, so \( T(n) = \Theta\left(f(n)\right) = \Theta\left( n^2 \right) \).

Consider the recurrence

\[ T(n) = 8 T(n/2) + n^3/\log{n}. \]

In this case, \( n^{\log_b{a}} = n^3 \) and \( f(n) = n^3/\log{n} \). \( f(n) \) is smaller than \( n^{\log_b{a}} \) but by less than a polynomial factor. Therefore, the master theorem makes no claim about the solution to the recurrence.

## References

[1] Cormen, T.H., *et al*. *Introduction to Algorithms*. MIT Press, 2009.