An ergodic Markov chain is an aperiodic Markov chain, all states of which are positive recurrent.
An irreducible Markov chain has a stationary distribution if and only if the Markov chain is ergodic. If the Markov chain is ergodic, the stationary distribution is unique.
Let be an ergodic Markov chain with states and stationary distribution . If the process begins at state , the expected number of steps to return to state is .
Many probabilities and expected values can be calculated for ergodic Markov chains by modeling them as absorbing Markov chains with one absorbing state. By changing one state in an ergodic Markov chain into an absorbing state, the chain immediately becomes an absorbing one, as ergodic Markov chains are irreducible (and therefore all states are connected to the absorbing state).