Ergodic Markov Chains
For and , the random walk with reflection is an ergodic Markov chain
A Markov chain that is aperiodic and positive recurrent is known as ergodic. Ergodic Markov chains are, in some senses, the processes with the "nicest" behavior.
Contents
Definition
An ergodic Markov chain is an aperiodic Markov chain, all states of which are positive recurrent.
Properties
An irreducible Markov chain has a stationary distribution if and only if the Markov chain is ergodic. If the Markov chain is ergodic, the stationary distribution is unique.
Let be an ergodic Markov chain with states and stationary distribution . If the process begins at state , the expected number of steps to return to state is .
Many probabilities and expected values can be calculated for ergodic Markov chains by modeling them as absorbing Markov chains with one absorbing state. By changing one state in an ergodic Markov chain into an absorbing state, the chain immediately becomes an absorbing one, as ergodic Markov chains are irreducible (and therefore all states are connected to the absorbing state).