Waste less time on Facebook — follow Brilliant.

Markov Chains - Stationary Distributions (QF)


A stationary distribution of a Markov chain is a probability distribution that remains unchanged in the Markov chain as time progresses. That is, if the initial position of the chain is given by this distribution, then the position of the chain at any future time has the same distribution.

Typically, it is represented as a row vector \(\pi\) whose entries are probabilities summing to \(1.\) If \(\textbf{P}\) is the transition matrix \(\textbf{P},\) for the chain, then \(\pi\) is a left eigenvector of \(\textbf{P},\) i.e. \[\pi = \pi \textbf{P}.\]

Unfortunately, such a stationary distribution may not always exist. For example, if a Markov chain is periodic, then a stationary distribution may not exist. In essence, a periodic Markov chain is one that can only visit certain states in fixed intervals, for example, always after an even number of steps. The chains for which such a distribution does exist are said to be ergodic.

Which states, if any, are periodic in the Markov chain pictured?

The only states of weather are sun and rain. If it's sunny today, there is an \(80\%\) chance it will be sunny tomorrow, and if it's rainy today, there is a \(60\%\) chance it's rainy tomorrow. In the long run, on what fraction of days will it be sunny?

What is the stationary distribution of the Markov chain pictured below?

In a particular (very large) deck of cards, the suit of each card is dependent upon the suit of the previous card. If the dependencies are given by the Markov chain pictured, determine the stationary distribution of suits.

A player wins a game of tennis by being the first to win \(2\) games more than the other player, provided they have won at least \(4\) games. If the game is tied at \(3\) to \(3\) and the probabilities are given by the Markov chain pictured, determine the probability that player A wins.


Problem Loading...

Note Loading...

Set Loading...