Probability

Markov Chains

Basic Markov Chains

         

In the Markov chain pictured, what is the probability that a process beginning at state A will be at state B after 22 moves?

Let kk represent the number of days until a person turns 200200 years old. What is the biggest problem with this model of life?

Any general stochastic process can be made to satisfy the Markov property by altering the state space (and adding probabilities for any new states). In this way, it can "turn into" a Markov chain. Given a stochastic process with state space SS, which of the following methods would create a Markov chain that models the same process?

  1. Alter SS to instead contain sequences of states, the kthk^\text{th} element of which tells where the Markov chain was at step kk.
  2. Change SS to contain pairs of states (i,j)(i, \, j) instead. That way, the Markov chain can know both the present state and the previous state.
  3. Modify SS to contain only one state, but make its transition probabilities contain all the information about the stochastic process.
  4. Don't change SS, but make the transition probabilities satisfy the Markov property.
  5. Add a large number of "in-between" states to SS, to create a bunch of little steps satisfying the Markov property.

The Golden State Warriors are strongly affected by morale, and they win games according to the probabilities in the following table:

StatesWin TomorrowLose Tomorrow
Won Today45\tfrac{4}{5}15\tfrac{1}{5}
Lost Today13\tfrac{1}{3}23\tfrac{2}{3}

If they lose the first two games in a "best of 77" (first to 44) series, what is the probability that they will be able to come back and win it?

In the Markov chain pictured, what is the probability that a process beginning at state A will be back at state A after 33 moves?

×

Problem Loading...

Note Loading...

Set Loading...