What is the probability that Markov chain works?

Let's play a game with a coin. We will flip this coin continuously, until either we get the sequence heads, heads, tails, or we get the sequence tails, heads, heads; in the former case, you win, in the latter I win. For example, if the tosses come up HHHTTHHTTH\text{HHHTTHHTTH}\ldots, then you won by toss 4 (because HHT\text{HHT}), even though THH\text{THH} appeared latter by toss 7. Had the second toss been tails, I would have won instead, since HHT\text{HHT} only appeared by toss 8. What is the probability that you win?

This kind of problem appears relatively often (or at least I once solved two successive problems of this fashion). Although there is a specialized method of solution that works for this problem (if you get any tails in the first two tosses, you lose, otherwise you win), I will explain a more general method that can be applied to several such problems.

First, we observe the possible states of the game. There are nine states, which we can encode as |*, |H*, |T*, HH*, HT*, TH*, TT*, HHT, THH\text{|*, |H*, |T*, HH*, HT*, TH*, TT*, HHT, THH}. A pipe (|) indicates the beginning of the game, while an asterisk (*) indicates the time just before the current toss. Thus |T*\text{|T*} indicates that we got tails as the first toss, and now we're at the second toss, while HT*\text{HT*} indicates the last two tosses were heads and tails, respectively. After each toss, our state might change to something else; for example, after a toss of heads from HT*\text{HT*}, we end up in the state TH*\text{TH*}, since now the last two tosses are tails and heads in that order. (The fact that the third-last toss was heads no longer matters.) HHT\text{HHT} is a win for you, and THH\text{THH} is a win for me; once we enter either of these, we won't go anywhere else.

Now we can construct a directed graph, where the vertices are states and edges are possible transitions, where edges are weighted according to the probability of entering such state. In other words, we create a Markov chain, as given in the following diagram made lovingly by Microsoft Paint:

A Markov chain of the problem A Markov chain of the problem

Red arrows indicate the state change on getting heads, while orange arrows indicate the state change on getting tails. (Except the two rightmost, finish states, where the arrow indicates the state change for whatever the toss; once we end up on one of those, we won't change any more state.) For a Markov chain, the different colors can be ignored.

Now, we put a number into each state, indicating the probability that you win. The winning states are easy: HHT\text{HHT} has probability 11 and THH\text{THH} has probability 00.

There is an important equality that always holds: P(state u)=((weight of arrow leaving u)P(state pointed by the arrow))P(\text{state }u) = \sum ((\text{weight of arrow leaving }u) \cdot P(\text{state pointed by the arrow})). For example, let's apply this to the state HH*\text{HH*}. Suppose it has probability pp. Then, according to the equality, we have p=121+12pp = \frac{1}{2} \cdot 1 + \frac{1}{2} \cdot p; the first summand is if we follow the arrow to HHT (getting a tails), while the second is if we follow the arrow to HH*\text{HH*} (getting a heads). We can easily solve p=1p = 1. This means once we end up on HH*\text{HH*}, you will always win no matter what.

Similarly, we can solve for HT*, TH*, TT*\text{HT*, TH*, TT*}. This one is a bit harder, as we obtain a system of linear equations in three variables, but we can solve that to give all zeroes. Once we get to any of these three states, you're doomed.

We can then continue to |H*\text{|H*} and |T*\text{|T*}. For the former, we can compute P(|H*)=12P(HH*)+12P(HT*)=121+120=12P(\text{|H*}) = \frac{1}{2} P(\text{HH*}) + \frac{1}{2} P(\text{HT*}) = \frac{1}{2} \cdot 1 + \frac{1}{2} \cdot 0 = \frac{1}{2}. The latter can also be computed in the same way to give P(|T*)=0P(\text{|T*}) = 0. And finally, this leads up to P(|*)=14P(\text{|*}) = \frac{1}{4}, the starting state, which means at the beginning you have only a 14\frac{1}{4} probability of winning. Yes, I'm a jerk.

A Markov chain after putting the probabilities of winning from states A Markov chain after putting the probabilities of winning from states

This visualization allows us to solve most problems of this sort better, without getting confused of how states lead to each other, and what your equations are. This is not much simpler than making equations without visualization, but it helps making sure you won't miss anything and also helps to tell which nodes you can probably compute earlier. At the worst case, yes, you need to make a system of linear equations and solve it by hand as usual, but hey, you now know that if you need to do so then it's the best way available.

Sample problems (read: the two problems I did that made me want to write this note):

Note by Ivan Koswara
4 years, 6 months ago

No vote yet
1 vote

</code>...<code></code> ... <code>.">   Easy Math Editor

MarkdownAppears as
*italics* or _italics_ italics
**bold** or __bold__ bold

- bulleted
- list

  • bulleted
  • list

1. numbered
2. list

  1. numbered
  2. list
Note: you must add a full line of space before and after lists for them to show up correctly
paragraph 1

paragraph 2

paragraph 1

paragraph 2

[example link](https://brilliant.org)example link
> This is a quote
This is a quote
    # I indented these lines
    # 4 spaces, and now they show
    # up as a code block.

    print "hello world"
# I indented these lines
# 4 spaces, and now they show
# up as a code block.

print "hello world"
MathAppears as
Remember to wrap math in </span>...<span></span> ... <span> or </span>...<span></span> ... <span> to ensure proper formatting.
2 \times 3 2×3 2 \times 3
2^{34} 234 2^{34}
a_{i-1} ai1 a_{i-1}
\frac{2}{3} 23 \frac{2}{3}
\sqrt{2} 2 \sqrt{2}
\sum_{i=1}^3 i=13 \sum_{i=1}^3
\sin \theta sinθ \sin \theta
\boxed{123} 123 \boxed{123}


Sort by:

Top Newest

check this article http://www.sciencedirect.com/science/article/pii/S0167715210002270

collatz lothal - 4 years, 6 months ago

Log in to reply

Thanks for detailed and well written Note! It gave me a head started on "Markov chain" in probability...

Pawan Kumar - 4 years, 2 months ago

Log in to reply


Problem Loading...

Note Loading...

Set Loading...