Law of Iterated Expectation
The Law of Iterated Expectation states that the expected value of a random variable is equal to the sum of the expected values of that random variable conditioned on a second random variable. Intuitively speaking, the law states that the expected outcome of an event can be calculated using casework on the possible outcomes of an event it depends on; for instance, if the probability of rain tomorrow depends on the probability of rain today, and all of the following are known:
- The probability of rain today
- The probability of rain tomorrow given that it rained today
- The probability of rain tomorrow given that it did not rain today
the probability of rain tomorrow can be calculated by considering both cases (it rained today/it did not rain today) in turn. To use specific numbers, suppose that
- The probability of rain today is 70%
- If it rains today, it will rain tomorrow with probability 30%
- If it does not rain today, it will rain tomorrow with probability 90%
In this case, the probability of rain tomorrow is
\[0.7 \cdot 0.3 + 0.3 \cdot 0.9 = 0.21 + 0.27 = 0.48 = 48\%\]
The Law of Iterated Expectation is useful when the probability distribution of both a random variable \(X\) and a conditional random variable \(Y|X\) is known, and the probability distribution of \(Y\) is desired. This occurs extremely often in practice, especially in economics and poker.
Formal definition
Let \(X,Y\) be random variables. Then
\[\mathbb{E}[X] = \mathbb{E}(\mathbb{E}(X|Y))\]
where \(X|Y\) is the conditional probability distribution of \(X\) given \(Y\).
If \(Y\) takes the outcomes \(A_1, A_2, \ldots, A_n\), the law can be written in the more natural form
\[\mathbb{E}[X] = \sum_{i=1}^{n}\mathbb{E}[X|A_i]P(A_i)\]
This means that the expected value of \(X\) can be calculated from the probability distribution of \(X|Y\) and \(Y\), which is often useful both in theory and practice.
Example
For example, consider a star basketball player who scores (2 points) 80% of the time when unguarded, but only 40% of the time when guarded. Against the team's current opponent, the player will be guarded 70% of the time. Then, when the player shoots, the Law of Iterated Expectation says that:
\[\mathbb{E}[\text{points scored}] = \mathbb{E}[\text{points scored|guarded}]P(\text{guarded})+\mathbb{E}[\text{points scored|unguarded}]P(\text{unguarded})\] \[\mathbb{E}[\text{points scored}] = 2 \cdot .4 \cdot .7 + 2 \cdot .8 \cdot .3 = .56 + .48 = 1.04\]
so the player scores an average of 1.04 points every time he gains possession of the ball.
This law has practical application for the opposing team: for each defensive scheme, they can calculate how many points the opposing team will score (on average), so long as they know
- How often each player will be guarded
- How well each player shoots while guarded, and how well they shoot while unguarded
which allows the opponent to pick the best defensive scheme possible.
You're asking the Oracle to foretell when you would meet your soul mate.
Oracle: You'll have a 52% chance to meet your soul mate tomorrow. Whether you'll see her today will affect whether you'll see her tomorrow.
You: Then what is my chance of meeting her today?
Oracle: I shall not speak Heaven's truth. All I can tell you is that your chance of seeing her tomorrow will be doubled if you see her today, and the chance of not seeing her tomorrow will be tripled if you don't see her today.
What is the probability (in percentage) of seeing your soul mate today?
Bayes' theorem and joint distributions
An important theorem that can simplify the reasoning is the law of joint distribution:
\[P(A \cap B) = P(A) \cdot P(B|A) = P(B) \cdot P(A|B)\]
This theorem makes logical sense: the probability that events \(A\) and \(B\) both occur is the same as the probability that \(A\) occurs, then \(B\) occurs. The probability that \(A\) occurs is simply \(P(A)\), and the probability that \(B\) subsequently occurs is \(P(B|A)\). Equivalently, the order can be reversed, leading to the second equality.
Note that when \(A,B\) are independent, this law becomes the more familiar \(P(A\cap B) = P(A) \cdot P(B)\).
This theorem is important because it allows the calculation of \(P(A|B)\) given \(P(A \cap B)\) and \(P(B)\), which is useful as \(P(A|B)\) is used in the law of iterated expectation.
Horace turns up at school either late or on time. He is then either shouted at or not. The probability that he turns up late is \(0.4.\) If he turns up late, the probability that he is shouted at is \(0.7\). If he turns up on time, the probability that he is still shouted at for no particular reason is \(0.2\).
You hear Horace being shouted at. What is the probability that he was late?
This problem is not original.
This law can also be rearranged into Bayes' theorem, which states that:
\[P(B|A) = \frac{P(A|B)P(B)}{P(A)}\]
which allows for the same calculations as above.
On planet Brilliantia, there are two types of creatures: mathematicians and non-mathematicians.
Mathematicians tell the truth \(\frac{6}{7}\) of the time and lie only \(\frac{1}{7}\) of the time, while non-mathematicians tell the truth \(\frac{1}{5}\) of the time and lie \(\frac{4}{5}\) of the time.
It is also known that there is a \(\frac{2}{3}\) chance a creature from Brilliantia is a mathematician and a \(\frac{1}{3}\) chance that it is a non-mathematician, but there is no way of differentiating from these two types.
You are visiting Brilliantia on a research trip. During your stay, you come across a creature who states that it has found a one line proof for Fermat's Last Theorem. Immediately after that, a second creature shows up and states that the first creature's statement was a true one.
If the probability that the first creature's statement was actually true is \(\frac{a}{b}\), for some coprime positive integers \(a, b\), find the value of \(b - a\).