Imagine I offer two kinds of games to you; let's call them Game A and Game B respectively.

In Game A, you bet \($1\) that a coin I'm going to toss turns up heads; the payout is even (you gain \($1\) if you win but you lose \($1\) if you lose). But I don't play fair; the probability that the coin will come up heads is just \(0.2\). Clearly this is a losing game, no? You won't want to play this game.

In Game B, you bet \($4\) that a coin I'm going to toss turns up heads; the payout is even (you gain \($4\) if you win but you lose \($4\) if you lose). Unlike the above, I now have two coins! One coin was that unfair one above, turning up heads with probability \(0.2\), but another one is unfair to your favor, turning up heads with probability \(0.7\). So how do I select which coin to use? I will look on the previous game you played; if you won that, I'll use the bad coin (probability of heads \(0.2\)), but if you lost that, I'll be generous and use the good coin (probability of heads \(0.7\)). For ease of discussion, suppose that in the first game you get the good coin; turns out this doesn't matter.

Let's analyze Game B. Suppose in the long run, the probability of winning Game B is \(w\), and the probability of losing it is \(l\). Then we get the following system:

\(\begin{cases} w &= 0.2w + 0.7l \\ l &= 0.8w + 0.3l \\ w+l &= 1 \end{cases}\)

The first equation is simply "P(win this game) = P(won last game) x P(bad coin gives heads) + P(lost last game) x P(good coin gives heads)", and the second equation is similar but with losing chances instead. The third equation follows from the total probability; either you win or lose this game, so the sum of the probabilities is \(1\). When we solve the system, we obtain that \(w = \frac{7}{15}, l = \frac{8}{15}\). Thus on average you will only win \(7\) of \(15\) but lose \(8\) of \(15\) games. Since the payout for a win is equal to the cost of a loss, in the long run you will lose \($4\) every \(15\) games, so this is a losing game.

But what happens when you play Game A and Game B alternately, in the sequence \(ABABABAB \ldots\)?

Observe that all Game B results only depend on Game A, so we might as well create a new game called Game C which plays one Game A followed by one Game B, and our sequence becomes \(CCCC \ldots\). Let's analyze Game C.

There are essentially four outcomes in Game C:

- You win Game A and win Game B. This happens with probability \(0.2 \cdot 0.2 = 0.04\), and gives payout \($1 + $4 = $5\).
- You win Game A and lose Game B. This happens with probability \(0.2 \cdot 0.8 = 0.16\) and gives payout \($1 - $4 = -$3\).
- You lose Game A and win Game B. This happens with probability \(0.8 \cdot 0.7 = 0.56\) and gives payout \(-$1 + $4 = $3\).
- You lose Game A and lose Game B. This happens with probability \(0.8 \cdot 0.3 = 0.24\) and gives payout \(-$1 - $4 = -$5\).

Computing the expected gain from Game C, we obtain \(0.04 \cdot $5 + 0.16 \cdot (-$3) + 0.56 \cdot $3 + 0.24 \cdot (-$5) = $0.2\). We have a positive expected value! And since every instance of Game C is independent, playing Game C repeatedly will simply magnify the expected gain. Thus the sequence \(CCCC \ldots\) is winning! But this is the sequence \(ABABABAB \ldots\), made up of two losing games! How can we obtain a game that is winning from two losing games?

This is also known as Parrondo's paradox. By playing two losing games in some sequence, you might be able to make it into a winning game!

Note that the two games must be dependent in some way. In the above, Game B depends on the result of the previous game, and this is the trick: we make sure that Game B depends on Game A instead of another Game B. We computed that Game B is losing, but only if the previous game is another Game B; if it's Game A (or some other game, like a "you always lose" game), then Game B is winning. You can prove that if the games are independent, then playing them in any sequence will eventually lose.

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

## Comments

Sort by:

TopNewestI believe that "but another one is unfair to your favor, turning up heads with probability 0.7" should be "fair" instead?

Here's a simpler way to explain the paradox:

There is a light in the room, which could randomly turn on or off every minute. Game A: Check if the light is on. If it is on, you gain $10. If it is off, you lose $20.

Game B: Check if the light is off. If it is off, you gain $30. if it is off, you lose $40.

Each minute, you can choose if you want to play Game A or B, and can decide to play Game B (or A) even after you play Game A (or B) and the result is reveals.

What is the maximum expected value of playing this game smartly?

Of course, with this explanation, it becomes much clearer how the dependence would lead from a negative expected value, to a positive expected value. This helps make it easier to think about it.

Log in to reply

A fair coin turns up heads exactly \(0.5\) of the time. The one that turns up heads \(0.7\) of the time is unfair (biased), but to the player's favor.

The thing with my example is that you decide the sequence beforehand, while your example decides the sequence on the go. In your game, the bulk of the game relies on smartly choosing the sequence, and thus positive expected value can be attributed to correct selections; in my game, you can't choose the sequence any more and must accept whatever fate gives you from the coin tosses, so positive expected value must come from fate.

Log in to reply

Ah yes you are right, i wasn't reading it right when i made the first statement.

My point is that when trying to explain a paradox, it is best to strip away the "fancy details" and boil down to the essentials. In this case, having the convoluted setup makes it harder for someone to follow through, and they might end up thinking "hm, this game is intentionally tricky and that's the reason why I was wrong", as opposed to "oh, now I see the reason for the paradox".

I understand what you are saying in terms of the difference of the game. The distinction boils down to how you encode "So how do I select which coin to use?", as explained by "two games must be dependent in some way".

Game A:

Check if light is on. If light is on, you get $10. If light is off, you lose $20.

Game B:

If you won Game A: Check if light is on. If light is on, you win $0, if light is off, you lose $10.

If you lost Game A: Check if light is off. If light is off, you win $30, if light is on, you lose $40.

Now, it is pretty clear that no matter if the light is on or off, if you play the game AB, then you will always get $10.

Here's a follow up question. How can one use this idea to make money in the real world?

Interestingly, there are lots of examples in trading. The idea is that Game A gives you significant information, which is why you are willing to pay (lose money) for it, and then make it back and more in Game B.

Log in to reply

Wikipedia gave the following example:

Game A: You lose \($1\) every time you play.

Game B: If your money is even, win \($3\). Otherwise lose \($5\).

That is a simple example too, where if your initial money is odd, playing AB gives you \($2\) each time. I tried to find an example that doesn't depend on how much money you have; turns out it's a hard thing or it eludes me for the moment.

Log in to reply

This is very interesting.

Why don't you post a wiki article on it? Wikis are more permanent than notes.

Log in to reply

...because I'm not used to wikis yet. Let's see...

Log in to reply

Oh what subject would this be on? Is there a Paradoxes section?

Log in to reply

Even if there's none, this fits in some probability section, about playing games of chance and such.

Log in to reply

I would love to create a paradoxes section. Simply get started with "Post Something - Wiki", and we will add them to the relevant sections.

Log in to reply