Waste less time on Facebook — follow Brilliant.

The St. Petersburg Paradox

This paradox involves a Casino game with heads and tails flips. The game goes as follows: First, you pay a buy-in price set by the Casino. You then flip a coin until you get a heads. If your first heads flip comes on the \(n\)-th flip, you receieve \(2^{n-1}\) dollars. So if your first flip is heads you get \(1 $\), if get \(HT\) then you get \(2$\), and so forth.

Here are some questions:

\(1.\) Theoretically, why might or might not you want to play this game? For how much money as a starting fee? Think about (or compute) the expected value of the game.

\(2.\) Why might your answer to the last question be different if this situation were given in reality? (Hint: Money is not infinite, and neither is your time).

Note by Michael Tong
2 years, 8 months ago

No vote yet
1 vote


Sort by:

Top Newest

(Some of this is adapted from Mathematics Without the Boring Bits, by Richard Elwes)

The probability of ending with a heads on the \(k\)th try is equal to \(\frac{1}{2^k}\) because to get to the \(k\)th try, you would have had to get all tails until the \(k\)th try, and then a heads. This means that there is \(1\) way to get to flipping a heads on the \(k\)th try out of \(2^k\) possibilities. You will win \(\$2^{k-1}\) That means that your expected winnings on the \(k\)th try is \(\$0.50.\) And since \(k\) can be any positive integer, the expected winnings of the game (shown turn by turn) is \(\$0.50+\$0.50+\$0.50+\$0.50+\ldots=\infty.\) So the game has infinite expected winnings.

Let's suppose there is a buy-in. Make it \(\$2.50.\) After \(8\) games you will have spent \(8\times$2.50=$20\) In this time you would expect to get \($1\) four times, \($2\) twice, \($4\) once, and \($8\) or more once. This sums to \($4+4+4+8=$20,\) so you have broken even. With a few more games you can expect to be in the profit. No matter how much the buy-in, you will come out on top in the long run. But sometimes, when I say long run, I mean a \(\textit{really long run.}\)

Suppose the buy-in is \($11.\) It would take you about \(2^{20}\) games to win. At a flip every five seconds, it will take you \(2\) weeks of continuous flipping to play. In addition, you will need \($11.5\) million to weather the losses until you finally win. So in essence, it isn't practical. You'll win eventually, but you'll run out of money, time, and willpower before you do. So the casino will always win. \[\text{.....................................................................................................................................}\] It's the same thing with the lottery, too. Believe it or not, you can expect to win the lottery. Let's suppose a ticket with \(50\) choices from which you have to pick \(6\) numbers, and it has a constant \($200\) million jackpot. If any of your numbers are wrong, you get nothing. The ticket costs \($2.\)

There are \(\binom{50}{44}\approx16\) million ways to pick your numbers, so the probability that your numbers match the winning numbers is about one in sixteen million. You're expected winnings are calculated by multiplying the chance of winning a certain amount of money by the amount and adding all of the calculations together. Here, it is \(200,000,000\times\frac{1}{16,000,000}+0\times\frac{15,999,999}{16,000,000}=12.5.\) Remembering to include the money lost in purchasing the ticket, your expected profit per ticket is \($10.50.\) But since there is such a high statistical improbability that you don't win in your lifetime, you probably (and that is a really serious probably) won't make that back. You'll end up losing.

Now yes, that's not exactly how lotteries work, but it's good enough for a simple example. Even if you use a real lottery, you are still expected to profit, but it probably won't happen in your lifetime. So the lottery will always win. Trevor B. · 2 years, 8 months ago

Log in to reply

@Trevor B. The lottery example is a great one. It is even more pronounced in this example. Michael Tong · 2 years, 8 months ago

Log in to reply

You can watch a video on this on Numberphile: Infinity Paradoxes Sharky Kesa · 2 years, 8 months ago

Log in to reply

If you are to win on the \( n\)th flip then you must get a sequence of \( n-1 \) tails followed by a head. To get such a sequence has probability \( (\dfrac{1}{2})^n \). So, if you get your head on the \( n\)th flip then your expected win is the probability times the payout, or \( \dfrac{1}{2^n} \times 2^{n-1} = \dfrac{1}{2} \) irrespective of \( n \). So in fact the expected value of any game is \( +\dfrac{1}{2} \) so you should buy in for any price less than $0.50, or, if you're just playing for fun, equal to $0.50. Josh Rowley · 2 years, 8 months ago

Log in to reply

@Josh Rowley Not quite. You must take the infinite sum of these probabilities to get the true expected value. That is, \(\displaystyle \sum_{n = 0}^{\infty} \frac{1}{2} \rightarrow \infty\).

So the question is, why is it non intuitive that the expected value is infinite? In real life, is this a good deal, even though the expected value is technically infinite? Michael Tong · 2 years, 8 months ago

Log in to reply

@Michael Tong I did that first but an infinite expected value seemed a bit ridiculous (I guess that's why it is a paradox) Josh Rowley · 2 years, 8 months ago

Log in to reply


Problem Loading...

Note Loading...

Set Loading...