Continued from this note.

So far, we've talked about gambling in a setting where you've decided on a threshold for when to stop, in terms of a target gain. (And of course you stop if you are ruined.)

But now, what happens if we don't have to worry about death and taxes? (What a wonderful world!) Can you just keep gambling forever?

You walk into Martin Gale's Betting Room with an initial budget of \(k\)

As usual, you can play the following game any number of times (if you still have what it costs)

- You pay Martin a dollar.
- Martin tosses a fair coin
- If the coin comes up heads Martin pays you two dollars. If it comes up tails, you get nothing.

Your plan is to just keep playing indefinitely, if you can.

Which of the following is true?

1) You will lose all your money within \(O(k^2)\) games.

2) You will lose all your money within \(O(2^k)\) games.

3) There is an integer \(n = n(k)\) such that it is impossible to increase your wealth to \(n\)

4) For each integer \( n > k\), you have a positive probability of increasing your wealth to \(n\), hence you have a positive probability of being able to play indefinitely.

5) For each integer \( n > k\), you have a positive probability of increasing your wealth to \(n\), but you will inevitably eventually lose all your money.

**Note:** You may assume that Martin's budget is infinite. In fact this is implicit in the fact that you are even allowed, in principle, to play indefinitely. If Martin had a finite budget \(B\) then situation would be equivalent to your deciding to play up to threshold \(k+B\).

SPOILERS AHEAD

.

.

.

.

.

We already know from the previous problems that if you fix any \(n >k\), then the probability that you increase your wealth to \(n\) is \(k/n\). Nevertheless, you cannot keep gambling indefinitely. Here is why. Let us imagine an infinite sequence of games that you are planning to play, and we will group them into batches using what is called a "doubling trick" as follows. The first batch is all the games until you either lose all your money or increase it to \(2k\), whichever comes first. Then, as we have already seen, you have probability 1/2 of reaching \(2k\) and probability 1/2 of ruin. If you are still around after the first batch, then your new budget is \(2k\) and the second batch is all the games until you either lose all your money or increase it to \(4k\), whichever comes first. Again, you have probability 1/2 of emerging from the second batch "unruined", with a new budget of \(4k\). Inductively, the \(j\)th batch of games starts out with a budget of \(2^{j-1} k\) and continues till it has either ruined you (with probability 1/2) or increased your budget to \(2^j k\). We have effectively reduced the problem to a sequence of fair coin flips, Heads you're ruined, Tails you get to try again. But now it is pretty clear that the probability of an infinite sequence of Tails is zero, so you inevitably eventually roll Heads and lose all your money.

So how long do you expect to play before you go bust?

\[O(k^2)\]
\[\infty\]
\[O(2^k)\]
\[O(k^k)\]

You walk into Martin Gale's Betting Room with an initial budget of \(k\)

Again, you can play the following game any number of times (if you have what it costs)

- You pay Martin a dollar.
- Martin tosses a fair coin
- If the coin comes up heads Martin pays you two dollars. If it comes up tails, you get nothing.

You decide to keep playing, meaning that the only thing that will cause you to stop is losing all your money.
How many games do you **expect** to play before you stop?

**Note:** As in problem 6 of this series, you may assume Martin's budget is infinite.

SPOILERS AHEAD

.

.

.

.

.

And now this is the confounding part! The expected time until you stop is infinite. How can we see this?

We will subdivide the games into batches as before. Let \( X_j\) be the number of games player in the \(j\)th batch of games. If you do not survive until the \(j\)th batch, then \(X_j = 0\). Let \(X= \sum_{j} X_j\) be the total number of games played. Then, by Linearity of Expectation, the expected number of games played is the sum of the expectations of the \(X_j\)s. Recall that the \(j\)th batch of games goes from a budget of \(2^{j-1} k\) until your wealth is doubled or you are ruined. To have made it to this batch at all, you have to first have survived the previous \(j-1\) batches, which we showed was equivalent to \(j-1\) favorable coin flips, and happens with probability \(1/2^{j-1}\). Thus with probability \(1-1/2^{j-1}\), \(X_j = 0\). Conditioned on \(X_j \) being positive, we saw in an earlier problem that if you're playing from budget \(B\) to either \(2B \) or zero, then the expected number of games is \(B^2\), so the expected number of games in the \(j\)th batch is \((2^{j-1} k)^2\). Therefore \[ \mathbb{E} (X_j) = 2^{j-1} k^2 \] which is growing as a function of \(j\). So clearly the series id divergent, and \(\mathbb{E}(X)\), the expected number of games until you are ruined is infinite.

Is your head spinning yet?

As a final word, the doubling trick I've used here is probably not the simplest way to tackle the particular problems in this note. It is, however, a powerful tool, which is more generally applicable in other problems in theoretical computer science, some of which I hope to write about soon.

To be continued.

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in \( ... \) or \[ ... \] to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

There are no comments in this discussion.