Suppose you start off with $1, and you play a (binary) game whereby you win with a probability of $\frac{1}{2}$. If you win, you gain $1; if you lose, however, you lose a dollar. Let's say you win this game; now you have $2. Then, split the $2 into two lots of $1; you play this game again, but now you play it twice (since you have $2). This process is repeated ad infinitum; in other words, whenever you have $$n$, you split it into $n$ lots of $1 and play $n$ instances of the game. Note that at the end of each step, you lump up your winnings and start over the process again; the process end when you are left with $0.

Here are some questions that may be of some interest for you:

How to calculate the probability of having $$x$ at a certain point in (discrete) time?

What is the long-term probability of running out of money or ending up with infinitely many money?

Does this long-term probability change when you vary the probability of winning the game? How about if we varied the starting amount?

Answering these questions may provide a sturdy bridge by which branching processes may be studied; it has been a problem that has eluded me for years, and I have moved on to more greener pastures at this stage. Although I would like to return to it, I post this here to garner some interest in this problem.

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

$</code> ... <code>$</code>...<code>."> Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in $</span> ... <span>$ or $</span> ... <span>$ to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestSome findings: The expected amount of money at any stage is $1. (not surprising)

The probability of ending up at 0 in n steps or fewer are given by https://oeis.org/A187131 / 2^(2^(n-1)).

The numerators for the cumulative probabilities of ending up at 0 are https://oeis.org/A167424 See the examples for the probabilities themselves. Not surprisingly, these fractions are tending to 1. You will eventually run out of money.

There is zero probability of infinite money, but there is a positive probability of ever exceeding any finite amount.

After 4 rounds, here is the probability distribution, all out of 2^15

Log in to reply

Define a branching process where each node has $0$ child w.p. $\frac{1}{2}$ (which corresponds to losing that particular game), and $2$ children w.p. $\frac{1}{2}$ (which corresponds to winning that particular game). The long-term probability of running out of money is equal to the probability of extinction of this particular branching process, which is given by the smallest non-negative root of the fixed point equation $z= \frac{1}{2}+\frac{1}{2}z^2,$ which is $1$. Hence, we will run out of money eventually almost surely.

Keeping the probability of winning $p_w$ the same (i.e., 1/2), for any finite amount of starting money, we still run out of money eventually almost surely. For any general $p_w$, this probability may be obtained by examining the smallest root of the following fixed point equation $z=p_wz^2+(1-p_w)$ which is $\alpha=\min(1, \frac{1-p_w}{p_w})$.

If we had initially started with $\$N$, probability of running out of money eventually becomes $\alpha^N$.

Log in to reply