Payoff 1.

Toss 5 coins. You get $1 for each consecutive pair HT that you get.

Payoff 2.

Toss 5 coins. You get $1 for each consecutive pair HH that you get.

The consecutive pairs are allowed to overlap.

For example, if you tossed HTHHH, under payoff 1 you will get $1, but under payoff 2 you will get $2.

No vote yet

1 vote

×

Problem Loading...

Note Loading...

Set Loading...

Easy Math Editor

`*italics*`

or`_italics_`

italics`**bold**`

or`__bold__`

boldNote: you must add a full line of space before and after lists for them to show up correctlyparagraph 1

paragraph 2

`[example link](https://brilliant.org)`

`> This is a quote`

Remember to wrap math in \( ... \) or \[ ... \] to ensure proper formatting.`2 \times 3`

`2^{34}`

`a_{i-1}`

`\frac{2}{3}`

`\sqrt{2}`

`\sum_{i=1}^3`

`\sin \theta`

`\boxed{123}`

## Comments

Sort by:

TopNewestThis is a trick question, both payoffs have same expected value of profit, which is: $1.

Consider this:

We will count the number of ways HT can occur. We can have:

HT_ _ _ : 8 ways

_ HT _ _ _ : 8 ways

_ _ HT _ : 8 ways

_ _ _ HT : 8 ways

Total number of ways: 32

Similarly for HH, the total number of ways in which it can occur is 32.

What this means is that the sum of the payoffs of HT and HH over the 32 possible outcomes is $32 for both. The payoffs may be distributed in different ways for HT and HH, but their expected value is same.

Mathematically, the expected value is same. But there is one more thing left to do: analyze the probability distribution. We can easily see that payoff 2 might be desirable because it tends to give large profits as compared to payoff 1 for certain events. eg: HHHHH gives $4 in p2 and 0 in p1. HHHHT gives $3 in p2 and $1 in p1.

But the thing is that these events are not very common. Using method of reflection and analyzing the 16 possibilities , we see that there are many events for which p2 gives $0 payoff. There are 13 outcomes for which p2 gives $0 payoff, but only 6 where p1 gives $0 payoff.

So If I'm given this choice, I'd take payoff 1, as then I have more chance of leaving with at least a dollar in my pocket.

Since this is in quantitative finance, should I talk about risk and stuff? I do not know.

Log in to reply

I think that there may be some double-counting here. For example, you've counted HTHTT and HTHTH twice each. There are only \(2^{5} = 32\) possible sequences in total, and there are some, such as TTHHH, that have no HT payouts, so there cannot be \(32\) HT outcomes. I'm getting an average payout of \(\dfrac{28}{32}\) dollars in the first game type and \(\dfrac{31}{32}\) dollars in the second, but I am too tired to double-check. I'll do that tomorrow. :)

Edit: Oh. now I see what you've done; very clever. I must have missed out on my count somewhere. I'll still wait to confirm in the morning. And yes, there does seem to be an advantage to choosing p1, (even though the expected winnings are the same), in the sense that you are more likely to at least win some money.

Log in to reply

Is "more likely to at least win some money" the main consideration that you will use if the expected payoff is equal?

For example,what would you choose between:

Payoff 1: 50% of $0, 50% of $100

Payoff 2: 90% of $1.01, 1% of $4900

Log in to reply

If Payoff 2 was 99% of $0, 1% of $5000, then it gets interesting, (at least for me). Even a such a slight chance of winning $5000 would be hard to pass up, so I would probably go with Payoff 2, unless I absolutely needed that $100 right away.

If Payoff 2 was 99.9% of 0$, 0.1% of$50000 .... hmmmm .... That's a lot of dough, so I'd go with Payoff 2. However, if the options were these last two I've listed, then I might choose the $5000 option.

If Payoff 2 involved a 50% chance of losing $100 and a 50% chance of winning $200, then I would go with Payoff 1, since I'm not comfortable with there being such a good chance of losing a fair bit of money. There is a lot of psychology going on in these choices, and yet the expected winnings is always $50, (which ironically is an amount that would never actually be won in any of these scenarios).

Log in to reply

So, there are multiple ideas here.

- Is the probability of a positive payoff so important that just because it moves from $0 it would greatly influence your preferences?

- At what point does the potential promise of a huge payoff overweigh the certainty of a small payoff?

Log in to reply

Log in to reply

Yes, the expected payoff for both scenarios is $1. The easiest way to see this is to look at Indicator Variables, which is essentially what Raghav did (though not phrased in that language).

Well, everyone has their own "payoff preference". For example, you stated that you would prefer to "be more likely to leave with at least a dollar in your pocket". As such, this tells me that you are very risk adverse, and that you would prefer the certainty of a positive payoff.

If you want to talk about "risk and stuff", what do you need to consider, and why?

Log in to reply

I am not formally educated in these topics, so I'm just saying all this from a logical standpoint.

After a bit of thinking, I think I agree with you on the fact that everyone has their own "payoff preference". One can take a risk to win more, or take less risk to win less. In the scenario mentioned here, the aspects of risk are not conspicuous. The amount to be won is not significant, and there is no penalty for losing. Hence, both the payoffs are inherently attractive offers. When we come to real life situations, I think there are many more things that contribute to the risk:

The probability that you will get a payoff.

The probability that you stand to lose money.

The security of the winnings. Money once won shouldn't be taken away from you.

The effects of said decisions on long term/ on your ability to take other decisions.

According to me, every decision of ours is based on weighing in these risk factors against the probable prize. The balance may tip either way, and so may our decisions.

I don't even know if I'm thinking in the right direction... Help me out here @Calvin Lin

Log in to reply

Hopefully, you will come up with a consistent, logical risk-reward structure. For example, if you would choose payoff 1 over 2 and payoff 3 over 4, but would prefer a combined 2 and 4 over a combined 1 and 3, then it would be very easy for someone to sell you things in part, and make you overpay for them.

Log in to reply

@Calvin Lin sir, Am I right?

Log in to reply

Let \(X\) be an indicator variable which takes on value 1 for each \(HT\) pair and 0 otherwise, and let \(Y\) be an indicator variable which takes on value 1 for each \(HH\) pair and 0 otherwise. There are \(5 - 2 + 1 = 4\) indicator variables (both \(X\) and \(Y\)) in 5 coin tosses. The expected value of the payoff is: \[\text{E}\left [\sum_{k = 1}^{4} X_{k} \right ], ~ \text{E}\left [\sum_{k = 1}^{4} Y_{k} \right ].\]We can apply Linearity of Expectation here, which holds even for dependent events (nice proof of this is given in Brilliant Wiki Page), to get: \[\sum_{k = 1}^{4}\text{E}\left [ X_{k} \right ] , ~ \sum_{k = 1}^{4}\text{E}\left [ Y_{k} \right ].\]Expected value of both \(X_{k}\) and \(Y_{k}\) is: \[\text{E}\left [ X_{k} \right ] = \text{E}\left [ Y_{k} \right ] = 1 \cdot \dfrac{1}{4} + 0 \cdot \dfrac{3}{4} = \dfrac{1}{4}.\]It follows that expected payoffs in both cases are the same and they equal:\[\sum_{k = 1}^{4}\text{E}\left [ X_{k} \right ] = \sum_{k = 1}^{4}\text{E}\left [ Y_{k} \right ] = 4 \cdot \dfrac{1}{4} = 1.\]

Bonus: What about the variance of the payoff? It can also influence our decision. Is it also the same?It turns out they are not! Let us first consider payoff 1 and indicator variable \(X\):\[\begin{align} \text{Var}[X] &= \text{E}\left [\left(\sum_{k = 1}^{4}X_{k}\right)^{2}\right ] - \text{E}\left [\sum_{k = 1}^{4}X_{k}\right ]^{2} \\ &= \text{E}\left [\left(\sum_{k = 1}^{4}X_{k}^{2}\right) + \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ] - \text{E}\left [\sum_{k = 1}^{4}X_{k}\right ]^{2} \\ &= \text{E}\left [\left(\sum_{k = 1}^{4}X_{k}^{2}\right)\right] + \text{E}\left[ \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ] - \text{E}\left [\sum_{k = 1}^{4}X_{k}\right ]^{2} \\ &= 1 + \text{E}\left[ \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ] - 1 \\ &= \text{E}\left[ \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ].\end{align}\]The same is true for payoff 2 and \(Y\). Now, let us calculate \(\text{E}\left[ \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ] \). We make two groups of \(X_{k}X_{j}\):

\(\left | k-j \right | = 1\) representing two consecutive indicator variables which share one common coin. Notice that their product is always 0 since it's impossible to have two \(HT\)s in 3 consecutive coins.

\(\left | k-j \right | > 1\) representing two non-consecutive indicator variables which share no common coin. Their product is non-zero only when they are both non-zero which happens with probability \(\left(\frac{1}{4}\right)^{2} = \dfrac{1}{16}.\)

There are total of \(4 \cdot 3 = 12\) \(\left(X_{k}X_{j}\right)\) pairs, and \(2\cdot 3\) of them which are consecutive ie. where \(\left | k-j \right | = 1\). Hence, we have: \[\text{Var}[X] = \text{E}\left[ \left(\sum_{k \neq j}X_{k}X_{j}\right)\right ] = 0 \cdot 6 + \frac{1}{16} \cdot 6 = 0.375.\]

In the same way, we calculate \[\text{Var}[Y] = \text{E}\left[ \left(\sum_{k \neq j}Y_{k}Y_{j}\right)\right ] = \frac{1}{8} \cdot 6 + \frac{1}{16} \cdot 6 = 1.125.\]

So, variance of the payoff 2 is 3 times greater than variance of the payoff 1. If my reasoning is correct, then payoff 1 is somewhat safer option and it tends to the expected value, while payoff 2 is more riskier but it offers better chances of earning more than 1$. What do you thinks about this approach which takes variance into account, @Calvin Lin ?

Log in to reply

There are 2^5 outcoms of 5 coin tosses.

For payoff 1, there are this many ways to toss a HT. There are {HTHTT,HTHTH,HTTHT,HTHHT,THTHT, HHTHT,HTHHH,HHHTH,HHHHT, HTTTT,THTTT,TTHTT,TTTHT,HTTTH,HHTTT, HHTTH} Expected paayyoff 1 = (6/32

$2) + (10/32$1)= $11/16=$22/32For payoff 2, there are this many ways to toss a HH( or HHH,HHHH and HHHHHH). There are {HHHHH, THHHH,HHHHT, THHHT,HTHHH,HHHTH, HHHTT, TTHHH, HHTHH, HHTTT, THHTT, TTHHT, TTTHH, THTHH, THHTH] Expected payoff 2 =(1/32

$4 )+ ( 2/32$3) + ( 5/32$2) + (7/32$1)=$3/4= $27/32Payoff 2 is better, I guess LOL @Calvin Lin

Total possible outcomes of 32 of which there is one combination [TTTTT] which has no payoff.

Log in to reply

Check your calculations. The expected payoff of both scenarios is $1.

Log in to reply

There example given implies that HHH is counted as two consecutive HHs. This seems questionable as in standard English two consecutive HHs means HHHH. Is this really what the scenario intended?

Log in to reply

Yes, that is what the scenario intended. The consecutive pairs are allowed to overlap. Let me edit that in.

Log in to reply

Expected value is same for both payoffs

Log in to reply

Great observation. Does this make both payoffs the same? If not, what else do you want to consider?

Log in to reply

Thus the payoffs are same, so i can choose either of the payoff be it payoff A or Payoff B

Log in to reply

Because the expected payoff for both is $0, you are equally happy to choose either?

Log in to reply

I think I need to look at the variance of values in payoff 1 and payoff 2. Then if the expected value is same then we will go for the one which is less riskier.

Log in to reply

Log in to reply

Log in to reply

I remember reading about an interesting experiment done by a French mathematician regarding payoffs and expected utility from lotteries. Allais Paradox?

Log in to reply

That's interesting. In experiment 1 here, taking a risk, (even if it is only 1%), means the possibility of losing a guaranteed million dollars; I would deeply regret gambling and ending up losing that money, but if I took the million and played 1B just to see what happened and found that I could have won 5 million, I would have just said "Oh well" and still been perfectly happy with my million. But if there is no guaranteed money, then having a chance at 5 million at the expense of a 1% greater chance of winning a million seems worth the risk.

Experiment 1 makes me think of the old saying, "A bird in the hand is worth two in the bush." If there are no "birds in hand", however, then the risk evaluation process is quite different.

Log in to reply

Let Ii be an indicator rv for an event of getting H at ith toss. Let X be a rv which is how many times HT occured. Similarly, let Y be the same but for HH. Then, X = I1

I2 +... +I4I5 and Y = I1(1-I2) +...+I4(1-I5). We need to compare EX and EY. Given EIi = 0.5, we can apply linearity of expectation (even for multiplication since events are independent) and get 1 for both expectations.Log in to reply

So, if their expectations are the same, does that mean that you are indifferent about which payoff to take? If so, why?

Log in to reply

Yes, because with both payoffs, one can win the same amount of dollars on average.

Log in to reply

Payoff 3: Always get $0

Payoff 4: 50% chance of - $1,000,000,000 and 50% chance of $1,000,000,000

Log in to reply

Log in to reply

Log in to reply

Log in to reply

I would go for Payoff 2 , because For payoff 1 ..... one can get maximum number of $1 is 2 which equals $2 i.e for this outcome HTTHT ( MAXIMUM POSSIBLE 'HT' consecutive is 2). For payoff 2 ..... one can get maximum number of $1 is 4 which equals $4 i.e for this outcome HHHHH ( MAXIMUM POSSIBLE 'HH' consecutive is 4).

Log in to reply

I think this is a combinatorics problem?!

Log in to reply

In the sense that arithmetic is a part of algebra, which is a part of calculus? There are numerous cross-disciplinary ideas. Using combinatorics will only get you one perspective of things.

Notice that I am essentially getting at the risk-reward preference, which is a "finance idea" as opposed to a "combinatorics idea". Yes, combinatorics ideas like expected value is involved, but they don't tell the full story.

Log in to reply

From a statistical standpoint, I would choose payoff option 1. As the number of H tossed increases, the probability of the consecutively tossing another H decreases. That being said, the HH combination can lead to a higher payoff because HTHTH only results in $2 while HHHHH results in $5. The issue lies in the decreased probability of continuing to toss H. This is a good example related to risk, much like in the decision involved in buying a AAA bond earning 5% vs a B bond earning 13%.

Log in to reply

To be more specific, payoff 1 would be like the AAA bond and payoff 2 like the B bond. It just depends on what kind of risk is right for you.

Log in to reply

Not quite the same comparison. Note that the expected payoff in both methods are the same.

Whereas in the bond example that you gave, we are trading expected payoff for certainty.

I also strongly disagree with "As the number of H tossed increases, the probability of the consecutively tossing another H decreases". The coin tosses are independent, and that is a common fallacy that "the proportion of realized events must be equal / close to the calculated probability"

Log in to reply