Today I have a decision problem for you.

Alice offers Bob participation in a simple coin toss game. It’s called *triple or bust*. Alice start the game by writing an IOU to Bob for an amount of $ 1.00. Alice then makes at least six subsequent tosses with a fair coin. On each ‘heads’ Alice triples the IOU amount. On ‘tails’ she sets the IOU to zero. How much should Bob be prepared to pay Alice to participate in this game, knowing that he can repeat this game as often as he likes?

Okay, let’s see: on each coin toss there is a 50:50 chance for tripling and for voiding the IOU. So on average, after a coin toss the IOU increases to 3/2 times the amount before the toss. That means that after the first coin toss, the expectation value for the IOU is $ 1.50. After two tosses the expectation is 1.50 times $ 1.50 or $ 2.25. This exponential growth continues, and after six tosses the IOU dollar amount has risen to 1.50^6 or 11.39. Any coin tosses after the sixth, will obviously continue the exponential growth of the expected IOU. In the long run, the game will yield returns closing in on the expectation value. So paying an amount less than $ 11.39 per game will make it advantageous to participate.

Bob has worked out the same logic and decides to offer Alice $ 10.00 per game.

Alice immediately accepts.

Bob pays Alice $ 10.00, Alice writes an IOU of $ 1.00, and starts tossing. Heads shows. Alice changes the IOU into $ 3.00. Again heads. The IOU is now $ 9.00. Then tails appears. “No need for any further coin tosses, okay?” Alice looks at Bob. Bob nods. Alice rips the IOU in pieces.

Bob decide to go for another game. Alice pockets another $ 10.00. Now tails shows in the first round. Once more an IOU gets shredded.

Bob is in it for the long haul, chasing a very profitable expectation value. He keeps playing.

After 37 games Bob has lost $ 370.00. Bob pays another $ 10.00. This time he is more lucky. After 5 heads in a row the IOU reads $ 243. Alice makes a sixth coin toss. Again a head. “Yes!! That’s 729 dollars!” Bob blurts out.

Alice writes down $ 729 on the IOU and prepares for a seventh coin toss.

“Wait a second” Bob intervenes. “Don’t throw another coin, just give me the 729 dollars.”

“I will give you another coin toss for free”, Alice replies. “As agreed upfront, I am entitled to give you additional coin tosses. I am sure you have incorporated this game feature into your decision to offer me $ 10. Isn’t it?”

Bob nods silently and stares at Alice’s hand containing the coin. She makes a seventh coin toss. Again heads. The IOU now reads $ 2187. An eight toss follows. A tail. Alice rips the IOU in pieces.

Bob shakes his head and quits the game.

What went wrong? We have not made an error in our math, and neither has Bob. Something must be wrong in the logic.

It is correct that the expectation value for this game increases exponentially with the number of coin tosses. And for a fixed number of tosses per game, this expectation value does describe the returns that Bob will be make in the long run. It is also true that, having agreed at least six tosses, in the long run it is disadvantageous for Alice to add a seventh coin toss. And, again in the long run, it is even more disadvantageous for her to add an eight toss. Yet, giving Alice full liberty in adding any number of additional tosses gives her the power to make a killing in this game. Bob is guaranteed to lose every penny he puts into this game.

What is going on here?

The challenge is to understand the role of the expectation value for this game. Centuries of statistics research is based on applying expectation values as predictors for long-term gains. Putting your brain to sleep by ignoring the expectation value is not going to eliminate the paradox.

It seems that the Hammock Physicist moved here, so I will repost my comment here:

I am not sure whether I really understand the game, but here it goes. This looks like a variant of the St. Petersburg paradox.

https://en.wikipedia.org/wiki/St._Petersburg_paradox

Alice can choose to toss the coin as often as she wants, if it is at least 6 times. Bob can win only when Alice stops tossing after a string of N heads. But why would Alice stop tossing the coin? The probability that she will throw a tails after any N tosses approaches 1 exponentially with more tosses.

If we assume that Alice can toss the coin 2000 times, the probability that she will NOT get a tails is 1/2^2000 ~ 10^-602. Calling this vanishingly small is an understatement.

Bob simply cannot win. The lesson is that you should never take a bet where the other party holds the strings.

LikeLike

Good to see you’re posting here: for several days now I can’t post any comments at science2.0 (keep getting an error message). It is these IT problems (a few weeks ago science2.0 made disappear one of my articles) along with the aggressive adverts that make me explore alternative blogging routes. We’ll see where my hammock lands in the virtual world…

On your specific question: the last sentence in above blog post is most relevant. It is indeed straightforward to arrive at the conclusion that for 2000 tosses Bob will more than likely make zero returns. But the question is how to reconcile this conclusion with an exponentially exploding expectation value.

LikeLike

I had the same experience on science2.0. I found that I could successfully reply using safari.

About the “exponentially exploding expectation value”. Bob can only win if Alice stops at some point after a string of heads. So, we must put some probability on Alice stopping and admitting defeat. That is tough. Instead, model this probability with a maximum number of tosses N > 6. The expectation value of the payout increases exponentially with N, that is right. However, the probability of a payout decreases exponentially with N.

This means that for large N, the expected payout could be gigantic, but the probability of actually winning could be vanishingly small. The expected payouts are correct, you only never are going to see a win ever during the life-time of the universe even if you would play a game every second.

LikeLike

@RobvS – and yes, you’re right: this paradox is not really different from the St Petersburg paradox. When going through old blog posts at science2.0 and considering which ones to republish here, I also took a look at the St Petersburg article. It then occurred to me a simpler game is possible.

LikeLike

Follow-up blog post: https://hammockphysicist.wordpress.com/2017/02/01/triple-or-bust-paradox-part-2/

LikeLike