What is the Probability of Running Out of Coins in a Flipping Game?

  • Thread starter Office_Shredder
  • Start date
  • Tags
    Challenge
In summary, a man begins flipping coins and has a 50% chance of running out of coins on the first flip. However, if he gets heads on the first flip, the game continues with a 100% chance of eventually running out of coins. Therefore, the probability that he eventually runs out of coins is 1.
  • #1
Office_Shredder
Staff Emeritus
Science Advisor
Gold Member
5,666
1,568
A man begins flipping coins according to the following rules: He starts with a single coin that he flips. Every time he flips a heads, he removes the coin from the game and then puts out two more coins to add to his stack of coins to flip, every time he flips a tails he simply removes the coin from the game.

For example, if he flips a heads on his first flip, he now has two coins. He flips both, if he gets two tails then he has run out of coins. On the other hand if he flipped two heads he would now have four coins to flip.

Once he begins flipping, what is the probability that he ever runs out of coins?

For a harder challenge check out Challenge 12b.
 
Mathematics news on Phys.org
  • #2
So heads means +1, and tail means -1?
 
  • #3
That's how I take it, Borek -- except presumably it's game over once he runs out of coins. If that's the case, the probability is 1 that he runs out of coins eventually.

The person will only run out of coins on odd numbered flips. It helps to look at just those odd numbered flips. The probability that the person is still in the game (hasn't run out of coins) after flip [itex]2n+1[/itex] is [itex]\binom{2n+1}{n+1}\,2^{-(2n+1)}[/itex]. This goes to zero as n→∞. In particular, it asymptotically approaches [itex]1/\sqrt{n\pi}[/itex].
 
  • #4
Here's an alternative solution.

Let P(n) be the probability of losing if you start with n coins. Let's start with 2 and toss both.

[itex]P(2) = \frac14 + \frac12 P(2) + \frac14 P(4)[/itex]

And, in general:

[itex]P(2n) = \frac14 P(2n-2) + \frac12 P(2n) + \frac14 P(2n+2)[/itex]

Hence:

[itex]P(2n+2) = 2P(2n) - P(2n-2)[/itex]

By induction:

[itex]P(2n) = nP(2) - (n-1)[/itex]

Hence:

[itex]P(2) = \frac1n P(2n) + \frac{n-1}{n}[/itex]

Letting n → ∞ gives:

[itex]P(2) = 1[/itex]

So, the probability of losing eventually if you start with 2 coins is 1. And, therefore, you're bound to lose if you start with 1 coin.
 
Last edited:
  • #5
Nice inductive solution Perok!
 
  • #6
In fact, as a corollary, starting from:

[itex]P(2n) = nP(2) - (n-1)[/itex]

[itex]P(2n) = n - (n-1) = 1[/itex]

So, it doesn't matter how many coins you start with, you're bound to lose eventually! Which, I guess, is intuitive if you know enough about probability.
 
  • #7
PeroK said:
So, it doesn't matter how many coins you start with, you're bound to lose eventually!
Exactly. This is the gambler's ruin problem. The house presumably has an infinite supply of pennies in this challenge.
 
  • #8
Just for completeness, using the brilliant observation by mfb on the harder problem that:

[itex]P(2) = P(1)^2[/itex]

(Losing with two coins is equivalent to losing two independent games with one coin each.)

Then:

[itex]P(1) = \frac12 + \frac12 P(2) = \frac12 + \frac12 P(1)^2[/itex]

[itex]∴ \ P(1)^2 - 2P(1) + 1 = 0 \ ∴ \ (P(1) - 1)^2 = 0 \ ∴ \ P(1) = 1[/itex]
 
  • #9
If consecutive coins in action have the P(H)=1 then the game never ends. So the answer is not unique unless a sequence of probabilities for the coins in action is defined. For P(H)=p (fixed and 0<p<1) the problem is a famous college problem as indicated by others. Of course p=0 and 1 are trivial cases.
 
Last edited:
  • #10
I note that, if for all coins P(H)=0.6 the the game never ends with probability = 1- (1-0.6)/0.6= 1-2/3= 1/3.
In fact P(H)≤ 0.5 => definite end of game. Else, there is a positive chance of a never ending game.
 
Last edited:
  • #11
Let ##p## be the probability that he eventually runs out of coins. There are two ways for this to happen. He can get tails on the first flip--this happens with probability ##\frac{1}{2}##. Alternatively with probability ##\frac{1}{2}## he can get heads on the first flip. If he gets heads on the first flip he essentially plays the same game twice, and the probability of running out of coins in *both* of these games is ##p^2##. So we have

##p = \frac{1}{2} + \frac{1}{2}p^2##

The solution to this equation is ##p=1##.
 

FAQ: What is the Probability of Running Out of Coins in a Flipping Game?

What is Challenge 12a: Flipping Coins?

Challenge 12a: Flipping Coins is a scientific experiment in which a coin is flipped multiple times to determine the likelihood of getting a certain outcome.

Why is flipping coins used in scientific experiments?

Flipping coins is used in experiments because it is a simple and unbiased way to generate random outcomes. This can help scientists study probability and make predictions.

What are the variables that can affect the outcome of a coin flip?

The variables that can affect the outcome of a coin flip include the initial position of the coin, the force applied to flip the coin, and the surface on which the coin lands.

How can flipping coins be applied in real-life situations?

Flipping coins can be applied in real-life situations, such as in gambling or decision making. It can also be used in statistical analysis to determine the probability of certain outcomes.

Can flipping coins accurately determine probability?

While flipping coins is a useful tool for understanding probability, it may not always accurately represent real-world situations. Other factors, such as human bias, can also affect the results. Therefore, it should be used in conjunction with other statistical methods for more accurate predictions.

Similar threads

Back
Top