Is Switching the Best Option in the Door Game Show Dilemma?

In summary, the movie "21" presents a well-known problem called the "Monty Hall Problem" which involves a game show host giving a contestant a choice of three doors, two of which contain goats and one which contains a car. The proposed solution suggests that the contestant should switch their choice after one of the doors is eliminated, but there is some confusion about the application of probability in this situation. The Monty Hall Problem is well-documented and can be explored further through various sources such as Wikipedia.
  • #36
As I understand probability, if I can "count" the TOTAL possible ways the game can play out and I can "count" the TOTAL number of ways to WIN (or lose), If I divide the number of ways to win by the total number of ways the game can be played then I would be able to calculate exactly the theoretical probability of winning.

Making a table of the 24 possible ways the game can be played out and then counting the 12 ways in which the game can be won, I would suggest the probability of a random selection of winning is 0.50.
 
Mathematics news on Phys.org
  • #37
vector03 said:
As I understand probability, if I can "count" the TOTAL possible ways the game can play out and I can "count" the TOTAL number of ways to WIN (or lose), If I divide the number of ways to win by the total number of ways the game can be played then I would be able to calculate exactly the theoretical probability of winning.

Making a table of the 24 possible ways the game can be played out and then counting the 12 ways in which the game can be won, I would suggest the probability of a random selection of winning is 0.50.

You are suggesting that if one selects a single door at random from three doors, one of which contains a car, that the theoretical probability of selecting the car is 0.5.

The fact that the host eventually opens one of the doors containing a goat does not alter the fact that there is a 1/3 probability that you will select the door that hides the car.

Surely, you can agree that if the host does not open one of the doors and provide the opportunity for switching, there is a 1/3 probability that you have selected the winning door.

If you accept this, but still assert that there is a .5 probability that you have selected the correct door, you are asserting that if the host opens a door that he knows to contain a goat after you have made your selection, the probability that you initially made the winning selection increases.

If we accept your assertion that after the goat has been revealed, the probability that your door is the winning door increases to .5, then we would also expect that in a very large number of trials, you would likely win more often if the host opens a door after you have made your selection than if he does not, even if you stay with your selection after he reveals one of the goats.

This result is clearly absurd. If you stay with your initial selection, the fact that the host shows one of the goats will not change the frequency with which you make the correct selection.

While the "paradox" may appear initially counter-intuitive, the notion that if the host opens a door that he knows to contain a goat, the probability that you made the correct selection will increase to .5 should seem even more absurd.

We know that if the host never opens the door after you make your selection, and you have no opportunity to switch, your chances of winning are 1/3. How could the event of the host opening a door he knows not to be the winning door after you make your selection possibly increase your chances of winning? Would you really expect to win more often if you make your selection, and choose to stay with that selection after the host opens the door hiding a goat, simply because the host opened the door?

Suppose you roll a die, but do not look, and I look to see what it is. Suppose you guess what number was rolled, and I tell you which number it is not, and ask you if you wish to switch your guess. Suppose you choose not to switch. Will the fact that I told you which number it is not in any way increase the probability that you initially guessed right? Will the probability that you initially guessed right increase from 1/6 to 1/5 simply because I told you something I know after you guessed? Will the event of me telling you something I know after you guess cause you to guess correctly more often if you do not ever change your guess?
 
  • #38
I like that explanation. Very clear.
 
  • #39
Indefinite said:
You are suggesting that if one selects a single door at random from three doors, one of which contains a car, that the theoretical probability of selecting the car is 0.5.

Not actually suggesting what you have stated above.
Suppose the player makes his door choice and the "host" opens a losing door. The player decides to switch doors as recommended.

Now suppose that a new player walks in and knows nothing about what has transpired between the host and original player. The new player simply sees two doors. The host asks the new player to make a choice of doors as well.

According to previous assertions, the original player has 2/3 chance of winning while the new player has a 1/2 chance of winning. What I'm suggesting is that one player cannot have a 2/3 chance of winning at the same time another player has a 1/2 chance of winning given exactly the same set of circumstances.

This implies that a "new" game ensues after the host opens a door and the chance of winning is simply changed from 1/3 to 1/2.
 
  • #40
That's not correct, you have caused the door open by the host, since you made a choice. That's why we need the premise of having the host aware of the things behind the doors.

The newcomer will have a new game, but not your game since he did not cause the open door to open.

Think it with the 1000 doors or lottery tickets nationwide. If I show you all the numbers except the last one. it might happen that all of them are the same but you don't know about the last one. Because you did not know you will miss it by one number initially. Your initial chance to win does not change. You still hold the ticket that you bought with a very small chance of winning.

Now I say, I am going to remove all the the dummy tickets and leave you with one. You are standing there with your ticket and slim chance. But the one I have left has a big chance because all the possibilities of the dummy tickets are stacked on that ticket. Why? Because you might as well picked up a dummy one which is very likely to happen...
 
  • #41
If I understand, Your suggesting it is possible for 2 different players at a single point in time given an equal set cirumstances to have 2 different probabilities of winning?
 
  • #42
No, I am saying that one of them is cheating since there were 3 doors for one and afterwards 2 doors for the other.
 
  • #43
Considering that point in time, what are the probabilities for both the original player and the new player?
 
  • #44
That's crux of this question!

The newcomer has %50.

The original player has 1/3 as he started but given a chance to modify the odds or switch sides. Moreover, they reduced the options down to one. So if I had 1/3, and I have one more option to choose, then that option has to have the probability 2/3.

That's why it is neither a multiplication of probabilities nor resetting to %50 after he made one decision. Hence, a "dice has no memory" argument does not apply here. Because the event is not repeated it is altered.
 
  • #45
We're almost there... :-)
The new player has 1/2 chance of winning, it seems we agree.
What is the probability of winning for the original player at the point in time under consideration?
 
  • #46
Do you understand what I wrote there?
 
  • #47
Indefinite said:
If we accept your assertion that after the goat has been revealed, the probability that your door is the winning door increases to .5, then we would also expect that in a very large number of trials, you would likely win more often if the host opens a door after you have made your selection than if he does not, even if you stay with your selection after he reveals one of the goats.

vector03, I suggest that you read this part very carefully.

vector03 said:
This implies that a "new" game ensues after the host opens a door and the chance of winning is simply changed from 1/3 to 1/2.

This isn't true. Just because the host opened a door to reveal a goat doesn't mean that we can ignore the circumstances before it.
 
  • #48
trambolin said:
Do you understand what I wrote there?

Yes. I understand all the explanations presented and I understand the mathematics behind the calculation. Each one who has attempted an explanation has presented what I would consider a reasonably good explanation (for whatever that's worth). My big "hang-up", and I'm not sure I agree, at least yet, that it's possible for a player to have a 1/2 chance of winning while another player at the same point in time with exactly the same set of conditions can have a 2/3 chance of winning.

Either something wrong with the theory or there is something wrong with the application that doesn't consider (or account for) the "hypothetical" new player (3rd observer).

I think to achieve 2/3 chance of wiinning requires and depends on an event that has a 100%chance of occurring --> "mechanically" requiring the original player to "switch" everytime which takes away from, in my opinion, some of the "randomness". An event which must occur 100% of the time is, in my opinion, not random.

So bottom line, yes... I've understood your explanations and appreciate them yet I'm just having a hard time "wrapping" my thoughts 100% around them.
 
  • #49
vector03 said:
My big "hang-up", and I'm not sure I agree, at least yet, that it's possible for a player to have a 1/2 chance of winning while another player at the same point in time with exactly the same set of conditions can have a 2/3 chance of winning.
When you calculate a probability, your answer depends on how much information you have. The second player is lacking some information, so the probability they calculate is 1/2. The original player has more information and calculates the probability as 2/3. The game show host has even more information and will calculate the probability to be either 1 or 0.

The probability represents how often you would win if you repeated the experiment many times using the information you have and assuming all the information you don't have is random.
 
  • #50
Personally, I would vote for the host's chances as 0 since the host is generally not allowed to play.

I note the qualifications of "long run" or "repeated many times" and respectfully submit that the theory is based on the "long run" assumption. In this particular case, that assumption is not met. This experiment is setup as a one time chance of winning. If the player had hundreds of chances, in the long run, his chances would approach a limit of 2/3. However, the player only get's 1 chance and that invalidates any use of the "repeated many times" assumption. The player only has one chance.

Applying any theory that is based on certian assumptions being met to the solution of a problem where those assumptions are not being met does not seem cosistent
 
  • Like
Likes lynzoet
  • #51
I've read all these comments and theories and I understand what's been said, but to me it boils down to this:

When you make your initial choice, you would of course have a 33% chance of winning. But, no matter which door you choose, the host is going to open one of the losing doors, then present you with a new choice. So based on these facts, your chances of winning are 50% from the start.
 
  • #52
dennynuke said:
I've read all these comments and theories and I understand what's been said, but to me it boils down to this:

When you make your initial choice, you would of course have a 33% chance of winning. But, no matter which door you choose, the host is going to open one of the losing doors, then present you with a new choice. So based on these facts, your chances of winning are 50% from the start.

Ok, you almost got it! Not quite, but your on the right track, sort of.

Consider this. If you intially choose a door, you have a 1/3 chance. Right?
Right.

If you do nothing further with that choice(that is, you KEEP your door) it DOES NOT change the fact that your chance is 1/3, because 1/3 what you started with. Right? (as long as you keep your initial selection)
Right.

But, if you switch to the remaining door, your odds change to 50/50.
Not if you STAY, ONLY IF YOU SWITCH!
That was the flaw in your logic. You think that by staying your odds somehow magically convert from 1/3 to 1/2 by your doing nothing.
That's NOT true.
But, if you switch, it IS true! And does indeed become 50/50

That's why switching increases your odds.
 
Last edited:
  • #53
Here's another, rather brilliant way of looking at this that finally conviced me (I was a sceptic too!)

You have a standard deck of playing cards and are told that the Ace of Hearts is the winning card.
You choose one "blindly"
What's your chance of having the Ace of hearts? 1/52 Very poor odds. Agreed?!

Now for the "magic":
Take that card, DO NOT LOOK at it, and lock it into a safe.
1/52 odds, agreed?
Nothing can change that at all as long as it is kept, unkown, in the safe. Agreed?
1/52 odds, agreed?

The magician takes the remaining 51 cards and looks through them. You are not allowed to do this.
The magician knows that the winning card is the Ace of hearts. His task is to select 1 card from the remaining 51 and discard the rest.
If the magician finds the ACE of HEARTS, he selects that card, but doesn't tell you.
If the magician does not find the ACE of HEARTS(because you have it), he selects a different card but doesn't tell you.

In either case, either you or the magician has the Ace of Hearts at this point, agreed?
Read that over again if it doesn't make sense. Sip on a beer and agree with this to this point...

OK, now, you OR the magician HAS the Ace of Hearts. This much IS CERTAIN, because the magician would select it if you did not have it, or select a different card if you did.

At this point, what is going on?
Nothing.
Your card in the safe still has a 1/52 chance of being the Ace of Hearts.
If you do NOTHING, and keep your card, it's 1/52. Nothing has changed.

But NOW, the magician poses this question: Keep your card or switch it for mine.
You NOW know that either you OR the magician has the ACE of Hearts.
It most certainly at that point is a 50/50

To keep your card is 1/52
To switch it, the odds are now 1/2.
Those odds WILL NOT GO FROM 1/52 to 1/2 if you keep your card. YOU MUST SWITCH.

You could have the winning card if you do not switch. THIS IS TRUE! Could happen!
But it will be 1/52.
If you switch the odds go to 1/2. You COULD still lose, but the odds are much better for a win.
 
Last edited:
  • #54
pallidin said:
To keep your card is 1/52
To switch it, the odds are now 1/2.
This doesn't make sense. As there's a 1/52 chance you have the winning card, the chances the other card is the winning card are 51/52, so if you switch, the odds of winning are 51/52. (Assuming the magician is telling the truth and has done what you described.)

And, similarly in the original car/goat game, the odds are 1/3 if you don't switch, and 2/3 if you do.

When you consider all possible outcomes, the odds have to add up to 1.
 
  • #55
OK, here is the same situation presented to us in different words. The crux of the matter (and the most important) is that in each case we are presented with a new problem in the end, with a new set of variables.

In the case of the three doors, your first choice is 1 out of three, then you are asked to make your next choice with only two options. It does not matter how you got to this point. The past has nothing to do with the future odds, so your chances are 50/50.

The same is true with the cards. your first choice was 1/52. Then, no matter how you look at it, once the rest of the cards are elimnated then your new choise is between 2 cards, neither of which you know. There is no other answer than 50/50.

Again, the past has nothing to do with future odds. No matter how you end up at a choice, if that choice is between two unknowns, then your odds are always 50/50.
 
  • #56
dennynuke said:
OK, here is the same situation presented to us in different words. The crux of the matter (and the most important) is that in each case we are presented with a new problem in the end, with a new set of variables.

In the case of the three doors, your first choice is 1 out of three, then you are asked to make your next choice with only two options. It does not matter how you got to this point. The past has nothing to do with the future odds, so your chances are 50/50.

The same is true with the cards. your first choice was 1/52. Then, no matter how you look at it, once the rest of the cards are elimnated then your new choise is between 2 cards, neither of which you know. There is no other answer than 50/50.

Again, the past has nothing to do with future odds. No matter how you end up at a choice, if that choice is between two unknowns, then your odds are always 50/50.

No, no, no!
Denny, if you leave the card in the safe(don't switch) the 1/52 odds CANNOT be violated no matter what happens to the other 51.
It's only when you DO switch that the odds change dramatically in your favor.
Ok 2/3, not 1/2, my bad(thanks DrGreg, I forgot).
Anyway, the IMPORTANT point is to grasp that NOT switching is NOT the same odds as switching.
Bizarre? I know, but it's true, and makes perfect sense once you think about it enough.
 
  • #57
In the beginning the magician is allowed to select from and eliminate all but one card, thus giving him a 51/52 chance of winning and I have a 1/52 chance. You're saying that once he eliminates 50 of the cards that I still have a 1/52 chance of having the right card.
I submit that it is now a new problem with different variables. My new (and now my only) choice is to keep my card or choose a different card. It doesn't matter if we started with 1,000,000 cards, or what the odds were when we started. That was a different problem. The only choice NOW is between two cards that are unknown to me.

I can't see it any other way. I appreciate your examples, but each boils down to the same choice between two unknowns. 50/50
 
  • #58
Just consider what would happen if you repeated this game 5200 times. You pick a card at random.

In about 100 games you pick the ace of hearts, your opponent picks some other card at random. If you swap, you lose.

In the other 5100 games you pick some other card (about 100 games the 2 of hearts, about 100 games the 3 of hearts, about 100 games the 4 of hearts, etc), and always your opponent picks the ace of hearts. If you swap, you win.

So if you swap, in about 100 games you lose, in about 5100 games you win. Is that a 50:50 chance?
 
  • #59
dennynuke said:
In the beginning the magician is allowed to select from and eliminate all but one card, thus giving him a 51/52 chance of winning and I have a 1/52 chance. You're saying that once he eliminates 50 of the cards that I still have a 1/52 chance of having the right card.
I submit that it is now a new problem with different variables. My new (and now my only) choice is to keep my card or choose a different card. It doesn't matter if we started with 1,000,000 cards, or what the odds were when we started. That was a different problem. The only choice NOW is between two cards that are unknown to me.

I can't see it any other way. I appreciate your examples, but each boils down to the same choice between two unknowns. 50/50

What you and others fail to understand(including myself at one point) is that those are NOT 2 entirely separate problems.
They are "entangled" if you will.
That is, the potential outcome of problem #2 is definitely influenced by problem#1
Hope that makes any sense.
 
  • #60
DrGreg said:
Just consider what would happen if you repeated this game 5200 times. You pick a card at random.

In about 100 games you pick the ace of hearts, your opponent picks some other card at random. If you swap, you lose.

In the other 5100 games you pick some other card (about 100 games the 2 of hearts, about 100 games the 3 of hearts, about 100 games the 4 of hearts, etc), and always your opponent picks the ace of hearts. If you swap, you win.

So if you swap, in about 100 games you lose, in about 5100 games you win. Is that a 50:50 chance?

OK, there's a glimmer of understanding starting here... Because the magician is allowed to see all the cards, he has a 51/52 chance of finding the ace so he has skewed the probability in his favor. I only have a 1/52 chance based on my initial random selection. So at this point I'm offered a choice between two cards which are unknown to me, but the magician's card has a high probability of being the ace since he was able to eliminate all but my one card. Is that it?
 
  • #61
dennynuke said:
OK, there's a glimmer of understanding starting here... Because the magician is allowed to see all the cards, he has a 51/52 chance of finding the ace so he has skewed the probability in his favor. I only have a 1/52 chance based on my initial random selection. So at this point I'm offered a choice between two cards which are unknown to me, but the magician's card has a high probability of being the ace since he was able to eliminate all but my one card. Is that it?

Yes. The two cards are not both random. One is always random, but 51 times out of 52 the other is not random at all.
 
  • #62
Good job, denny!
 
  • #63
vector03 said:
Yes. I understand all the explanations presented and I understand the mathematics behind the calculation. Each one who has attempted an explanation has presented what I would consider a reasonably good explanation (for whatever that's worth). My big "hang-up", and I'm not sure I agree, at least yet, that it's possible for a player to have a 1/2 chance of winning while another player at the same point in time with exactly the same set of conditions can have a 2/3 chance of winning.

Either something wrong with the theory or there is something wrong with the application that doesn't consider (or account for) the "hypothetical" new player (3rd observer).

I think to achieve 2/3 chance of wiinning requires and depends on an event that has a 100%chance of occurring --> "mechanically" requiring the original player to "switch" everytime which takes away from, in my opinion, some of the "randomness". An event which must occur 100% of the time is, in my opinion, not random.

So bottom line, yes... I've understood your explanations and appreciate them yet I'm just having a hard time "wrapping" my thoughts 100% around them.
While it is true that the second player has a 50% probability of winning, and the first player has a 2/3 probability of winning if he switches doors, this is not a contradiction, because the probability values describe different events. The .5 probability describes the probability that the second player will win if he selects from the two doors at random. The 2/3 probability describes the probability that the first player will win if he switches doors. Note that it does not actually matter who is playing in order for these probability values to hold. The second player also has a 2/3 probability of winning if he selects the door that player 1 can switch to, and the first player has a 1/2 probability of winning if, after being asked if he wishes to switch, he makes his selection at random.

No inconsistencies arise from the fact that the probability values are not equal, because they describe events occurring under separate conditions.

The 2/3 probability of winning applies only to a player who chooses to select the door that was not selected initially by the first player, and that was not opened by the host. The 1/2 probability of winning applies only to the player that selects from the two remaining doors at random after the host has opened the third door. Similarly, the 1/3 probability of winning applies only the player who chooses to stay with the door that was selected before the losing door was opened.

To summarize, let door 1 represent the door that is first opened, door 2 represent the door that is opened by the host, and door 3 represent the door that player 1 has an opportunity to switch to.

There is a 1/3 probability that either player will win if he selects door 1.
There is a 2/3 probability that either player will win if he selects door 3.
There is a 1/2 probability that either player will win if he selects a remaining door at random after the host has revealed one of the goats.

Note that each distinct probability value is associated with a distinct condition. So you see that there is internal consistency between them. Does this clarify things?

vector03 said:
Personally, I would vote for the host's chances as 0 since the host is generally not allowed to play.

I note the qualifications of "long run" or "repeated many times" and respectfully submit that the theory is based on the "long run" assumption. In this particular case, that assumption is not met. This experiment is setup as a one time chance of winning. If the player had hundreds of chances, in the long run, his chances would approach a limit of 2/3. However, the player only get's 1 chance and that invalidates any use of the "repeated many times" assumption. The player only has one chance.

Applying any theory that is based on certian assumptions being met to the solution of a problem where those assumptions are not being met does not seem cosistent

The derivation of the solution to this problem is not predicated on the assumption of repeated trials. I only broached the topic of repeated trials to bring a deeper understanding of the implications of the asserted probability value, because we know that the theoretical probability value represents the frequency of occurrence that we will converge to in the limit as the number of trials approaches infinity. The theoretical probability value represents the frequency that a hypothetical experiment would converge to in the limit of arbitrarily many trials, regardless of whether or not such an experiment is actually conducted. My discussion of a large number of trials was only meant as another means of interpreting theoretical probability values.

In a similar sense, I might say that there is a .5 probability of landing heads on a coin flip, and expand on what this means by asserting that if we conduct many trials, we can reliably expect to obtain heads approximately 50% of the time. However, the fact that we do not actually conduct these trials does not change the fact that there is a .5 probability of obtaining heads in a single trial. The notion of many trials simply furnishes us with another perspective for understanding what a theoretical probability value means.

Because we know that the results of an experiment of many trials will tend to converge toward the theoretical probability value for a single trial, we can use our expectations of the results of such an experiment to determine whether or not our theoretical value seems intuitively reasonable.
 
Last edited:
  • #64
dennynuke said:
I've read all these comments and theories and I understand what's been said, but to me it boils down to this:

When you make your initial choice, you would of course have a 33% chance of winning. But, no matter which door you choose, the host is going to open one of the losing doors, then present you with a new choice. So based on these facts, your chances of winning are 50% from the start.

You chances of winning will only be 50% if you select from the two remaining doors at random.

The solution to the "paradox" does not state that the player has a 2/3 probability of winning from the start of the game. The solution states that the player has a 2/3 probability of winning if he selects the second door. In order to expect to win approximately 2/3 of the time, the player must switch doors every time.

The crucial error in reasoning that continues to be made here is that the 1/2 probability of winning when selecting between two doors only applies when one selects between those 2 doors at random. The 2/3 probability does not apply to the player who selects between the two doors at random, it applies to the player who switches doors.

If it is decided at the start of the game that the player will always switch, we can expect in a large number of trials that he will win approximately 2/3 of the time.

If it is decided that the player will always select between the two doors at random after the host reveals the goat, then his frequency of winning will converge to 1/2 as the number of trials becomes arbitrarily large.

Your confusion arises from the fact that you know that if there are two doors, and one contains the prize, and you select between those doors at random, you will win approximately half of the time. However, the conditions of the 2/3 probability value do not state that the two doors are selected from at random; they state that the player switches.

The 2/3 probability of winning applies only under the conditions in which the player switches doors, not under the conditions in which the player selects between the remaining two doors at random.

Denny stated that it is a 50% probability if it is a choice between two unknowns, and this is true. The problem is that the 2/3 probability is not intended to describe a choice between two unknowns. It describes only the probability of winning if the player switches every time.

I think most of the confusion here derives from the fact that people do not fully understand the conditions that are specified in the problem. The problem does not state that the player has a 2/3 probability of winning if he has the opportunity to select between the two doors after the host opens one with a goat behind it. It states that the player has a 2/3 probability if he switches doors. The player must switch doors in order for this value to describe his probability of winning.
 
Last edited:
  • #65
An attempt to explain in fewest words possible:

Put yourself in the shoes of the showhost who knows where the car is.

For "one third" of the experiments the contestant will initially choose the car door, in which case the contestant will switch to a goat upon changing doors.

For "two thirds" of the experiments the contestant will initially choose a goat door in which case the showhost is forced to pick the other goat door that the contestant did not choose, "leaving the car door as the single remaining door".

So, one third of door changes the showhost will see her giving up the car for a goat, and two thirds of the time the showhost will see her giving up a goat for the car.

So, probabilities are: 1/3 a goat, 2/3 a car.

This can be made to be intuitive by realising the advantage for the contestant arises as a result of the showhost being forced into revealing the location of the car 2/3 of the time.
 
  • #66
Not sure if this has already been understood but I think I can explain this more understandably.

If you are given a choice between 100 boxes, and you are told by a host that one box contains £100, you have a 1/100 chance of picking the money. Every other box is empty, so this translates that you have a 1% chance of picking the box with the money.

So...let's say that you pick box 37 and do not yet open it. The host then opens all other boxes except one. All the boxes he opens are empty, he knew this and so you are now left with two boxes. One box MUST contain the money. You are now given the option to switch.

Because the host knows which box has the money, it is always best to switch. Remember, when you picked your box, you had a 1% chance of getting the money, translating to a 99% chance that the money is in the box the host has left you with.
 
  • #67
Switching is NOT the best solution for this question.

Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1 [but the door is not opened], and the host, who knows what's behind the doors, opens another door, say No. 3, which has a goat. He then says to you, "Do you want to pick door No. 2?" Is it to your advantage to switch your choice?

By switching, you do not have a 2/3 chance. If the question was, before I show you a door, do you want to switch AFTER I show you a door? or Is it to your advantage to switch? In this case most definitely. As seen before, CGG, GCG, GGC, by choosing one you only have a 1/3 chance and by always switching you will receive a 2/3 chance. There are 3 variables here. BUT, the question is, AFTER I show you a door, which eliminates a choice, is it in your best interest to change. In the problem, if contestant chooses Door #1 and shows Door #3 having a goat, the only 2 possible choices are CGG and GCG. One must not be confused and include GGC because this is not possible, the 3rd door MUST be a goat. There are only 2 variables here, thus it is a 50/50 chance. Essentially this would be no different than the game show host opening a door initially. He is removing a variable from the equation. Or the same as opening the door you choose (if you obviously didn't pick the car) and allowing you to switch to either of them.
 
  • #68
C: the number of the door hiding the car,
S: the number of the door selected by the player
H: the number of the door opened by the host

P(C=2|H=3,S=1) = Probability of the car in the other door =
P(H=3|C=2,S=1)*P(C=2|S=1)/(Summation(i=1,2,3) of
P(H=3|C=i,S=1)*P(C=i|S=1).

Wiki shows this:
(1 * 1/3) / (1/2 * 1/3 + 1 * 1/3 + 0 * 1/3) = 2/3. INCORRECT!
(1/2 * 1/3) / (1/2 * 1/3 + 1/2 * 1/3 + 0 * 1/3) = 1/2. CORRECT

There is no reason P(H=3|C=2,S=1) should equal 1. C could be in EITHER 1 or 2,
so this is obviously a 50/50 chance or 1/2. Why would they say:
P(H=3|C=2,S=1) = 1 BUT P(H=3|C=1,S=1) = 1/2? Makes absolutely no sense.
When they are both equal to 1/2, the probability of the car in the other door is 1/2,
just like it should be.
 

Attachments

  • 533fc4123bec46e1773ef05796a584f9.png
    533fc4123bec46e1773ef05796a584f9.png
    2.8 KB · Views: 390
  • #69
Keep this in mind: The host CANNOT show you what is behind the door you picked. And the Host CANNOT show you what is behind the door with the car (this is really the most important part.)

There is no reason P(H=3|C=2,S=1) should equal 1.
P(H=3|C=2,S=1) is the probability that the host will show you door 3 GIVEN that you picked door 1, and the car is behind door 2. The host CANNOT show you the door with the car and the Host CANNOT show you the door you picked. This leaves exactly 1 choice for him to open: Door3:

There is no reason P(H=3|C=2,S=1) should equal 1. C could be in EITHER 1 or 2,
so this is obviously a 50/50 chance or 1/2. Why would they say:
P(H=3|C=2,S=1) = 1 BUT P(H=3|C=1,S=1) = 1/2? Makes absolutely no sense.
This: P(H=3|C=1,S=1) is the probability the the host will show you Door 3 GIVEN that you picked Door 1 AND the car is behind Door 1. In this case, the host can show you Door 2 OR Door 3 because you picked neither of the doors, and there is a goat behind both of them.
 
  • #70
Appreciate ya

Robert1986 said:
Keep this in mind: The host CANNOT show you what is behind the door you picked. And the Host CANNOT show you what is behind the door with the car (this is really the most important part.)


P(H=3|C=2,S=1) is the probability that the host will show you door 3 GIVEN that you picked door 1, and the car is behind door 2. The host CANNOT show you the door with the car and the Host CANNOT show you the door you picked. This leaves exactly 1 choice for him to open: Door3:


This: P(H=3|C=1,S=1) is the probability the the host will show you Door 3 GIVEN that you picked Door 1 AND the car is behind Door 1. In this case, the host can show you Door 2 OR Door 3 because you picked neither of the doors, and there is a goat behind both of them.
 

Similar threads

Replies
12
Views
2K
Replies
1
Views
2K
Replies
15
Views
2K
Replies
32
Views
7K
Replies
30
Views
3K
Replies
212
Views
13K
Replies
2
Views
2K
Back
Top