Monty Hall Problem

You are a game show contestant, Monty is the host. Monty presents you with three boxes, one of which contains a FABULOUS PRIZE!!!!. If you choose the correct box, you win the prize.

You choose a box. Monty, who knows which box contains the prize, opens one of the unchosen boxes that does not contain the prize. Monty then offers you the opportunity to change your selection to the other unopened box.

In order to maximize your chance of winning the prize, should you change your selection?

See the rec.puzzles FAQ http://www.faqs.org/faqs/puzzles/faq/ for the short description of the correct solution. See MontyHallSolution and the remainder of this page for additional attempts to explain or refute that solution.


This is an example of one of the categories of the InevitableIllusions.

Harder than working out how this works is explaining how it works


Part of the psychology of this, I suspect, is that people don't want to choose one door, switch, and then be thought an idiot for switching away from the right door. Perhaps the thought/feeling is that not switching and losing is somehow unlucky, while switching and losing is foolish.

AnAcceptableWayOfFailing?


The length of this page does nothing to restore my faith in the average coders' abilities in introductory mathematics. It would be funny if it weren't so disturbing.

Perhaps we should talk about ProgrammerMathSkills. (...or lack thereof. ;-)

Wow! How about CategoryInvoluntaryHumor? ?

How about a quantum mechanical explanation based on SchroedingersCat: One car is simultaneously behind three separate doors. According to its wave equation, there is one third of a car behind each door. You select a particular third of a car by claiming (temporary) ownership of it. Monty then performs a quantum transition that collapses the other two thirds of a car into a smaller space. You are invited to go get two thirds of a car and drive away.


I've read the correct solution to this a dozen times and still don't get it, which has lead me to conclude bad things about myself. -- TheFool?

Don't feel bad. It doesn't help that so many writers confound the answer with a lot of big tables and mathematical notation, along with implications that those who don't get it are lazy, undereducated, or irrational. It really is pretty simple, but counterintuitive. You just have to approach it in different ways and wait for the "Ah-ha" to hit you. I personally found MontyHallSimulation to be an effective way of understanding it. -- AnotherFool?

A suggestion: try empiricism. Grab a buddy and play the game. Try each of the two different strategies (stay with your first choice / switch) a few dozen times. Keep track of your results. Once you've got some empirical data in hand, switch roles with your buddy and play another few dozen rounds. Then try to figure out why your results aren't 50/50. (If no buddy's handy, you can use a MontyHallSimulation.)

It amazes me that so many people are willing to defend the position "switching doesn't matter" when they've never played the game. I've found that getting people who don't understand the solution to actually play the game triggers the necessary AhHa.

Does this mean that you have tried the empirical solution yourself? Did you record data? Are you willing to publish? -- a curious AnonymousDonor

Yes, I have, many times. Mostly with a MontyHallSimulation, but sometimes with other people. My goal was to convince them (or myself) that switching is better, so I didn't bother to keep formal records. Here's an idea: why doesn't everyone try it and record their results below? Then you wouldn't have to trust my word.


Of course you always change your selection. To make the puzzle more intuitive, imagine there are a million boxes instead of just three, but still only one prize. You pick a box. Monty opens 999998 boxes, none of which contain the prize. Should you switch to the one box he didn't open, or should you go with your initial, one-in-a-million guess?

[Imagine you picked box 1 (out of a million). Monty tells you the prize is either in your choice, the very first box, or box number 453,824. The specific number makes it abundantly clear that your choice is random and his is very revealing.]


Of course you never change your selection. By revealing to you that the other 999998 boxes did not contain the prize, Monty increases the chance you are correct from one-in-a-million to one-in-two!

It doesn't matter either way. Whether you change your choice or not, the chances are the same that you are correct.

-The above view is the classic incorrect reasoning for this puzzle. You already knew there were 999998 empty boxes in that set, knowing which ones they are has no effect on the probability of the prize being in that set of boxes.


Problem: If room R is chosen originally from a set of N rooms and room R' is revealed to you (such that R' is a "goat" room), should you choose one of the other N-2 rooms?

If N is < 3, this problem doesn't make sense: 0 rooms: you can't choose anything, 1 room: you always win, 2 rooms: if Monty shows you the other room then you've already picked the winning room. So, N >= 3.

Given a set of N rooms where it is given that exactly one of N rooms possesses the property "car" (or is set to true, if you wish) and the other N-1 rooms possess the property "goat" (or is set to false, if you wish), the probability of a room possessing the car, Pc(N) = 1/N. Thus, the expectation of choosing the "car" room, Ec(N) = 1 * 1/N = 1/N, and the expectation of choosing a goat room, Eg(N) = (N-1) * 1/N = (N-1)/N.

Thus for three rooms, Ec(3) = 1/3 and Eg(3) = 2/3.

But if R' is revealed to be a "goat" room you now know that the original expectation of one the rooms you didn't choose housing the car, Eg(N), is now broken up amongst the N-2 rooms, giving you the probability of the other room possessing a car => Pc'(N) = Eg(N) / (N-2) = (N-1)/N(N-2).

Thus, for a three room game, Pc'(3) = 2/3.

Note that Pc'(N) > Pc(N) always as Pc'(N) = (N-1)/(N-2) * Ec(N) = (N-1)/(N-2) * Pc(N) as Ec(N) = 1 * Pc(N), and N-1 > N-2, so (N-1)/(N-2) > 1.

The main idea here revolves around expectation. Nothing can change your original expectation to win or lose if you just played out the game ignoring all additional information. I.e. if Monty reveals a door and you close your eyes and keep playing, it doesn't change the fact that you will win 1/3 of the time. You don't suddenly win 1/2 the time.

Thus, the 2/3 expectation that you will lose must mean that the car is smeared across the other rooms somewhere. If you remove a room, the car has less places to be smeared across and thus increases the other rooms' likelihoods of having the car.

[Pc'(4) = 3/8 = 37.5%. Check it out from the program below:

Rooms (3-32767): 4 Rounds (0-2147483647): 100000 After 100000 rounds:

Original picks won: 24732 times (24.732000%) Swapped picks won: 37603 times (37.603000%)

Pc'(5) = 4/15 = 26.666...%. Check it out:

Rooms (3-32767): 5 Rounds (0-2147483647): 100000 After 100000 rounds:

Original picks won: 20010 times (20.010000%) Swapped picks won: 26713 times (26.713000%) ]

Source code available at http://sunir.org/c2/MontyHallProblem/montyhal.c

-- SunirShah


Bold is which door has the prize. He will not show the door that has the prize , which is the fact that increases the odds, so this possibility has been left out. Plainly, against my own thoughts, it does work out to a 1-in-2 chance of being right.

                                                  Right   Wrong  
   You pick 1. He shows 2. You pick 3.   1  2  3            x
   You pick 1. He shows 3. You pick 2.   1  2  3            x
   You pick 2. He shows 3. You pick 1.   1  2  3    x
   You pick 3. He shows 2. You pick 1.   1  2  3    x

You pick 1. He shows 3. You pick 2. 1 2 3 x You pick 2. He shows 1. You pick 3. 1 2 3 x You pick 2. He shows 3. You pick 1. 1 2 3 x You pick 3. He shows 1. You pick 2. 1 2 3 x

You pick 1. He shows 2. You pick 3. 1 2 3 x You pick 2. He shows 1. You pick 3. 1 2 3 x You pick 3. He shows 1. You pick 2. 1 2 3 x You pick 3. He shows 2. You pick 1. 1 2 3 x

Viewing it this way cleared it up for me absolutely - I don't know why it is not expressed elsewhere in table form. -- RodneyRyan?


Sadly, the above table is improperly constructed. There are three options for you to pick, and three places for the prize to be. 3 x 3 = 9. You are treating what he shows as an input, which it is not. The inputs are "your pick" and "location of prize". What is shown is a function of those inputs.

pick = 1 location = 1 You change from good to bad. = lose

pick = 1 location = 2 You change from bad to good. = win

pick = 1 location = 3 You change from bad to good. = win

pick = 2 location = 1 You change from bad to good. = win

pick = 2 location = 2 You change from good to bad. = lose

pick = 2 location = 3 You change from bad to good. = win

pick = 3 location = 1 You change from bad to good. = win

pick = 3 location = 2 You change from bad to good. = win

pick = 3 location = 3 You change from good to bad. = lose

3/9 lose, 6/9 win. = 1/3 lose 2/3 win. Hope this table re-clears things up for you absolutely.

No. I do not understand your point. Can you be more specific? In the table above yours I simply listed all the possible sequences of events and tallied the outcome of each. I do not see where I made an error.

OK. The first table was acceptably constructed, but the cases are listed without their corresponding probabilities, which are not equal. Lines 3, 4, 5, 8, 9, and 10 have probability 1/9. Lines 1 and 2 have probabilities which total 1/9; likewise, lines 6 and 7 have probabilities which total 1/9, and lines 11 and 12 have probabilities which total 1/9. I know this is also stated below, but read it here, think about it, and let it sink in. The penny should soon drop. -- VickiKerr


 Strategy: Stick with First Choice
 First Choice
 Prize123
 1x
 2x
 3x
Read it like this: If the Prize is at 1, and my First Choice is 1, and I stick with my First Choice, then I am correct. If the Prize is at 1, and my First Choice is 2, and I stick with my First Choice, then I am wrong.

Add it up and you get 3/9 = 1/3 chance of being right with this strategy.

 Strategy: Switch from First Choice
 First Choice
 Prize123
 1xx
 2xx
 3xx
Read it like this: If the Prize is at 1, and my First Choice is 1, and I switch from my First Choice, then I am wrong. If the Prize is at 1, and my First Choice is 2, and I switch from my First Choice (i.e. from 2 to 1, since switching from 2 to 3 is impossible, since Monty already showed me 3), then I am correct.

Add it up and you get 6/9 = 2/3 chance of being right with this strategy.

Even less helpful [What about with the annotations?]

Explaining theory is not as clear as listing every possible sequence of actual events, totalling the outcome and drawing conclusions. That is why I made the You Pick, He Shows, You Pick table above. I still see it as correct, that is, it lists every possible sequence with the appropriate outcome and presents a tally of 1-in-2 chance of being correct. If I have left out or mis-stated sequences, please correct the table, which I reproduce here:

                                                  Right   Wrong  
   You pick 1. He shows 2. You pick 3.   1  2  3            x
   You pick 1. He shows 3. You pick 2.   1  2  3            x
   You pick 2. He shows 3. You pick 1.   1  2  3    x
   You pick 3. He shows 2. You pick 1.   1  2  3    x

You pick 1. He shows 3. You pick 2. 1 2 3 x You pick 2. He shows 1. You pick 3. 1 2 3 x You pick 2. He shows 3. You pick 1. 1 2 3 x You pick 3. He shows 1. You pick 2. 1 2 3 x

You pick 1. He shows 2. You pick 3. 1 2 3 x You pick 2. He shows 1. You pick 3. 1 2 3 x You pick 3. He shows 1. You pick 2. 1 2 3 x You pick 3. He shows 2. You pick 1. 1 2 3 x


What the table is missing is the associated probabilities of the events:

                                                      Right   Wrong   Probability
   1)  You pick 1. He shows 2. You pick 3.   1  2  3            x     .055
   2)  You pick 1. He shows 3. You pick 2.   1  2  3            x     .055
   3)  You pick 2. He shows 3. You pick 1.   1  2  3    x             .111
   4)  You pick 3. He shows 2. You pick 1.   1  2  3    x             .111

5) You pick 1. He shows 3. You pick 2. 1 2 3 x .111 6) You pick 2. He shows 1. You pick 3. 1 2 3 x .055 7) You pick 2. He shows 3. You pick 1. 1 2 3 x .055 8) You pick 3. He shows 1. You pick 2. 1 2 3 x .111

9) You pick 1. He shows 2. You pick 3. 1 2 3 x .111 10) You pick 2. He shows 1. You pick 3. 1 2 3 x .111 11) You pick 3. He shows 1. You pick 2. 1 2 3 x .055 12) You pick 3. He shows 2. You pick 1. 1 2 3 x .055

If you originally chose the correct location, then Monty has two doors to choose from, hence the 50%. If you picked one of the incorrect locations, Monty's only got 1 door to choose from, hence the 100%.

Again it is unclear. You say "If you originally chose the correct location, then Monty has two doors to choose from, hence the 50%." 50% of what? If you originally chose the correct location you will always be wrong because you will always change your choice to an incorrect location after he picks another door - what is this 50% of ?

Exactly. Your original choice is correct 33.3% of the time, and wrong 66.7%. If your original choice is correct, the "always switch" strategy always fails.

But if your original choice is wrong, "always switch" always works. Monty takes one wrong choice away from you, and "always switch" takes you away from the other wrong choice.

So, unless you choose right the first time (33.3% chance), "always switch works the rest of the time (66.7% chance). -- RobMandeville

50% of the doors he can show you. Draw a decision tree if this is unclear.

My view is based on each of the 12 sequences listed in the table above as all being equal - that is, the probability value of each statement is .083. If this is not the case, and this is what is causing my misunderstanding, please fill in the actual values in the new labeled Probability. The values must add up to 1. If the values differ please explain why.

The 12 sequences are not all equal. Those where switching is right are .111... Those where staying is right are .0555... (Monty doesn't open both doors when you guess right initially).

Hmm... try this:

  correct door     your first pick    if you stay   if you change   prob Monty
       1                  1             right          wrong        1/9  2 or 3
       1                  2             wrong          right        1/9  3
       1                  3             wrong          right        1/9  2

2 1 wrong right 1/9 3 2 2 right wrong 1/9 1 or 3 2 3 wrong right 1/9 1

3 1 wrong right 1/9 2 3 2 wrong right 1/9 1 3 3 right wrong 1/9 1 or 3
I think this breakdown shows that if you stay you have a 1/3 chance of being right, but you have a 2/3 chance if you use the additional information Monty gives you. -- AndyPierce Does showing the door Monty shows make this table clearer?

No, and neither of your explanations make sense. They both seem to hinge on the difference in switching or not, but according to the problem itself after Monty opens the door you always switch. Therefore, whether you stay or not, or switch or not, has no relevance to the problem. As I have asked, please label the column of Probability in the table with appropriate values to explain why the chances of any one of the 12 sequences occurring is not the same for all statements.

Okay, I filled in the probabilities in your table. My lame browser did not let me see to line them up, someone feel free to fix that.

Thanks! Now, the differences in probabilities has to do with the chance of your first guess being correct, right?

Yes - specifically, the sum of all the outcomes after you pick your door must equal the probability that you picked your door (conservation at work). But I've never managed to teach an unbeliever using that approach.

Well it worked for me - I see now what the logic is. Thanks a lot for bearing with me! I finally understand it. This is how I would explain it:

If you initially pick the door that has the prize (1 out of 3), you will switch and always be wrong. If you initially pick a door that does not have the prize (2 out of 3), Monty will open a door that also does not have the prize, you will switch and always be right. Therefore your chance of picking the correct door is (2 out of 3).

Thanks again to all for your explanations ! -- RodneyRyan?


BayesTheorem?, anyone?


I think the reason that people so often get this wrong is that they are hoist on their own petard (please correct the cliché). The victim, eager to prove he's no JoeSixPack?, thinks as follows: "Any dummy knows the car can't suddenly switch places, switching doors isn't going to make the car jump from one room to the other. That's what I learned in college." And of course, if were a matter of the car switching rooms spontaneously (possible according to quantum theory but very unlikely ;-)) you would be correct to stop your thinking there. But the victim, eager to prove (s)he's not an ignorant boob, jumps on the notion that the "probabilities can't change," skips the rest of the required reasoning and blurts out the (incorrect) answer.

I had trouble believing the correct answer myself until running a MonteCarlo? simulation. 10000 trials with the always-stay strategy yielded 3322 wins. The same number of trials with the always-switch strategy yielded 6652 wins, which is close enough to the theoretical result for government work.

-- DavidBrantley

I liked the explanation on rec.puzzles, which is the mate of the one in bold above. They say: The odds of you guessing WRONG in your first choice is 2/3. If you stay with your first choice, you stay WRONG with that probability. Ergo, when you switch, you move to only a 1/3 chance of being WRONG... 2/3 of being right. -- AlistairCockburn


Known to bridge players as the LawOfRestrictedChoice?.


Your original choice has a chance of 1/3 to win the prize. This doesn't change no matter what the moderator will do. Consequently, the chance that the prize is hidden behind one of the two remaining doors is 2/3.

So, now, the moderator opens one of the goat doors, but not yours. Your original choice still has a chance of 1/3 to be correct, but the two thirds for the other two doors now only apply to one door. That's why switching doors is smart.

A little ASCII art [:)]

After your original choice:

 +------+  +------+ +------+
 |  ?   |  |  ?   | |  ?   |
 +------+  +------+ +------+
                   your choice

\-------+--------/ chance to win: | 1/3 chance to win: 2/3

+------+ +------+ +------+ | goat | | ? | | ? | +------+ +------+ +------+ chance to chance to win: 2/3 win: 1/3

I'd pick the goat and win 100% of the time!


Label the box you picked as A, Monty picked as B, the other as C. The game is equivalent to the following:

You have two choices: do not switch and get all the prize in A; or switch and get all the prize in (B and C). Switching is twice as good as not switching. Monty tells you that all the prize in (B and C) is concentrated in C alone. So switching to C is equivalent and still gets you all the prize in (B and C).

Still have problems?

Do a simulation and verify that in the long run:

(1) The total number of prizes in B and C equals that in C alone. 
(2) The total number of prizes in B and C is about twice that in A.
And conclude that
(3) The total number of prizes in C is about twice that in A.


This reminds me of the TwoEnvelopeProblem.


I understand the 2/3 vs 1/3 explanation, I believe, and I'll accept that tables and MontyHallSimulations? prove it correct. But I don't understand why the problem isn't equivalent to this problem:

You have one box; Monty has another box. One of those two boxes contains a prize; the other is empty. Should you exchange boxes?

It seems to me that the initial 1-of-3 choice in the original problem is irrelevant. The revelation of one empty box simply changes the available choices to two, without providing any other information. Can someone explain this?

Both before and after the change, Monty's unopened boxes contain the prize with probability 2/3. The "simple change" implies those probabilities are the same after the change, but now "Monty's unopened boxes" becomes "Monty's unopened box". In your alternative scenario, much the same applies, but the initial probabilities are different (both 1/2).


If you think the answer to the MontyHallProblem is that it doesn't make a difference, you are confusing it with NotTheMontyHallProblem. -- AndrewMcGuinness


I think the confusion arises because even though the probability that Monty opens one door is 1/2, the choice is not random. Once, I had a co-worker (with limited skills in programming and math) actually scream at me when I discussed this because HIS PROBABILITY PROFESSOR had told him it didn't matter if you switched or not. So much for higher education in this country. Though I believed the doesn't matter scenario till I ran a simulation, saw the 2/3 result for switching, and sat down to work out the conditional probabilities. -- RobertField


I had the good fortune to discuss this at length via email with one of the folks here. Here's another way of looking at it. Suppose that there is a lottery in which there are a million tickets and a million dollar prize. Winners are chosen by drawing a matching ticket from a really big hat. You have one ticket. Monty eliminates all the other tickets but one. Somehow we know that all the tickets he eliminated were not winning tickets. Now would you accept the ticket Monty offers? The answer should be yes. It should be obvious that between your ticket in one hand and all the other tickets in the other hand, that you are more likely to win when you hold all the tickets. -- MattSimpson

Unless Monty's elimination process was random, avoiding the winning ticket merely by chance.


Ok, my attempt to explain the MontyHallProblem:

When you pick a box, you have a 1/3 chance of being a prize. After Monty eliminates an empty box, the other box has a 1/2 chance of being a prize. Therefore, it pays to switch. -- SeanOleary

[No. With 1/3 chance, you pick the box with the prize, and Monty chooses one of the other boxes at random. With 2/3 chance, the prize is in one of the boxes you didn't pick, and Monty must choose the empty box to open and show you, leaving the other box, which must have the prize. So, if you always change boxes, 1/3 of the time you lose, 2/3 of the time you win. So it is favorable to always switch. -- RobertField] (Robert, isn't that what's said lower down in the para beginning "Of course"?)

And how did you know that "the other box has a 1/2 chance" was correct?

Monty has better information. For him, it's always a 50/50 split. He always eliminates an empty box.

But the probabilities have to sum to 1.

Probabilities have to sum to 1 in the same scenario. They're two different scenarios here. My scenario, where each box has a 1/3 chance of winning (3 * 1/3 = 1) and Monty's scenario, where there are only 2 boxes, each with a 1/2 chance of winning (2 * 1/2 = 1). Monty has better odds because he has better information.

But then your logic breaks down, since your choice is based on comparing probabilities arising in different scenarios. Anyway, Monty may be choosing to open either of two empty boxes - in which case, he can give each a 50/50 chance of being selected. However, in the other cases, Monty doesn't have a choice of empty box to reveal, and so a 1/2 chance doesn't arise.

Let's change the game to roulette, in order to show that comparing probabilities is valid. For the moment, let's forget about 0 and 00. I place a bet on the first dozen (one chance in three). I'm blindfolded and Monty spins the wheel. Then Monty says I'll get the same payoff if I change my bet to black or red (one chance in two). Should I switch to get the better odds?

Of course, but that's very different. In the previous scenario, the original choice has probability 1/3 of having the prize, so the probability of the prize being in the remaining box(es) is 2/3. Bringing in 1/2 for comparison purposes is simply incorrect.


I call these discussions that are not on target WinoArguments, after the endless circular discussions by drunks in bars.

Hic! 'sright, but we're learning Sean to do maff proper... Hic!


All right, after being anonymously berated for not knowing my maths, I decided to write my own simulator. Here's my revised take:

I choose one of three boxes and have a 1/3 chance of picking the correct box. That means the other two boxes have a 2/3 chance of being the correct box. Of those two boxes, Monty can always show me an empty box. Therefore the remaining box has a 2/3 chance of being the correct box.

Thanks for your attempts to edumacate [sic] me. The best advice was to try it out myself. (Now would someone else clean up this thread? I don't mind my mistake being preserved for posterity, but comparing me to a wino, tsk tsk. :-) -- SeanOleary


You don't need to switch to get 50/50 odds:

Seems to me that each box of the three boxes have 50% chance to win. This is due to the removal of a wrong box. If Monty discloses an unpicked box, and then proposes a chance to switch, he has just removed one unpicked box from the standing, leaving just the remaining two boxes (yours and the other unpicked box). You have 50% no matter what. Correct or not, the odds are now 50% and no switch is required to raise your odds from 1/3 to 1/2 (Monty did this by removing one box from the mix).

Pick the correct box, Monty removes a wrong box, and you now have 50/50 odds. (switch and lose: stay and win) Pick a wrong box, and Monty removes the other wrong box, and you now have 50/50 odds. (switch and win: stay and lose) Switch or not, you have the same odds.

Did you read this page carefully? You are incorrect, in one of the usual ways. Granted, this page is a mess of explanations and confusion, but re-factoring it has failed before.

I see now looking at it this way. I reserve one box and Monty reserves one box and we discard ALL the rest (Monty must pick the winning box if I hadn't). Now we have gotten down to two boxes with imbalanced odds: my choice is 1/n while his is n-1/n. If we started with 100 boxes, my pick would be 1/100 while his has a 99/100 chance of being correct (his box can only loose if I reserved the winning box.) Goal then is to first pick a loosing box, then switch after he narrowed it down. Experiment: Try with a deck of cards. Try to find an Ace of spaces. Pick one card. Monty eliminates all but one. Now we have 1/52 and 51/52 odds for the two cards. Pick. Chances are it's Monty's card.


Not that I really expect it to help, but I wrote up a different presentation of the problem, which may help guide someone to new understanding: http://www.whiterose.org/cluey/archives/003576.html


I'm kind of shocked. Everyone here has really complex arguments. I think I have an argument that is easier to follow, because all it uses is the LawOfTotalProbability?. I hope it's right. :) Let me demonstrate:

When we start off, we have to choose a door. The probability that we're right is 1/3, and the probability that we're wrong is 2/3. We'll choose door 1.

Now, Monty comes along and says, "Well, if you look behind door #3 you see a goat? Do you want to keep your door, or change it?" Now, what just happened? Monty told us something.

Wait, but we know that P(Door3) = 0. The law of total Probability says that we have to have a total of 1. We know that Door3 can't be it, because Monty told us that P(Door3) = 0. So, to make P(Door2)+P(Door3) = 2/3 and satisfy our law of total probability, we must say that P(Door2) = 2/3.

So, P(Door1) is 1/3. P(Door3) is 0. P(Door2) is 2/3. P(Total) is 1.

What do you think? The reason that it's not a 50/50 chance is because these trials are not separate, they're linked. If they were totally separate, and no information was shared (choose one of these three doors for no apparent reason, then come back next week and choose one of two doors for a prize), then it would be a 50/50 chance. There would be no link via the LawOfTotalProbability?. -- DaveFayram

You're quite right. It is shocking. (And your exposition is 100% correct.) However, I prefer AlistairCockburn's explanation, which is the same as yours, but shorter. To paraphrase: the probability you chose the wrong door initially is 2/3, so the probability of being right if you switch is also 2/3. Note from this the four-door variant:


I've often wondered if the difficulty people have in accepting the solution would be diminished if it were posed in terms of Monty's choices, not the contestant's.

Consider it this way: after the contestant has made a choice, what is the probability that Monty can leave a door closed that has the prize behind it?

The contestant had a 1/3 chance of getting it right thus leaving Monty with a 2/3 chance of leaving a door closed with a prize behind it. So after Monty has made his choice, there is a 2/3 probability that the door he left closed has the prize behind it.

Is that any easier as an explanation?

-- ChanningWalton


Many smart people have been unable to see the right answer, even when it is explained to them many ways (Personally, I find the empirical approach, played as a betting game with REAL MONEY, to be the quickest way to get someone to understand it. A simulation doesn't concentrate your mind as well as losing money does.) The lesson I take from people's inability to understand this is that there are likely to be MANY other substantial mistakes in quantitative judgment that even smart people make all the time. If you can identify these fallacies, you can probably use them to make a lot money, at least for a while.

-- PeteProkopowicz


I've run it empirically and I accept the results, but I still find the theory unsatisfying. Moreso because people make so many convoluted other examples in attempt to clarify. Some of the more common things I have problem with is a claim that Monty knowing where the prize is makes a difference, which it does not. Only the actions taken and the information revealed make a difference. -- ChrisMellon?

Monty's knowledge of where the prize is does matter. After your initial choice, the door he opens never contains the prize. This can only be accomplished if he knows where the prize is.

No, it doesn't. What matters is that the door opened will not contain the prize, nor will it be the door you picked. *Why* that is the case does not matter. A gust of wind blowing a random door open will reveal exactly the same information, so long as it opens the same door that someone with knowledge would have picked.

Of course it does - Monty ALWAYS opens a door without a prize! A random door opening will miss sometimes.

No. It's a constraint of the simulation that the opened door will always be empty, and will not be the door you picked. Whether that happens by random chance or because Monty knows where the prize is doesn't matter. What matters is that the door opened is not your door, and does not have a prize. Period.

Let's try something - let's play the game withh 100 doors. Choose 1, then Monty opens 98 of the doors, revealing them empty - now do yuo change doors? I sure would.... NissimHadar

Missing the point... let's play the game with 100 doors. Choose one door. A crazed felon runs into the studio, screaming, then opens one door and dashes through it. That door doesn't have a prize. Monty is freaked out and can't remember which door you picked, so he lets you pick again. You remember the door you picked before, so do you choose that door (which is not the one the felon opened), or one of the other 98 doors?

AhHa, I see. To paraphrase, is your point that the contestant, despite having no knowledge of why Monty chose that particular door to open, still has enough information to choose the better option (switch)? In that sense, the constraint that Monty always chooses a prizeless door to open is unnecessary to understanding the solution to the problem.


Let's simplify this to basics. Which is the better option, to choose Door Number 1 or to choose Door Number 2 and Door Number 3? This is the underlying choice in this game. When you initially choose Door Number 1, there is a 1 in 3 chance you are correct. You already know that there is a 100% chance that there is no prize behind either Door Number 2 or behind Door Number 3. Revealing one of them to be empty and then offering the other is equivalent to immediately offering the choice of both.


Come on, guys. This is basic conditional probability.

So, if you swap, the probability of winning is (1/3 * 0) + (2/3 * 1) which is obviously 2/3.

And if you don't, the probability of winning is (1/3 * 1) + (2/3 * 0) which is obviously 1/3.

Therefore, it is always better to swap. Intuitively, you are more likely to have picked a losing box on your first pick.


See MontyHallSimulation



EditText of this page (last edited March 17, 2009) or FindPage with title or text search