Monday, 9 October 2017

Monty Hall - Stick or Switch? It Depends How Often You Can Play

The Monty Hall problem is back in the news, or at least the weekend edition of the Financial Times, again, I think because Monty Hall died recently. Here’s the problem:
You’re on a quiz show with a host, Monty. There are three cabinets A, B and C. In one cabinet is a car, and in the other two a goat. You get to nominate a door, and then Monty will open one of the other doors and ask you if you want to change your choice. What you know is that Monty never opens the door with the car in it. Never. Should you change your choice?
The answer, given by Marylin vos Savant, is that you should, as in two-thirds of the cases, you will win the car. When she gave that answer, the wrath of a zillion statisticians and mathematicians descended on her. Here’s her argument: there are three options (in order A, B, C)

  1. Car Goat Goat
  2. Goat Car Goat
  3. Goat Goat Car 
If you pick A, you lose by switching in option 1 and win in 2 and 3. Otherwise you win by switching in the other two options. Take the odds and switch. At least when you have the opportunity to play the game over and over.

What happens when you can only play once? Choose A and suppose that Monty opens door C to show a goat. Now you know there are only two options:

  1. Car Goat Goat
  2. Goat Car Goat
In this case, the odds are 50-50 for switching. Why? Because you don’t have third option of Goat-Goat-Car which would force Monty to open door B.

Play the game over and over, and switching will win more often. Play once, and it’s a flip of the coin, so you may as well switch, since the odds are the same. There’s a winning strategy for multiple plays, but not for a single play.

Damn that’s clever.

Statistics is not only hard, it also only applies when you can repeat the experiment.

What about all the other arguments, including one quoted on Wikipedia that says this;
By opening his door, Monty is saying to the contestant 'There are two doors you did not choose, and the probability that the prize is behind one of them is 2/3. I'll help you by using my knowledge of where the prize is to open one of those two doors to show you that it does not hide the prize. You can now take advantage of this additional information. Your choice of door A has a chance of 1 in 3 of being the winner. I have not changed that. But by eliminating door C, I have shown you that the probability that door B hides the prize is 2 in 3.’
Here’s the mistake: "the probability that the prize is behind one of them is ⅔” should read “the probability that the prize is behind one or other of them is ⅔”. No argument that tries to establish that switching always gives a 2:1 advantage can be right, because when you can only go once the odds are 50-50.

On a one-shot play, sticking is as good as switching.

And in the TV show, you only got one shot.

No comments:

Post a Comment