silverscreenselect wrote:They've started a weekly contest on Friday at
http://www.fivethirtyeight.com in which they have a puzzle that requires math or logic to solve. They will post the answer the next Friday. You don't get any prize for winning other than a shout out on their website if you are the "big" winner, but this week's puzzle at least is challenging.
Two players go on a hot new game show called “Higher Number Wins.” The two go into separate booths, and each presses a button, and a random number between zero and one appears on a screen. (At this point, neither knows the other’s number, but they do know the numbers are chosen from a standard uniform distribution.) They can choose to keep that first number, or to press the button again to discard the first number and get a second random number, which they must keep. Then, they come out of their booths and see the final number for each player on the wall. The lavish grand prize — a case full of gold bullion — is awarded to the player who kept the higher number. Which number is the optimal cutoff for players to discard their first number and choose another? Put another way, within which range should they choose to keep the first number, and within which range should they reject it and try their luck with a second number?
Intuitively, you would think the answer should be .5 because at that point your odds are 50/50 of improving or not. But if your opponent adopts that strategy, he's going to wind up with a number higher than .5 some 75% of the time, so if you stay on .5, you rate to lose. So your answer should be some number higher than .5, but I'm not sure what it is right now.
Here's the link to the puzzle:
http://fivethirtyeight.com/features/can ... game-show/
This analysis can't be right. If you and your opponent adopt the same strategy, your odds of winning
must be 50-50. The problem with your analysis is that if you decide to redraw when you get 0.5, you will decrease your chances of winning half the time.
I don't have time to work through the analysis right now, but I think the right approach is to assume that you choose
x, fixed, as your cutoff and then let your opponent choose
y, variable, as hers. You want to choose
x that maximizes your minimum (which will occur when your opponent also chooses
x). I believe Nash's Theorem still applies in this setting, in which case that choice of
x is the best you can do.
You then have 4 possibilities to consider -- you and your opponent each do or do not end up with a first number below the chosen cutoff. In each case, with a little care you can figure out the likelihood that you win. The values of
x and
y tell you how likely you are to fall into each of the four cases, and your likelihood of winning then falls out as a function of
x and
y.
You probably have to do the analysis twice, once assuming
x <
y, and once assuming
x >
y. --Bob
"Question with boldness even the existence of a God; because, if there be one, he must more approve of the homage of reason than that of blindfolded fear." Thomas Jefferson