Deal or No Deal Algorithm
Posted December 20th, 2005 @ 11:53am by Erik J. Barzeski
Carey and I watched the first episode of "Deal or No Deal" last night. We watched a dumb woman pass up $171,000 for a 1-in-3 (2-in-6) chance of winning either $300k or $500k. The other four choices were $50k, $7500, $500, and $100 or something like that. The lady, who had never owned a house, ended up with $25,000 (her briefcase had something small; $500 I think).
Anyway, towards the end of the game, the banker's offer (I'm not gonna explain the show - look it up or something eh?) was pretty close to the average of the remaining choices. In fact, when only two cases were left, they were $50,000 and $1,000 (I think), and the banker's offer was the $25k the woman took. At the beginning of the game, with 10 "small" numbers (<$1000) and 10 "big" numbers, on average, remaining, the banker's offer is relatively small compared to the average to urge contestants onward. Last night's first offer was $17,000 - not bad money for five minutes of work.
I'm wondering if "the banker" is simply following some algorithm that weights the number of possible payouts remaining, and I'm wondering when an observer of the show will figure out what that algorithm is. Is there an algorithm? How much do the intangibles - namely, the psychology of greed and gambling - play into it? Is the algorithm (again, if it exists) flexible or is it simply mathematically based to attempt to minimize the payout based on the odds each stage of the game presents?
I've never had any mathematical training in such an area or I'd look into this myself. As such, I can only sit back and hope to read about someone else cracking the algorithm, if there is an algorithm to be broken, that is.
Posted 20 Dec 2005 at 12:27pm #
Wikipedia has some information on the math.
http://en.wikipedia.org/wiki/Deal_or_No_Deal
Posted 22 Dec 2005 at 8:54pm #
Hi,
One of my Artificial Intelligence courses back in University had a section that dealt with this… not to mention our statistics courses in math and psychology. A basic game theory algorithm will include both the value of the prizes as well as the probabilities. I think we also looked at algorithms that take less tangible things like psychology into account.
The banker does use a computer to crunch the numbers but there does seem to be a "fudge factor" that can help skew the offers in favor of the producers depending on the contestant. That's why they have so many breaks where they ask them questions to get to know them; it helps them determine what the person is likely to do next. I would assume that the computer program they use has sliders to match the contestant, sort of like in computer games where you can customize a character.
Posted 06 Mar 2006 at 4:09am #
I'm not sure how much of a "fudge factor" there is in the game. There is an online version of the game at the NBC website, and obviously there is some algorithm determining the bankers offers (though theroetically there could be some randomization thrown in).
It's definitely the average of the remaining values when you get down to two or three cases left, but I've played the online game a few times and the first round or two is definitely not average, or median, or average minus the largest and smallest values.
My guess is that early on in the game the algorithm is looking not only at what are the possible amounts in the players case, but also at what amounts were just revealed.
Posted 07 Mar 2006 at 12:06am #
This formula works pretty well for the 3 games I've watched so far:
banker's offer = average value * turn number / 10
This works out so that for the first half of the game or so, the banker's offers are just lousy, so they can egg the player on (the longer a player goes, the less players there are per show, so the less money they have to pay out)
It is definitely not exact, and I think there are either some factors that I'm not taking into consideration, or the banker may actually have some input to the algorithm based on his perception of the current player's willingness to take risks.
Posted 15 Apr 2006 at 12:38am #
The banker's algorithm takes the average winning value at any given point in the game (which is the sum of the unexposed cases), multiplied by a sliding scale factor. That sliding factor severely pushes below the average winning position at any given state early in the game, then approaches the true average when the contestant is down to four or fewer cases.
In some rough comparisons against a quick PC model I built, it appears that the bankers algorithm is computed as a percentage of the average winning value in this approximate sequence:
11%, 15%, 22%, 37%, 70%, 90%, 100%
If you plot those numbers on an Excel graph, you'll find they make a rather clear tilted S shape, so even if the numbers aren't precise, I'm confident they're close enough to demonstrate that it is a predetermined formula. The extensive profile contestants fill out probably contributes to a tweaked version of that graph that includes an estimate of a given contestant's risk aversion.
The wikipedia discussion on the probability, point-in-time values, and probability distributions is very interesting...
Posted 08 May 2006 at 7:05pm #
HELP! Can you help solve a fight that I'm having with my brother regarding the way to calculate the probability of getting the million dollar case in DEAL OR NO DEAL. My brother whose an engineer claimes that the probablility of getting the $1 million is 1 in 27. (and that seems correct to me) But he claims that that probability remains that same (1/27) even if all but 4 cases have been opened. That the probability remains 1 in 27 even after the 22 other cases have been opened and none of them have the $1m in them. I believe that the probability adjusts after each case is opened.......which means that with four cases remaining that there is a 1/4 chance that the $1m is in the contestents case. I've pointed out the extreme of two remaining cases ... I would say that there is a 50% that the contestent has $1m but he claims that the probility that the $1m is in the contestents case is still 1/27 and that ....it makes more sense based on probility to switch cases .....my view is that that's crap ... it's a 50, 50 chance at that point.
HELP me solve this family arguement! Remember he's Mech Engineer so you have to talk very slow and explain thing in tremendous detail.
Marty
Posted 31 Dec 2011 at 10:45am #
I know its been years but curiousity caught me to your post. Your brother is right. Since your choice in case is made in the beginning revealing cases has no actual effect on your actual chances of winning statistically. Sure at that point in time with four cases left you can deduce that you will win 25% of the time. However, your choice was not effected. Just because you were lucky enough non million cases to be revealed doesn't change the fact that the choice was made with a 1/27ish.
Posted 29 May 2015 at 3:47pm #
Hey Marty I came across this blog doing some research on basic probability theory and would like to take the opportunity to make things a little more clear for you. On the surface, from a straight forward probability perspective your brother is absolutely correct, nothing changes the initial probability of 1 in 26, however, if the question is one of valuation at the end of a process then the process itself must be considered. Some responses have referred to the Monty Hall Equation and they have accurately deduced that should you find yourself in the enviable position of having two cases left, one with $1,000,000 and one with $1, and the process that got you there was a Monty Hall process where all the cases were removed by someone who has knowledge of their value and who wanted to leave you with this choice then you must exchange your case for the one remaining as your case has a 1 in 26 chance of holding the million while the case left by the host has a 1 in 2 chance. Fortunately in Deal Or No Deal things are more clearly defined by the process that gets you to this choice. When the process begins you choose one case out of 26 giving you a 1 in 26 chance of choosing $1,000,000, in the next step of the process you randomly remove 6 cases out of the remaining 25. The odds of removing the million during this step is 1 in 4.16. After beating those odds you must remove 5 of the remaining 19 cases and your odds of removing the million here is 1 in 3.8. Your next round has odds of 1 in 3.5, then 1 in 3.63, 1 in 3.5, 1 in 5, 1 in 4, 1 in 3, and finally in round nine to get to your case and one remaining on stage you must remove 1 in 2 and once again not remove the million. The chances of you going through this process and not removing the million but rather leaving it as the sole remaining case on stage are 1 in 84,353 hence your original valuation of 1 in 26 is a much better value, keep the case, don't swap.
Posted 09 May 2006 at 4:06pm #
there are two methods of calculating probability, one with memory one without, you are both right. 🙂
Posted 11 May 2006 at 3:13pm #
Marty, I feel like i have to post... with 4 remaining cases, the prob is 1/4 EXACTLY. Engineers like limits. ask your brother what is the prob of 1M provided all cases except one has been opened--100%, right? Granted that, there is a disjunct from 2 cases (1/27) to 1 case (100%) and consequent contradiction. Beyond that, there is obvious intuiative probs with his assumption plus it contradicts the premise of the show--not that shows are logical.
Posted 18 May 2006 at 3:04pm #
Marty, your brother is correct. See The Monty Hall Problem for reference.
Posted 18 May 2006 at 11:50pm #
ok here's a weird question... my gf and i have been arguing for about an hour about this:
is it possible for the banker to offer you an amount that is currently on the board?
my answer is yes - as long as this is based on a formula, it is possible that the payout could happen to be an amount that is on the board. however, there is heated argument in the room as to whether this is the case..
can someone clear this up?
Posted 30 May 2006 at 3:23pm #
I'm a student in a statistics course, and as a final project I am to come up with an algorithim for the game show.. not exact, of course, but within a close range of the banker's offer. I doubt I'll come even close... but if some other genius student finds an accurate one, I'll be sure to post it.
Posted 19 Jul 2006 at 10:38pm #
to marty,
your brother is right. i am an engineer also. =)
the probability of getting the 1 million dollars is 1 out of the total suitcases. once you select your suitcase it doesn't change as you open the remaining suitcases. the show wants you to have the false belief that the probability increases or decreases as you open the remaining suitcases but it doesn't because you already selected your suitcase. The probability will just increase or decrease if you can always change your suitcase after each suitcase elimination. since the game doesn't allow you to change suitcase after each suitcase elimination, the probability won't increase. get my point?
Posted 24 Jul 2006 at 8:06am #
Marty, your brother is absolutely right. That's why the game is stupid when probability comes into play. It never matters how many cases will be opened since the chosen briefcase had already been chosen out of 27 briefcases.
Though the probability is not on the player's side, the player could take advantage on the offers being made the banker.
Posted 17 Sep 2006 at 2:10pm #
I'm sorry but the engineer has some basic problems with logic. If you were to begin each turn with all cases back in play then yes. But you don't and this brings the equation back to the present which for example if you have 2 cases left $1 and $1,000,000 in play then at this point in time you have a 50/50% chance, odds for this purpose are current not historical, if you were to place a bet in a horse race out of 20 horses assuming they were physically equal etc. Then 18 horses were scratched from the race then you now have a 50/50 chance of your horse coming in first and your bookmakers odds would drop below 2/1 as opposed to below 20/1(to allow for profit). It would be totally irrelevant how many horses "were" in the race!
Posted 24 Sep 2006 at 11:52am #
A big difference between DOND vs. the Monty Hall problem is that Howie doesn't open all but one of the suitcases with prior knowledge of their contents. The contestant picks the suitcases. In the Monty Hall problem, if you have a million curtains and the contestant picks one, Monty can reveal what's behind all but one of the other ones, and the assumption is that he does NOT do so randomly--if the contestant hasn't picked the grand prize, Monty always reveals all the other curtains EXCEPT the one with the grand prize...which is a pretty boring job 999,999 times out of a million, and in that case of COURSE the contestant should switch curtains at the last minute. The problem is set up differently in DOND, and the expected outcome is based on the information you know, which in this case has no interference from the host and his implied contract. That's what I don't like about most descriptions of the Monty Hall problem: they forget to state the crucial piece of information that Monty is NOT as clueless as the contestant--the assumption is that he knows which curtain has the grand prize and NEVER reveals that one.
Posted 30 Sep 2006 at 11:38am #
complete algorithm or casio program would be great ( even if an approx has to be made on bankers offer, (which seems to be the mean less a certain amount to me)any ideas anyone? thanks
Posted 11 Oct 2006 at 1:17pm #
On the question of one chance in twenty seven. This remains fixed throughout the "event". The "event" is defined by the action of taking the one case AND KEEPING IT. The die is cast..so to speak. This defines the complete universe....this one action. IF one could ALWAYS and MUST return the case and THEN choose another case EACH round of bargaining, THEN and ONLY then, would the odds shrink as cases were eliminated. So the "odds"....in this universal context NEVER change in spite of what the host of the show says. I hope this helps.
Posted 19 Oct 2006 at 12:09am #
AMEN, very true, just hard to wrap your mind around.
Posted 26 Oct 2006 at 5:00am #
reply to the horse racing analogy....
The game show is more in line with you betting on a horse with 20-1 odds. Then, if I tell you that 10 specific horses will definitely not win the race, but will still race, does that change your horse's odds of winning? no, it doesn't. The odds are still 20-1.
Posted 26 Oct 2006 at 10:43pm #
This problem is not the same as the Monty Hall problem. Suggesting that the probability that your original case contains 1 million is allways 1/27 regardless of what is revealed to be in the other cases is just absurd. For example, if you pick a case there is a 1/27 chance that it contains 1 million dollars. What if you then eliminate a case which is revealed to have 1 million dollars? The probability that the case you chose contains the 1 million is now zero. I can only assume that the people who are suggesting that the probability is allways 1/27 have never seen the show and don't realize that the cases are opened when they are eliminated and the contents revealed to the player and audience/
Posted 27 Oct 2006 at 11:20am #
Isn't it true that the probability of picking the 1 million were 1/27 - that is true that never changes.
But the odds that you have the 1 million does change as
the suitcases are revealed and you receive more information about 'game'. The odds increase 1/26, 1/25, etc. until there are the two final suitcases and you have a 1/2 chance of having the 1 million suitcase?
There are two different things being discussed.
Posted 11 Nov 2006 at 7:25pm #
"But the odds that you have the 1 million does change as the suitcases are revealed and you receive more information about 'game'. The odds increase 1/26, 1/25, etc. until there are the two final suitcases and you have a 1/2 chance of having the 1 million suitcase?"
Yes, but that assumes that the 1 million suitcase haven't been opened yet. The odds increase as more cases are opened, but they drop instantly to zero if you hit the 1 million case.
In 25 out of 27 cases, the 1 million suitcase will be opened before we have only two cases left.
This means there's a 26/27 chance that you do not have the million case. (25 where the million is found before there are two suitcases left + the one where the million is in the one you didn't choose.)
Posted 14 Nov 2006 at 5:07pm #
wow, i was 100% convinced the engineer was right, but after really thinking about it, not so much.
Monty Hall doesn't really apply here sinse THE PLAYER IS THE PERSON CHOOSING THE 25 CASES TO BE OPENED. In Monty Hall, the game show opens a curtain with 100% certainty that they will not open the winner. Here, if the million is on the board and not in your possession, 25/26 times you'll open it before the end of the game. So, there is a 1/26 chance you chose to leave the million dollar case on the board and a 1/27 chance it wasn't there to begin with.
Here is what Deal or No Deal is doing. The player first chooses 1 from 27 (obviously 26/27 it's wrong) Then, the player is just choosing another case from the remaining 26 (this time by the ratings friendly process of eliminating 25 cases) 25/27 times its not in either, which is not possible in the Monty Hall example. As unlikely as it is that you chose correctly on your first try, its almost equally unlikely that you chose correctly through the elimination process of the remaining cases.
this is VERY VERY different from THE HOST removing 25 that he knows 100% are not the million. If that were the case, Monty would apply and you should switch.
Sorry, not sure if its exacly 50/50, but I think your closer to being correct than your engineer brother with 26/27... way off.
Posted 03 Dec 2006 at 4:34am #
holy cow! the odds that are being played is simple. i can't comprehend that there is a discussion about this lasting this long. Please folks, watch the game once before replying to this thread. each time the contestant chooses between deal or no deal, there is a new set of odds (as some have already pointed out.. marty. . why don't you ask a stats engineer. hey, if i'm a maintenance engeneer, do i qualify to answer this question? (ok a bit facetious, i'm sorry)
my question... what is the deal or no deal algorithm? any ideas?
Posted 04 Dec 2006 at 10:20pm #
There are 26 suitcases, not 27.
The chances of you winning 1,000,000 are 1/26.
As you start playing the game, these odds dont change, but the probability that you did indeed win does change based on the number of cases open.
So as the game progresses the likelihood that you have won indeed does change, however your overall probability of winning does not change.
It appears that people are talking about two differnt things:
1. Overall probability that you will walk away with 1,000,000 is 1/26 and this doesnt change.
2. Probability that you will walk away with 1,000,000 once you start openning cases changes, but this is only a subset of the overall 1/26 odds
Posted 05 Dec 2006 at 3:43pm #
You should all read this world-class research paper on European versions of DOND. Not only do they study participant behavior, they also estimate a functional form that explains the banker's offer quite well.
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=636508
By the way, for those worried about the odds of winning $1 million, it turns out to be path dependent. What you really want to know, is what's the probability that the banker is going to offer you $1M at any point in the game. The paper clearly shows that the only way this would ever happen is if you chose the right case at the very beginning AND had the guts to say "No Deal" every time. So, in my estimation, it's a lot less than 1/N. You're not likely to see someone say "No Deal" when the two cases left in play are $1M and $.01 with a $500K offer from the bank (which the algorithm predicts, by the way).
Posted 28 Dec 2006 at 5:17pm #
A Java solution:
package test;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
import java.util.Random;
public class DealOrNoDeal {
static List amounts = new ArrayList() {
{
add(new Double(.1));
add(new Double(1));
add(new Double(5));
add(new Double(10));
add(new Double(25));
add(new Double(50));
add(new Double(75));
add(new Double(100));
add(new Double(200));
add(new Double(300));
add(new Double(400));
add(new Double(500));
add(new Double(750));
add(new Double(1000));
add(new Double(5000));
add(new Double(10000));
add(new Double(25000));
add(new Double(50000));
add(new Double(75000));
add(new Double(100000));
add(new Double(200000));
add(new Double(300000));
add(new Double(400000));
add(new Double(500000));
add(new Double(750000));
add(new Double(1000000));
}
};
public static void main(String[] args) throws Exception {
if (args == null || args.length == 0) {
simulation();
} else {
interactive();
}
System.out.println("\nGAME OVER!");
}
static void simulation() {
while (!amounts.isEmpty()) {
System.out.println("Expected value: $" + computeMean());
double randomNumber = Math.random() * 1000000d;
long randomSeed = Math.round(randomNumber);
int index = new Random(randomSeed).nextInt(amounts.size());
System.out.println("\nRemoving: $" + amounts.get(index));
amounts.remove(index);
}
}
static void interactive() throws Exception {
while (!amounts.isEmpty()) {
System.out.println("Expected value: $" + computeMean());
System.out.print("\nPlease enter value to remove: $");
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
Double in = null;
try {
in = new Double(br.readLine());
} catch (NumberFormatException nfe) {
System.out.println("Input must be a number.");
continue;
}
if (!amounts.contains(in)) {
System.out.println("Input value not found.");
continue;
}
amounts.remove(in);
}
}
static double computeMean() {
double mean = 0;
for (Iterator it = amounts.iterator(); it.hasNext();) {
mean += ((Double) it.next()).doubleValue() * (1.0 / amounts.size());
}
return mean;
}
}
Posted 29 Jan 2007 at 8:41pm #
I see discussions about the probability of having a million dollars in the case that was originally selected. But after it's selected the decisions are that of risk/reward.
The "board" at any time has a "value" based on the remaining amounts. The player has to balance the banker's offer against potential changes to the board's value (which impacts the next banker's offer), given the number of cases that have to be opened.
This is easier to calculate towards the end of the game, and you even hear contestants discussing this. But the complexity of trying to do so when opening over 2 cases makes it very difficult to do without a computer (which the "banker" has).
Adding to the complexity is that, while there are 26 initial cases, their individual "values" are not evenly distributed. That pulls the mean way off of the median; a fact likely to be lost on the average player/observer.
Anyway, it is a fun way to get the old squash some exercise...
Posted 12 Feb 2007 at 2:05pm #
I understand that everyone is trying to discuss the statistics of this game and whether you gain statisticly as the game nears its end.
To clear this up lets start with this simple explanation. If you do not have the million dollars in the first case you choose you absolutely cannot win 1 million dollars. The banker will never ever offer you 1 million dollars even if the last two numbers on the board are $750,000 and $1,000,000 he will probably offer you something around $875,000.
Knowing this, your chances of winning a million dollars rests completely on whether your case has $1 million in it. Which you decide from the very beginning of the game. There are 26 cases at the beginning and you must choose one. So your chances of winning a million dollars is 1 in 26.
let me repeat this IF YOU DO NOT SELECT THE $1 MILLION DOLLAR CASE AS YOUR CASE IN THE BEGINNING OF THE GAME YOU CANNOT WIN $1 MILLION DOLLARS!
Remember that in the game you can never switch cases!
Posted 17 Mar 2007 at 9:14am #
Has anybody implemented the game with GUI on Java?? ❓
I need the script please!!
Posted 17 Mar 2007 at 4:16pm #
[quote comment="40159"]I need the script please!![/quote]
Gee, got a homework assignment due soon or something? 🙁
Posted 19 Mar 2007 at 10:40pm #
So far George Jones seems to have the closest formula.
At least based on tonight's episode with 50k, 75k, 100k remaining - offer was 67k. GJ's formula comes up with 67,500.
With 50k and 75k the offer was 62k. GJ's formula gives 62,500.
banker's offer = average value * turn number / 10
Posted 26 Mar 2007 at 5:22pm #
[quote comment="40165"][quote comment="40159"]I need the script please!![/quote]
Gee, got a homework assignment due soon or something? :-([/quote]
Well I need to see the script. Is it so bad?? It doesn't mean that I have an assignment to fulfil or somethin'......
I want to see how it should be implemented because I am a "feeble" java programmer. Simple & plain!
In any anycase thanks a lot!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Posted 26 Mar 2007 at 10:03pm #
Just a tidbit while I'm still reading the Post et al. paper. If you picked the million dollar case in the beginning, than the million dollars would always be factored into the bank offer. This would make it hard to proceed, say if you had knocked out more of the right side of the board than the left, and your suitcase looked like it had a greater chance of being low.
Also.... the people on this show do not appear intelligent enough to discern this. The show requires only the minimal amount of intelligence - pick a number and then say deal or no deal. The produces (dutch, originally) are smart to pick people who do not take an analytical approach to the show. I am guessing that people who often play the lottery are those who are screened for the show.
Finally, the ideal game strategy (after seeing 2 episodes) for me would to be offered 100k+ and then take the money. I am interested in seeing tbe average winnings of people offered 100k+ but went no deal. I have a hunch it is less than 100k.
Posted 28 Apr 2012 at 3:53pm #
No the Monty hall solution is biased because the host causes one of the three choices that he knows is a bad choice to be revealed, this fact means statistically you should switch.In deal or no deal the banker has no idea what case has the top prize nor does he influence the contestants choices of elimination. The actual fact in deal or no deal is the remaining cases cause the probability percentage to change two cases, is 50/50 it only remains 1/27 if you do not reveal the contents thus negating the game in entirety. The only way your brother is correct is when he refers to the initial odds of choosing the top prize. After that you are referring to the remaining odds of what may be In your case. Would he quote the same odds if you started the game with 27 cases but 23 of them were already open revealing the contents as not being one million dollars? Also there is a guy on here that watched 3 episodes and assume the banker knows what case the million is in, not true.
Posted 27 Mar 2007 at 1:42am #
I realize that you guys have been hacking around on Marty's probability question for a while, but it strikes me that no one is entirely right. I teach stats. Maybe someone else who teaches stats would pitch in also to confirm.
After you pick the case, the PROBABILITY of having picked the $1,000,000 case is either 0 or 1. Either you picked it or you didn't. It's just as if I flipped a coin but kept my hand over the result. The probability isn't .5 that the coin is heads. The probability is either 0 or 1. Once I lift my hand, you'll know which probability is correct.
The issue here is CONFIDENCE. Upon choosing your briefcase, your level of confidence that the case contains $1,000,000 is 1/26. As briefcases are opened, your confidence level will change to reflect the new information you receive. If briefcases other than the $1,000,000 case are opened, your confidence will increase to 1/20 after the first round, 1/15 after the second, and so forth. However, if the $1,000,000 briefcase is revealed by one of the models, your level of confidence will drop to zero. If you get down to two cases without revealing $1,000,000, your level of confidence is 1/2. Open one more and your confidence shoots up to 1 or down to 0.
It's correct to observe that you will not take home $1,000,000 unless you pick that briefcase at the outset. Since that is the maximum payout, the banker can only lose or break even if he offers you the full $1,000,000 before he is certain you have it in your case. However, this doesn't really address Marty's question. S/he's correct in saying that there's no point in switching when you're down to two cases. The level of confidence in either briefcase is 1/2.
As others have said above, DOND is NOT the Monty Hall problem. Because Monty Hall would and could always open a door with a gag prize, the confidence level for the chosen door would remain 1/3, but change to 2/3 for the unchosen and unopened door. In effect, Monty was saying to a person who chose door number 1, "You can be 2/3 confident that the prize is behind door 2 or 3, and it isn't behind door 2."
Also, the issue here isn't the difference between odds and probability. Odds compare winning outcomes to losing outcomes whereas probability is the ratio of winning outcomes to all outcomes. Before you choose a briefcase, your probability of picking the big winner is 1/26 and the odds of picking the big winner are 1:25.
Anally yours,
Worth
Posted 27 Mar 2007 at 2:02pm #
Do you think the banker knows whats in each case and makes offers to either persuade to continue or persuade to stop?
Posted 27 Mar 2007 at 2:06pm #
[quote comment="40849"]Do you think the banker knows whats in each case and makes offers to either persuade to continue or persuade to stop?[/quote]
I think it's in the rules that the banker doesn't know. I think I remember hearing that nobody knows what's in what case (except, obviously, for whomever puts the dollar values in the cases). Or maybe they don't do that - they could put the dollar values in the cases without seeing the numbers on the front.
Posted 01 Apr 2007 at 10:24pm #
java shmava. It's the percent chance of choice (cases left) times the sum of all money left on board. i.e. 4 cases in play = 25% chance of choice. If there is $400,000 left on the board, the offer will be approx. $100,000. Check it.
Posted 01 Apr 2007 at 10:45pm #
Lets apply the Krunktonite formula more directly...
If there are 37 posts on this thread and the first 27 are still trying to figure out the odds of choosing the $1M case what is the average I.Q. of all posters? Script! I need script! Psssst!Down here! Could you change a five for 5 ones? Who does #2 work for? WHO DOES #2 WORK FOR?
Posted 10 Apr 2007 at 5:55pm #
Here is a page with the formula for the on-line game
http://www.davegentile.com/stuff/Deal_or_no_deal.html
Also regarding Worth's comment - The distinction that should be made here is not between “probability†and “confidenceâ€Â, because confidence has a meaning connected with confidence intervals, etc.
The distinction we want is between classical or frequentist statistics (what Worth calls probability) and Bayesian statistics. In Bayesian statistics, probabilities say something about our state of knowledge, and justified believes based on that knowledge. In classical or frequentist statistics probability is treated as describing something that is objectively uncertain.
In the game, after you pick the case, the classical probability that you have 1,000,000 in your case is either 1 or 0. But the probability in Bayesian statistics is 1/26. This will get better, as you open cases that don't contain 1,000,000.
And, as noted above, this is not the Monty Hall paradox situation.
Posted 13 Apr 2007 at 9:50am #
Here's the reason that you should switch in the end, but I'm going to use the "three doors" logic:
The odds to pick a car in the game are 1/3, and after you have the host reveal a door, the odds change to 1/2. The problem is that most people explain why very badly, so I'll clarify. If you have a 1/3 chance to pick a car, you also have a 2/3 chance to pick a goat. So the odds are in favor of you originally picking a goat, correct? That means that once the host reveals one of the doors with a goat, only your door or the single remaining door has the car. Since you originally had a 2/3 chance to pick a goat, you probably did. But now you only have two options, and therefore a 1/2 chance to get the car, so you should switch.
The important factor here is that since he's eliminating a losing door you KNOW that one of the two doors has a car, so if the game were to start over again, the chance is one in two, and since he's throwing one of the doors out, the game, for all intents and purposes, IS indeed starting over again, only with a 50/50 chance of picking the correct door this time, rather than a 2/3 chance of picking a goat.
Posted 13 Apr 2007 at 11:16pm #
Steve,
Are you retarded? That was the worst clarification ever; it was very incorrect.
Signed,
Everybody Who Understands Monty Hall
Posted 18 Apr 2007 at 12:13pm #
You are not correct, when the game comes down to two cases remaining, you have the choice of choosing your case or the last one on stage, so you can change cases at least once. Therefor, if you do not choose the million at first, and given that the million is at least one of the two remaining, you can switch to the million case if it is not the one you originally chose .
Posted 21 Apr 2007 at 2:11am #
1) As mentioned by others, definitely not Monty Hall since Monty always eliminates non-winning choices.
2) Angel wanted the implementation so badly that she failed to realize that an analytic solution wasn't in hand yet. To me, this is very telling.
3) NThurston gives a good reference to a paper--pages 16 and 17 of their April 2007 draft (Thierry Post, et al) contain the relevant information. The authors indicate fairly good least squares fit of estimated parameters to their equation, though don't give all information--you need to jump start it with initial bank offer information and allow for the banker to vary the offer to help the game continue.
4) George Jones (Haywood_J concurs) has a decent formula. David W appears to have reasonably good numbers as well. Various numbers taken from their progressions are validated fairly closely by Post, et al. However, the values for 'rho' fluctuate, but this is not unlike what the authors say happen as the dealer might try to do to help lucky/unlucky contestants feel they should continue the game.
5) Jason appears to have the correct logic for calculating the expected value of the remaining values--while I don't program Java, I can understand what he did, and I independently used the same formula the first time I tried predicting DOND offers. While this is not the offered value, the banker appears to use it to calculate the offer; it is used as part of Post's formula (and that of George Jones and David W.).
From Thierry Post paper(*):
B[r+1] = b[r+1]*E
b[r+1] = b[r] + (1-b[r])*rho^(9-r)
where
r is the round (1-9)
E is the expected value of the remaining amounts
B[r+1] is the expected bankers offer (the value you are looking for)
b[r] is the percentage bank offer (you can get this easily at the first round--or any round if you know what you are doing).
The authors give 4 values of rho:
USA: rho = 0.777
Dutch: rho = 0.832
German season 1: rho = 0.815
German season 2: rho = 0.735
Examples:
With all case amounts in play, E = ~$131,477
Eliminating the smallest six case amounts, E = ~$170,916
I've seen initial offers as high as $20K+. Using 20K and 171K, b[1] = 0.117 (very similar to David W's).
Using 0.11 (David W's initial percentage bank offer) and using the USA value of rho, we get the next percentage bank offer of 0.228 (too high). But you could, instead, calculate rho from David W's set of percentage bank offers: 0.15 = 0.11 + (1-0.11)*rho^(9-1) yields rho = 0.679. Using his values of 0.22 and 0.15, you get rho = 0.700; values of 0.37 and 0.22 yield rho = 0.760. Basically the values fluctuate, as per Post, et al.
Best choice, use rho given by the author as they are based on a least squares regression fit to all the data they took from the show episodes.
So, let's stick with b[1] = 0.11 and rho = 0.777 and since I already eliminated 0.01, 1, 5, 10, 25 and 50 with E = 170,900 and B[1] = 18.8K.
Next, eliminate 100k, 5k, 400, 500k, and 10k. This gives E = 186.8k, b[2] = 0.11 + 0.89 * 0.777 ^ 7 = 0.262, and B[2] = 49K.
Next, eliminate 200k, 1M, 200, and 1k. This gives E = 145.8k, b[3] = 0.262 + 0.738 * 0.777 ^ 6 = 0.424, and B[3] = 61k.
Next, eliminate 100, 750, and 750k. This gives E = 106.4k, b[4] = 0.424 + 0.576 * 0.777 ^ 5 = 0.587, and B[4] = 62k.
Next, eliminate 400k, and 75k. This gives E = 62.6k, b[5] = 0.587 + 0.413 * 0.777 ^ 4 = 0.738, and B[5] = 46k.
Next, eliminate 25k. This gives E = 70.2k, b[6] = 0.738 + 0.262 * 0.777 ^ 3 = 0.861, and B[6] = 60k.
Next, eliminate 300k. This gives E = 12.7k, b[7] = 0.861 + 0.139 * 0.777 ^ 2 = 0.945, and B[7] = 12k.
Next, eliminate 300. This gives E = 16.9k, b[8] = 0.945 + 0.055 * 0.777 = 0.988, and B[9] = 16.7k.
Next, eliminate 75. This gives E = 25.25k, b[9] = 1, and B[10] = 25.25k.
(*) Post, van den Assem, Baltussen, Thaler, "Deal or No Deal? Decision making under risk in a larg-payoff game show," April 2007 draft, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=636508
Posted 22 Apr 2007 at 11:34pm #
Looking at the fits in the Post paper, and based on watching the show a couple of times, I'm convinced that there is no fixed algorithm. The bank is using an expected value modified by turn number, however clearly the banker is human and is responsible for creating good television. So in situations where an altered bank offer might create more drama, soul-searching, tears etc. from an indecisive contestant (or get a popular contestant off the hook after eliminating most of the high values) the math doesn't hold.
Based on that, while the Post model fits the bank's behavior OK, my guess is that the actual algorithm used by the bank probably is very simple (eg expected value * turn / 10 as previously cited), and gives a lower value than that predicted by the Post model. This is because I see many reasons for the bank increasing the offer to add to TV drama, and few reasons for decreasing the offer.
While we are on the subject, is there a saddle point? Based my very instinctive guess it looks like the rational contestant would take the offer in Round 7.
Posted 23 Apr 2007 at 10:11am #
Does anyone know if the show's formula matches the formula used for the on-line game, found here?
http://www.davegentile.com/stuff/Deal_or_no_deal.html
If not is it close?
Posted 23 Apr 2007 at 9:24pm #
I just like looking at the girls.
😈
Posted 23 Apr 2007 at 9:50pm #
I'll check more thoroughly, but a quick check from the first three rounds of a recent game suggest not. I tried submitting the board after the 1st round so you could check me, but it didn't work.
Round 1 Formula: 81810.49; Offer: 22000
Round 2 Formula: 101788.79; Offer: 55000
Round 3 Formula: 144845.72; Offer: 108000
Posted 25 Apr 2007 at 12:13pm #
Just to give another answer on the guy arguing with his brother.
Your brother's theory that the probability that you've the million dollar box always remains at 1/n has one glaring mistake - how about if one of the boxes he asks to open has the million dollars, then obviously the probability that his box has the million reduces to 0. Thus the probability does change - and if the box he asks to open doesn't contain the million dollars then the probability increases to 1/n-1, then 1/n-2, etc.
Thus the difference between this and Monty's game is that when Monty opens the first door there is a 0% chance that the door has the prize but when the contestant asks for a box to open there is a 1/(number of unopened boxes) chance that it contains the million dollars (or 0 if the million has already been revealed).
Posted 06 May 2007 at 10:16pm #
I haven' t read ALL the comments, so I apologize if I'm repeating another comment. The Let's Make A Deal is different. You can prove this by placing face down 25 cards (5 rows x 5 columns) with only one ace, and have the contestant guess where the ace is. Have the dealer, who KNOWS where the Ace is, turn over all the cards except the Ace and the card the contestant pick. With two cards remaining the contestant should switch cards if given the option as his odds were 1 in 25 when he picked and still are. To prove this, perform this game 100 times and see how many time the contestant picks the Ace. after a handful of attempts, the 25 to 1 odds will sink in.
With DOND, down to 2 cases, one having the million and one not, the odds ARE reduced to 50/50 from the 1 in 27 when the initial case is picked. The reason being that out of a sample of 100 full DOND games, 90 something games will be thrown out of the sample because the $1M was revealed prior to getting down to the last two cases because the revealing of the cases is random, unlike the turning over of the cards. So if you do actually make it to the situation where only two cases are remaining, the $1M in one of them, the odds are indeed 50/50. Of course this situation will happen only 1 in 90 something times. How many times have you seen it happen?
To create a situation in DOND where the odds reamain 1 in 27, the show would change whereby after the contestant picks a case, Howie, knowing where the $1M case is, opens all cases except the $1M case and the contestants case, and lets the contestant choose. The Contestant should always pick the other case as his case remains 1 in 27 of having the $1M.
Posted 13 May 2007 at 5:12pm #
Having watched the English version of Deal or No Deal at work to I'd like some help in answering a minor disagreement I had with a colleague.
In this show's situation the contestant had two boxes left, one containing 1p and one containing £75,000 - the banker did NOT make him an offer. I thought he should have changed the box, as to me it seemed more probable that he would get the larger sum of money - but my colleague was convinced it didn't matter as it would be a 50/50 chance.
In that situation what would be the best decision to make-change the box, keep the box, or randomly pick one?
Posted 13 May 2007 at 9:56pm #
Responding to Viciki - It should be a 50/50 chance. But I'm not familiar with the Engilish version. The key question is - "Did the bankers lack of an offer contain any information?" For example - in Monty Hall, Monty always opens a door without the big prize, NOT a random door. So Monty provides information that you did not have when you picked the first door. So you should switch. But if the Banker never makes an offer when there are 2 cases, then there in no information given. So it does not matter which case you choose.
Posted 04 Jul 2007 at 6:42am #
The easiest way to understand it:
When you pick a case, the odds that it will contain 1 million are 1 in 26.
When you get down to 2 cases... there is your case (which had a 1/26 chance of containing the million), and there is the last remaining case (which also had a 1/26 chance).
Both of them have always had completely equal chances of containing the million dollars throughout the entire game. So if the million is still in play, it must be in one of those cases that have an equal chance... so it's 50/50. No benefit (and no loss) in switching cases at the end.
Once again, this is NOT Monty Hall. There is only one situation where the last remaining case would have a 25/26 chance to have the million, which is: If after selecting your case, all of the models inspected their cases, and were told to open the case if they don't have the million.
Posted 04 Jul 2007 at 11:38am #
I agree with you entirely E. Honda, although, you have provided the best explanation to date. Well done!
Posted 22 Aug 2007 at 12:23pm #
I have a question regarding the Monty Hall and DOND:
If you choose your case out of 26, there's a 1/26 chance this contains the 1 M$.
If you then randomly chooses 24 cases which do not contain the 1 M dollar and are left with 2 cases, why is this different from the Monty Hall situation. You now know that 24 cases has been eliminated, and neither of them contained the $1M. This is exactly the same information that you have in the Monty Hall.
In most cases this will not happen, because you will open the $1M before you are left with 2 cases, but given this information, I think this is the Monty Hall situation where you know that 24 cases has been eliminated which did not contain $1M, and therefore the chance of you picking the $1M at the beginning will be 1/26.
Posted 22 Aug 2007 at 1:22pm #
Reply to TP
The difference between Monty Hall and DOND -
Monty will only open a door where there is no grand prize. Every case you open on DOND may or may not contain "a grand prize". That is the difference. Monty KNOWS where the grand prize is. When he opens a door he tells you something you didn't know when you started - different than just opening a random door would have told you.
Let's say we have 3 boxes, and 1 contains a million dollars.
P= 1/3 for all 3 boxes. We open one box at random. We gain knowledge. If the box had a million, the probability for the other two drop to P=0. If the box did not have a million, P=1/2 for the other two boxes.
Now, instead suppose we start with three boxes. P=1/3.
We pick box #1. Now somebody who knows where the million is and will never open the million box comes and opens box#2 for us.
In scenario number one, we survived a test of our picking ability. There was a risk involved when we open the box. Our probability either went up or down. We gained or lost something.
But in scenario 2, there was no risk. We know that the box opened will not contain 1 million. Our starting probability for our box stays at 1/3.
When Monty opens a box, or when we randomly open a box, in both cases we learn something about the contents of the opened box. But randomly opening a box, tells us a little bit about our box. Monty opening a box, tells us nothing about our box.
In the Monty situation the probability that we have picked incorrectly = 2/3. The probability that we have picked correctly = 1/3.
If we have picked box #1 incorrectly (P=2/3), and Monty opens number 2, then #3 must be the million. Thus box#3 has the million with P=2/3.
If we have picked #1 correctly (P=1/3), then box number 3 will not contain the million (P=1/3).
Thus we should switch in the Monty scenario.
A *successful* random selection of a box that is not ours, increases the odds that our pick was correct. Monty opening a box tells us nothing about how good our pick was, since the contents of Monty's box is the same whether our pick was good or bad.
If we have picked well with #1 then
Monty's box #3 has 1 million with P=0
The random box #3 has 1 mil with P=0
If we have not picked well with #1 then
Monty's box #3 has 1 million with P=0
The random box #3 has 1 million with P=.5
Monty's box is the same whether we picked well or not. The random box has a better chance of containing the mil when we've picked poorly than when we've picked well, so opening the random box and not finding 1 mil increases the chances that we've picked well.
Formal math to follow.
Posted 22 Aug 2007 at 1:48pm #
Let H = hypothesis “box #1 contains 1 millionâ€Â
~H means “not Hâ€Â
Let E = evidence – other box when opened does not have the million.
~E then means other box has the mil when opened.
H|E means the probability of our hypothesis, after we get the evidence.
E|H means the probability of the evidence given our hypothesis.
Bayes's formula for updating the probability of our hypothesis is P(H|E) = P(H) *P(E|H)/P(E)
The denominator can be expanded. P(E) = P(E|H)*P(H) + P(E|~H)*P(~H)
In both the random and Monty case P(H) = 1/3. (We have picked correctly)
For Monty P(E|H) = 1. P(E|~H)=1
(Monty will always show an empty case)
For random P(E|H) = 1. P(E|~H)=0.5
(If we've picked wrong there is a risk in the randome case)
For Monty P(E)=1 (he always gives this evidence)
P(E|H)*P(H) + P(E|~H)*P(~H) => 1*(1/3) + 1*(2/3) = 1
For random P(E)=2/3
(sometimes the random case will have the mil)
P(E|H)*P(H) + P(E|~H)*P(~H) => 1*(1/3) + (1/2)*(2/3) = 2/3
Bayes for random => P(H|E) = P(H)*P(E|H)/P(E) =>
(1/3)*1/(2/3) = ½
Bayes for Monty => P(H|E) = P(H)*P(E|H)/P(E) =>
(1/3)*1/1 = 1/3
Thus in both cases our probability for #1 starts at 1/3. In the random case it becomes ½. In the Monty case, it is unchanged.
We could also update ~H, the probability that we have picked incorrectly.
Bayes for random => P(~H|E) = P(~H)*P(E|~H)/P(E) =>
(2/3)*(1/2)/(2/3) = ½
Bayes for Monty => P(~H|E) = P(~H)*P(E|~H)/P(E) =>
(2/3)*1/1 = 2/3
Thus in the Monty case, our probability of having picked wrong is unchanged at 2/3.
In the random case, our probability of picking wrong has been updated to ½.
Dave
Posted 22 Aug 2007 at 4:37pm #
TP: It's different because when Monty picks the cases, he guarantees that the last remaining case has a 25/26 chance of containing the million. So it's your 1/26 case, vs the 25/26 case.
But if you pick randomly, it means nothing. The last remaining case will also have a 1/26 chance of containing the million. So it's your 1/26 case vs another 1/26 case... making the odds 50/50.
Another thing to consider: If you picked your case and made it all the way to the end without opening the million, there's a good chance it was because the million was inside your case to begin with. (Unlike with Monty, where you get to the final 2 every single time.)
Posted 22 Oct 2007 at 12:10am #
dave hit the nail on the head. it's tough to understand at first but after some classes in probability it makes much more sense.
when you choose a door in monty hall's problem, you aren't adjusting the odds at all because the host is removing a door, not you. in DoND, YOU are eliminating the cases one by one. if you selected your case and then howie removed 24 cases not holding the $1,000,000, then it would be exactly like the monty hall problem. you would have two cases left and your original case would have a 1/26 of containing the $1,000,000 BUT HE DOESN'T!!! if the $1,000,000 case somehow survives until the last two, there will be a 50/50 chance your case holds the $1,000,000
those who believe that it's a case of the monty hall problem need to learn some more about probability. engineers obviously don't need any training in this area. sorry if i sound rude but honestly!!
Posted 24 Oct 2007 at 4:43pm #
If you end up in a situation with your case and one other case at the end of the game AND the $1,000,000 has not yet been revealed, then the probability that your case contains $1,000,000 is 1/26 (which it was from the start of the game) and the probability that the remaining case contains $1,000,000 is 25/26 (the two probabilities must add to 1.0).
It is irrelevant who selects the cases during the game or what order they were selected or anything else. It could be the host, the contestant or a computer generated random number. The question is what is the probability that each of the two remaining cases contains the $1,000,000 when the game gets down to the two case scenario AND the $1,000,000 case has not been opened. This would be a very rare situation because in most games the $1,000,000 would have been revealed before getting down to the last two cases.
Every time you select a non $1,000,000 case from the pool of 25 cases out there, the probability that the $1,000,000 is in one of the other remaining cases increases EVEN IF IT IS NOT IN ONE OF THE CASES OUT THERE.
The probability that your case contains $1,000,000 remains at 1/26 and does not change throughout the course of the game as long as the $1,000,000 has not been revealed. Once it is revealed the probability that your case contains $1,000,000 becomes 0.
If presented with this highly unlikely situation I would definitely switch cases at the end of the game for the one remaining case. I may make a mistake but 96% of the time (on average) I would win the $1,000,000.
Posted 29 Oct 2007 at 10:48pm #
Gary: It is absolutely relevant how the 24 non-winning cases were opened. You are ignoring the fact that you're far more likely to reach the final two if you have the $1,000,000 in your own case.
There are two ways to get down to the final two cases and still have the million in play.
Way #1: You chose a losing case (odds: 25/26). You then opened 24 more cases, but then one case you left unopened was the million dollar case (odds: 1/25). Total odds of this happening: 25/26 x 1/25 = 1 in 26.
Way #2: You chose the million dollar case (odds: 1/26). You then opened 24 more cases, but there's no way you could have opened the million since it was already in your case (odds: 24/24). Total odds of this happening: 1/26 x 24/24 = 1 in 26.
If you play DoND and get down to the final two, there is a 24/26 chance the million will not be in play, a 1/26 chance you will have it, and a 1/26 chance you should switch. Since 1/26 = 1/26, both of the final two cases have an equal chance of winning.
In DoND, all of the cases have the same odds. This is completely different from Monty Hall, in which the game is rigged by someone who knows where the prizes are.
Posted 30 Oct 2007 at 12:54am #
the last two cases have an equal chance of containing the $1,000,000 PERIOD. read dave's proof if you disagree
Posted 30 Oct 2007 at 1:04am #
Keep in mind that when you "choose" the case you're really just setting it aside until the end. It's not really special at all. It just happens to be one of the remaining two cases. What if Howie didn't ask you to pick a case at the beginning but in your mind you said I'm gonna leave #1 and #10 until the end? It's the same situation. 50/50 chance so flip a coin and hope you get lucky.
Posted 30 Oct 2007 at 9:53am #
Hey Honda,
Very thoughtful reply but consider the following.
The only situation we are considering is: there are two cases left and the $1,000,000 has not shown up. It is a very unlikely event that the $1,000,000 has not shown up until that point, but the situation (condition) we have been presented with is that it hasn't.
Now, with only two cases left we need to determine two probabilities: 1) the probability that the selected case has the big prize (and I think we are all in agreement that that probability is 1/26) and 2) the conditional probability that the remaining case has it. These two probabilities must sum to 1.0 because there are no other options - we know the prize is in one of the two cases because the sample space at this point in the game has been reduced to two sample points only.
Posted 30 Oct 2007 at 10:58am #
Gary,
Read my post from 22 Aug 2007 at 1:48pm for the math, and also the post before that one.
The probability that the million is in your case starts at 1 in 26. But by the time there are two cases left the probability your case case has the 1 million is 50%. The probability updates according to Bayes formula.
Monty Hall is different. If DOND was like Monty Hall then you would be correct, and at the end of the game, your original case would still have 1 chance in 26, and the remaining case would have a 25 in 26 chance.
The situation is different, depending on whether or not the cases are openen at random, as in DOND, or are opened by a host that will never open the winner, as in Monty Hall.
In DOND, every time a case is opeed without a million, your original pick has survived a test, and gains in probability. But in Monty Hall, your original pick has not survived any test at all, and its probability remains the same.
Find a friend, and try it both ways 50 times. In one senerio you'll win 50% of the time by switching, and in the other senerio you'll win almost all the time by switching.
Posted 30 Oct 2007 at 11:08am #
Gary writes: "Every time you select a non $1,000,000 case from the pool of 25 cases out there, the probability that the $1,000,000 is in one of the other remaining cases increases EVEN IF IT IS NOT IN ONE OF THE CASES OUT THERE."
Yes. The probabilty that the million is in one of the other cases increases. It increases in ALL the other cases, including YOURS.
The difference between DOND and Monty is not just WHO picks the cases, it is HOW they are picked, and what INFORMATION you get. DOND is a random pick. Monty knows where the big prize is and will never open that one. THis is a different process. If Monty openen the door at randome, then it would be the sam as DOND.
Posted 30 Oct 2007 at 11:36am #
Dave,
In a recent comment you wrote:
"In DOND, every time a case is opeed without a million, your original pick has survived a test, and gains in probability."
I do not agree with this statement. What has changed is the probability that the million is contained in each of the remaining cases (however many are left from the pool of 25). This probability has increased slightly because the denominator of that pool was decreased by 1.
But, I would agree with your statement if the rules of the game were modified slightly. Suppose you re-entered the selected case back into the pool of unopened cases (i.e., with replacement) after every case is opened and then selected another case at random from the pool to be the new selected case. Now, the probability of the million being in your case will increase with every opened case (as long as the opened case does not contain the million). If these were the rules of the game and you end up with two cases at the end of the game, I would agree that the probability is 0.5 that the big prize is in your case. But this is not how DOND is played.
Posted 30 Oct 2007 at 1:34pm #
Gary,
You wrote:
"I do not agree with this statement. What has changed is the probability that the million is contained in each of the remaining cases (however many are left from the pool of 25). This probability has increased slightly because the denominator of that pool was decreased by 1."
Your selected case is still part of the pool of unopened cases. The fact that you have "put it aside" changes nothing. It is still an unopened case, and no information about it has been gained.
Think about this instead. - Suppose you physically pick case #1, but you mentally "put aside" case #2, you say to yourself "this is really my final case". We get down to three cases left. why would the mentally picked case gain probability, and not the physically picked case?
My suggestion would be to try to work through the "math" post. If you have questions on the math post, I can answer them.
Just FYI - I work as a statistician, and I'm currently in the process of working on a paper for publication is the area of Bayesian statistics. This is a relatively simple application of Bayesianism. I can pretty much guarantee I've giving you the correct answer. - Better that 99% probability.
:o)
Dave
http://www.davegentile.com/philosophy/about.html
Posted 30 Oct 2007 at 1:56pm #
Gary,
Just trying to "get in your head" to see what you might be thinking here. Your example of "putting the case back" leads me to believe that you might be confusing "physical probability" with "inductive probability".
This first slide from a 500 level philosophy course (discussing Bayesian methods), explains the difference.
http://patrick.maher1.net/517/lectures/lecture1.pdf
If I flip a coin, and then hide the result, what is the probability that it is heads? The physical probability is either 1 or 0, but we don't know which. The inductive probability is 1/2.
I've previously described DOND in terms of inductive probability. In terms of physical probability, the probability that your first selected case has the million are either 1 or 0, but we don't know which. We would then model the other 25 as two possible separate processes. In process#1 they all have zero probability, and stay that way, or in process#2 they all start with 1 chance in 25, and when there is only one case left in the pool it has probability =1, but we don't know which model is true. So for physical probability, at the end, we have a case in our hand that is either probability = 0 or 1, and a case in the pool that is either probability 1 or 0, but we don't know which model of the situation is correct.
Thie only way to get your 25/26 chance for DOND is to muddle physical and inductive probability together, and as a result improperly use both.
Dave
Posted 30 Oct 2007 at 3:21pm #
All:
What a lot of big, fat brains exist out there! I applied the Monty Hall Dilemma when arguing this in the past, but after reading the many postings, I'm just not so sure any longer.
I agree with those that claim the odds associated with your original case choice does not change, but I get confused when the statisticians start bandying about probability concepts.
But for those of you who believe that it is a 50/50 scenario when you are down to the final two cases, let me change the game a bit and propose the following:
Let's suppose that 25 out of the 26 cases contained $1M and one case contained $1. If you came down to the final two cases and the $1 case had not yet been exposed (thus guaranteeing you a $1M winning case), would you really be willing to switch your case for the one held by the model? According to the 50/50 folks, switching would make no difference. Yet my gut (and feeble mind) would still be telling me that my chance of having picked a $1M case was still way in my favor and I would not want to switch.
If you chose to switch in this scenario, it seems that it would be more about you feeling lucky/unlucky and luck doesn't seem to sit well with statisticians.
Posted 30 Oct 2007 at 4:31pm #
Kurt,
Yes, in your altered game, the inductive probability is 50/50 at the end. When you picked a case at the beginning, there was a 25/26 chance of it being a $1 million case. Then after a case was opened, your case fell to 24/25, then to 23/24, and in the end to 1/2. I would not switch, just because of the psychological effects of not wanting to be wishy-washy, but if someone offered me $1000 to switch I certainly would switch.
Again, the fact that it is a 50/50 chance at the end could be verified by actual experimentation. Of course, since the experiment would have to involve random picks, you'd only get a valid test in 2 of 26 attempts, since most times the $1 would appear before the last 2 cases, and ruin your test.
Now if it was Monty Hall in reverse, then for sure I would keep my original case. It would be a Monty Hall situation if the host knew where the $1 was and would only open cases that did not have it.
Again, going through the math post with care may be the best option.
BTW - the proof of Bayes's theorem is very simple, all you need is the product rule for probabilities. The product rule is:
P(A intersection B) = P(B) * P(A|B)
Here P(A|B) means "Probability of A given B"
Dave
Posted 30 Oct 2007 at 5:14pm #
I think some people are forgetting the most important point. In the Monty Hall problem the host is manipulating the game based on inside information. This is not the case in DoND. Every selection the contestant makes is a random guess (assuming the game show isn't fixed).
I just did a presentation on this topic in my seminar course for my applied math major. The Monty Hall Paradox is all about the fact that Monty Hall always eliminates a door with a goat behind it. This isn't random selection. I think some of you are forgetting that.
Posted 30 Oct 2007 at 5:29pm #
From the "math post" - only enough to show how the situations are different.
======
Let H = hypothesis “door #1 contains 1 millionâ€Â
~H means “not Hâ€Â
Let E = evidence – some other door when opened does not have the million.
E|H means the probability of the evidence given our hypothesis.
E|~H means the probability of E given that our hypothesis is wrong.
For Monty P(E|H) = 1. P(E|~H)=1
(Monty will always show a bad door)
For random P(E|H) = 1. P(E|~H)=0.5
(If we've picked wrong there is a risk in the random senerio)
==========
That is to say -
Suppose we pick door #1 as our "case".
What is the probability that door #2, seleted at random contains the grand prize? Answer - 1/3rd.
Now on the other hand if Monty opens door #2, because he knows there is no grand prize there, what is the probability of a grand prize behind #2? Answer - probability = 0. Monty never shows the grand prize.
That difference shows up in the other probabilities in the problem.
Dave
Posted 30 Oct 2007 at 5:37pm #
Edit of immediately above post -
From the "math post" - only enough to show how the situations are different.
======
Let H = hypothesis “door #1 contains 1 millionâ€Â
~H means “not Hâ€Â
Let E = evidence – some other door when opened does not have the million.
E|H means the probability of the evidence given our hypothesis.
E|~H means the probability of E given that our hypothesis is wrong.
For Monty P(E|H) = 1. P(E|~H)=1
(Monty will always show a bad door)
For random P(E|H) = 1. P(E|~H)=0.5
(If we've picked wrong there is a risk in the random senerio)
==========
That is to say -
Suppose we pick door #1 as our "case".
What is the probability that door #2, seleted at random contains the grand prize? Answer - 1/3rd. (or if you prefer physical probability (and to match the math above), it is either 1/2 or 0, but we don't know which)
Now on the other hand if Monty opens door #2, because he knows there is no grand prize there, what is the probability of a grand prize behind #2? Answer - probability = 0 (and probability of a goat = 1, as in math above). Monty never shows the grand prize.
That difference shows up in the other probabilities in the problem.
Dave
Posted 30 Oct 2007 at 6:12pm #
Dave,
I'm glad you mentioned experimentation because I've developed an easy thing for YOU to do to prove my point that the odds are NOT 50/50. (Note: caps do not indicate shouting; only emphasis).
To get others up to speed that don't care to read further up this chain, I posited that simply being left with two cases at the end did NOT indicate a 50/50 chance of winning/losing. I also changed the DOND game a bit by saying that there existed 25 $1M cases and only one $1 case.
Dave, since you are firmly entrenched in the 50/50 camp, I'd like you to do the following:
1. Take a deck of cards (or blank index cards) and remove 26 cards; make sure that of the 26 cards, the ace of spades is one of them (or mark one index card with an 'X'). In my version of DOND, 25 of those cards represent $1M cases; the ace of spades (or marked index card) represents $1.
2. Shuffle the cards and draw one of them. Place it face down in front of you WITHOUT looking at it. Place the remaining cards to the side.
3. Now pretend that you've been playing DOND and only two cards remain; one of which is the ace of spades. You don't have to actually play the game here to prove that the 50/50 proposition is wrong. Just pretend that you managed to randomly 'dance around' opening the $1 case and it remains out there; either in front of you or with the model.
4. Since you believe that you have a 50/50 chance at this point, you've made it clear that switching makes no difference. So, no matter what card is in front of you, you're going to switch from it. (Forget the treacle about being wishy-washy in switching). Also, let's say that every time you switch from a winning card (any card except the ace of spades), you owe me $20. Everytime you switch away from the ace of spades, I owe you $20. You would proposition that after a small random sampling, both you and I would be even in money exchanged. I posit that you will owe me far more than I owe you.
Now I know that you're going to want to insist that cards must be randomly turned over one-by-one for this to work, but know that that is simply not true. Just pretend that everytime you start this game, in your random turning of cards, (opening of cases) you were very 'unlucky' each time and managed to leave the $1 case unexposed (it could be your card, too).
Before posting a response, sit with the cards, and you let me know the number of times you 'won' (had the ace of spades in front of you) and the number of times you 'lost' (had any other card in front of you). Remember that losing in this case means you swithed away from a winning card (any card but the ace of spades).
Corey,
Remember that the Monty Hall Dilemma was all about 'is it better to switch doors or stay with your original choice'. Marilyn Vos Savant first presented this in her Parade magazine column several years back and she started a virtual firestorm by stating that you increased your odds of winning by switching from your original choice. Many mathematicians, statisticians, college professors and others took her to task by saying that once Monty had revealed one bad door the odds now switched to 50/50 and switching made no difference. Very clearly and concisely, Marilyn showed that the odds were that you had picked a 'bad' door in the beginning and when given a change to switch out of that bad choice, you best take it. Many of those 'intelligentsia' had to write her apologies for being mistaken.
Posted 10 Nov 2010 at 9:30pm #
Actually wrong on both accounts. There is a gross methodological error whenever someone tries to automatically turn theoretical probability calculation into decision strategy. Theoretical probability in valid only in the limit of large numbers and unless you are measuring something that has underlying, physical, cause you are not guartanteed any convergence of sampling at all with finite number of samples. Also, your sample is of size 1 - you won't be able to repeat the game and average results.
When sampling is measuring something with underlying cause than you do expect rapid convergence and then it is valid to derive a rule based on stats. When there's no underlying cause then you are deluding yourself and believeing in supernatural forces. That's how gamblers loose their moneys -- by imagining.
Since the assumption is a fair game, there is no underlying cause. So the sole thing that decides your odds in reality is the actual information content you have at the time of making the choice and whether you make a random or forced/deterministic choice. That is the point at which otherwise very different games of DOND and Monty Hall have the common denominator.
You never have more than 50/50 actual chance (which is why banker's offer is always < the average) but even TO REALISE that CHANCE YOU HAVE TO MAKE a RANDOM CHOICE in the last step. If you make a forced choice (always keeping or always switching) then your odds remain the ones that existed at the time you made the random choice. Random choice, with a fair coin, is the only way to realize new, but still incomplete information with even chance of success.
So Marilyn Vos Savant was actually cheating by presenting the problem as if viewed by a God (over infinite number of samples) and then using the calc for individual case as if there is underlying cause. We are talking the fundamental fallacy of Bayesian "inference" applied to finite, short and ultimately singular samples without an underlying cause.
So, in general case you have 1/13 chance not to expose the 1 mil chip during the game and then you have 1/2 chance to actually nail it if you make the second random choice. If you make forced choice then you remain at 1/26. In probabilistic theory these two things are the same but that's completely irellevant for your choice since if you got a rare luck of 2 chips left with 1 mil still in the game, at that point you, and only you, in that singular game, have a 50% chance.
Back to actual DOND -- you actually do have more info here -- banker's offers. With assumption that the game is fair (that banker is not allowed to mislead you), sequence of banker's offers is a function of your 1st choice. If your 1st choise was very high that will shift the ballance in banker's calc in a way that will make them smoother, just as the very high last value will shift ofers towards the average sooner as you approach the end.
I suppose they don't allow a notepad or a programable calculator in the game :-))
Posted 30 Oct 2007 at 8:22pm #
Thanks for the info Kurt, but I've been researching it for a few weeks now haha. Here is a good article I found
http://query.nytimes.com/gst/fullpage.html?res=9D0CEFDD1E3FF932A15754C0A967958260
In your case above, if I read it correctly, there would still be a 50/50 chance between the two cases. Don't think of it as choosing a case in the beginning but just deciding not to flip that one over until the end. Say you choose two cards not to flip over but you don't distinguish between them. Then you flip over the rest of the cards and the $1 isn't there. So what's the probability now? It's the same: 50/50
Posted 30 Oct 2007 at 10:01pm #
Kurt,
I have a couple of issues with the "pretend" scenario.
1) Pretending would ruin the value or a real experiment.
2) Information is critical in inductive probability. You would need to be more specific about what we need to pretend.
3) If I read it correctly, we agree on the result. If I pick a card, and put it aside, it will be the Ace of Spades 1 time in 26. Since we agree (I think), there is no need to do the experiment.
But let's stick with cards. To make it more manageable, let's go to 10 cards. Suppose we have all of the non-picture Spades, A-10, for a total of 10 cards. And suppose we do the game for 100 trials. I don't think we really need to, since I think we agree on the expected results.
On average -
Scenario A - 10 of 100 trials will result in me correctly picking the Ace of Spades.
Scenario B - 80 of 100 trials will result in a spoiled experiment, since the Ace will come up before I get down to the last 2 possibilities.
Scenario C - 10 of 100 trials will result in getting down to two cards, but I will have picked wrong, and the last remaining card will be the Ace of spades.
Thus, the probability of me picking correctly in the first place is 1/10th, a fact upon which we agree.
However, look at the ratio of the probability of scenario A to scenario C. The probability ratio here is 1 to 1. Scenario A and scenario C are equally probable. Thus if they are the only possibilities remaining, there is a 50% probability of each.
Scenario B are all invalid trials, so those 80 trials are eliminated from consideration. In the 20 remaining trials, I will win 10 times, 1/2 of the cases, or 50%. Switching is therefore irrelevant.
Dave
Posted 30 Oct 2007 at 10:28pm #
Corey,
I read the newspaper article. Interesting. Her original column was clearly poorly worded. The procedure the host follows is critical. What sort of probability is being talked about should also be specified.
All,
Note the date of the column. 1991. That is well before Bayesian probability became a hot topic in business schools and in philosophy. Bayesianism goes back to the late 1700s, but through most of the 20th century it was ignored in statistics. The computer age and information theory have brought it back in style. In the 21st century, you can now get text books that teach both Bayesian and classical statistics.
Posted 30 Oct 2007 at 10:47pm #
Kurt wrote:
"Remember that the Monty Hall Dilemma was all about 'is it better to switch doors or stay with your original choice'. Marilyn Vos Savant first presented this in her Parade magazine column several years back and she started a virtual firestorm by stating that you increased your odds of winning by switching from your original choice. Many mathematicians, statisticians, college professors and others took her to task by saying that once Monty had revealed one bad door the odds now switched to 50/50 and switching made no difference. Very clearly and concisely, Marilyn showed that the odds were that you had picked a 'bad' door in the beginning and when given a change to switch out of that bad choice, you best take it. Many of those 'intelligentsia' had to write her apologies for being mistaken."
Yes, when the problem is clarified and Monty's behavior is specified, then Marilyn was clearly correct, and her pre-"Bayesian age" critics were wrong. But this isn't the Monty Hall scenario. (see my post above).
Another point I could mention in my background - in 2001 I took a business class devoted to the topic. The teacher kept a running game with a prize at the end of the semester. All that was involved was your ability to answer these sorts of questions correctly. Myself and 1 other student got it from the start. Half the class got better with time. And at the end of the semester, half of the class still had not picked up on it.
Marilyn's critics were not specialists in this form of probability, and that was before Bayesianism was as widely taught as it now is. And again, I've got a paper out for review in this area, so I'd put myself in the specialist category. And while we Bayesians might disagree with each other on various philosophical points, we'd all agree with Marilyn about the fully specified Monty Hall problem, and we'd all agree that DoND is not Monty Hall.
Posted 30 Oct 2007 at 11:55pm #
All,
For completeness, I should also post the Monty Hall version of the 10 card game. (see above posts)
100 trials are run of a 10 card game.
Senerio A - In 10 of 100 trials, I correctly pick the Ace of Spades. The host flips up 8 cards, none of them the Ace of Spades and offers a switch.
Senerio B - In 90 of 100 cases, I do not pick correctly. The host flips up 8 cards which he knows are not the Ace of Spades, so that the Ace of Spdes is the one remaining card (other than the one I picked).
I'll now win 90 out of 100 times by switching. Again this is unlike DoND (see above).
Posted 31 Oct 2007 at 8:56am #
Gary: I'm not sure why you don't believe that your own selected case has increased odds as cases are eliminated, but you do believe that every other case does.
Suppose you played DoND, and opened Case 26 which did not contain the million. There are now 25 cases (including your chosen case) in the game. According to you, your chosen case is now inferior to the other cases, since all the other cases have increased their odds to contain the million.
My question to you: Why is your case special? Why does your particular case stay with 1/26 odds, while every other case improves to 1/25?
Situation #2: Suppose I am playing DoND. Before playing, I tell you that I am going to keep going all the way until only Case 1 and Case 2 are still in play. I assume you would agree with me that each one would have an equal chance to contain the million (assuming it's still in play).
But then I tell you "Case 1 will be my chosen case, and Case 2 will be the last remaining one on the board." Does Case 2 suddenly increase to a 25/26 chance to win?
If I tell you "I lied. I'm actually going to choose Case 2, and leave Case 1 on the board" do the odds suddenly swing back the other way?
(the answer: of course not, they are equal the entire time.)
Posted 31 Oct 2007 at 9:12am #
Kurt: Your ace of spades game doesn't provide 50/50 odds, because you are turning it into Monty Hall. You are using knowledge of where the prizes are to ensure that the player reaches the final 2 with the ace of spades still in play.
"Now I know that you're going to want to insist that cards must be randomly turned over one-by-one for this to work, but know that that is simply not true."
It IS true, and is the entire reason DoND is different from Monty Hall. It is extremely important that you are playing the cases at random, and that you have a 24/26 chance of not having the million in play by the time you reach the final 2 cases.
IMPORTANT FACT: If your chosen case does not contain the million, only FOUR PERCENT of the time will you reach the final 2 with the million still in play!
If you get a free ride into the finals, it screws up the math. Because you're getting that free pass, you're making it to the finals with a losing case 25 times as often as you should be.
But if played normally, you have only a 1/26 chance to get to the finals with a losing case. Which is also the same as your 1/26 to have picked the million dollar case for your own.
Posted 31 Oct 2007 at 9:19am #
By the way, Dave's explanation of "ace of spades" 10-card game does a better job than me at clearly showing why the chances are equal, and at showing why it DOES matter that the discarded cases are opened randomly.
Posted 31 Oct 2007 at 12:40pm #
E Honda,
Nice to have a reply from yet another out there! I believe that you and I are in the same camp; we agree that the odds of the person having the $1M case at the end of the game are the same as at the beginning (1/26). The issue I was jousting with Dave over is if at the end of the game, with only two cases remaining, one case having $1 and the other $1M, what are the CONTESTANT's odds; 1/26 or 50/50? I'm excluding all the shows where the $1M shows up early and simply saying that the contestant was lucky enough not to have exposed the $1M case. This allows me to quickly get to the 1/26 or 50/50 argument.
Dave,
I appreciate that you moved away from your formulas and put your arguments more in layman's terms. Very few of us out here (I think) understood the formulas and almost none of us were going to take the time to run them. Unfortunately, I think that you are still failing to see the forest for all the Bayesian trees you've been planting. 🙂
I'm going to set up a new scenario for all of you out there that still believe that when only two cases remain, the CONTESTANT'S odds are now 50/50. If this doesn't make the light bulb come on for you, there's probably little else I can do to make it light up.
I'm going to switch gears in this scenario and make it a bit more like the DoND game, changing only a couple of items for clarity. First, we're going to say that out of the 26 cases, only one contains $1M, the remaining 25 contain $1. The game starts off normally; our contestant (Dave) gets to choose one case and put it by his side. Howie then turns to the camera and says, "Folks, we've gotten complaints that the show is moving a bit too slowly, so we've changed the rules a bit. When I say 'Go!', everyone on this stage is going to open their case AT THE SAME TIME!" So the game starts for you, Dave. Howie says, "Go!" and you and all the models open your cases at the same time. Oh, boo. You have a dollar in your case so you lose. Seeing the heartbreak on your face, Howie says, "Don't worry, Dave. We're going to give you 25 more times to play this game so you can win a million dollars!!"
I think that most of us out there would agree that if Dave played the game 25 more times in that same format, he would win $1M just once (a 1-out-of-26 chance).
So now I have to ask all of you out there in '50/50 land'; Why is it, that when the cases are opened one-by-one, the CONTESTANT'S odds increase each time the $1M case is NOT exposed? Why is it that you believe that the "slooowww pace of the show" some how magically confers better odds on the CONTESTANT'S case? Remember, that's all I've changed here; the pace at which the cases are being opened. If you agree with my 'fast open' approach that Dave can only win the $1M 1/26 of the time, why do you believe that slooowwwly opening the cases transports him to 50/50 odds at the end of the game?
Before any and all scramble to write back responses related to the 'randomness of case opening' and references to how 'this is NOT Monty Hall'; please do your best to tell me (in the simplest terms possible, for my sake) why the pace at which the cases are opened confer better/worse odds on the CONTESTANT? It's a very simple question.
Posted 28 Apr 2012 at 4:18pm #
Ok the main problem is the speed of the cases being opened has nothing to do with it. The odds you had to choose a million dollar case were 1/26 the odds that you have that case if all other cases are removed are not constant. Just your odds of winning the million. If I had X-ray vision and could see into 22 out of 26 cases when I chose my initial case are my odds the same?
So why don't the odds for me winning change as I see the cases being revealed as not having the million dollars as the game progresses
Posted 31 Oct 2007 at 1:22pm #
Kurt,
In your fast paced game, we agree I will win 1/26 times. That is also true in the slow paced game. I will win only 1 in 26 times, when we play from the beginning of the game.
The question is how many times will I lose on the second-to-last remaining case? In a Monty scenario I will lose 25/26 times on the second-to-last case.
But what we're telling you about the DoND game is that I will only lose on the second-to-last case 1 in 26 times.
In Monty the ratio of “win/lose/don't count†is 1/25/0 (for a total of 26). In DoND on the last case the ratio of “win/lose/don't count†is 1/1/24 (for a total of 26). There are 24 times that we never get that far. Yes, we agree I win in 1 of 26 total games of DoND. But I will win 1 of the 2 games that make it all the way to the end, or 50% of those games.
I don't know if repeating can be of much help, but this presentation below is the clearest I've made to date, IMO.
==========
1. The non-Monty Hall version – Suppose we have all of the non-picture Spades, A-10, for a total of 10 cards. I win if I get the Ace of Spades. And suppose we do a “Deal or no deal†style game for 100 trials.
On average -
Scenario A - 10 of 100 trials will result in me correctly picking the Ace of Spades.
Scenario B - 80 of 100 trials will result in a spoiled experiment, since the Ace will come up before I get down to the last 2 possibilities.
Scenario C - 10 of 100 trials will result in getting down to two cards, but I will have picked wrong, and the last remaining card will be the Ace of spades.
Thus, the probability of me picking correctly in the first place is 1/10th.
However, look at the ratio of the probability of scenario A to scenario C. The probability ratio here is 1 to 1. Scenario A and scenario C are equally probable. Thus if they are the only possibilities remaining, there is a 50% probability of each.
Scenario B is composed of all invalid trials, so those 80 trials are eliminated from consideration. In the 20 remaining trials, I will win 10 times, 1/2 of the cases, or 50%. Switching is therefore irrelevant.
2 The Monty Hall version of the 10 card game. 100 trials are run of a 10 card game.
Scenario A - In 10 of 100 trials, I correctly pick the Ace of Spades. The host flips up 8 cards, none of them the Ace of Spades and offers a switch.
Scenario B - In 90 of 100 cases, I do not pick correctly. The host flips up 8 cards which he knows are not the Ace of Spades, so that the Ace of Spades is the one remaining card (other than the one I picked).
I'll now win 90 out of 100 times by switching, whereas in the non-Monty version I only won 10 out of 20 times by switching.
Posted 31 Oct 2007 at 1:29pm #
Kurt,
Do you disagree with any of this from the above post?
========
Scenario A - 10 of 100 trials will result in me correctly picking the Ace of Spades.
Scenario B - 80 of 100 trials will result in a spoiled experiment, since the Ace will come up before I get down to the last 2 possibilities.
Scenario C - 10 of 100 trials will result in getting down to two cards, but I will have picked wrong, and the last remaining card will be the Ace of spades.
Thus, the probability of me picking correctly in the first place is 1/10th.
======
You are correct that 10 is 10% of 100. But we are also correct that 10 is 50% of 20.
Posted 31 Oct 2007 at 1:37pm #
A little bit more on the slow vs. fast game. -
In the slow game on the first case my probability either goes up from 1/26 to 1/25 or it goes down to zero. Opening that case changes my probability, either up or down.
In the fast game, opening the 25 cases makes my probability either go up very quickly all he way to 1, or it makes it fall all the way to zero. Again, my probability adjusts. It just adjusts more quickly in the fast game.
Posted 31 Oct 2007 at 1:59pm #
In this simulation there are 26 games, each one corresponding to a number at the left of Table 1. There are 26 cases at the beginning of each game with a different letter of the alphabet representing a different case. Lower case letters represent cases without the million. There is one uppercase letter in each game, it represents the case with the million. The contestant picks one case at random, and puts it aside. In the simulation in Table 1 the letter to the right of all the others on each row represents the selected case.
I think we all agree that at the beginning of the game there is a 1/26 chance that the case with the million will be selected by the contestant. To represent this chance only one contestant in the 26 games has selected the case with the million (an upper case letter in game 8 below).
Each contestant removes cases from the remaining 25 cases at random so that only one of them remains unopened at the end of the game. The only condition is that the case with the million will remain unopened as one of the two cases at the end of the game.
Table 1: The situation at the beginning of the games
1 abcdefghijklmnoPqrstuvwyz x
2 abDefghijklmnopqrstuvwxyz c
3 abcdeghijklmnopqrsTuvwxyz f
4 abcdefghiJklnopqrstuvwxyz m
5 abcdefghijklMnpqrstuvwxyz o
6 abdefghijklmnopqRstuvwxyz c
7 abcdeFghijlmnopqrstuvwxyz k
8 abcdefghijklmnopqrstuvxyz W*
9 abcdefGhijklmnopqrstuvwxz y
10 abcEfghijklmnopqrstuvwxyz d
11 abcdefghiklmnOpqrstuvwxyz j
12 abcdefghijKlmnopqrsuvwxyz t
13 Acdefghijklmnopqrstuvwxyz b
14 abCdefghijklmnopqrtuvwxYz s
15 abcdefghijlmnopqrstUvwxyz k
16 abcdefghijklmnoprstuVwxyz q
17 bcdefghijKlmnopqrstuvwxyz a
18 aBcdefhijklmnopqrstuvwxyz g
19 abcdefghijkLmnopqrstuvxyz w
20 abdefghIjklmnopqrstuvwxyz c
21 abCdefghijklmnopqrstuvwxz y
22 abcdefghijklmnopQrstuvwxy z
23 abcdeghijklmnopqrstuvWxyz f
24 acdefghijklmnopqrsTuvwxyz b
25 abcdefghijkLmnopqstuvwxyz r
26 abcdefghijlmnopqRstuvwxyz k
Since the case with the million is not opened, at the end of each of the 26 games only one of the 25 cases remains unopened. For every game other than game 8, only one possible case can remain and it is the case with the million in it. For game 8 where the contestant initially chose the case with the million in it there are many options but it will by necessity be a non-million case.
You would be foolish not to switch at the end of the game if given the opportunity.
Posted 31 Oct 2007 at 2:17pm #
Kurt,
A different "fast" game -
Suppose we quickly open 24 cases, leaving only mine and one other.
I will win 1 in 26 times.
I will lose instantly 24 in 26 times.
I will lose on the last case 1 in 26 times.
You are correct that I will win only 1 in 26 tiems, but we are also correct that I will win 50% of the time given that I don't lose instantly. That is - you are correct that I win one in 26. We are correct that I win 1 out of 2.
Posted 31 Oct 2007 at 2:24pm #
Gary,
This sentance -
"The only condition is that the case with the million will remain unopened as one of the two cases at the end of the game."
makes it into the "Monty" version of the game. So in your game it would indeed be foolish not to switch.
Out of the 26 games, we agree that I will win only 1.
But how many times to we get down to 2 cases? In your version we get there 26 of 26 times. On DoND we only get there 2 of 26 times. BIG difference.
In your version I will win one of the 26 times we get to two cases. In our version (the DoND version), I will win 1 of the 2 times we get to that late stage of the game.
Posted 31 Oct 2007 at 2:29pm #
All, "probability" is in general not an absolute thing. The question is usually "Probability relative to what?"
My odds of rolling 12 in craps are 1 in 36 - RELATIVE to all other possible outcomes. But my odds of rolling a 12 before a 2 are 50%. 12 and 2 have equal relative probabilities.
My odds of winning on DoND are 1 in 26 RELATIVE to my chances of losing at any point in the game.
My odds of winning on DoND are 50% - RELATIVE to my odds of losing on the second-to-last case.
Posted 31 Oct 2007 at 2:53pm #
Dave,
If we get down to the situation where we have two cases left (the selected case and one model's case) and the million has not been revealed then the following two statements hold true:
The probability that the contestant's case holds the million is 1/26, not 1/2.
The probability that the other case contains the million is 25/26, not 1/2.
Are you saying that you agree with these statements?
Posted 31 Oct 2007 at 3:10pm #
Dave made a great point.
Why do you other guys think you can increase the probability of the other cases but not the case you chose? The remaining cases are all the same. I've said this before but I'll say it again, CHOOSING THE FIRST CASE DOESN'T MAKE IT SPECIAL.
When you figure the probability of a remaining case you're finding it RELATIVE to all the others. Suppose the contestants on DoND wrote down the order of the cases they will open before openning them. Then when they get down to the two remaining cases with $1,000,000 still on the board, according to your logic, both cases would have a 1/26 chance of winning. What happens to the 24/26?
Posted 31 Oct 2007 at 3:30pm #
Corey:
Actually choosing the first case does make it 'special'. Choosing it removes it from the pool of cases remaining. That's what makes it 'special'. If you go back a few messages I talked about this issue.
However, if the chosen case were put back into the pool of unopened cases (something akin to the concept of replacement) each time you open a case, and then randomly select another case as the new chosen case, that changes things. You would end up with a 50% chance at the end of the game that each of the two remaining cases held the million (given that the million had not been revealed up to that point). But, the moment you remove it from the pool it retains it's original probability (1/26) of having a million in it.
Posted 31 Oct 2007 at 3:39pm #
Corey:
Here's my response to your second paragraph.
We are not saying that at all. We are saying that the chosen case has a probability of 1/26 of containing the million and the other remaining unopened case has a probability of 25/26.
So, if given the chance at that point in the game make sure you switch cases!
Posted 31 Oct 2007 at 4:25pm #
Gary worte: "If we get down to the situation where we have two cases left (the selected case and one model's case) and the million has not been revealed then the following two statements hold true:
The probability that the contestant's case holds the million is 1/26, not 1/2.
The probability that the other case contains the million is 25/26, not 1/2.
Are you saying that you agree with these statements?"
======
Dave replies:
Starting from the beginning of the game -
The probability that the contestant's case holds the million is 1/26.
The probability that the second to last case contains the million is 1/26.
The probability that we won't get to the last two cases is 24/26.
Please let me know if we agree on the above.
Posted 31 Oct 2007 at 4:33pm #
Gary wrote "But, the moment you remove it from the pool it retains it's original probability (1/26) of having a million in it."
That's incorrect. If you mean inductive probability, then its probabiltiy will adjust. If you mean physical probability as your talk about replacement leeads me to believe, then the removed case now has probability = 0 or 1, but we don't know which. But if you leave it fixed at 1/26 throughout the whole game you just plain get the wrong answer. (Unless you are playing the Monty Hall version, in which case that is correct).
The 10 card game is probably the best example so far. Do you agree with the probabilities of scenarios A, B and C?
Posted 31 Oct 2007 at 4:41pm #
Dave wrote exactly what I was about to respond with. Let me stress that as you eliminate cases ALL of the remaining cases increase their odds.
You guys are forgetting WHY the Monty Hall Paradox exists: HUMAN INTERFERENCE. The situation that I assume we're talking about, is where the host only opens a door with a goat behind it. There is NO chance of him opening the door with a car thus the odds remain the same. In DoND you have to take into account the events leading up to the final two cases unlike Monty Hall.
Would you rather:
Choose a case in DoND and have the host remove 24 cases NOT containing the $1,000,000 then decide
or
Choose a case in DoND and eliminate cases individualy until you're down to two cases.
In the first case you are guaranteed to have a choice between 2 and you should switch because the odds are 25/26 for the other case.
However, in the second case you have to make it through the other 24 dodging the $1,000,000.
Posted 31 Oct 2007 at 4:42pm #
Gary,
Let's look at your simulation.
1 abcdefghijklmnoPqrstuvwyz x
2 abDefghijklmnopqrstuvwxyz c
3 abcdeghijklmnopqrsTuvwxyz f
4 abcdefghiJklnopqrstuvwxyz m
5 abcdefghijklMnpqrstuvwxyz o
6 abdefghijklmnopqRstuvwxyz c
7 abcdeFghijlmnopqrstuvwxyz k
8 abcdefghijklmnopqrstuvxyz W*
9 abcdefGhijklmnopqrstuvwxz y
10 abcEfghijklmnopqrstuvwxyz d
11 abcdefghiklmnOpqrstuvwxyz j
12 abcdefghijKlmnopqrsuvwxyz t
13 Acdefghijklmnopqrstuvwxyz b
14 abCdefghijklmnopqrtuvwxYz s
15 abcdefghijlmnopqrstUvwxyz k
16 abcdefghijklmnoprstuVwxyz q
17 bcdefghijKlmnopqrstuvwxyz a
18 aBcdefhijklmnopqrstuvwxyz g
19 abcdefghijkLmnopqrstuvxyz w
20 abdefghIjklmnopqrstuvwxyz c
21 abCdefghijklmnopqrstuvwxz y
22 abcdefghijklmnopQrstuvwxy z
23 abcdeghijklmnopqrstuvWxyz f
24 acdefghijklmnopqrsTuvwxyz b
25 abcdefghijkLmnopqstuvwxyz r
26 abcdefghijlmnopqRstuvwxyz k
I win in scenerio #8. Let's say we open the other cases starting at the back of the alphabet. Then the only times we will get down to 2 remaining cases is scenario #8 and scenario #13 (when "A" is the winning case).
The percetage of total games that I win? 1/26. That is my starting probability.
But I win 1 of the 2 games that go all the way down to the wire. That is - I win 50% of them. So based on your simulation, I would win 1 time by not switching (scenario #8), and I would win one time by switching (scenario #13) so I win the same number of times either way, in your simulation.
Dave
Posted 31 Oct 2007 at 8:58pm #
Dave,
I see by your response above that you're not considering only the specific scenario when two cases are unopened and the million has not been revealed. And that's the only scenario I'm considering as I thought that what this discussion was all about. So I think we're talking apples and oranges and that's why we can't agree. All along I admitted that such an outcome would be a relatively rare event when playing the real game, as most of the time the game will not end up with two cases with the million in one of them. But it will happen sometimes in the real game, and when it does, I was thinking about the probabilities of each case holding the million then.
I think you are considering all possible outcomes up to that point in time, and not just that specific outcome I'm considering. I reach this conclusion because in only two situations in your interpretation of the above simulation was the million left in play. I don't think we will ever be able to agree if we don't first agree about what we are talking about!
Posted 31 Oct 2007 at 10:08pm #
Gary writes: "I don't think we will ever be able to agree if we don't first agree about what we are talking about!"
And on that we agree!
:o)
I think Corey, Honda and I have generally agreed that in various senerios you and Kurt have presented the probability is indeed 1 in 26, AS YOU PRESENT IT. But also in general when our side as presented a senerio, your side seems to re-write it - which changes the problem.
Discussion of the actual problem in the next post...
Posted 31 Oct 2007 at 10:20pm #
Gary: "All along I admitted that such an outcome would be a relatively rare event when playing the real game, as most of the time the game will not end up with two cases with the million in one of them."
Dave: Yes 24 in 26 times we will not get down to the last two cases.
Gary: "But it will happen sometimes in the real game, and when it does, I was thinking about the probabilities of each case holding the million then."
Dave: Yes. So am I.
Gary: I think you are considering all possible outcomes up to that point in time, and not just that specific outcome I'm considering.
Dave: The first half is true, I don't know about the second part, because I'm considering the path that got us to the end, AND the probability at the end, when we get there.
Gary: I reach this conclusion because in only two situations in your interpretation of the above simulation was the million left in play.
Dave: Yes. Only in 2 situations is the million left in play. In 24 of 26 situations we don't get to the end. In 1 of 26 we get to the end and I win, and in 1 of 26 we get to the end, and I lose.
So now the BIG question. IF IF IF we do manage to get all the way to the end of the game, will I benefit by switching? NO. If I don't switch I will win one of those 2 games, and if I do switch I will win one of those 2 games. Switching provides no benefit in this game, formulated exactly this way.
Posted 31 Oct 2007 at 10:43pm #
All,
Another simple formulation -
24/26 times we do not get to the end of the game with $1mil remaining. Thus -
It is a rare event that I pick the $1 million on DoND. (1/26)
It is an EQUALLY rare event that the second-to-last case has the million. (1/26)
The RELATIVE probability of those two rare events, the "odds ratio" is 1 to 1.
When we are down to the last two cases, the only possibilities remaining are two equally improbable rare events. Since there are two possibilities left, and they are equally probable, the probability of each is 50%,
Starting from the beginning -
I will win 1 of 26 games by "sticking to my guns".
I will win 1 in 26 games, by switching at the last moment.
In 24 of 26 games, I will never get the opportunity to switch at the last moment.
I will win the same number of games by switching at the last moment as I win by not switching. (1 each)
But again, if we make this game into a "Monty Hall" game, the situation is very different, and I should switch.
Posted 31 Oct 2007 at 11:06pm #
Gary, I had one more thought. It sounds like you are probably familiar with the product rule for probabilities. That is - If A and B are independent events, then the probability of "A and B" is the probability of A times the probability of B.
The probability of a 6 on 1 die = 1/6. The probability of 2 sixes = 1/36. The probability of 9 heads in a row = 1/512. The probability of heads on the 10th flip = 1/2. Thus the probability of 10 heads = 1/512 * 1/2 = 1/1024.
The probability that I win on DoND should then be the probability that I make it to the last 2 cases times the probability that I survive that last hurdle. Both those things have to happen for me to win, So we need that product rule.
We agree, I think, that there are 2 chances in 26 that I will make it to those last two cases.
You think my chances of winning once two cases are left is 1 in 26, and I say 1 in 2.
If you were right then my overall chance of winning DoND according to the product rule would be 2/26 * 1/26 = 1/338. (Probability of getting to the last two cases times the probability of winning once we get there) (should, but does not, equal total probability of winning the game).
I would argue my chances of winning are 2/26 * 1/2 = 1/26. (probability of getting to the end of the game, times probability of winning once we get to the end, equals total probability of winning on DoND starting from the beginning).
Posted 01 Nov 2007 at 3:13pm #
Hey Dave,
If you were on DoND and it went down to the last two cases and the million hadn't shown up, it wouldn't matter to you which case you ended up with (50:50).
I would switch.
According to your logic you would win half of the time and so would I so it really doesn't matter what action we take. Both actions lead to the same expectation of winnings.
Neither of us is arguing that it is better to stay with the originally selected case. So, if there is even the slightest possibility that my logic makes sense then it would be better to switch.
It's been fun.
Posted 01 Nov 2007 at 3:35pm #
Kurt: I was not agreeing with you.
We all agree that the contestant's odds are 1/26 at the start of the game. But if you make it down to the final two (by luck, and not by cheating) the odds are 1/2.
If you believe that you are 25/26 to win by switching then please respond to this:
At the beginning of the game, you are 25/26 to pick a losing case. Do you believe you will make it to the final 2 without opening the million dollar case every single time?
Posted 01 Nov 2007 at 3:46pm #
[quote comment="44092"]
If we get down to the situation where we have two cases left (the selected case and one model's case) and the million has not been revealed then the following two statements hold true:
The probability that the contestant's case holds the million is 1/26, not 1/2.
The probability that the other case contains the million is 25/26, not 1/2.[/quote]
That is simply not true. Look at it this way:
At the beginning of the game, the models have a 25/26 chance to be holding the million dollar case. But their chances go down (and the contestant's go up) as they lose cases.
Suppose there's a lottery going on, with only 26 total tickets in play. A random number from 1 through 26 will be chosen to determine the winner. I give you 25 of the 26 tickets, and I give your friend only 1 ticket.
But it's a windy day, and a sudden gust of wind blows 24 of your tickets out of your hand, and they fall into a sewer opening and you can't get them back.
You now have 1 ticket, and your friend has 1 ticket. Do you think you're more likely to win, just because you used to have more tickets?
Posted 01 Nov 2007 at 4:32pm #
What if you knew that none of the tickets that went down the sewer was the winner? That is the situation I am talking about:
- two cases left and the million not revealed
Posted 01 Nov 2007 at 4:57pm #
W Honda,
Gary's last post to me indicates that he seems to agree on the main point now.
But I'd stick with my case given the 50/50. I'd be kicking myself too hard if I had the million and let it go.
Dave
Posted 01 Nov 2007 at 8:58pm #
Dave and Honda,
Thanks for being patient and sticking with me. I'm in a humble kind of mood right now because I now realize I was wrong. Quite a good moment of realization though, so thanks for helping me get it.
Posted 02 Nov 2007 at 3:06am #
Gary,
Did your original view come from a statistics class? It sounded like it might have.
There seems to be a lot of conceptual difficulties in this area. I was just reading Jaynes's "Probability Theory: The logic of science" and he talks about LaPlace's rule of succession. LaPlace developed this around 1800, and for two centuries, if the result was mentioned in a text, it was generally to mock it, and to say never to use it. In the 21st century we're learning it has a fundamental role in probability theory. But for 2 centuries virtually nobody got it.
Maybe this bodes well for my other debate. My paper just came back with basically the same objection it had the first time, which I thought I had answered, so now I need to try to convince a referee he's missing something.
:o/
The question there involves meta-probabilities. Those are "probabilities of probabilities". It's like asking what is the probability that the "real" probability of X is between 20% and 21%? That adds another layer of complexity since you've got two different sorts of probability to keep track of. I'm arguing the rules for one sort are just slightly different than the rules for the other sort. The referee thinks that won't work, but I'd argue he's confused the two levels. We'll see...
Dave
Posted 02 Nov 2007 at 4:44pm #
Dave,
No, no such luck, I can't blame a course or anyone else. I enjoy dabbling in these types of problems as a hobby, it's fun and it keeps the mind a little sharper than it would be otherwise.
Good luck with your submission, it sounds like a complicated topic. On the face of it meta-probability appears to be related to the confidence interval.
Posted 02 Nov 2007 at 5:46pm #
On meta-probabilites -
Related to confidence anyway. Jaynes's example is "previous life on Mars" vs. an "inspected fair coin". Suppose we think the probability of each is 50%. But even given 10 heads in a row, we are not going to change our assessment of the fair coin very much. We still think heads is 50% probable on the next toss. But a little bit of new information about Mars could change our probability assessment there very quickly. The meta-probability associated with the 50% probability of heads on the coin is very high, we're very sure it is 50%. The meta-probability associate with the 50% probability of previous life on Mars is low. Our estimate is 50%, but that could change easily.
Dave
Posted 03 Dec 2007 at 12:39am #
In reply to the 1/27 vs. 1/2 argument,
Picture a scenario where a ball is some distance away from a wall. If the ball moves directly toward the wall, we can say its chances of hitting the wall are 100%, assuming it travels the correct distance. If the ball moves away from the wall, we can say its chances of hitting the wall are 0%, even if it travels the same distance as in the first trial. One step further, we can say if the chances of the ball hitting the wall are 100%, then the chances of it not is 0%, and if the chances of hitting are 0%, then the chances of not hitting are 100%.
Let's move into DoND terms. If ten people each have a case in their hands, one with a million and nine with less than a million, and one case is revealed to have less than a million, then that case's chances of having the million reduce to 0% "at this moment." According to the ball experiment, if the chances of one situation reduce, the chances of the opposite must increase (Newton's Second Law). Thus, if the chances of one case having a million reduce, then the chances of the other cases having a million must increase. If we eliminate 8 of the 10 cases lower than a million, then all 8 of those had their chances of being a million reduce to 0%, which caused the chances of the last two to increase. Neither is 100%, and neither is 0%, however, but their probabilities changed nonetheless.
The biggest misconception in this scenario is the time perspective, or the input of trials. With respect to the beginning, each of the cases had an equal chance of being the million. As a trial is completed, the number of cases decreases, which changes the variables of the experiment, thus making it a completely new trial when the process is repeated. This trial has fewer cases and makes the chances higher, with respect to the previous test, for the other cases to be a million.
Finally we can look at it from the high"er"/high"est" point idea. Ten cases all have different and unknown values in them, for instance. If one, two, or even eight cases are removed, knowing none are the high"est", the chances of the next case being high"er" than the other nine cases remains constant. After all, the next case could be less than the tenth case, but still be greater than the other eight removed, and therefore satisfy the high"er" statement. However, the chances of that case being high"est" are varied as the number of cases decreases that could be high"est". After two cases are removed, one is eliminated from being the high"est". After three, two cases are eliminated, and so forth. The number of cases makes no impact on whether the next case is high"er" or not than the others, but does impact whether the next case is high"est".
Posted 10 Dec 2007 at 9:58pm #
I teach 8th grade advanced algebra. This discussion has been awesome for my class and me to follow and discuss during the probability and statistics unit...which I drew out longer just to keep the 14-year-olds involvedly thinking about this. I had to "interpret" a lot of the advanced algorithmic stuff, but the gist of the discussion led us to try many simulations suggested in the posts. So thanks!!!!!! But the remaining question is...How does the banker determine his offer? We are still hashing that one out. And as I watch the show even as I type, the contestant is deciding whether to switch cases [the 1 million is out of play, sorry]. The guy ended up with $1. Har har har.
Posted 11 Dec 2007 at 10:59am #
The old formula for the on-line game can be found here -
http://www.davegentile.com/stuff/Deal_or_no_deal.html
They have changed the formula, however.
All I've determined is that for the first 5 rounds, the offer is between 20% and 35% of the expected value. It is (almost) always an even percentage multiplier (20%, 21%, 22%, etc.)
After that, when you get into the single suitcase rounds, the range of possible multipliers rises with each case. It seems to be about 60%-75% by the time you get to two cases.
Posted 11 Dec 2007 at 11:22am #
O.K. if we've done that probability "paradox" to death, here is a new one -
Suppose I have two envelops. We are told one envelope contains twice the $ amount as the other. Which should I choose? The symmetry of the situation says that it should not matter. There is a 50/50 chance of picking the better envelope.
Now open one. Suppose we find $1000. Now we know the other envelope must have either $500 or $2000, each with 50% probability. The expected value of the other envelope is then $2500/2 = $1250. So we should switch, if given a chance. But this means that no matter what we find in the first envelope, we will always decide we should switch. How is this possible when the odds were 50/50 to start with?
This paradox shows that probability depends on the information you have available. (At least that is what a Bayesian interpretation of this problem says). When we don't know the contents of either envelope, they are equal. But the problem is rigged in such a way, that whichever envelope we know the contents of will leave us in an information situation where we should prefer the other envelope.
You might not want to share this one with 8th graders. They might decide the whole thing is bogus.
On the other hand, dealing with this sort of thing is important, if you are a graduate student studying markets. Because in a market, in many cases, or at least in the simplest models, the probability of halving your money, is equal to the probability of doubling your money. Not knowing anything but that one fact about the market, you're more likely to have more money tomorrow than today (at least a little bit).
A complication in a market situation is that there is a zero value, and units of money are only divisible into so many pieces. Thus - there is always some possibility of losing everything. This offsets, to some extent, your expected gain. That is - if you stay in until tomorrow, you will probably make money, but if you stay in forever, you will go broke at some point.
Posted 19 Dec 2007 at 12:48am #
Ok, so I'm no mathematician, and the analogies are getting a bit old at this point. But I can offer something which nobody seems to have tried in some time. I wrote a program to simulate a bunch of trials, using an approach that more closely mirrors what you see on the real show. I'm not looking at offers -- only games where you've made it to the very last round and need to decide whether to keep your case or swap it for the one remaining on stage. Call it the Mythbusters approach -- forget the theory, just experiment & observe!
My program started like this:
1. Pick a case at random
2. Randomly eliminate all but one other case
3. Compare the initial case to the final remaining one
I ran this a million times, looking to see whether it's better to stick with your original case or switch. There may or may not be a million dollars involved, but will you get a higher payout by staying or switching?
Not surprisingly, it makes no difference. The chance of your initial pick being higher than the final remaining case came out to 50.14%. HOWEVER, there's absolutely no consistency to the amounts you're comparing -- it could be $25 and $300. If those are the only two cases you have left, nobody's really going to care.
So, I decided to limit my program to scenarios that often occur on the show, and where you really do care about the outcome. I used the same procedure as above, BUT required the following condition to be true:
- After eliminating all but one case, one remaining amount must be >= $200,000, and the other must be <= $50,000
If that condition is not met, throw out the trial (this game is uninteresting) and play again.
So now you're in the very last round, the two amounts on the board are $50,000 (or lower) and $200,000 (or higher), and you need to decide whether to stay or switch. What do you do?
Incidentally, this happened in 33.28% of all trials -- meaning I had to run just over 3 million trials to get 1 million where this happened. So if you blindly refuse all offers and make it to the last round, there's about a 1 in 3 chance that you'll be choosing between one amount that's $50,000 or lower and another that's $200,000 or higher. But anyway...
Now that you're in that situation, should you stay or switch? Once again, it makes absolutely no difference! In 50.09% of the matching trials, staying with the initial case led to the higher payout.
So now let's limit it to the "most" interesting scenario. You're down to two cases, and one of them has the million dollars. This happened in only 7.7% of all trials -- about 2 out of 26, exactly as you'd expect. (Either the initial case OR the final remaining case must contain the million.) Now what do you do?
Surprise, surprise: yet again, staying with the initial case yielded the million in 50.11% of the trials.
Personally, I'm not really interested in a formal proof of this. I will say that the simulators on the Monty Hall site convinced me that in that game, you should always switch; in this game, my own simulator has convinced me that it doesn't make the least bit of difference.
Posted 14 Jan 2008 at 10:12pm #
[quote comment="17846"]I'm sorry but the engineer has some basic problems with logic. If you were to begin each turn with all cases back in play then yes. But you don't and this brings the equation back to the present which for example if you have 2 cases left $1 and $1,000,000 in play then at this point in time you have a 50/50% chance, odds for this purpose are current not historical, if you were to place a bet in a horse race out of 20 horses assuming they were physically equal etc. Then 18 horses were scratched from the race then you now have a 50/50 chance of your horse coming in first and your bookmakers odds would drop below 2/1 as opposed to below 20/1(to allow for profit). It would be totally irrelevant how many horses "were" in the race![/quote]
NO! its a 50/50 chance that one of the suitcases has the $ 1 million . . . NOT 50/50 chance that you picked the $1,000,000 because when you choose the case, it's out of 27, not out of 2 . . .it doesn't change that the INITIAL 1/27 chance of getting the $1,000,000. . . . Just because you open one case at a time it doesn't mean your chances are greater everytime... it's always going to be a 1/27 chance that you picked the $1,000,000
Posted 16 Jan 2008 at 7:18pm #
Jason,
Who cares about the initial probability? Yes, it is true that when you begin there is a 1/27 chance of picking the $1,000,000 but it is irrelevant as you continue playing the game. It's like the horse race example above. If 18 horses scratch then it doesn't matter what the odds WERE, it matters what they are NOW. If you had a decision between the two remaining cases you should ignore the initial probability and only focus on the current one which would be 50/50.
As you play the game you eliminate cases and thus rule out certain outcomes which MUST change the current probability that your case contains the $1,000,000.
You shouldn't think of it like your choosing from 2 cases at the end, instead think of it like you're choosing the order in which you open all 27 cases. What's the probability the $1,000,000 case is in the last position? 1/27. What's the probability that it's in the second to last position? 1/27. Thus both values are equivalent and when you eliminate all other possibilities you have 1/27 chance for both cases or 50/50.
Posted 23 Jan 2008 at 4:13pm #
Another engineer ringing in:
The best explainaition of why the probability of winning one million dollars is best expressed in the phrase "The die has been cast"
If you were to roll a fair die and asked what number wil show, the odds are 1/6 that you will be correct. Suppose you throw that die and instead of seeing it yourself, the casino worker began to tell you all of the numbers that did not show. This does not effect your probablility of winning because you already threw the die. The "event" is over.
In DOND you make a choice at the beginning. The odds of your suitcase being any value is 1/total number of cases at the time you are allowed to select one. After you have made your choice the "event" is over.
What does change are the odds of the banker's offer being better or worse than what you may have. However recognize that here you are asked to choose which constitutes another "event".
Posted 24 Jan 2008 at 5:11pm #
Woody,
I can't quite tell which side you are on. On DonD when there are two cases left, one of which has the $1,000,000 what is the probability that you have picked the $1,000,000, at this point in time?
Posted 27 Jan 2008 at 7:24pm #
"Carey and I watched the first episode of "Deal or No Deal" last night. We watched a dumb woman pass up $171,000 for a 1-in-3 (2-in-6) chance of winning either $300k or $500k. The other four choices were $50k, $7500, $500, and $100 or something like that. The lady, who had never owned a house, ended up with $25,000 (her briefcase had something small; $500 I think)."
You didn't say how far the woman pushed it, but if she was offered $171,000 with the board showing $100, $500, $7500, $50,000, $300,000 and $500,000 then turning down the deal was NOT dumb at all.
Six amounts on the board, and only 2 of them were more than her offer. This means if she reveals those 2, the offer plummets. But if she reveals one of the other 4, the offer will rise.
4 versus 2? Hmm... she is TWICE as likely to make the offer go up as down on the next pick. I'd take another pick, too.
Posted 27 Jan 2008 at 7:49pm #
[quote comment="45559"]4 versus 2? Hmm... she is TWICE as likely to make the offer go up as down on the next pick. I'd take another pick, too.[/quote]
While true, let me ask you this: if I gave you six briefcases and one had $1M, but the other five had $1, and I offered you $180k to stop or I was going to open two briefcases, would you do it?
In other words, the amounts matter too. It's not like she had a big clump all around $171k. Losing even one of the high ones would have been devastating (and it turns out it was).
Additionally, $171k to someone who's never owned a house reminds me of the phrase "bird in the hand beats two in the bush."
Posted 28 Jan 2008 at 12:14am #
[quote comment="45559"]
You didn't say how far the woman pushed it, but if she was offered $171,000 with the board showing $100, $500, $7500, $50,000, $300,000 and $500,000 then turning down the deal was NOT dumb at all.
Six amounts on the board, and only 2 of them were more than her offer. This means if she reveals those 2, the offer plummets. But if she reveals one of the other 4, the offer will rise.
4 versus 2? Hmm... she is TWICE as likely to make the offer go up as down on the next pick. I'd take another pick, too.[/quote]
In this situation, the cases add up to $857,810 with an average value of $143,968. Turning down an offer of $171,000 is dumb, since it means you're giving up $171k for something worth only $144k.
Additionally, with 6 cases remaining (and 2 large amounts), here are the possible case choices if you have to pick two:
500k 300k
500k 50k
500k 7500
500k 500
500k 100
300k 50k
300k 7500
300k 500
300k 100
50k 7500
50k 500
50k 100
7500 500
7500 100
500 100
So there's actually a 60% chance (9 out of 15) of opening a large case and losing money.
Posted 28 Jan 2008 at 4:59pm #
[quote comment="45510"]Woody,
I can't quite tell which side you are on. On DonD when there are two cases left, one of which has the $1,000,000 what is the probability that you have picked the $1,000,000, at this point in time?[/quote]
Dave the chances that your case hold 's $1,000,000 is 1/26.
This is because the "event" took place when there were 26 to choose from.
If you were allowed to pick again at his particular point in time, it constitutes a new event and your chances would be 50/50.
In last week's game (Britney Lewzader) there were 2 cases left. One held $400 an dthe the other had $1 million. Howie told her she had a 50/50 chance of winning a million dollars. That is incorrect. By selecting the case (yet another event) she has a 50/50 chance of a favorable outcome. there are 4 possibilities and 2 of them are in her favor. We only get to recalculate probability here because a choice is being made from what remains.
Had she not accepted the banker's offer, her chances remain 1/26 of wining $1 milion. Because that decision was made from 26 choices. That die has already been cast.
Posted 28 Jan 2008 at 5:10pm #
[quote comment="45510"]Woody,
I can't quite tell which side you are on. On DonD when there are two cases left, one of which has the $1,000,000 what is the probability that you have picked the $1,000,000, at this point in time?[/quote]
Dave the chances that your case hold 's $1,000,000 is 1/26.
This is because the "event" took place when there were 26 to choose from.
If you were allowed to pick again between the two cases at his particular point in time, it constitutes a new event and your chances would be 50/50.
In last week's game (Britney Lewzader) there were 2 cases left. One held $400 an dthe the other had $1 million. Howie told her she had a 50/50 chance of winning a million dollars. That is incorrect! By selecting the banker's offer (yet another event) she has a 50/50 chance of a "favorable outcome". There are 4 possibilities and 2 of them are in her favor. We only get to recalculate probability here because a choice is being made from what remains. It turns, the outcome of this event was in her favor.
Her choice was prudent because she traded a 100% chance of winng $400k over a 1/26 chance of winning $1 million.
Had she not accepted the banker's offer, her chances remain 1/26 of wining $1 milion. Because that decision was made from 26 choices. That die has already been cast.
A lot of folks have trouble accepting this because it is easy to get distracted by option to accept or decline the banker's offer between rounds. This amount of his offer is dependent upon a different set of probabilities because the contestant chooses cases to eliminate. Each of these "events" had a different probabilty of increasing the banker's offer depending upon which are available and how many you get to choose.
Posted 28 Jan 2008 at 5:35pm #
[quote comment="44846"]O.K. if we've done that probability "paradox" to death, here is a new one -
Suppose I have two envelops. We are told one envelope contains twice the $ amount as the other. Which should I choose? The symmetry of the situation says that it should not matter. There is a 50/50 chance of picking the better envelope.
Now open one. Suppose we find $1000. Now we know the other envelope must have either $500 or $2000, each with 50% probability. The expected value of the other envelope is then $2500/2 = $1250. So we should switch, if given a chance. But this means that no matter what we find in the first envelope, we will always decide we should switch. How is this possible when the odds were 50/50 to start with?
[/quote]
Good question Dave! 🙂
Choose between two envelopes (EVENT #1).
Chances of having the big envelope are 1:2
Open the envelope to reveal it's value (not an event)
Choose between your envelope and the open envelope (EVENT #2).
You chances of having the larger value are 1:2.
The odds don't "remain" 50/50. In your example they just so happen to be 50/50 again for the second event. The amount revealed by opening the envelope is irrelevant. What matters in the second event is that your are again being asked to choose between two options.
Posted 28 Jan 2008 at 5:58pm #
Dave said on December 11, 2007:
O.K. if we've done that probability "paradox" to death, here is a new one -
========
Woody,
We agree the probability is 50/50 all the time here. But you ignored the "expected value" calculation, which is what makes this problem interesting. The interesting question, after opening envelop 1, is this: "Will your average payoff be larger if you choose envelop #2?"
Suppose envelope 1 has $1000. Now there is a 50/50 chance. Envelope #2 either has $500 or $2000. The average payoff in envelope #2 is (2000+500)/2 = $1250. That's greater than the $1000 in your envelope, thus you should switch. YOu have a 50% chance of being worse off by $500 and a 50% chance of being better off by $1000.
Posted 28 Jan 2008 at 6:23pm #
Woody: the chances that your case hold 's $1,000,000 is 1/26.
This is because the "event" took place when there were 26 to choose from.
If you were allowed to pick again between the two cases at his particular point in time, it constitutes a new event and your chances would be 50/50.
Dave: Nope. As far as when probabilities change - that might depend on what sort of probability we are talking about. In Bayesian probability, we update with each new piece of information, and on DonD, every case we open is another bit of information.
Woody: Had she not accepted the banker's offer, her chances remain 1/26 of wining $1
million.
Dave: See the discussion surrounding the computer simulation, above. In 26 games played to the end:
On average 24 games will not have $1,000,000 left, when we reach 2 cases.
On average in 1 game, the million will be in the penultimate case.
On average, in 1 game, there will be a $1,000,000 winner.
There is 1 winner in every 26 games, played from the beginning.
However, of the 2 games that reach the last two cases, there will be 1 winner. Thus there will be on average 1 winner in every two such games. And at least to this Bayesian, that is a 50% probability.
It sounds like more confusion with the Monty Hall game. There, when there are three doors, and Monty opens one, your odds really DO stay 1/3rd. This has nothing to do with the counting of "events", however. The different between the game is that Monty opens a door that MUST not contain the grand prize. But on DonD, every case you open MAY contain the grand prize, thus the information provided by opening the door is different that the information provided by opening a case.
Three Monty games where we do not switch:
In 2 of the three games, the contestant will not initially pick the winning door. Of the two remaining, Monty will open the remaining losing door. The contestant will not switch, and will lose.
In 1 of the three games, the contestant will initially pick the correct door. Monty will offer a switch. This is declined. We have a winner.
1 in 3 games is won by not switching
Three Monty games where we do switch:
In 2 of the three games, the contestant will not initially pick the winning door. Of the two remaining, Monty will open the remaining losing door. The contestant will switch, and will win.
In 1 of the three games, the contestant will initially pick the correct door. Monty will offer a switch. This is accepted. We have a loser.
2 in 3 games is one by switching in the Monty game.
But again, on DonD, if you have 2 cases left – you will win 1 time in 2 if you switch, and one time in 2 if you don't switch, because in 24 out of 26 games, you never got down to the last two cases. And in the 2 games that do reach this point, there is 1 winner, and 1 loser, on average. Repeating the above –
On average 24 out of 26 games will not have $1,000,000 left, when we reach 2 cases.
On average in 1 game out of 26, the million will be in the penultimate case.
On average, in 1 game out of 26, there will be a $1,000,000 winner.
There is 1 winner in every 26 games, played from the beginning.
However, of the 2 games that reach the last two cases, there will be 1 winner. Thus there will be, on average, 1 winner in every two such games. And at least to this Bayesian, that is a 50% probability.
Posted 28 Jan 2008 at 6:34pm #
[quote comment="45602"]Dave said on December 11, 2007:
O.K. if we've done that probability "paradox" to death, here is a new one -
========
Woody,
We agree the probability is 50/50 all the time here. But you ignored the "expected value" calculation, which is what makes this problem interesting. The interesting question, after opening envelop 1, is this: "Will your average payoff be larger if you choose envelop #2?"
Suppose envelope 1 has $1000. Now there is a 50/50 chance. Envelope #2 either has $500 or $2000. The average payoff in envelope #2 is (2000+500)/2 = $1250. That's greater than the $1000 in your envelope, thus you should switch. YOu have a 50% chance of being worse off by $500 and a 50% chance of being better off by $1000.[/quote]
Another good question:
Deciding weather to switch or not is a matter of personal strategy. If you say "I will always switch based on expected value of ..." that is a strategy. Your strategy could be such that $1000 is a suitable outcome for you. It would then be unwise to throw it away.
The expected value is defined as "the mathematical expectation of a random variable".
In your example where each outcome is equally possible the expected value is the average value of all possible outcomes. Your calculation is correct.
How you treat the expected value of an event depends upon your strategy in the situation you are considering. It could be a gameshow, a miltary campaign, a business decission, or a stock market choice. Expected value is one of many parameters used to describe what mathematicians call a "sample space".
Posted 28 Jan 2008 at 6:51pm #
Dave:
Bayes theorem applies to conditional events.
In DOND the choices being made are independent.
We are all in agreement that at the beginning of the game your chances of winning are 1/26 of winning $1million.
By the time we get to two remaining cases, one of which holds $1 million. We area asked to accept a deal or keep our case.
We must first agree on what "winning" is.
If winning is making the best choice at this point in the game it is obvious the odds are 50/50. There are four things that can happen in our situation.
1) accept the offer and our case is smaller (favorable)
2) accept the offer and our case had a million (unfavorable)
3) decline the offer and our case in smaller (unfavorable)
4) decline the offer and our case has a million)
Four possible outcomes, two of which are favoribable. The odds of "winning" here are 50/50.
If we define "winning" as getting the million then we have to deny the banker's offer. We must stick to our original choice because teh banker's offer isn't a million. The odds our original choice is one milion dollars still 1/26 .
Posted 28 Jan 2008 at 8:02pm #
Woody,
I've posted a few times to give an alternate view and try to help dave convince the people who are still confused but I will try again. If you have new information you MUST update your odds. Everytime you eliminate a case, you are eliminating possible outcomes and thus learning more about your situation.
I'm going to use the Monty Hall Problem to help illustrate the idea. If you aren't familiar with this problem it would definitely help to be.
First situation:
-Contestant chooses a case from 26
-Host knows if the contestant picked the correct case and where the $1,000,000 case is
-Host then elimates 24 cases leaving contestant's choice and another case of his choice.
-The Host purposefully picks the case so that the million dollar case is in one of the two cases.
This situation results in a 1/26 chance for the original choice containing the million and it would be wise to switch obviously.
Second situation is the normal DoND situation. The contestant eliminates cases one by one, eliminating cases by pure chance as opposed to host persuasion.
Would you rather play the first game or the second? The probability the $1,000,000 case makes it to the final two in the second game is 1/13 (2/26) and you're saying you have the same chance of winning as the first situation? Hopefully you can tell that your logic is flawed by this demonstration. You can't ignore all the cases you elimated during the game because you have new information.
Posted 28 Jan 2008 at 10:24pm #
Woody wrote: Deciding weather to switch or not is a matter of personal strategy.
Dave: True. Economists would define a personal "utility function". And the would be a factor in a fuil "decision theory analysis. However, I've just assumed that we are risk neutral. If that is the case then we should always switch from envelop #1 to envelop 32, whatever amount is revealed.
Posted 28 Jan 2008 at 10:42pm #
Woody wrote: Bayes theorem applies to conditional events.
In DOND the choices being made are independent.
Dave: You can use Bayes theorem as a classical statistician, but that's not what I meant. There are two major philosophical schools of probability theory. One is "frequentist" or "classical" the other is "Bayesian".
One quick definition - A Bayesian says you can have the "probability of a hypothesis". In classical stats, you can't have that - a hypothesis is something you condition on.
Here is a practical example - I flip a coin, and hide it behind my back. What is the probability that it is heads? A classical statistician might answer that before the flip, the odds were 50/50. After the flip the probability is either zero or one, but we don't know which. In classical stats you can only have the probability of a random varible, not the probability of a hypothesis. I fall in the Bayesian camp. My hypothesis is that the coin is heads. I claim there is a 50/50 probability that this hypothesis is correct.
So, we may just be having semantically or philosophical issues, rather than any fundamental disagreement. Perhaps sharing our background might be relevant. After a Masters in physics, and some practical statistics work in market research, I did most of a PhD Management Science. I actually ended up with an M.S. Finance is a program the rough equivalent of a financial engineering degree. I'm currently employed as a statistician, and working on publishing a paper in the Philosophy of Science area on the topic of Bayesian probability. Most of the related material I read these days is from the philosophy of science field, but I'm familiar with the topics from other perspectives.
So anyway, I'm starting to suspect we may just have a philosophical/semantic issue here. However, I'm not totally sure of that. Thus, the next post...
Posted 28 Jan 2008 at 10:52pm #
Woody writes: We must first agree on what "winning" is.
If winning is making the best choice at this point in the game it is obvious the odds are 50/50. There are four things that can happen in our situation.
1) accept the offer and our case is smaller (favorable)
2) accept the offer and our case had a million (unfavorable)
3) decline the offer and our case in smaller (unfavorable)
4) decline the offer and our case has a million)
Four possible outcomes, two of which are favoribable. The odds of "winning" here are 50/50.
If we define "winning" as getting the million then we have to deny the banker's offer. We must stick to our original choice because teh banker's offer isn't a million. The odds our original choice is one milion dollars still 1/26 .
Dave: I want to clearify whether or not this is a real disagreement, or a philosophical/semantical one. So here is a question. Let's leave out the term "probability" and just focus on fractions or the frequecy of events. We are in a situation where there are two cases left. One contains the $1,000,000. No choices are offered at this point. What fraction of the time will it be true that we will move from this current situation to a situation where we are $1,000,000 winners. Or - in other words - Suppose we play 26,000 total games. 2,000 result in us getting to the final two cases with $1,000,000 remaining in play. No choice is offered. Is it correct ot say that we will be $1,000,000 winners in one half of the cases? That is - Is it correct to say that we will win the $1,000,000 in about 1,000 of the 2,000 trials?
Posted 29 Jan 2008 at 12:21pm #
[quote comment="45614"]...
So anyway, I'm starting to suspect we may just have a philosophical/semantic issue here. However, I'm not totally sure of that. ... [/quote]
I am a BSEE and have been working for 10 yrs in embeded software development. Most of my work experience is in the area of signal processing. In this thread my experience with decision making algorithms is what I have drawn from.
What I understand of the Bayesian model is that it uses the "effect" to describe the "cause". A fundamental example being this:
I have blindly pulled n number of balls from a bucket.
I do not know how many balls were in the bucket nor do I know how many are of any particular color.
I know that half of the balls I have drawn are black and half are white.
Bayesian statistics says that I have a 50/50 chance of selecting a white ball if I pick again.
As I sleect more balls, the probability can change based on the evidence of picks that have already been made.
So Dave, it is possible I may not understand the Bayesian completely.
---
The reason I do not feel that Bayesian logic here is because in DOND the sample space is completely defined. We know exactly what our sample space is from the beginning.
Wouldn't the Bayesian model would agree with the classical model as it did in your coin flip example?
---
Dave: ... Suppose we play 26,000 total games. 2,000 result in us getting to the final two cases with $1,000,000 remaining in play. No choice is offered. Is it correct ot say that we will be $1,000,000 winners in one half of the cases? That is - Is it correct to say that we will win the $1,000,000 in about 1,000 of the 2,000 trials?
Woody: I don't think so, Dave.
The only way to win the $1,000,000 is to have picked it already in the begining. That probabilty remains 1/26.
I do recognize that if the game had arrived at this situation 10 times and 5 of those times resulted in a million dollar winner, Bayesian statistics would say the probability is 50/50 that the next persion in that situation would win the milion. While classical statistics would still say 1/26.
So yes if there were in fact 2000 actual games played in the situation you described and there had actualy been 1000 winners of the $1,000,000 then I do understand that the Bayesian model says the chances of the next guy wining is 50/50. But can we arbitrarily declare the probability to be 50/50 without previous results to support the hypothesis.
Dave, I admit I am no expert in the Bayesian model.
I first visited this thread to see if anyone had noticed a predictable trend in the banker's offer. But it seems they simply fudge plus or minus a bit from the mean value of what remains on the board.
You posts have encouraged me to look further into the Bayesian idea though. I appreciate that.
So if I am correct in my classical model and you are correct in your Bayesian approach then it seems we may be in agreement.
Posted 29 Jan 2008 at 12:32pm #
[quote comment="45610"]... Would you rather play the first game or the second? The probability the $1,000,000 case makes it to the final two in the second game is 1/13 (2/26) and you're saying you have the same chance of winning as the first situation? Hopefully you can tell that your logic is flawed by this demonstration. You can't ignore all the cases you elimated during the game because you have new information.[/quote]
Corey,
I am familiar with the Monte Hall problem and it is quite different from the DOND situation.
With Monty you pick 1 of 3. Monty deliberately selects one of the booby prizes. If you were to pick again you now have a 50/50 chance of winning. This is because you a picking again from a different sample space.
The chances of the $1,000,000 making it to the final round are not 1/13. They are the chances of you picking it plus the chances of you never turning it over if you don't pick it first. I have not calculated it but it is definitely not 1/13.
In DOND we only get to select one suitcase. We do it at the begining from a pool of 26 cases. We never get the chance to trade it in for another case when there are fewer cases remaining. We do get this opportunity in Monty Hall and there lies the difference.
Posted 29 Jan 2008 at 12:50pm #
[quote comment="45627"] ...
The chances of the $1,000,000 making it to the final round are not 1/13. They are the chances of you picking it plus the chances of you never turning it over if you don't pick it first. I have not calculated it but it is definitely not 1/13.
... .[/quote]
I have made an error regarding this point . Corey, you are correct that the chances of the $1,000,000 suitcase (or any case) being in the final two is 1/13.
Posted 29 Jan 2008 at 1:00pm #
[quote comment="45628"]
I have made an error regarding this point . Corey, you are correct that the chances of the $1,000,000 suitcase (or any case) being in the final two is 1/13.[/quote]
I must correct myself agian 🙁
This is only true if I picked the two cases at the very beginning.
For any case to be in the final round I have to either have picked it in the beginning or never picked it at all. I will try again without being so hasty but I am certian the result will not be 1/13.
Posted 29 Jan 2008 at 1:05pm #
[quote comment="45629]This is only true if I picked the two cases at the very beginning.
[/quote]
I am also incorrect here.
😉
Posted 29 Jan 2008 at 2:19pm #
Woody: So if I am correct in my classical model and you are correct in your Bayesian approach then it seems we may be in agreement.
Dave: I think you are 95% correct from the perspective of the Frequentist camp. And of course, I think I'm correct from a Bayesian point of view.
Let me try to do the analysis from a frequentist point of view. (Less familiar territory for me). Here the only probability we can have is the probability of a random variable. Conceptually probability = the relative frequency of some event in the infinite limit. Thus when we say that we have a 1/26 chance of picking the $1,000,000 we mean that as we approach an infinite number of trials, the actual percentage of times we succeed will be 1/26th.
So far so good.
Now suppose we get down to two cases. This happens 1 out of 13 times on average. If we ask the classical statistician "What is the probability that our case contains $1,000,000?" he'll tell us that we have asked an improperly phrased question. He'll tell us we can't have "the probably that case #26 contains $1,000,000" because, technically, the contents of our sampled case is not a random variable. It has a definite physically defined value, but we just don't know it yet. So to the classical statistician there is no answer to the question "What is the probability my case contains $1,000,000?" (Other than "0 or 1, but I don't know which") because we are asking for a probability associated with a hypothesis, and the classical statistician says we can't have such a thing.
But I think we could use Bayes's formula classically as follows -
P(M|N) = P(M)*P(N|M) / P(N)
The | means "conditional".
P(M) is the probability of selecting $1,000,000 in a 1 case sample. Classically this is still the random variable associated with our initial 26 cases, so P(M) = 1/26.
P(N) is the probability of not finding the $1,000,000 in a 24 case sample of the 26 cases = 1 - 24/26 = 2/26 = 1/13.
P(N|M) - this is the probability of not finding it in the 24 GIVEN that it really is in our final case. P(N|M) = 1. If it is in our final case, the conditional probability of finding it in those other 24 was 0, so the probability of NOT finding it in those other 24 = 1.
Then our formula says: P(M|N) = P(M)*P(N|M) / P(N)
and we have P(M|N) = (1/26) * (1) / (1/13) = 1/2.
And the probability of sampling the $1,000,000 in the final case, conditional on not having found it in the previous 24 = 50%.
More to come…
Posted 29 Jan 2008 at 2:48pm #
Well done Dave. That is a very fine explanation. I understand now where you are coming from.
Forgive those three brainfart posts. That's from me trying to do too many things at once. 😆
Posted 29 Jan 2008 at 2:52pm #
Woody: I have blindly pulled n number of balls from a bucket.
I do not know how many balls were in the bucket nor do I know how many are of any particular color.
I know that half of the balls I have drawn are black and half are white.
Bayesian statistics says that I have a 50/50 chance of selecting a white ball if I pick again.
As I select more balls, the probability can change based on the evidence of picks that have already been made.
Dave: Yes, the correctly describes the situation from a Bayesian point of view.
In my view classical statistics is completely correct, but incomplete, because it limits what we can assign a probability to.
Sometimes Bayesian probability is thought of as representing our "degree of belief". I'm not fond of that because that is associated with "subjective Bayesianism" in which the probability can vary from person to person, and I favor an "objective Bayesian" approach. But "degree of believe" does capture the idea that what we are concerned with is an "in my mind" or "epistemological" probability, where as in the classical approach probability is "out there" and assumed to be associated with something physical. They can agree, of course. If we are about to roll a die, the physical probability is 1/6th, and my justified epistemic state is also a 1/6th degree of belief. But, if the die is cast, and hidden, then the approaches disagree. Physically, the probability is now 1 or 0. Epistemically, I'm in the same place I was before, since I have learned nothing. Thus *information* is key in a Bayesian approach.
And rather that call the probability "degree of believe" (as in subjective Bayesianism) I prefer "logical inductive probability" (an objective Bayesian idea). The idea here is that there is a system of rules, analogous to deductive logic, which defines the probability. Deductive logical operations produce values of 0 or 1, depending on the information input. "Logical inductive probability" produces a value between 0 and 1, again dependent on the information input.
Let IP = inductive probability. Say before the die is case IP = 1/6th. After it is cast and hidden, I have received no information so IP= 1/6th, as it did before.
In drawing marbles from the unknown distribution, in your example, each marble drawn provides information and thus changes my probability. A difficult question for the Bayesian approach concerns the “prior probability†– that is “What is the probability when we have no information at all?†My paper proposes my answer to this question, slightly different than previous attempts at an answer.
Posted 29 Jan 2008 at 2:53pm #
Woody: Well done Dave. That is a very fine explanation.
Dave: Thanks. 😀
Posted 29 Jan 2008 at 6:34pm #
[quote comment="45627"][quote comment="45610"]... Would you rather play the first game or the second? The probability the $1,000,000 case makes it to the final two in the second game is 1/13 (2/26) and you're saying you have the same chance of winning as the first situation? Hopefully you can tell that your logic is flawed by this demonstration. You can't ignore all the cases you elimated during the game because you have new information.[/quote]
Corey,
I am familiar with the Monte Hall problem and it is quite different from the DOND situation.
With Monty you pick 1 of 3. Monty deliberately selects one of the booby prizes. If you were to pick again you now have a 50/50 chance of winning. This is because you a picking again from a different sample space.[/quote]
Woody,
I'm not sure how familiar you are with Monty Hall but at no point is there a 50/50 chance of winning.
Posted 30 Jan 2008 at 1:19pm #
[quote comment]
Woody,
I'm not sure how familiar you are with Monty Hall but at no point is there a 50/50 chance of winning.[/quote]
I am very familar with the Monty Hall problem and absolutely certain that the chances of the probabilities stated above.
At the point where you are given a second choice here are the possible situations outcomes if you switch doors.
1) you had the car and pick a goat (unfavorable)
2) you had the goat amd pick a car (favorable)
That is a 50/50 chance of winning. I am very sure of this.
This exact problem was used as a demonstration in a week-long seminar I attended led by faculty members of the University of AZ.
It was in this same seminar that I was introcuced to Bayesian Statistics.
Posted 30 Jan 2008 at 1:24pm #
[quote comment="45654"][quote comment]
Woody,
I'm not sure how familiar you are with Monty Hall but at no point is there a 50/50 chance of winning.[/quote]
I am very familar with the Monty Hall problem and absolutely certain that the chances of the probabilities stated above.
At the point where you are given a second choice here are the possible situations outcomes if you switch doors.
1) you had the car and pick a goat (unfavorable)
2) you had the goat amd pick a car (favorable)
That is a 50/50 chance of winning. I am very sure of this.
This exact problem was used as a demonstration in a week-long seminar I attended led by faculty members of the University of AZ.
It was in this same seminar that I was introcuced to Bayesian Statistics.[/quote]
Reviewing my note I am wrong. Corey, I'm not sure why I keep yacking up any time I respond to your posts 😛
Switching doors at the end give the contestant a 2/3 chance of winning the prize.
I stand corrected.
http://en.wikipedia.org/wiki/Monty_Hall_problem
Posted 19 Feb 2008 at 4:47am #
Look, Marty's question is WAY to simple to answer to justify all these posts on it. Both Marty and the engineer are right, because they are answering DIFFERENT questions. They are both also wrong because they refuse to acknowledge this simple fact...
The engineer is answering this question: What are the odds that the contestant picked the case with $1,000,000 in it?
Answer is 1 in 26, and does not change at any time during the game.
Marty is answering this question: What is the chance that the contestant has chosen the $1,000,000 case given the knowledge that his or her case is one of four that MUST contain $1,000,000?
Answer is 1 in 4, and is 100% correct in response to the current question.
So both are right, because they are each answering different questions, and that is as complex as this issue gets.
BTW, from a FUNCTIONAL standpoint, as in how you go about getting by in the REAL world, Marty is asking the most useful question, and therefore getting the most useful answer.
The engineer is correct, but if he actually tried to live his life answering all questions like this he would probably be dead in a week (imagine someone who crosses streets based on overall odds, not on the actual situation, like whether a car is coming, or whether he has the walk signal.)
Posted 19 Feb 2008 at 12:06pm #
Arentol,
Your comments put you solidly in the Bayesian school of probability. (Where I am as well)
Dave
Posted 29 Feb 2008 at 3:51pm #
Hello,
I stumbled on this thread because much like Marty I got into an arguement about the probabilities behind this wonderful game we call Deal or No Deal. During thier "million dollar mission" they decided to add million dollar cases to the mix.
According to some, it seems that this would not change the odds once you get down to having for example a $200 case and a 1 million case in the end - its 50/50.
What is disturbing me is the idea that if there are 13 $1 million dollar cases it seems much more likely to have chosen one of the $1 million cases than the $200 case. If you get down to the "one or the other" situation does the probabilty migrate back to 50/50 or is it true that I can stick to logic and say that the chances are I chose the million?
Posted 01 Mar 2008 at 7:53am #
In the "13 $1 million cases" game, if you get down to two cases, one with a million and one with a small amount, the odds are exactly 50/50.
Yes, you are 13 times more likely to have chosen a million dollar case in the beginning. But... if you chose a losing case, you're also 13 times more likely to have a million dollar case remaining on the board when you reach the final two cases.
On a side note, it's been interesting/funny/sad seeing the reactions of my friends and family when we've had one of the multiple-case games on TV. I guess they've been trained to have the "oh wow that's terrible, what bad luck" reaction whenever someone opens the million dollar case. So when the contestant opens two or three of them per round, they're amazed at how bad the contestant's luck is... even though it's perfectly normal to open that many, because there are just so many on the board.
Side note #2: I can't believe no one has won the million yet. With so many multiple-case games played, the odds were extremely good that someone would end up with two winning cases as their final two.
With 13 cases, there's only a 76% (or 13/26 * 12/25) chance that the final two cases are not millions. IIRC there were 3 games with 13 cases. So to do some (approximate) math...
.76 * .76 * .76 * .797 * .831 * .862 * .89 * .914 * .935 * .954 * .97 * .982 * .991 * .997 = .171
Only a 17.1 chance of no one winning the game during that million dollar mission. NBC got lucky.
Posted 03 Mar 2008 at 10:24pm #
Engineers are good at applying math, but those of us pure to math are better at the theory behind it.
The truth behind Deal or No Deal is intuitive to what you would normally expect--you have picked a case and played the game straight through--at the end there is a case with .01 and 1,000,000. You in essence have a 50/50 chance of having 1,000,000 Dollars in your case.
The Monty Hall example does not apply here and this is why. In a Statistical Theory class i took in college we studied the Monty Hall Problem, where in you initially have 3 doors to chose from. After choosing a door, on of the bad prizes was revealed--If you switch, you are 2/3 likely to win the 1 good prize and onlt 1/3 likely to win if you stay with the one initially.
This does not intuitively make sense, because if you start with 3 doors you chose one your chances of winning the good prize are are 1/3. After one of the bad prizes is revealed most people assume that they have a 50/50 chance of winning the good prize. This however, is not true. However, in the Deal or No Deal example the 50/50 chance rule is true--This is because the producers backstage in the Monty Hall Example intentionally displayed a non-randomly chosen door to reveal a bad prize. The producers at Deal or No Deal have distributed the cases at random and have no way of introducing the non-random value variable that provides the very existence of the Monty Hall Theory
Posted 03 Mar 2008 at 11:23pm #
C.R.Wentworth: Engineers are good at applying math, but those of us pure to math are better at the theory behind it.
Dave: Although apparently those in pure math do not excel at portraying themselves as modest.
😉
But, in any case, I agree with the main point of the post.
Posted 02 Apr 2008 at 12:49am #
Let's say there are two cases left, and the two remaining amounts are $1 million and $1. Some of you think the odds of "your" case containing $1 million are 1/26 because that's what they were at the beginning of the game. What makes the $1 million any different that the $1? Weren't the odds of picking the $1 case also 1/26 at the beginning? And by your logic, they would still be 1/26. So that's a 2/26 chance that "your" case holds either $1 or $1 million dollars. Where is the other 24/26?
Posted 08 Apr 2008 at 3:11am #
The online version makes no sense. I just played it and it came down to two cases, the $750,000 and the $1,000,000 (I was pretty lucky!), and the offer was $545,000 (and some change). Why would it make an offer which was lower than either of the two remaining cases???? The online version has a screwed up formula!
Posted 08 Apr 2008 at 10:48am #
Yeah, I don't like the new on-line version. The old one was better. I have the details here - http://www.davegentile.com/stuff/Deal_or_no_deal.html
Posted 30 Apr 2008 at 12:54am #
So forgetting the million dollars, what is the best strategy?
For example, this is how I would play:
Keep going until you loose your saftey net OR until the offer breaks 100K
Posted 01 May 2008 at 7:19am #
The total of all the cases is $3,418,416.01, with the average case being worth $131,477.54. This is more money than the average contestant wins. So overall, the best strategy is to keep going until you open every case. (Sometimes there's an exception, once in a while the banker will give a profitable offer: $270,000 when you're down to $1,000 and $500,000 for instance.)
But if you use this strategy, you will win $100,000 or more only 7/26 of the time.
There really is no "best" strategy. It all depends on how much the money really means to you, and how risk tolerant you are (are you really prepared to take a risk that means you might win only $10?)
For a billionaire, the best strategy is to open every case. He can afford to, the money means nothing to him. For a person with a low income and debts to pay off, it could be best for him to take an early offer of $75,000. That money could be worth over a decade of savings for him, he can't afford to risk it.
Overall, one of the best ways for the average person to play is to keep going until there are about 6 or 7 cases left, and then take the deal. This strategy averages about $100,000 in winnings, and has only a very small chance of going home with anything less than $50,000.
The thing that really throws your plans off is the banker. His offers seem so random at times. Sometimes he'll give a very fair offer, sometimes he'll offer only half of what the average case is worth. I think it might have something to do with whether you're on a winning streak or a losing streak, but it's really hard to tell.
Posted 02 May 2008 at 2:59pm #
I agree there is no one best strategy. It depends on your risk tolerance and/or utility function which has a lot to do with your net worth. It also depends on what formula the banker uses. We don't know that for the TV version. For the old on-line version there was a good answer to this question, however. In most situations, the optimal stopping point was 3 cases for a wide range of risk tolerances. The reason for this is that as a percentage of the expected value, the offers kept getting better until you got down to 3 cases, then the 2 case offer was worse. There was no point in going to 2 cases, unless you were committed to going all the way. Also if you contemplated stopping at 5 cases, you would have to take into account the fact that the offers would on average go up the next two times. Stopping at 5 meant giving up your potential offer on 3 cases. When I would play, the only time I felt I could not pull the trigger was when there were all small amounts and one of either $1,000,000 or $750,000 remaining. If there were no amounts that large, or two remaining, I would go to 3 cases. In the TV version my sense is that without a safety net I would not go beyond 4 cases, and even that would be tough in some situations.
Dave
Posted 02 May 2008 at 3:20pm #
I guess how "fair" the offer is woldn't really mater to me. I'd rather get a "crapy" 100K offer than a wonderful 10K offer. Who cares if it's above or below the mean if you're getting a lot of money anyways. If you have a 100K crappy offer where the average is 200k, I'm not going to feel very good about picking a couple high cases and my next offer being 10k even though the average is 7k.
Since I have no statistician math skills, I was thinking about writing a computer program to simulate a million games with various contestant "rules" to follow to see what is the best average outcome.
The game defiantely typically follows a pattern of people making more money for a while, until they start loosing it.
However if your first 5 pics all are very large you're probably screwed off the bat.
Posted 02 May 2008 at 5:17pm #
Honda's point about "fair" is that if you get a less than fair offer then in the long run, over many games, you would be better off not accepting a less than fair offer and instead playing to the end. If the offer is more than fair, then playing to the end is just plain stupid. A fair offer = the expectation value of the remaining cases, i.e. what you would earn from them in the long run, over many games.
There are a couple of problems with setting up the computer program -
1) We don't have an exact model for the banker’s behavior.
2) By what criteria do we judge the result? The highest average payout is almost always achieved by playing to the last case, for example. A better criterion might be to pick a strategy that maximizes the median player's returns. My guess would be that happens somewhere around 4 cases.
Dave
Posted 02 May 2008 at 9:19pm #
To me, I would go the path of least risk. If there were only one large amount on the board I'd take the offer, even if there were several small amounts left. Even though statistically it may be smart to choose another case or two, I don't want to risk being the minority who chooses the last large case and ends up with 1K.
I don't care how bad the offer is, I would not risk 60 thousand dollars on a 1 in 10 chance of loosing almost all of it for the likely benefit of doubling my money.
Posted 02 May 2008 at 9:22pm #
I guess my point is, it is only smart to play the odds, when you get more than one chance.
Playinig for the statistical amount may give the most players the most payout, but when it's me playing I don't care about how much money DOND looses... I care about how much I stand to loose.
Posted 03 May 2008 at 3:34am #
I don't remember enough programming to write a simulation, but here's one way to do it. Just find out with a certain number of cases left, what are the odds of you having an average of $100,000 or more?
I just tried doing the math by hand, and trust me on this it gets WAYYY too complicated once you get past 3 cases remaining. Off to look up some programming info...
lol, I just spent a silly amount of time relearning how to use Visual Basic. I'm kind of a programming scrub so it took me forever to get things right, kinda viewed it as a challenge after a while.
11 cases remaining: 3000 attempts, 1000 times the average was under 100k (33.3%).
8 cases: 3000 attempts, 1117 times under 100k (37.2%).
6 cases: 3000 attempts, 1318 times under 100k (43.9%).
5 cases: 3000 attempts, 1337 times under 100k (44.6%).
4 cases: 3000 attempts, 1397 times under 100k (46.6%).
3 cases: 3000 attempts, 1523 times under 100k (50.8%).
2 cases: 3000 attempts, 1753 times under 100k (58.4%).
I was a little surprised that the chance of having less than a 100k average is so high. I also see more reason to take the deal earlier in the game. Sure, as you contine, the banker may raise his offer from 75% of the average case value to 80%, but you're also increasing your chances of "failure" almost as quickly.
I was just thinking though... while most players hope to stay over that 100k case average, I wouldn't really consider it a failure if your average was 94k (and you took an offer of maybe 82k). That's still a lot of money. I'm going to run the numbers one more time, this time looking for the chances of a variety of outcomes. I'm going to categorize them as Disaster (under a 30k average), Disappointment (30k-80k), Typical (80k-180k), Big Win (over 180k).
11 cases: Disaster 3.2%, Disappointment 18.6%, Typical 57.5%, Big Win 20.7%.
8 cases: 9%, 21.1%, 44.4%, 25.5%.
6 cases: 16.5%, 18.8%, 36%, 28.7%.
5 cases: 21.2%, 15.3%, 32.7%, 30.8%.
4 cases: 29.6%, 13.6%, 24.1%, 32.7%.
3 cases: 35.7%, 13.7%, 22.4%, 28.2%.
2 cases: 46.5%, 11.7%, 12.3%, 29.5%.
Interesting how the Big Win percentage hits a peak at 5 cases, but really doesn't change much at all. Meanwhile the Disaster chances are skyrocketing. Look at the difference between 8 and 6 cases, your chances of disaster almost double.
Looking at those numbers, I'd say the time to stop is at 8 cases, or possibly even 11. But this ignores the biggest factor in the game... the banker. You have to get down to 8 cases to see even a semi-reasonable offer, and he ups it considerably at 6 and 5.
So in conclusion... my first hunch was right. 6 cases is really the sweet spot, and 8 cases isn't bad if you want to play it safe. Additionally, I'm a huge nerd who needs to find better things to do. wheeeeeeee
Posted 03 May 2008 at 3:41am #
[quote comment="47474"]
The game defiantely typically follows a pattern of people making more money for a while, until they start loosing it.[/quote]
This happens somewhat often, but not all of the time. And it definitely sticks in your memory more when someone builds their way up to 200k and then loses it all. But there are also plenty of games where the player gets off to a terrible start, and then plows his way through the field of low amounts to work his way back up to a decent bank offer.
Posted 03 May 2008 at 11:16am #
Honda,
Its good to keep fresh with one programming language. If I was more current in VBA I could do it in Excel. As it is I'd have to use SAS, which I may do sometime this week.
But without taking into account the banker's progressive offer, the ideal stopping point is 20 cases. There is no promise of your expected value going up and your risk does go up, so you should stop right away if that was the game.
Dave
Posted 03 May 2008 at 4:50pm #
What about simulating a little bit of decision making too. Where the contestant keeps playing until he is down to 1 large number. So instead of finding the average for choosing down to 6 cases, whats the average for stopping when you only have 1 (or maybe 2) cases at 300k and above?
Hopefully you don't wipe them all out in the first two sets of pics.
Posted 03 May 2008 at 11:47pm #
I guess what I would think would happen would be the average payout would go down significantly, however the disaster scenario would also go way down.
Posted 04 May 2008 at 3:12am #
I just gave it a try, but for some reason the random number generator isn't cooperating. Everything works perfectly if I tell it "keep going until there are 8 cases left". But if I tell it "keep going until there's only one huge amount left", it opens the cases in the same order every single time. I really don't understand it.
Here's another way to do it though - figure out on average, how many cases will be remaining at the time you get down to one large amount? Unless I'm mistaken, it should be 26/5, or 5.2.
Then keep in mind that your single large amount could be anything 300k or above. On average, that amount will be worth $590,000.
Adding in another few thousand for whatever your remaining small cases might be worth, this strategy will average you a win of $115,000. Not bad at all, only a slight decrease from the "regular" average of $131,000.
I just wish I could get my program to work, so I could find out just how many bad outcomes the strategy helps to avoid. My guess would be "not that many". You might avoid most of the disasters, but there could be quite a few disappointments.
Also, keep in mind that this strategy can add a lot of risk. If you get down to the million, the 750, and the penny and you have an offer of $550,000... well you still have more than 1 large amount in play, so it keeps on going.
Posted 04 May 2008 at 11:26pm #
That last scenario would require more "logic"
I would say if loosing any one case can result in the next best offer being $50,000 less than your current offer, settle.
It comes down to, how much are you willing to loose on a roll of the dice.
Would I risk 60K I didn't have for a chance to strike it big on a 1 in 6 chance I'll loose it? Not me.
Posted 05 May 2008 at 9:53am #
Edmond,
Let's take your strategy of "stop as soon as you have only one case left on the board that's $300k or greater". To know if this is a good strategy, you need to know the banker's formula for generating offers (which is how this whole thread started). The offer will vary based on whether this happens early or late in the game. For lack of anything better, I'll use the random windows listed at the bottom of http://www.davegentile.com/stuff/Deal_or_no_deal.html.
I modified my own program to implement your strategy. As you say, it's a bit dangerous, because if you enter an early round with only two huge amounts left, you could actually knock them both out in the same round and end in disaster. Interestingly, there's about a one in 60,000 chance that your first five picks will be the five highest amounts. Not likely, but possible!
Anyway, let's see how we did. I ran 100,000 games.
On average, using this strategy, you'll take the offer after 4 rounds (18 cases picked). The mean of all offers is a mediocre $65k, and the median offer is only about $30k.
Of course, the maximum win is still $1m (that's if you picked the $1m and got to the end with only $1m and one other huge number on the board). But the worst offer I got was a dismal $7. (That was after round 5, with the remaining cases: 10c, $1, $5, $25, $50, $100. Ouch.)
Anyway, I decided to vary the "huge amount" limit and see how things changed. In the table below, the strategy is "stop as soon as possible after removing the second-to-last amount on the board of amount X or greater":
$50k: play 7 rounds, mean offer $88k, median offer $32k
$75k: play 6 rounds, mean offer $83k, median offer $32k
$100k: play 6 rounds, mean offer $78k, median offer $31k
$200k: play 5 rounds, mean offer $71k, median offer $30k
$300k: play 4 rounds, mean offer $64k, median offer $29k
$400k: play 4 rounds, mean offer $57k, median offer $29k
$500k: play 3 rounds, mean offer $49k, median offer $30k
From the results, IMHO this "play until you eliminate the safety net" strategy isn't a very good one. 🙂
Posted 05 May 2008 at 11:16am #
Ron, the second set of formulas on that page don't give very good payouts. Try the first set of formulas. They are closer to the real game, I believe. In fact, I think at one point in the past the real game may hav been exactly that, but I'm not sure. Plus you'll be in good company - I know for sure of one PhD thesis using that set of numbers, and one masters level finance class.
I might give it a try myself, if I have time soon.
Dave
Posted 05 May 2008 at 11:20am #
Ron, the problem with looking at median offers is it only makes sense (to me) if you are going to be able to play the game several times. You may be better off (after several games) to play one strategy, but when you're the poor SOB on the crappy end of the curve that is not much consilation since there are no do-overs or secon chances.
Even if a disaster is not likely, I'm going to stop as soon as it is possible. Regardless of the bankers offer. I would be happy with 50K. I'd much rather miss out on the 300K offer than risk walking out with 500.
Posted 05 May 2008 at 11:24am #
I guess what I would like to see is a 3d chart that shows the chance of a disaster vs average payout vs rules of play.
One 3d chart per rule of play (stopping at 2 large cases or stopping at 1 large case)
Obviosly I would use the technique that minimizes risk (with an acceptable payout) compared to maximizes potential offer.
Posted 05 May 2008 at 11:53am #
Here's another way to think about what kind of information I would want to know before I played:
What strategy gives me the highest AVERAGE payout with no more than a 1/50 chance of leaving with less than 50K?
Posted 05 May 2008 at 12:18pm #
Dave, ok I re-ran with the first set of numbers on that page. Here's what you get. Once again, the strategy is "stop as soon as possible after removing the second-to-last amount on the board of amount X or greater":
$50k: play 7 rounds, mean offer $106k, median offer $46k
$75k: play 6 rounds, mean offer $100k, median offer $46k
$100k: play 6 rounds, mean offer $96k, median offer $46k
$200k: play 5 rounds, mean offer $91k, median offer $46k
$300k: play 5 rounds, mean offer $84k, median offer $43k
$400k: play 4 rounds, mean offer $73k, median offer $41k
$500k: play 3 rounds, mean offer $61k, median offer $36k
Posted 05 May 2008 at 5:49pm #
Ron, my strategy was -
play until you get to 4 cases, then if the situation was either $1,000,000 plus all other things under $100,000 or $750,000 with all other things under $100,000 I stopped. Otherwise I opened one more case, and stopped with 3 left. I would only go beyond that if all 3 remianing cases were 100,000+, then I would go all the way, but this last bit is unimportant, since it never happens.
Dave
Posted 14 May 2008 at 10:47pm #
u take the number of cases left, and put one over that number to give u a percentage
example
5 cases left
1/5=20%
take that percentage and subtract 3 % from it
example
20%-3% = 17%
take that percentage and multiply it by the total amount of money on the board
example
10000
200000
300000
400000
1000000
total = 1910000*(.17)= bankers offer
in this case the offer should be 324700
try it and see if it works, get back to the forum and let me know!!!!
Posted 09 Nov 2008 at 4:14am #
Brothers, (if you're still on here)
I believe you two are talking apples and oranges and bananas. Deal Or No Deal is really three (3) probability games rolled into one.
In the first game you have a 1/26 chance of "saving" out the 1 million dollar case. End of probability and end of first game. You just don't know what's inside yet because you have two more totally different games to play. This first game is what your engineer brother is focused on.
The second is a dynamic "revealing" game starting with a 1/26 chance of the 1 million case being selected the next "loser" case. Or another way of saying it, you have a 1/26 chance of holding the 1 million case. And yet another way of saying it, you have a 1/26 chance of keeping the 1 million dollar case in the game. Over time you could end the game with a 1/4 chance or even a 1/2 chance for the million to be in your saved case if the million is still in play. This is the game you are focused on.
The third game is between you and the banker for the million. Of course the 1 million dollar case is physically part of only the first game or only the second game as either a "saved" case or a "revealed" case, and will have a 50:50 or 1/2 chance (after the first case is chosen until the million is revealed) as being a "loser" or "winning" case.
Fruit basket aside, this is more like an exciting shell game than a horse race.
Posted 13 Nov 2008 at 4:48pm #
The first and second 'game' make sense. The third 'game' does not. No matter what sort of probability you are talking, it is not true that there is a 50/50 chance your case is a winner (at least before you get down to two cases). Just because we can enumerate two possibilities does not mean those two possibilities have equal probability.
Posted 04 Dec 2008 at 5:15pm #
I just watched someone go home with 26,000 after having two cases left: $10 and $100,000 -- how could that deal have been so low?? I have watched other games where the offer is about half, or a little less, of the highest offer when there are only two left. Maybe it's because it was the daytime game?
Posted 01 Jan 2009 at 11:38pm #
I suggest everyone read about the monty hall problem before making assumptions.
Posted 02 Jan 2009 at 12:01pm #
Has anyone compared offers made to different contestants? I'm wondering how much of a difference there is in the offers made to different contestants that is based on personality, gender, or race. It seems to have been established that there isn't a mathematical formula that matches the actual show all the time, but has anyone analized this?
Posted 03 Jan 2009 at 3:04am #
rp: Interesting idea. We all hope the world is free of that but it's not so that would be an interesting study. Personally, I think personality and financial need definitely seem to affect the game sometimes but not always. Let us know if you find something.
I'm not sure the rest of you guys are totally grasping what's going on here.
To Justin: While The Monty Hall Paradox is very interesting it does not relate to Deal or No Deal. There's been a lot of debate on this page but you can trust me on this one.
Matt: That's very interesting. It has long been established that there doesn't seem to be a set function to determine the offers but that offer is definitely puzzling. There must be some external factors they consider besides what cases are remaining.
Rick: I agree with Dave. I'm not sure I follow your logic on the third game. The game is really simpler than most people make it (the offers make it seem more complex). You have a 1/26 chance of choosing the million dollar case in the beginning. You also have a 1/26 chance that the million is still being held by a lovely lady after opening 24 cases.
Posted 03 Jan 2009 at 3:39am #
Who cares what the odds of getting 1 million dollars are. Only an idiot would go to the last case.
The game is trying to figure out which strategy averages you the most take home with the least amount of risk.
I believe through simulations, using the strategies thought of so far, your goal should be somewhere from 100K to 200K, then quit.
Posted 04 Jan 2009 at 1:30am #
Unfortunately there is no perfect strategy. Everyone starts with nothing so why would you call them an idiot if the last two cases were 1 million and 500,000? I'd say knowing the odds at that point would be pretty important. The reason it has received so much attention is because there was a long debate about whether it was similar to the Monty Hall "Let's Make a Deal" game. You're right that the focus was originally on the strategy but It's difficult when there are so many variables to consider.
Simulations have been proven to be somewhat pointless since there doesn't seem to be concrete algorithm and whatever system is in place seems to change week to week. Simulations need to have some constants and there aren't many. I think a very rough strategy could help but the riskiness is what makes it fun. In the end common sense should suffice (although not many people on the show seem to have that).
Posted 14 Oct 2009 at 9:12pm #
[quote comment="17841"]Marty, your brother is correct. See The Monty Hall Problem for reference.[/quote]
Actually that proves his brother is wrong. The Monty Hall problem assumes you learn something from taking away options. Well that's apparent in Deal or No Deal as they tell you what is taken away.
So initially you guess a box with 1/26 chance. Then once all but 2 boxes are eliminated, the odds that you have $1M is 1/2. It's not 1/26 STILL, that's silly.
There is really no reason to switch boxes if you know one has $1M. Realize that in the Monty Hall game, Monty is GIVING you information. He's removing the boxes. In Deal or no Deal, you just come to have the $1M by dumb luck (either you picked it initially or you randomly picked away at the other boxes until you arrived the $1M left to open). Either way the odds are even, neither is more probable..
As for the algorithm I am guessing it's an interesting Decision algorithm based on human interaction. It changes by round as suggested before. But it definitely does not use averages at all. I'm guessing it groups numbers into blocks. So if you have 1,10,25,1000 and 500,000, it may group 1,10,25 then 1000 then 500,000. And throw up an offer somewhere in between.
The key to winning Deal or No Deal is understanding the algorithm. Because odds are against anyone going all the way to the end and having $1M in their case. So you need to know what the offers will be under any scenario and then decide whether its smart to proceed.
The best way to figure it out is just watch the show and write down what happens under each scenario to get a pattern.
Posted 15 Oct 2009 at 10:38am #
Time to continue this thread from years ago?
:o)
Not much to say. You are correct. Deal or no deal is not Monty Hall because Monty only opens duds. He won't open the one with the prize, but on deal or no deal you open at random.
Somewhere in the thread above, however, we did discuss a very technical issue. In Bayesian statistics you could say the probability at the end is 1/2. But in classical statistics, you can't do that, since a probability must be associated with a random variable. To say what we want to say in that language, you have to say that "The conditional probability of winning the prize when there are two cases left, conditional on having opened 24 cases with less than 1,000,000 is 1/2. "Probability" (without the "conditional" in classical statistics could only be talking about the initial pick.
Posted 14 Oct 2009 at 10:02pm #
Here's proof that the odds are even at the end if one of the last two boxes has the $1M.
Initially you guess a box, you have a 1/26 chance of guessing right at the beginning.
Now what are the odds that you were lucky enough to pare away every box except the $1M and that you did not pick the $1M to begin with?
prob that you pared away = prob you did not choose it at the beginning X the prob that you guessed all but the $1M
The prob you did not choose it at the beginning is simply 25/26.
the prob that you guessed all but the $1M is 1/25. It's the same as guessing it.
So the prob that you pared away all the boxes = 1/26 X 25/26 = 1/26.
So the odds are even that you guessed it in the beginning or you guessed wrong but pared away the remaining boxes.
So there would be no reason to switch boxes.
Posted 15 Oct 2009 at 11:22am #
[quote comment="55772"]Time to continue this thread from years ago?
:o)
Not much to say. You are correct. Deal or no deal is not Monty Hall because Monty only opens duds. He won't open the one with the prize, but on deal or no deal you open at random.
Somewhere in the thread above, however, we did discuss a very technical issue. In Bayesian statistics you could say the probability at the end is 1/2. But in classical statistics, you can't do that, since a probability must be associated with a random variable. To say what we want to say in that language, you have to say that "The conditional probability of winning the prize when there are two cases left, conditional on having opened 24 cases with less than 1,000,000 is 1/2. "Probability" (without the "conditional" in classical statistics could only be talking about the initial pick.[/quote]
Yes, I came across this thread trying to figure out the DoND algorithm. I've pretty much come up with my own algorithm on how to play DoND now based on what I've seen. It's really not that complicated of a game.
And yeah, by Monty "helping you out" you gain information. So he helps your odds. In DoND, no new information is given to you.
And finally, yes, technically speaking P(A|B) i.e. Bayes Theorem is how you determine the final probability. I think this makes the most sense to the average person. Considering that you have 2 options and both are 1/26 they are the $1M....that's not very intuitive. But if you came in at the end of the show with the volume off, then yes, that would make perfect sense. 😉
Posted 15 Oct 2009 at 1:47pm #
The odds are most important if you get to play more than one game. The odds are important to the house, but to the player the risk is more important. Even if the odds are 10:1 that you will pick the 1MM case, if there is a one in ten chance you'll loose everything are you going to put it all on the line? If 0.01 and 1MM were the only cases left on the board would anyone in their right mind "go for it?"
Once I hit 50K I'd settle. I would never in real life withdrawl 50 thousand dollars from my savings and put it on the table even if the odds were "favorable" and once you have the 50k in the game, you are doing just that.
I love how people become high rollers just because the money wasn't theres to start with. Once its on the boad, it is yours to start with!
Posted 15 Oct 2009 at 2:39pm #
Well you probably wouldn't take $5000 out either and throw it on the table....but would you settle for an initial offer of $5000?
The key to the game is to get the bankers offers as high as you can, not to try and end up with a particular box.
If you had $500,000, $75,000 and $5 on the board, would you settle for a $50,000 offer? I wouldn't. Mainly because the algorithm would increase the next offer NO MATTER which box you chose.
It's one thing to not take a generous offer, it's another to not take advantage of the game and make more money. I remember this one family passed up on something like $350,000 only to win $500 or some low amount.
There is a bit of misdirection in how the game plays. When you see 4 high numbers on the board and 1 small number, you assume odds are in your favor. The only problem is, you are REMOVING boxes, not picking among the boxes. So if you continue, you have 80% chance of losing one of your high numbers. Go again, and it's 75% you'll lose a big number.
The only reason to move forward is to increase your offer if they are low.
That's why if I was offered $50,000 with $500,000 and 4 low numbers on the board, I'm going to pass on the offer. Even though the odds seem against me, they are in my favor to move ahead.
Why? Because it's 80% chance I will eliminate one of the small numbers. And probably move into the $100,000s range for an offer.
If I'm gutsy, I would go again with a 75% chance of removing another small number and hence increasing my offer maybe into $200,000. At which point I would have to quit as the odds shrink to 66% of eliminating another low number.
Really the best scenario is to have at least 2 large amounts. Then you can keep pushing it 1 box at a time as the offers increase. And once one of the large amounts gets eliminated, then take the offer.
It's all about going as long as you can to hit the max offer.
Posted 15 Oct 2009 at 3:58pm #
I understand what you're saying, but to me 10:1 odds in your favor aren't that good when you're the one guy who looses everything. Thus, I'd play to a modest 50K unless there was virtually no way to loose money. I don't want to get the highest statistical payout, I want to get to the most confident payout. Remember, you only get to play one time. Odds are great at the poker table, but not so much comfort to me when there's could be a years salary at stake and no round 2
Posted 15 Oct 2009 at 4:39pm #
You can't really shoot for a number like $50K. Because you have to deal with what is on the board. $50K is actually not as easy as you make it sound.
The problem with your concept of the game is A) what does "confident" mean? and B) that you can throw out a target number and play for that number.
If you pick badly, you may never get an offer over $20K. And why would you walk away from the game with $50K when it's obvious you could get much more? Or a high probability of at least a little more?
The game is totally random, so there really is no "confidence". It's not like you are guaranteed $50K if you leave early.
Like I said above, there are situations where even if you get offered over $50K, there is 100% chance you will get a higher offer if you choose one more box.
I realize you want to be "safe". And even a free $5,000 is nothing to sneeze at. But you really have to play the hand you're given. And in some instances you are very safe in moving forward even if it seems greedy.
Would you stop at $50K if you had $25K, $100K and $1000000 on the board?
If you play the game well, then you won't make stupid decisions. Really the only time you are vulnerable is when you are forced to choose more than 1 case. As long as you have at least 2 large amounts when you get to eliminating 1 case at a time, you should press it until only 1 large amount stands. There really isn't any risk in doing that.
Posted 15 Oct 2009 at 4:58pm #
A few comments.
There definitely is a psychological factor that makes people treat $$ they might win differently than $$ they already have. (See work of Kahneman and Tversky) This is not "rational" in the way they define it.
Another thing to consider is that money is more valuable if you don't have much. The marginal utility of money decreases as you have more. So optimal behavior for a poor family and a rich family may differ.
If I were on the show I would keep playing as long as there are 2 large amounts on the board. After that...it depends…but I wouldn’t push it very hard. Certainly if I had $100K+ and only one big amount left, I would stop.
Posted 15 Oct 2009 at 5:07pm #
Dave,
No doubt. IMHO, people don't always play very well. I gave the example of the family passing on $300K+ because it came so easy to them to that point. And then ended up with less than $1000.
If you don't know what you're doing, then by all means quit when you are offered a lot of money.
Even though the game is simple, it's not as intuitive as people think it is. Again, you're removing cases and not choosing from cases. That makes the logic of probabilities a bit confusing.
You can tell many don't understand the probability involved based on how they choose.
Posted 29 Oct 2009 at 3:10pm #
woo the zombie thread has arisen, just in time for halloween!
[quote comment="55777"]Well you probably wouldn't take $5000 out either and throw it on the table[/quote]
Actually I'm quite certain he would. He would be risking $5000 for a 99.99% chance of getting more money than he started with.
[quote comment="55777"]Would you stop at $50K if you had $25K, $100K and $1000000 on the board? [/quote]
This situation can never happen, he would have had a $50k offer earlier if that much money is on the board.
[quote comment="55777"]The problem with your concept of the game is A) what does "confident" mean? and B) that you can throw out a target number and play for that number.[/quote]
You can't precisely determine an exact number to shoot for, but you can put an estimate out there. "If I get to around $50,000, I'm going to stop." Then if you get an offer of $48,000, you may decide that's close enough and just take the deal.
"Confident" means that there is a high likelihood of achieving that goal. In the vast majority of DOND games, at one point there is an offer of over $50,000. It's probably 95% of the games, at least.
He would be choosing to trade a high-risk high-reward situation, for a almost-sure-thing of $50,000. If you hate to gamble, and you place an extremely high value on not ending up with a small amount, this is a great strategy.
When making decisions like you do on this game show, you can't just look at the dollar amount. You have assign a personal value of each potential result in order to make the decisions that are best for you.
For many people, winning $500,000 and winning $1,000,000 hold almost the same personal value. You have nearly the same level of excitement, the same level of "now I can buy whatever I wanted"... but those are vastly different dollar amounts.
On the other hand, there is a vast personal value difference between winning $20,000 and winning $1. Even though the dollar value difference is 1/25 of the previous example, one result is a huge disappointment and one is an exciting win.
If you have a $50,000 debt that's been hanging over your head for years that you'll do anything to get rid of, you'll place a high value on ending the game without the risk of going below that amount.
If you want to buy a new house but you'll need $100,000 minimum to make it happen, an $89,000 offer has a much lower personal value to you than it would to other people.
Posted 29 Oct 2009 at 7:02pm #
[quote]Actually I'm quite certain he would. He would be risking $5000 for a 99.99% chance of getting more money than he started with.[/quote]
Well, I think 99.9% is a bit of an exageration. Just based on strict odds of each slot, there are 20 of 36 dollar values at or less than $5000. So if you played all the way through to the end, you'd have a 56% chance of getting $5000 or LESS.
Of course it's impossible to determine the odds without understanding what the AI algorithm is that determines the bank's offers.
[quote]"Confident" means that there is a high likelihood of achieving that goal. In the vast majority of DOND games, at one point there is an offer of over $50,000. It's probably 95% of the games, at least.[/quote]
Well if you have stats on all the DOND shows and it would have to be a significant amount of shows, not just say 10 or 30...at least 100 shows, then that would give you some confidence.
Just because you seem to recall a $50,000 offer every game does not actually coincide with real probabilities. Again, based on getting $50,000 or more is 9/36 or only 25% chance.
If you wait around for a $50,000 offer, you may actually pass up other better offers and up with less.
Obviously the smaller number that you "settle on" the more confidence you will have that it will show up.
[quote]He would be choosing to trade a high-risk high-reward situation, for a almost-sure-thing of $50,000. If you hate to gamble, and you place an extremely high value on not ending up with a small amount, this is a great strategy.
When making decisions like you do on this game show, you can't just look at the dollar amount. You have assign a personal value of each potential result in order to make the decisions that are best for you.[/quote]
I'd argue the other way. You obviously have to be very logical about your choices BUT you shouldn't dump a strong position in the game (all large values on the board) because $50,000 is a lot of money to you.
Choosing a value and leaving when it shows up is a fine strategy especially if you know REAL stats on the true offers of many games. In other words, you know the algorithm.
But to me if you truly understand the algorithm then picking a specific value based on averages of other games is a bad decision. You need to pick target values AS YOU PLAY.
If you eliminate the whole left side of the board, why would you settle for the next offer to come up on the board just because it's over $50,000? There are solid strategies and probabilities that suggest playing just 1 more box will most likely increase your offer. But because you are basing your game on averages of past games, you would screw yourself by quitting.
[quote]If you have a $50,000 debt that's been hanging over your head for years that you'll do anything to get rid of, you'll place a high value on ending the game without the risk of going below that amount.[/quote]
Well that's fine. But again, there are two problems with this approach:
1) you get bad luck (and don't say it won't happen or is impossible) and you get no offers near $50,000. Let's say the max offer is $30,000 and the whole right side is eliminated. Because your only strategy was to play for $50,000, you probably will play to the end. And the odds of getting less than $1000 are 50/50 if you play to the end.
2) you jump on the first offer over or near $50,000, this is a bad move if you're in a great position on the board. There are several strategies that are highly likely that you will get a higher offer if you don't stop. So you may have a 95% chance of getting a higher offer, but because you quit you blew more money
Knowing the stats on previous games is a good strategy. But it's all a crap shoot. I've played the online version and wiped out most of the high numbers and had a string of bad luck.
You really need to be emotionless and know the odds. You need to change targets during the game and not have preconceived ideas of what a good offer is.
Based on your suggestion, someone may have $25,000 credit card debt and jump on the very first offer. IMHO, it's best to understand the game. Mainly because there are several instances where guessing another box most likely will not hurt your last offer but has potential to drastically increase your next offer.
Confidence changes during the game, fwiw. You can know what past games were, but it doesn't really mean anything if you hit a good or bad patch of luck. You may be giving up a good opportunity or keep playing when you shouldn't.
Realize that the goal you should have is to play to your best possible offer. And that can vary at any time during the game. Being too short-sighted is comparable to being too greedy.
The best way to play is to play the cards you are dealt and decide as you go along.
Posted 31 Oct 2009 at 10:54pm #
I'm too lazy to quote and edit it all...
About the "pay $5000 to play", it is virtually guaranteed that at some point (most likely your first offer), you will see an offer of over $5000. Just take that and quit the game.
About the "I'll quit when I see a $50k offer", you're calculating it wrong. Just because only 9/36 of the cases are worth 50k or more, that does not mean there is a 9/36 chance of seeing an offer of 50k or more at some point during the game. In fact, the vast majority of games have an offer of 50k or higher at some point.
"you shouldn't dump a strong position in the game (all large values on the board) because $50,000 is a lot of money to you"
This would never get to happen. To reach the position of having all large values on the board, you need to turn down offers when the board is 2/3 large 1/3 small, for example. During this time you would have received an offer of over 50k and taken the deal, so you would never get a chance to see the "large cases only" board.
"1) you get bad luck "
This is possible. But the player has chosen to accept the approx. 5% chance of failure in exchange for the 95% chance of getting at least 50k.
"2) you jump on the first offer over or near $50,000, this is a bad move if you're in a great position on the board"
Again, if your offer is in the neighborhood of 50k, you will never be in a great position on the board. First because playing the 50k strategy means you never reach a "large cases only" situation, and second because if you were in that situation the offer would be much larger than 50k.
"Based on your suggestion, someone may have $25,000 credit card debt and jump on the very first offer."
Well, it all depends on their situation. If this debt has been ruining his life, he might take the offer. If he makes $80k per year and has $50k in investments he could sell and pay off the debt anytime he wanted to, he's much more willing to take risks in order to get more money. And if he's Bill Gates, the optimal strategy is to skip the entire game and just keep your selected case.
Posted 01 Nov 2009 at 10:00am #
It's funny that you reply today. Just last night I say a woman that eliminated the ENTIRE right hand side of the board and never had an offer over $10,000. *They replay DOND on GSN.
I think you vastly over estimate the ability to see $50,000 as an offer. And the way you talk, you sound like you're just making up these statistics based on what you ASSUME and not what you KNOW.
Realize that the bank offers are not predicated on an average anyway otherwise you'd see $100,000 offers at the very beginning. And you get rewarded for continuing to play.
So if you get a $50,000 offer early, that means more likely than not, you have a REAL probability of winning more if you continue. So why screw yourself and quit early?
The average winning when you start is around $100,000. So if everyone played until the end, on average the bank would lose $100,000 per person. YET, initial offers are usually miserable, even when you eliminate many smaller amounts.
So if you realize that your possible winnings average has shot up to say $200,000. Why would you stop?
The bank offers are not averages. Early offers are less than the probabilities of what you could win. So stopping early is a mistake if you are getting large offers like $50,000.
Likewise, if you have 5 boxes under $1000 and 1 box at $100,000, taking an offer of $20,000 is a smart move. Because that is better than your real odds.
So sticking to a preset number is the wrong strategy. Because if you get that offer early, you're hurting your likeliness of winning more. And if you wait too long for that offer, you're increasing your chances of walking away with less than $1000.
You need to play the board and know your odds. Quit earlier when the board is against you, keep going when it's in your favor. But it seems to make sense that you should keep going to at least the rounds of guessing 1 box at a time. The only time that wouldn't make sense is if you eliminated most of the high value boxes.
Posted 06 Nov 2009 at 1:54am #
Todd: I'm certainly not saying it's impossible to never get an offer over $50k, but doesn't mean it's not very likely to happen. I wish I still had that little program I wrote a couple years ago around, I might have played with it and try to find the approximate odds.
Actually, I just scrolled up and saw that I did some math before that does apply to this-
11 cases remaining: (under 30k average) 3.2%, (30k-80k average) 18.6%, (80-180k average) 57.5%, (over 180k) 20.7%.
Going by those approximate numbers, about 78% of the time you'll have an offer of 50k or higher with 11 cases remaining (this assumes that an 80k average will get you a 50k offer).
So, you have a 78% chance to succeed at just that one point in the game alone. In the other 22% of the games, you'll have many other chances both earlier and later in the game to reach the 50k mark.
Maybe my guess of 95% was high, but I would be surprised if you don't hit 50k in over 88% of the games.
I will agree with one thing... if you eliminate 5 very low cases to start and immediately get an offer of $50k, it's much worse to choose to stop there. This is because early offers are a very low percentage of the remaining case value on the board (sometimes like 30% of the average), when later on the offer will be 70-80% of the average or even more.
Posted 01 Nov 2009 at 12:50pm #
There is no set number to reach for, it was just an arbitrary example. The main point I was trying to make is playing the odds as you would in a casino is the wrong strategy. Those rules are only "smart" if you get to play more than one time. You must keep in mind good odds aren't any concelation when you're the loosing guy. Of course there are some people who will be very unlucky and loose anyways.
The goal is to maximise the chance of YOU winning. (Minimize risk)
To play the odds alone is to maximize the chance of the HOUSE losing. (Maximize payout)
One thing I can say absolutely is I would never pick another case if there were just one high value case left. When being the unlucky person (even if it's 1/10) is not worth the risk, even if it's a crappy offer. Theres nothing crappy about free money.
Posted 01 Nov 2009 at 1:39pm #
You should always keep in mind your odds, but because the bank offers are not strictly based on probability you really need to play to what the bank offers will be.
This is the easiest strategy, IMHO, and it seems to work 99% of the time:
As long as you have at least 2 boxes above $50,000, then keep guessing boxes. Once you get to 1 box at a time you can either keep going until only 1 box above $50,000 is left OR take an offer that seems like a good deal.
The only exception to this is if you are still choosing multiple boxes (4-6 at a time) and you've almost wiped out the entire right side. Then I would jump on a big offer.
The game is really not that challenging in terms of how to play. When I play online, I don't even look at the boxes it picks. I just randomly pick boxes until it gets down to 1 box at a time. Then I figure out when to take the deal.
Posted 06 Nov 2009 at 12:17pm #
Edmond,
That's the thing. The offers are not strictly based on averages. They obviously are to some extent. But if you figure that after the first 6 boxes are removed and the $1 million and $500,000 are still there that:
$1.5 million/30 = $50,0000 alone
So that means as long as $1M and $0.5M are always on the board, an average would never be less than $50,000.
But you know that's not the case. Initial offers are usually like $20,000 and most times LESS.
Offers reward you for moving ahead. And typically if there's a large offer, that means you have a good shot of increasing the offer with another box.
You have to be willing to have a little wiggle room. Let's say you get a $75,000 offer, but you know the chances are high of eliminating another small box. But if you choose a big box, the offer may go to $50,000. Why not take the shot?
I compare it to jumping on lily pads in a pond. Do you move ahead or stay safe? How safe is the next box? What are the rewards/losses for jumping to the next lily pad. If it's relatively safe, i.e. your offer won't drastically change, then do it. Otherwise stay put.
Posted 18 Mar 2010 at 2:28pm #
Wow. Can't believe how long this disucussion has gone on so long. Too many to read all so I apologize if someone else has already cleared this up.
To answer the brother vs. engineer question correctly, you need two basic things:
1. An understanding of the rules of the game (yes)
2. An understanding that there is a difference between probability and odds.
So here we go:
You choose a case from the board to be yours. The PROBABILITY that this specific choice yielded a case with $1M in it is 1/26. This will NEVER change.
Your current ODDS of your case containing the $1M are 1:25
The PROBABILITY of you winning $1M based on the next case you open is 0 and does not improve because the most you can win at this point in the game is what the Banker offers.
Each successive case that you remove that doesn't contain the $1M improves the ODDS of your case containing the $1M, but not the PROBABILITY because the PROBABILITY that your case has $1M in it is linked to the single event of you choosing that case.
If you make it all the way down to the point in the game in which there are only two cases remaining then AND you refuse the bankers final offer:
1. PROBABILITY of your chosen case having the $1M is 1/26
Because this choice happen only once
2. ODDS of your case having $1M is 1:1
$1M vs. not $1M
3. PROBABILITY of you winning $1M is 1/2 (Here is the 50/50 chance that everyone gets hung up on)
take your case, trade for the othercase
4. ODDS of you winning $1M is 1:1
$1M vs. not $1M
You can't actually calculate the probability of winning this game because you can't accurately calculate what each individual who plays will consider a good deal and hence you can't calculate the probability that they will take the deal at any give point
Posted 19 Mar 2010 at 11:21am #
Hi Damien,
Yes we pretty much resolved the whole thing a while back, but every once in a while the thread comes back to life. I've followed it for awhile now.
My background: I have a couple of quantitative degrees, and I currently work as a statistician. I'm (still) working on publishing a paper related to probability theory, and (just by coincidence) I happen to be currently taking a probability theory class right now and living and breathing the stuff (with the goal of adding another degree or and least a certificate to the list).
O.K., so here is how I would describe things.
According to classical probability and statistics, the probability of a case containing the 1,000,000 is 1 in 26, and that never changes. Your pick is a random variable, and does not change.
However, in classical probability and statistics you can then talk about a "conditional probability". Once you have opened 24 cases and not found the 1,000,000 the conditional probability of winning 1,000,000 GIVEN the 24 open cases is 1/2.
There is also another school of probability thought called "Bayesian" probability. Here the probability at the end would be 1/2.
Your distinction about "odds" vs. "probability" is not correct, however. Odds are just a simple calculation you can make from the probability, which just states the same thing in a different way. A horse might have a 1/6 probability of winning and we say the odds are 5:1. This is just (1-p)/p.
However, with a little more detail, your distinction could be made correct. "Odds" are most often associated with gambling, and so is Bayesian probability. It is well established in decision theory.
So if you say the "classical probability" is 1/26, and the "Bayesian odds" are 1/2, you would be correct.
But the clearest statement would be to say that at the end the conditional probabilty is 1/2. Everyone in both schools would be clear as to what you meant.
Just trying to keep the thread informational at this point.
:o)
Dave
Posted 19 Mar 2010 at 4:18pm #
You can know all the probability in the world, but it doesn't really matter.
The game is pretty simple, you play for the offer. 99% of the time you don't go down to the wire and choose your case and the last case. 99% of the time you take an offer.
Odds and probability are over-rated in analyzing this. As long as you have some big numbers on the board, you're going to get a big offer.
The longer you go, the better the offers get.
The key is simple, keep playing while several large numbers are on the board, quit when there are few. Don't keep playing if there is a chance that one of the big numbers will be eliminated.
Bayes and all that other garbage won't help because the offers are not based strictly on probability. And you always take an offer.
Posted 19 Mar 2010 at 4:29pm #
Todd,
Well, I wasn't offering a strategy for playing the game. Part of the discussion that took place on this thread involved the "Monty Hall" game, and whether of not DOND was an example of such a game (it isn't). My comments are just to help people who might stumble across the thread get terminology correct.
As far as playing the real game, I agree with you.
It would be helpful if we knew the banker's formula, but we don't. Here on this page I have two version of the formula used by the on-line game, but neither matches the real game.
http://www.davegentile.com/stuff/Deal_or_no_deal.html
Dave
Posted 15 Nov 2010 at 5:14am #
[quote comment="63292"]Actually wrong on both accounts. There is a gross methodological error whenever someone tries to automatically turn theoretical probability calculation into decision strategy. Theoretical probability in valid only in the limit of large numbers and unless you are measuring something that has underlying, physical, cause you are not guartanteed any convergence of sampling at all with finite number of samples. Also, your sample is of size 1 - you won't be able to repeat the game and average results.[/quote]
It's not wrong, and the fact that your real life experience would only involve 1 attempt is irrelevant.
If the givens are the same (Monty always reveals a losing door every time), then yours odds of winning are 2/3 if you switch. You certainly won't be guaranteed to come out ahead, which is the case if you played a large number of times. But even in 1 attempt, the odds are in your favor, and that's all that this is about.
[quote]So Marilyn Vos Savant was actually cheating by presenting the problem as if viewed by a God (over infinite number of samples) and then using the calc for individual case as if there is underlying cause. We are talking the fundamental fallacy of Bayesian "inference" applied to finite, short and ultimately singular samples without an underlying cause.[/quote]
I'm not sure what your complaint is here. The odds in the long run are the same as the odds in a single game. Using Bayesian inference is appropriate.
[quote]Back to actual DOND -- you actually do have more info here -- banker's offers. With assumption that the game is fair (that banker is not allowed to mislead you), sequence of banker's offers is a function of your 1st choice. If your 1st choise was very high that will shift the ballance in banker's calc in a way that will make them smoother, just as the very high last value will shift ofers towards the average sooner as you approach the end.[/quote]
It's not just a function of your 1st choice, it's a function of your luck in the game. You could pick the $1 case and have very lucky choices after that, getting down to the final 3 cases without ever eliminating the $750,000 and $1,000,000. The banker's offers would be no different from if you picked the million dollar case, and kept the $750,000 and $1 in play for a long time. There's no way to get out a calculator and figure out what' sin your case.
Posted 21 Nov 2010 at 8:10pm #
Me and my friend found the perfect formula (it was a 2 month project in our university). I cant give it right now cause i'm gonna use it to play on television soon, but I will leave it soon.
Posted 21 Nov 2010 at 8:13pm #
Hint (just for fun): Edsger Dijkstra
Posted 12 Apr 2011 at 3:13pm #
holy shit I ran into this post looking to see if anyone knew if the banker knew what was in the cases.
IMO it's very simple: As you eliminate cases your percentage goes up. You start at a little over 3% that you have chosen the 1M. These are fantastic odds compared to the lottery or any betting game. Unless of course you have a large bank roll in Vegas and even still you are not betting any of your money not even a dollar for a ticket, so the odds in comparison to other games are awesome, that's what makes the game great.
With each case you choose if you don't pick the million your odds get better to win the million. If you do choose the million dollar case your odds keep improving but you are now playing for a smaller prize.
You don't need an algorithm to play this game you need basic math.
If you have 8 cases with 1 case holding 100,000 and the rest are low cash amounts like .01, 5, 100, etc and they offer you 30,000 dollars. You need to think you only have like a 12% to have that big prize. That means you have an 88% chance to leave poor at that moment. The alluring part is that if you take the chance to start eliminating cases your odds quickly grow...first by 2% then by 4% then 5% etc. until you reach the ultimate goal of 50/50...Is it worth it? Mathematically speaking you are climbing up a steep mountain from the beginning, however you started the climb without any risk...so psychologically you'll be no worst off than when you started... And toward the end game the percentage increments are greatly in your favor but the money risks are usually high. Equating to fun fun FUN!
Personally if the game was a 1M dollar highest prize my personal goal would be to get to a 100,000 dollar deal and then hit that button. Yet when the blood starts pumping who know what someone will do.
Posted 28 Jun 2011 at 7:52pm #
Funny that this turned into a Monty Hall vs. DOND debate with some algorithm chatter sprinkled in. As in the Monty Hall Paradox, it benefits you to switch for the same reason. 26 out of every 27 times the game is played, the Million is on stage (with the girls), not in your case. If you're lucky enough to have the other 25 cases eliminated without the Million exposed, you'd better go for that 26/27 chance, rather than your 1 in 27. Odds on your case stay the same throughout, but the odds of that last case on stage went up every time another case was eliminated. Keep in mind that the stage cases have 26/27 (96%) of containing the Million. As lower dollar cases go away, less and less stage cases are sharing that 96%; while your lonely case is resting on the sad odds it had at the beginning... and that's because neither the dollar amounts, nor the cases were redistributed at any point during the game, which would be the one scenario that would "reset" the odds, and make them equal for ALL remaining cases, even yours.
Posted 01 Jul 2011 at 2:52pm #
Hi Craig,
This is a very old thread, but it comes to life now and then. The issue gets settled, then a new person chimes in. I’ll go another round with it, however.
:o)
My background: 3 quantitative degrees, enrolled in MS Stat, and working as a Statistician.
I’ll try it a coups of different ways: Classical statistics, Bayesian statistics, and English.
For DOND in classical statistics the random variable is what happens when you select a case. The probability that the case has the $1,000,000 is 1/26 and it stays that way. However, the conditional probability changes as you go along. At the end, the conditional probability that you have the $1,000,000 GIVEN that you’ve opened 24 cases and not found it, is 50%.
For MH in classical statistics the random variable is your initial door selection. You have 1/3rd chance of a car. Now the key here is that Monty does NOT randomly select a door (as you do when you open cases). Monty will always open a door with a goat. So your conditional probability does not change when he shows you the goat. The probability your door has a car, given that Monty has shown you a goat is still 1/3rd, just like when you started.
In Bayesian terms we don’t need to talk about conditional probability. “Probabilities” here depend on the available information. In DOND your starting probability of the million is 1 in 26, and when there are two cases left it is ½. For Monty, your Bayesian probability is 1/3rd, both before and after he shows you the goat. Opening a case that MIGHT have had the million and didn’t give us information that changes our Bayesian probabilities. Monty showing us a goat, when we knew for sure he would, give us no information, and does not change the probabilities.
Finally in English, let’s think about Monty. There are 3 possibilities. 1) You pick the car, and Monty shows you one of the goats. 2) You pick the first goat and Monty shows you the second goat 3) You pick the second goat and Monty shows you the first goat. In situations 2 and 3 you get a car by switching. In situation 1, you lose. You should switch, because there is a 2/3 chance of winning by switching, and only a 1/3rd chance by standing your ground.
In DOND there are 26 situations. 1) You pick the million for your case 2) The million is in the first case you open 3) The million is in the second case you open…26) The million is in the last case you open. Now ASSUMING we get to the last two cases without finding the million, we want to know if switching helps. Situations 2 through 25 no longer matter, they violate our assumption that we have arrived at the last two cases without finding the million. That is – if we run the experiment 26 times, on average we only get to our target situation 2 times. Of those, one gives us the million, and one does not. Our conditional probability of having the million, given that we’ve arrived at this point is 50%.
Hope that was clearer than mud.
Dave G.
Posted 01 Jul 2011 at 10:10pm #
Haha, I love that the this thread is five and a half years old and still getting posts. My first post here was almost four years ago.
Craig: I'm afraid you have made a common mistake in thinking that it is better to switch cases at the end of DOND. You've forgotten that in the 26th game, you're guaranteed to have the million dollars remaining when you reach the final two, because the million dollar case is in your hands.
There are two ways to reach the final two cases and have the million dollars in play. 1) Get very lucky and select the million for your own case, and 2) Select a losing case, but get very lucky and leave only the million dollar case on the stage. The odds on both of those are the same, so if you find yourself at the end of the game with the million still in play, you are equally likely to have it in your hands or on the stage. So there is no benefit to switching cases.
Posted 04 Jul 2011 at 2:48pm #
I still say playing for the best odds is not the best approach when you only get one chance. I would not play for the best statistical peek offer, I would play for the least risk of loss. I'd rather take very little risk and walk away with $20k guarenteed. The only thing that causes people to keep going is they think 1MM is so far away they have lots of time to change their mind and take the money.
Posted 05 Jul 2011 at 2:01pm #
OMG, best thread ever! Thanks guys, I have never heard the DOND piece explained in such a way. I was absolutely forgetting that nearly all scenarios squash the million dollar 'showdown' before it even has the chance to ensue. Much appreciated.
Posted 13 Jul 2011 at 7:01pm #
Though I know this thread is dead as a doornail, not to mention well off-topic; I do hope that one of you can point out a mathematical flaw in the Monty Hall probability tree that a colleague of mine constructed. I saved the jpg to skydrive (link below), but in short; he shows the probability of event 1 as 1/3 (agreed), in step 2, he has the host's door opening probabilities; for each of the two events that began with a goat being chosen, the probabilities of the host opening the other goat are each (1). And for the 3rd scenario; with the car in door 3, the tree splits to accommodate the host opening the first goat door (.5) or the second goat door (.5) I also agree with this piece of it. It's the third step where the voodoo probability math takes place. --For every branch of the probability tree, he has a new split, one for keeping, and one for switching. And he assigns .5 probability to each. If you multiply the probabilities thru each branch they indeed come up 50/50 when added. Where is the flaw?? He needs to be taken down, and I've warned him that I'd seek help. Thanks!
https://skydrive.live.com/redir.aspx?cid=ec2da24e100aca13&page=play&resid=EC2DA24E100ACA13!114
Posted 14 Jul 2011 at 1:49pm #
Craig,
I'm afraid your colleague is correct. Think about this. You know that if you refuse to switch your odds of winning are 1/3rd. Switching results in winning 2/3rds of the time. It should not be surprising that if you choose randomly between those two possibilities you win 1/2 of the time.
Another way to look at it is that the history of the game up to the last choice does not matter if the last choice is random. One door with one goat has been opened. There are two doors left, one with a goat, and one with a car. You choose randomly between them - so you win 1/2 of the time.
Posted 14 Jul 2011 at 1:55pm #
Eric,
I agree that it might not always be best to play the odds. The correct strategy might depend on how wealthy you already are. In Economics this would be called "utility theory". The first dollars you get are more useful to you than the latter dollars, and have grater utility and therefore should be valued more. If you are very poor and $20,000 is a huge amount of money to you, then you should be very risk adverse. On the other hand Bill Gates should just play the odds. Another factor would be if there was some important threshhold you needed to reach. e.g. "My house will be foreclosed unless I can get x dollars".
Posted 15 Jul 2011 at 9:42am #
Dave, now I am confused once again. My colleague's probability tree was created specifically to support the theory that choosing to always switch from the original selection was fruitless, and would only yield a car in 50% of the trials. I want to be careful about the wording. His document is not supporting random selection of the final door, as the Monty Hall Paradox, in its traditional representation, doesn't involve randomly choosing stay or switch. His document flies in the face of 'always switching yields victory 67% of the time', a statement we, but not he, know to be true. So do I just have to accept that two theories which cannot co-exist are somehow both correct? Am I wrong to think that the probability tree would show switching wins 67% of the time? Either his math is wrong, or his diagram doesn't represent what he says it does. Right?
Posted 15 Jul 2011 at 10:19am #
Ah, O.K. I didn't know what argument he was trying to make. He has the problem in that case.
In the last row he has the probabilities of our second pick all as 0.5. This means that you randomly choose whether to switch or not. But that is not what we want. We want to switch from our original door with probability = 1, and we want to stay with our first pick with probability = 0. Put those numbers in the diagram and do the math. Now we win with probability = 2/3 , and lose with probability = 1/3.
Hope that helps.
Posted 15 Jul 2011 at 4:49pm #
Your colleague's probability tree is crap. It's an confusing mess, and its primary flaw is that it assumes you will choose to switch cases 50% of the time.
If you did switch cases 50% of the time, then you would always be 50% to win. But that's not the problem being worked on here.
I fixed his tree to show what it looks like if you choose always to switch: http://i.imgur.com/E1TuG.jpg
There is no need to make a "always not switching" tree, because then you are simply trying to pick a winner out of the three cards. Clearly a 1 in 3 chance, and it needs no further proof.
Posted 16 Jul 2011 at 12:04am #
Yes!!!! You guys rock. This is what I was looking for. I didn't realize that his tree assumed random selection during the last step. Woohoo! The probability tree (and it's associated YouTube video) are going down!!! Thank you!
Posted 20 Jul 2011 at 1:36am #
Wow, I just stumbled on to this blog post today when I was looking for the algorithm used to determine the banker's offer. I just read every comment. It is interesting seeing the logical traps that people fall for. It is also interesting how confident people are that they are right, even when they are wrong.
===================================
Craig, your colleague's tree is actually just about correct, except he seems to make a silly arithmetic error at the end.
If I'm reading this correctly, it reads: pwin_when_switching=a+c=plose_when_switching=e+g
But a=c=0.165 and e=g=.0825, so a+c=0.33 which does not equal e+g=0.165. Rather, a+c is actually double e+g, so one does have a 2/3 chance of winning if one switches.
Or, maybe I am just misreading the tree (its resolution seems to have been reduced from an original version). In any case, Edmond's version is clearer and more concise.
I hope you return to this comment section to tell us your colleague's reaction. Based on what I have observed, he will not be convinced.
Posted 20 Jul 2011 at 1:49am #
This also reminds me of another long-debated problem: the airplane on a treadmill.
http://www.kottke.org/06/02/plane-conveyor-belt
The commenters there who got it wrong were very confident, to the point of calling everyone else idiots.
I think these debates should all teach us a big lesson in humility.
Posted 23 Jul 2011 at 4:53pm #
Jesse,
I will most certainly be posting the reaction, which I am in complete agreement, will not exactly be a retraction or a change of heart. But first, I'm waiting for my reply from when I recently asked him how the sample trials went; in support of the theory. I think the experimental piece is being conveniently left out, as we are made to believe the theoretical tree stands just fine on its own.
--That Jet on a Treadmill one is a classic. Admittedly, I was a flip-flopper on that one once again the first time I heard it. It was around the time of first hearing that problem that I realized when someone asks a question like that, it's usually not to simply confirm that the most popular and likely answer from the masses is correct.
Posted 24 Jul 2011 at 1:36pm #
The Monty Hall problem doesn't apply to DOND because Monty is adding information to the situation. In DOND there is no additional info added. If you were allowed to switch your case during the game, that would make it more like Monty.
Intuitively you may think it's harder to pick away at cases to get down to one remaining than it is to choose the right case from the beginning. But they are the same odds.
As for the Monty problem, the best way to think of it intuitively is to use 1000 doors rather than just 3 doors.
If you chose a single door out of 1000 and Monty removed 998 doors. Intuitvely you would know that the OTHER door is the best choice. Right? I mean initially you had no chance of getting it right, so that means the other door must be it.
When you only use 3 doors, the odds are so close unlike having 1000 doors, it almost seems like you aren't getting an advantage. But it's the same concept. Your first guess was 1/3 meaning the odds are still in favor of you being wrong. Monty is telling you where the most like door is.
Posted 24 Jul 2011 at 6:19pm #
Hey,
I'm making a Deal or No Deal Computer Game... I would to know what was the best decided formula to calculate the bankers offer.
For now I am using just an extremely simply formula that works decently for the simpletons. It is as follows.
Offer = SumOfCases / 4
Thanks.
Posted 24 Jul 2011 at 7:20pm #
Oops,
I would still like to have a better algorithm.
But the simpleton algorithm I put up is wrong from what I used.
it is Offer = MeanOfCases / 4 (doesn't do that great)
After many more tests in an exel graph with it computing the values I made a more 'slider' algorithm. Probably this can be done with the correct value but the slider algorithm is follows the offer line nicely at the same level. it is as follows. (varrying based on offer #)
1st offer (20 cases) = Mean / 3.5
2nd offer (15 cases) = Mean / 3
3rd offer (11 cases) = Mean / 4
4th offer (8 cases) = Mean / 5
5th offer (6 cases) = Mean / 3
6th offer (5 cases) = Mean / 3
7th offer (4 cases) = Mean / 2
8th offer (3 cases) = Mean / 2
9th offer (2 cases) = Mean / 1.5
10th offer (1 case) = Case Value
Posted 25 Jul 2011 at 8:52am #
[quote comment="67750"]Oops,
I would still like to have a better algorithm.
But the simpleton algorithm I put up is wrong from what I used.
it is Offer = MeanOfCases / 4 (doesn't do that great)
After many more tests in an exel graph with it computing the values I made a more 'slider' algorithm. Probably this can be done with the correct value but the slider algorithm is follows the offer line nicely at the same level. it is as follows. (varrying based on offer #)
1st offer (20 cases) = Mean / 3.5
2nd offer (15 cases) = Mean / 3
3rd offer (11 cases) = Mean / 4
4th offer (8 cases)
= Mean / 5
5th offer (6 cases)
= Mean / 3
6th offer (5 cases)
= Mean / 3
7th offer (4 cases)
= Mean / 2
8th offer (3 cases)
= Mean / 2
9th offer (2 cases)
= Mean / 1.5
10th offer (1 case)
= Case Value[/quote]
I see some problems.
Let us say that the two remaining cases are 400k and 500k. Your algorithm would then offer 450k/1.5=300k which makes so sense.
Also, why does the 4th offer use mean/5 when the third was mean/4?
Posted 25 Jul 2011 at 1:32pm #
For the algorithm - here is my page about what the on-line game uses:
http://www.davegentile.com/stuff/Deal_or_no_deal.html
I'm not sure if that will be helpful or not.
Dave
Posted 25 Jul 2011 at 1:43pm #
On the psychology of why some people don't get this one, even after a lot of thinking about it - I think some people think of probability as a physical process. Bayesian probability or conditional probability is classical statistics involve taking into account information ( a non-physical thing). I probability is a physical thing, and the two final doors are physically the same, the probability should be 50%, or so that line of reasoning goes.
I did see an on-line video defending the 50/50 idea. The person there kept saying "Don't pick anything at first, then the host will show a goat, then you will have a 50/50 chance". True, but he changed the problem, of course. But it did make me notice that we give the host some information too. We must tell the host our pick, and limit his options, then he has to show us a door. There is a 2 way information exchange. Just one more tiny detail in a very over-analyzed problem.
Posted 25 Jul 2011 at 2:18pm #
Regarding the plane:
========
Here's the original problem essentially as it was posed to us: "A plane is standing on a runway that can move (some sort of band conveyer). The plane moves in one direction, while the conveyer moves in the opposite direction. This conveyer has a control system that tracks the plane speed and tunes the speed of the conveyer to be exactly the same (but in the opposite direction). Can the plane take off?"
========
The problem is ill posed. What must the conveyer belt speed (relative to the ground) be equal to? The speed of the plane relative to the ground, or the speed of the plane relative to the belt? If the belt speed must equal the speed of the plane relative to the ground, then obviously the plane can take off. The plane will be moving 200mph forwarad at some point, and the belt will be moving 200mph backwards, and the relative speed of the plane to the belt will be 400mph.
On the other hand if the belt speed must equal the speed of the plane relative to the belt, then the only way this is possible is if the plane remains motionless relative to the air and ground. Such a plane can not take off. But the question then is really "Is such a belt even possible?" Since the plane is using jets to push forward, and not moterized wheels, can ANY belt possibly keep the plane still? Probably not. The wheels have some friction, and so the plane can be pulled back a little by a moving belt, but not much. So in this case the answer is that such a belt does not seem possible, but to the extent that it is possible, the plane can't take off.
Posted 28 Jul 2011 at 12:45pm #
Yeah, the plane problem is confusing, I think that was intentional too. Most people will assume the plane's engine and means of moving forward on the ground are similar to a car's, when that isn't true at all. The whole question is just a trick, a belt like that is impossible, so the plane will take off.
Posted 29 Jul 2011 at 2:47pm #
As for the plane puzzle, the answer is yes for both assumptions, that the wheels drive it or the jets push it.
If the jets push the plane, then the wheels are insignificant. The wheels will simply rotate twice as fast as the jets are causing the plane to advance. Think of it as having a pencil between both of your hands and pushing your hands in opposite directions. It just spins the pencil.
So the plane can speed fast enough as it's not being hindered from gaining velocity.
If the wheels drive the plane then again, the plane can take off. Here is why.
If you have a control system tracking the plane to counter its speed, then there must be a fraction of time that the plane is going faster than the conveyor. The only way this could be avoided is if the converyor knew in advance what the plane speed was and matched it simutaneously. That's not the case.
So if the plane needed to get to 200 mph to take off, theoretically it would need to accelerate to that velocity in the fraction of time that the conveyor would be tracking the speed.
So lets say it took 1 sec for the conveyor to determine the plane speed.
At time 0 sec, plane = 0 mph, conveyor = 0mph
At time 1 sec, plane = 200 mph, conveyor = 0 mph
At time 2 sec, plane = 200 mph, conveyor = 200 mph; but the plane has taken off!
Obviously it depends on the "control system" and the ability of the plane to accelerate. But theoretically there is a time window where the conveyor is standing still. Also, the plane could gradually gain speed due to the time lag of the control system.
At time 0 sec, plane = 0 mph, conveyor = 0mph
At time 1 sec, plane = 50 mph, conveyor = 0 mph
At time 2 sec, plane = 100 mph, conveyor = 50 mph
At time 3 sec, plane = 200 mph, conveyor = 100 mph
Posted 31 Dec 2011 at 2:55pm #
[quote comment="68904"]I know its been years but curiousity caught me to your post. Your brother is right. Since your choice in case is made in the beginning revealing cases has no actual effect on your actual chances of winning statistically. Sure at that point in time with four cases left you can deduce that you will win 25% of the time. However, your choice was not effected. Just because you were lucky enough non million cases to be revealed doesn't change the fact that the choice was made with a 1/27ish.[/quote]
I love that this discussion still gets new posts!
James I'm sorry but you've made a mistake... Marty was correct and his brother was mistaken.
It is true that the contestant's chosen case had a 1/27 chance of containing the million. But so did the other 3 cases that are remaining on the stage! There are four cases in play, all of which began with a 1 in 27 chance of containing the million dollars.
Since there no other place that the million could be, it's clear that they can't all still be 1 in 27 odds, because that only adds up to a 4/27 chance when you know the reality is there is a 27/27 chance that the million is still in play.
So yes... the odds do change as the game is played out. Each of the cases now has a 1 in 4 chance of containing the million (the contestant's case and the three cases on stage are treated no differently... they are simply the four cases that haven't been opened yet). Marty's brother is wrong - switching cases does not help or harm your chances, it's 25% no matter what you do.
Please note, this ONLY applies if this stage of the game has been reached through good luck (which is required to get down to 4 cases and still have the million in play).
If it has been reached through someone manipulating the game (such as a gameshow host removing cases that he knows are not the million dollar case), then the odds have been affected by this outside interference, and in this situation you WOULD be better off switching.
Posted 02 Jan 2012 at 12:13am #
[quote comment="68906"]
I love that this discussion still gets new posts!
James I'm sorry but you've made a mistake... Marty was correct and his brother was mistaken.
It is true that the contestant's chosen case had a 1/27 chance of containing the million. But so did the other 3 cases that are remaining on the stage! There are four cases in play, all of which began with a 1 in 27 chance of containing the million dollars.
Since there no other place that the million could be, it's clear that they can't all still be 1 in 27 odds, because that only adds up to a 4/27 chance when you know the reality is there is a 27/27 chance that the million is still in play.
So yes... the odds do change as the game is played out. Each of the cases now has a 1 in 4 chance of containing the million (the contestant's case and the three cases on stage are treated no differently... they are simply the four cases that haven't been opened yet). Marty's brother is wrong - switching cases does not help or harm your chances, it's 25% no matter what you do.
Please note, this ONLY applies if this stage of the game has been reached through good luck (which is required to get down to 4 cases and still have the million in play).
If it has been reached through someone manipulating the game (such as a gameshow host removing cases that he knows are not the million dollar case), then the odds have been affected by this outside interference, and in this situation you WOULD be better off switching.[/quote]
Ah, how long ago was this post of mine? I am surprised how naive I was about statistical mathematics.
I may have explained it wrong back then, but you are not entirely correct either.
At the start of the game, when you select the first case, you had a 1/27 chance of picking the correct one. On this, you agree with me.
Each other case has a 1/27 chance of being correct. On this, you also agree with me.
However, the odds of "any other" case being correct is 26/27 = about 97%, a very high percentage!
As the number of cases dwindles to four, new statistics are created, but the odds from before are not manipulated. Even with four cases left, you still have a 97% chance of being wrong.
Divided between the three remaining cases, that's roughly a 32% chance. Any of those three would be a better choice than the one in your hand, ten-fold.
Therefore, it is in your best interest to switch cases when given an opportunity to do so, even after each case is checked.
And to touch on outside interference, if such a thing happened, then nobody would ever win the million unless the game show wanted them to, in which case they have all the control of the statistics, not you.
Posted 02 Jan 2012 at 11:59am #
James wrote: Any of those three would be a better choice than the one in your hand, ten-fold.
Therefore, it is in your best interest to switch cases when given an opportunity to do so, even after each case is checked.
Dave says:
That's not correct. Here is how the problem would be described in the language of classical statistics. The "random" variable is your initial choice. There is a 1/28 chance you pick the million and 27/28 that you pick something else. As the game progresses that probability remains unchanged. BUT - the conditional probability that you have the 1,000,000 DOES change. If we get to 4 remaining cases and the million has not been revealed then the CONDITIONAL PROBABILITY of any of the remaining cases having the 1 million GIVEN the known events of all those cases not having the 1,000,000 is 1/4 and there is no benefit to switching. The conditional probability is all we care about for decision making at that point. becasue we know that the events we are conditioning on are true.
Another example. Suppose the first case we reveal has the 1,000,000. The probabiity that you choose the rignt case is STILL unchanged. It is a fixed event. It is still 1/28. BUT...the conditional probability is now zero that you have the right case, GIVEN that the 1,000,000 has already been revealed.
For the record there is a benefit to switching in the Monty Haul game - but the situation there is a little different, and I don't want to confuse it with the above situation.
Also for the record, if we wanted to talk about Bayesian statistics then the probabilities do change during the game, because Bayesians use the same words to mean different things when compared to the Classical stat nomenclature.
But no matter what lingo you use - there is no benefit to switching cases in this game.
Posted 02 Jan 2012 at 5:42pm #
[quote comment="68919"]At the start of the game, when you select the first case, you had a 1/27 chance of picking the correct one. On this, you agree with me.
Each other case has a 1/27 chance of being correct. On this, you also agree with me.
However, the odds of "any other" case being correct is 26/27 = about 97%, a very high percentage!
As the number of cases dwindles to four, new statistics are created, but the odds from before are not manipulated. Even with four cases left, you still have a 97% chance of being wrong...[/quote]
Sorry, but this is where you are making your mistake- the odds just don't work that way.
With four cases left, each of them began the game with a 97% chance of being losing cases. But with only four left in play, that cannot still be true. If the million dollars is still in play, then there cannot also be only a combined 12% chance that one of the four cases is holding it. It has to be a 100% chance.
Let me come up with another similar situation - suppose you buy a raffle ticket where 100 tickets are sold, and there is 1 guaranteed prize - the ticket gives you a 1% chance to win the prize.
Suppose for entertainment purposes, they drew two raffle tickets, asked the ticket owners to come up on stage, and then there would be a coinflip to determine who wins the prize.
Once you are up on stage, you would not say your chances of winning are still 1% just because that's what your starting odds were. Because you have been very lucky to get this far, you have found yourself in a situation where your odds to win are now 50%.
It's the same thing with the 4 cases in DoND. You have been very lucky if you find yourself that far into the game and you still have the million dollars in play. Your odds of holding the winner are now 25%, not the 3% you began with.
[quote]
Divided between the three remaining cases, that's roughly a 32% chance. Any of those three would be a better choice than the one in your hand, ten-fold.
Therefore, it is in your best interest to switch cases when given an opportunity to do so, even after each case is checked.
And to touch on outside interference, if such a thing happened, then nobody would ever win the million unless the game show wanted them to, in which case they have all the control of the statistics, not you.[/quote]
Posted 02 Jan 2012 at 5:47pm #
Oops, I hit submit too quickly.
[quote comment="68919"]
And to touch on outside interference, if such a thing happened, then nobody would ever win the million unless the game show wanted them to, in which case they have all the control of the statistics, not you.[/quote]
I brought that up because this interference is exactly what happened in the Monty Hall "three doors" game. The host would often interfere by intentionally removing losing prizes from the game. Because of this, it actually is beneficial to switch when the game is being manipulated like that.
Fortunately there's never been a game show where they remove winning prizes on purpose.
Posted 02 Jan 2012 at 8:22pm #
Interestingly, the Monty Hall problem scenario DOES apply here, because we are under the assumption that upon reaching the four cases left, one of those remaining four is the million. Whether or not the "game master" was the one who removed them is irrelevant, all that matters is that the condition above exists.
I do agree that the instantaneous probability does change, clearly, because at that instant in time there is a 25% chance that the case has a million.
I do not agree that this instantaneous probability takes precedence or re-evaluates the original probability, which is that picking the first case was a 97% fail rate. Regardless of how many cases are left, that 97% remains, under the presumption that the game is still in continuation (e.g. if the million is revealed prior to the four cases being left, the game of finding the million is already over). This is true because a moment in time existed when the case had a 97% fail rate.
In response to the "combined percentage being 100%": that may be true, but keep in mind that when considering probabilities from multiple instants in time, we are discussing averages, not summations. The combined percentage of 12% corresponds to a [97% * 4 cases = 388%] scenario, which (388+12)/4 = 100%.
Posted 02 Jan 2012 at 9:17pm #
Well, first I'll reassert my above argument. Also I suppose I should add my profession is a statistician, and I actually went to school for studied like this etc...
But now I'll add the description of the Monty Haul game from the classical stat point of view.
In that game we start with a 1/3 probability of having picked the car. 2/3 probability of picking the goat. That probability is fixed and can not change, just as in the millionaire game. However, the conditional probability is both game can evolve. IN MH we are told that Monty ALWAYS reveals a goat. No matter what you picked in the first place he will always show you a goat. So what is the conditional probability that you picked he car, given that Monty showed you a goat? Still 1/3rd. Because 100 percent of the scenarios are still possible. (1/3) / 1.00 = 1/3. We should switch to the remaining door.
Let's return to the millionaire game. Let's suppose that we pick case #1 and start revealing 28,27, etc, until we get to 4 cases left, 1,2,3 and 4. Now let's think about our probability. GIVEN that cases 5-28 are no good, what is the conditional probability of picking correctly? 1 in 4. (1/28) / (4/28) = (1/4). That is: There is a 1/28 initial probability on picking correctly. But that many cases will be opened successfully in only 4 of 28 scenarios, and that is what we are conditioning on.
Returning to DA
Posted 02 Jan 2012 at 9:25pm #
Minor corrections to the above:
Well, first I'll reassert my above argument. Also I suppose I should add my profession is a statistician, and I actually went to school and studied things like this.
But now I'll add the description of the Monty Haul game from the classical stat point of view.
In that game we start with a 1/3 probability of having picked the car. 2/3 probability of picking the goat. That probability is fixed and can not change, just as in the millionaire game. However, the conditional probability in both game can evolve. IN MH we are told that Monty ALWAYS reveals a goat. No matter what you picked in the first place he will always show you a goat. So what is the conditional probability that you picked the car, given that Monty showed you a goat? Still 1/3rd. Because 100 percent of the scenarios are still possible. (1/3) / 1.00 = 1/3. We should switch to the remaining door.
Let's return to the millionaire game. Let's suppose that we pick case #1 and start revealing 28,27, etc, until we get to 4 cases left, 1,2,3 and 4. Now let's think about our probability. GIVEN that cases 5-28 are no good, what is the conditional probability of picking correctly? 1 in 4. (1/28) / (4/28) = (1/4). That is: There is a 1/28 initial probability of picking correctly. But that many cases will be opened successfully in only 4 of 28 scenarios, and that is what we are conditioning on.
Posted 02 Jan 2012 at 11:36pm #
Thank you for demonstrating your credibility by stating you are a statistician, that makes this discussion easier for both of us.
Take note that the original post stated: "after the 22 other cases have been opened and none of them have the $1m in them"
This correlates to the idea you stated: "ALWAYS reveals a goat," because all 22 cases had the proverbial goat in them. Thus, Monty Hall's problem remains, just with 27 doors and the fact that you revealed the goats, not the host.
And with you being a statistician, you will already know that the Monty Hall analysis is correct that switching being the proper choice.
Posted 03 Jan 2012 at 6:31am #
[quote comment="68926"]Interestingly, the Monty Hall problem scenario DOES apply here, because we are under the assumption that upon reaching the four cases left, one of those remaining four is the million. Whether or not the "game master" was the one who removed them is irrelevant, all that matters is that the condition above exists..[/quote]
The Monty Hall scenario does not apply here. It is extremely relevant as to how this situation was reached.
In the Monty Hall scenario, the gameshow host always interferes with the game's odds, and always does so by removing a losing prize. The contestant will always find himself in a situation where the winning prize is still in play.
In DoND it's completely different. There is never any interference from outside. The contestant will very rarely find himself in a late-game situation where the million dollars is still in play - and if he does, it's because he's been very lucky already.
In Monty Hall, the winning prize is ALWAYS there, and the contestant will ALWAYS be in a situation where he can win it.
In DoND, the winning prize will often be eliminated early, and it's rare that the contestant will be in a situation where he can win it.
They're entirely different scenarios that you cannot use the same math for. One is a single-decision, rigged-gameplay show... the other is a step-by-step show with many decisions and a lot of luck involved. They aren't the same at all.
[quote]
I do agree that the instantaneous probability does change, clearly, because at that instant in time there is a 25% chance that the case has a million.
I do not agree that this instantaneous probability takes precedence or re-evaluates the original probability, which is that picking the first case was a 97% fail rate. Regardless of how many cases are left, that 97% remains, under the presumption that the game is still in continuation (e.g. if the million is revealed prior to the four cases being left, the game of finding the million is already over). This is true because a moment in time existed when the case had a 97% fail rate.
[quote]
Let's say the contestant chose case #1, and has already opened cases 5 through 27.
Please tell me why you believe that case #2, #3, and #4 have better odds of containing the million than case #1.
Now let's suppose that the contestant chose case #4, and left #1-3 in play. Would case #4 magically be an inferior choice now, while #1 somehow has its chances improved?
The reality is that all four cases are the ones that the contestant has chosen to leave in play at this point in the game, and all four have equal chances to hold the million. The fact that one of them has been sitting in front of the contestant is irrelevant to the odds.
Posted 03 Jan 2012 at 6:35am #
[quote comment="68930"]Thank you for demonstrating your credibility by stating you are a statistician, that makes this discussion easier for both of us.
Take note that the original post stated: "after the 22 other cases have been opened and none of them have the $1m in them"
This correlates to the idea you stated: "ALWAYS reveals a goat," because all 22 cases had the proverbial goat in them. Thus, Monty Hall's problem remains, just with 27 doors and the fact that you revealed the goats, not the host.
And with you being a statistician, you will already know that the Monty Hall analysis is correct that switching being the proper choice.[/quote]
In DoND, you do not always open 22 cases and reveal 22 losers.
If that was guaranteed to happen (by the host interfering and removing losing cases for you), then the Monty Hall situation applies and you would be increasing your odds by switching cases.
But it's not.
Posted 03 Jan 2012 at 11:12am #
Whether or not that's how it works in any ordinary game Deal or No Deal is irrelevant.
The question he asked is one scenario of Deal or No Deal where all but four cases have been revealed to not have the million. In this scenario, it fits the Monty Hall problem. Yes, you are correct that Deal or No Deal itself does not guarantee you will keep the million in play, but in this one instance he creates in his question, it did play out to have the million left in a remaining suitcase. That is what matters here. In this one single scenario he created in his question, the Monty Hall problem exists, only because he has already guaranteed (to use your own word) 22 cases to be without the million.
Posted 03 Jan 2012 at 3:14pm #
[quote comment="68934"]Whether or not that's how it works in any ordinary game Deal or No Deal is irrelevant.
The question he asked is one scenario of Deal or No Deal where all but four cases have been revealed to not have the million. In this scenario, it fits the Monty Hall problem. Yes, you are correct that Deal or No Deal itself does not guarantee you will keep the million in play, but in this one instance he creates in his question, it did play out to have the million left in a remaining suitcase. That is what matters here. In this one single scenario he created in his question, the Monty Hall problem exists, only because he has already guaranteed (to use your own word) 22 cases to be without the million.[/quote]
All I can say is that there is a flaw in your logic here. The odds just don't work that way. You seem to have decided that a situation looks like the Monty Hall situation, therefore it must be the same, and that just isn't correct.
I mishandled the html in a previous response, let me ask this again-
Let's say the contestant chose case #1, and has already opened cases 5 through 27.
Please tell me why you believe that case #2, #3, and #4 have better odds of containing the million than case #1.
Now let's suppose that the contestant chose initially chose case #4, and left #1-3 in play. Would case #4 magically be an inferior choice now, while #1 somehow has its chances improved?
The reality is that all four cases are the ones that the contestant has chosen to leave in play at this point in the game, and all four have equal chances to hold the million. The fact that one of them has been sitting in front of the contestant (instead of left on the stage) is irrelevant to the odds.
Posted 03 Jan 2012 at 11:35am #
James, there is still a nuance of difference here, and it is important.
First, before we get to conditional probabilities, here are the initial straight probabilities:
In Monty Haul - the random variable at the start has three possible outcomes - car, goat 1, and goat 2.
In Millionaire - the random variable at the start has 28 possibilities - the million, other#1, other#2, other#3 and then other#4 etc...
So there is a tree with 3 branches in the first case, and a tree with 28 branches in the second case.
Now for the conditionals - In Monte Haul we are told (GIVEN) that we are on a branch where Monte will reveal a goat. Well, he reveals a goat ON ALL THREE BRANCHES. Saying that we will see a goat, eliminates none of the possible future worlds. Had we chosen differently, it would not have been otherwise, we still would have seen a goat.
But...in the other game saying that we get down to 4 cases tells us something different. It tells us that we are not in a world where we picked other cases #4-#28. Those branches are eliminated, we know we don’t go down those paths (or didn't go down those paths), so now we are looking at a tree with only 4 branches that we have not eliminated, and the chance that we are on the winning branch is 1 in 4.
So, at the start there are 28 possible worlds. 24 of those possible worlds end up with the million being revealed before we get to the final 4. 3 of those worlds have the million in some remaining case, other than ours at the final 4, and 1 world has us winning. At the start 1 of the 28 possible worlds has us winning. By 4 cases, however, we know we are not in the other 24 worlds, and one of the 4 possible worlds is a winner for us.
Again, in Monty Haul, there were three possible worlds at the start. After he reveals the goat, we still could be in any of the three possible worlds. Our chance of winning with our original choice is still 1 in 3.
Posted 03 Jan 2012 at 12:49pm #
Doing it with cards - borrowed from Wikipedia:
In Monty Haul: Use the red twos and the Ace of Spades, deal one card to the player. Look at the remaining two cards and discard a red two. Then record if the player would win by switching. They will win by switching anytime they were dealt a red 2 (2/3rds of the time).
In Millionaire (lets go with 27 total cases), use all the red cards and the Ace of Spades. deal one card to the player. Now throw out all but the bottom 3 cards in the remaining deck. Is one of them the Ace of Spades? If not then the results of this experiment are discarded. This is one of the 23 of 27 scenarios where the Ace of Spades is already gone by this point. In the scenarios we do record about 1/4th of them will result in the player winning with his original choice. If the Ace was the top card in the original deck, he wins. If it was one of the bottom 3 cards it is a scenario we will count. If the Ace was anywhere else, we're not looking at this scenario. The Ace is on top in 1 of the 4 scenarios that matter to us, and we will win with our original choice in 1/4th of those scenarios.
Posted 16 Jan 2012 at 3:12am #
great blog I'm a gigantic Big Brother fanatic
Posted 09 Feb 2012 at 10:48pm #
I don't really care about the whole DOND vs Monty Hall problem, but I do have a comment regarding the formula that governs the banker's offer. There is indeed an algorithm in play, and it is always the same for each episode. Also, it is strictly mathematical and not adjusted by the "banker" based on the contestant's prior choices.
Proof: Whenever there are 5 or fewer cases in play and the contestant makes a deal, the host asks the contestant to simulate what would have happened if they had kept going to the final 2 cases. Each time the contestant chooses a case, Howie says "and your offer would have been $_____". The offer is immediately calculated and displayed, with no time to fudge the offer at all. There have been a few episodes where there are identical cases remaining, and the offers have all been identical.
Also, there were two episodes (one in season 2 and one in season 3) where there were indentical cases remaining, and the offer was identical in both cases.
To me, this proves that the banker's offer is a strict algorithm and is quickly calculated via computer and displayed on-screen following an imaginary "phone call" from the banker. In fact, the banker servers no purpose whatsoever, other than to serve as the "bad guy" to increase suspense and give the show more "filler" to take up 22 minutes per episode.
However, what exactly the algorithm is, I haven't been able to solve. When I do, I will post it here first.
Posted 10 Jun 2012 at 5:23am #
Very great post. I simply stumbled upon your weblog and wanted to mention that I've truly loved surfing around your weblog posts. In any case I will be subscribing for your feed and I'm hoping you write once more soon!
Posted 21 May 2013 at 8:16am #
the problem that i have noticed is that people play the game because they want to walk out with a million instantly yet they have ignored so many offers, the online version is very tricky but yet the simple 1 to play has a very simple formula to use and it works all the time 26 cases pick 1 case you are left with 25 cases 6 cases and 5 cases and another 5 cases then look at the deal that is being offered to you if its higher than the amount that you have put in as your stake take the deal and walk away and keep on playing.
Posted 04 Apr 2016 at 6:51pm #
If the show should come back, i would suggest a Fair Value, similar like equity futures. The Fair Value is subtracted from the Future Charge/Change to give the Implied Open. I don't know if Deal or No Deal (DoND) actually used a fair value system, If they don't, and should the show come back once, I suggest:
if NoDeal then Open = FinalReveal
Suggested offer for Case XX, $1 / $1,000,000 still in play
WIN FUTURE CHARGE FAIR VALUE IMPLIED OFFER
1M 500,000 + 99=16,853 = $517,000
1U 500,000 + 99=25,944 = $526,000
ACTUAL OFFER:
AverageDeltaSolution>Round+LSolution
AverageDeltaHighestImplOffer+LowestImplOffer
HigestImplOffer-[(HIO *LIO)/FV]
526K-[(271.942M/99)]=521K(variable.LSolution)
LowestImplOffer+[(HIO*LIO)/FV]
517K+[(271.942M/99)]=522K(variable.HSolution
Delta522K+521K=1K(variable=DeltaSolution)
Average1K=500U>1K+521K=522K
OFFER: $522K
This can be a potential final bank offer. It can also be a below FutureCharge offer, calculated with AverageOfferIncrease>40,50,60,70,80,90,100,110or120% of the FirstOffer, Rounds1to9 respectively. That means someone gets a first offer of $36K, and then $43K (increase 16%), the fair value would be down instead of up, and that is variable between 4 and 12 with as center 8. 4 means the offer would be $488K, 12 means $464K. if the player turned down every offer.
This was part 1. Part two in a week!