bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 what Han and bid_them_up are talking about I don't understand :) What I understand is that you can pick a random number equally probable to any other between X and Y But you cannot do that between X and Infinite. I think Hannie is saying that because you cannot assign a number to infinity, you cannot define 1/2 of infinity or two times infinity, the problem is not possible to solve mathematically. Is this correct, Han? It doesnt matter what number you pick is what I am saying. Given that infinity is infinity, you can let infinity = A. But....if A = infinity, you can always come up with 2 times infinity (because its infinity, it can always be twice as large as itself, or 1/2 as much as it originally was). You are using infinity as a finite number in this case (even though it really is not). Logically it makes sense (to me) to let infinity be equal to N+1, in which case the problem still breaks down to: A = N+1B = 1/2(N+1) or B = 2(N+1)2B = 2 1/2 (N+1)B= 1 1/4 (N+1)1 1/4(N+1) > N+1 Han is way above my head though in his examples. :) Quote Link to comment Share on other sites More sharing options...
helene_t Posted July 18, 2007 Report Share Posted July 18, 2007 Here is a similar problem: There are two identical boxes, each box contains two identical envelopes. In the first box the envelopes contain $2,500 and $5,000, in the second box the envelopes contain $5,000 and $10,000. I don't think this is similar problem. If I run a simulation upon this it would be correct to switch: 2500$ -> switch you will win +2500$ or +7500$5000$ -> switch you will win +5000$ or lose -2500$10000$ -> Do not switch. (-5000$ or -7500$) Wich relates to the initial problem, the higher you have, the less you want to switch. so what am I missing? Han's two-box problem is a special case of the original problem. He's postulating the particular probabilities:(2500;5000): 50%(5000;10000): 50% Given those particular probabilites, you can compute the solution. In this case it happens to be: switch if the first envelope contains $5000 or 2500, otherwise don't switch. You can try with some other probabilites. Suppose, for example, there are are 2000 boxes, 999 of witch contain (2500;5000) and one contains (5000;10000), the remaining 1000 being irelevant now that you know the first envelop contains $5000. In that case you should not switch if the first envelop contains $5000. think Hannie is saying that because you cannot assign a number to infinity, you cannot define 1/2 of infinity or two times infinity, the problem is not possible to solve mathematically. Is this correct, Han?No, 2*Inf=Inf and Inf/2=Inf. That's not a problem. The problem is that if there's an infinite number of boxes, some must have higher probabilities than others. Since otherwise each particular box would have probability 1/Inf=0 and therefore it would be impossible to open a box, events with zero probability never happen. I know this sounds very contra-intuitive, it took me a long time fully to understand it myself. Quote Link to comment Share on other sites More sharing options...
BebopKid Posted July 18, 2007 Report Share Posted July 18, 2007 I think when looking at when to switch, this question must be asked: Is reducing my take by 50% acceptable? When I watch game shows, I see people frustrated at losing who could have left with a win. The only reason I would want to look at the odds of increasing my take by 100% is if I can accept the result of 50% of my take. When we're greedy, we tend to look at the odds of increasing our take, first. I think that's the purpose behind the question. Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 think Hannie is saying that because you cannot assign a number to infinity, you cannot define 1/2 of infinity or two times infinity, the problem is not possible to solve mathematically. Is this correct, Han?No, 2*Inf=Inf and Inf/2=Inf. That's not a problem. The problem is that if there's an infinite number of boxes, some must have higher probabilities than others. Since otherwise each particular box would have probability 1/Inf=0 and therefore it would be impossible to open a box, events with zero probability never happen. I'm talking envelopes. There are only two of them. :) It doesnt matter if the envelope A has $1, $5, $50, $1,000,00 or $N+1 in it. The other envelope will contain 1/2 or two times that amount. I don't get the part about some must have have higher probabilities than others. But thats ok, I don't really want to either. :D Quote Link to comment Share on other sites More sharing options...
goobers Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. The problem with these statements is that the envelopes in question contain a finite (and given amount of $5000 in the first envelope, call it Envelope A). All the other "probabilties" are meaningless. The only thing that matters in this problem is: Does Envelope B hold 1/2 as much as Envelope A or does it hold twice as much as Envelope A? B must equal either 1/2 A or 2A, and A is equal to $5000 in the problem as Justin has proposed it. So there is either $2500 or $10000 in envelope B, no other numbers matter. Person B is saying his Expected Value is greater if he switches envelopes. I happen to agree. His expected value is the sum total of what he receives by selecting B. (B = 1/2A) + (B = 2A) = (2B = 2 1/2) = (B = 1 1/4A) His expected value (on average) is 1 1/4 A. Lets say he does this 4 times, and meets the 50/50 odds. He draws 2500, 2500, 10000, 10000 = 25000/4 = 6250 which is greater than the 5000 he started with. If he does it 1000 times, 10000 times or a million times, his average expected value still works out to be 6250, which is greater than his current amount of $5000. QED. :D (Yes, I am sure there are flaws in my math arguments, but the logic is the same). There's something wrong with this. You say (B = 1/2A) + (B = 2A) but that isn't correct. Your value for A is inconsistent. It should either be 1/2A and A, or A and 2A, but not 1/2A and 2A. I think. This problem is giving me a headache. Quote Link to comment Share on other sites More sharing options...
Echognome Posted July 18, 2007 Report Share Posted July 18, 2007 I know they formulated a harder version of the problem, but at least to me, it's clear we cannot assume that the envelopes are equally likely as in the set up they haven't defined a clear prior distribution of envelopes. They only state that there are an infinite number of them. Thus they cannot be equally likely or we would not have a probability function. Quote Link to comment Share on other sites More sharing options...
helene_t Posted July 18, 2007 Report Share Posted July 18, 2007 Maybe it's easier to understand if we replace the boxes and envelops with something for which we actually have a feeling for what the probabilities are. Suppose you're told that in heterosexual couples the male is 50% heavier than the female (of course this is not always the case but let's assume it to be the case just for the sake of argument). Now you pick a random person and you can chose either to receive the person's weight (dollars per kilogram) or the person's spouse's weight. By the envelope argument you may reason that you're better of chosing the spouse, which can't be true of course. The solution is that it depends on the weight of the first person. If the first person weights 100 kilogram you reason that it's probably the male so you'd better stay put. If the first person weights 40 kg you reason that it's probably the female so you'd better switch. Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 Very interesting stuff guys, thanks :D Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 Your value for A is inconsistent. It should either be 1/2A and A, or A and 2A, but not 1/2A and 2A. I think. This problem is giving me a headache. In the problem as given, you are told one envelope is half the value of the other. You are then allowed to open one of the envelopes that contained $5000. Call this envelope A. But you do not know if it was originally A or originally B. :D A is $5000. You are told this. The amount of unopened envelope B must be either $2500 (1/2 of A) or the amount in the other envelope must be $10000 (making A = 1/2 of B, or 2A). The problem with your idea is you are assuming you always opened envelope A, when in fact you do not know which one it originally was. Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 Since otherwise each particular box would have probability 1/Inf=0 and therefore it would be impossible to open a box, events with zero probability never happen. I know this sounds very contra-intuitive, it took me a long time fully to understand it myself. I tried to set up a simalation for that (picking any number of random numbers and making a full equally probable sett of infinite numbers), and I knew it would be impossible, I didn't try it hard ebcaue I knew the answer before I started actually. It seemed very intuitive to me. Anyway, my point is that when trying to set up a simulation (I am computer engineer, not mathematician), I will have to analice every case. I've solved some problems this way, when I try to 'digitalice' the real world, I come down with the answer. A good example are the 'ropes and pieces' puzzles, when 2 things seem to be attached to each other, but actually if you move them the right way you can separate them without breaking anything. If you try to simulate them you will have to determine wich movements are possible and wich not. But by the time you find wich movements are possible you don't need any simulation to solve the problem. Quote Link to comment Share on other sites More sharing options...
sceptic Posted July 18, 2007 Report Share Posted July 18, 2007 easy this one, you are going to walk away with either 2500 or 10000, you will not walk away with 5000 unless that is an amount of money that would make a difference to your life I would think this is a pshycological issue not a maths one Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 I think Hannie is saying that because you cannot assign a number to infinity, you cannot define 1/2 of infinity or two times infinity, the problem is not possible to solve mathematically. Is this correct, Han? No. I don't even know what this infinity you talk about is. Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 I know they formulated a harder version of the problem, but at least to me, it's clear we cannot assume that the envelopes are equally likely as in the set up they haven't defined a clear prior distribution of envelopes. They only state that there are an infinite number of them. Thus they cannot be equally likely or we would not have a probability function. I like this explanation. Quote Link to comment Share on other sites More sharing options...
Blofeld Posted July 18, 2007 Report Share Posted July 18, 2007 Han is of course correct. The fallacy is this. Although you have picked between the two envelopes randomly, it is in general false that the other envelope will have more money in 50% of the time. To see that this cannot be true, imagine you open envelope A and find 1 cent in it. Envelope B can't have half a cent in, so must have 2 cents - and the probability it has more money is 100% rather than 50%. This is just an illustration of the fact that our intuition in this area is often wrong, and relies on the accidental fact that money isn't infinitely subdivisible. If you allow for arbitrarily small amounts of money, the argument doesn't hold. But the one that Han gave prevents any distribution of possible amounts of money to envelopes that always has a 50% chance of having more money. In fact, what the "paradox" shows is precisely that such a distribution cannot exist! Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 Here is a similar problem: There are two identical boxes, each box contains two identical envelopes. In the first box the envelopes contain $2,500 and $5,000, in the second box the envelopes contain $5,000 and $10,000. I don't think this is similar problem. If I run a simulation upon this it would be correct to switch: 2500$ -> switch you will win +2500$ or +7500$5000$ -> switch you will win +5000$ or lose -2500$10000$ -> Do not switch. (-5000$ or -7500$) Wich relates to the initial problem, the higher you have, the less you want to switch. so what am I missing? It is similar in that you have $5,000 and the other envelope contains either $2,500 or $10,000 and it is equally likely. Here you would want to switch. The original problem is not possible. It starts with an impossible assumption. Quote Link to comment Share on other sites More sharing options...
Trumpace Posted July 18, 2007 Report Share Posted July 18, 2007 2 were having the following discussion: A: Say you have 2 envelopes filled with money. The value of one envelope is half the value of the other. These values can approach infinite $. B: Ok. A: Say you are handed one envelope and open it contains 5000 dollars. You are offered the choice to switch envelopes, do you? B: Of course, the other envelope could have either 10,000 dollars or 2500 dollars. My EV is higher by switching. Person B is obviously wrong, but I cannot find the flaw in his argument. Can you prove specifically why the last statement of person B is wrong (I'm not looking for there are 2 possible cases etc, I'm looking for specifically the flaw of person B's thinking of which there must be something and I'm missing it). I am not sure if people have already said what I am going to say. But I will say it anyway. The answer is, "it depends". Unless we are given the "method" with which the numbers were chosen, we cannot tell if switching is bad. For instance, we can pick the numbers with probabilities in such a way that, if the envelope you open contains 5000$, you are better off not switching. By using a different "method" we can pick the numbers in such a way that, if the envelope you open contains 5000$, you are better off switching. (If you want concrete examples, I refer you to http://www.ocf.berkeley.edu/~wwu/cgi-bin/y...14781;start=25) Basically, there is no _always switch_ or _never switch_ which works. It is totally dependent on the underlying "method" with which the numbers were chosen. Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 To see that this cannot be true, imagine you open envelope A and find 1 cent in it. Envelope B can't have half a cent in, so must have 2 cents - and the probability it has more money is 100% rather than 50%. I don't really like this argument but I agree with the rest of your post. (although you shouldn't call it "Han's argument" because I didn't come up with it myself, it's a well-known problem). Quote Link to comment Share on other sites More sharing options...
Trumpace Posted July 18, 2007 Report Share Posted July 18, 2007 The original problem is not possible. It starts with an impossible assumption. I don't think the problem makes an impossible assumption. The problem never stated any uniform distribution of the numbers. It was B who thought so. The problem is perfectly valid, and if asked should you switch or not, you should just say "incomplete information" :D Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 The original problem is not possible. It starts with an impossible assumption. I don't think the problem makes an impossible assumption. The problem never stated any uniform distribution of the numbers. It was B who thought so. The problem is perfectly valid, and if asked should you switch or not, you should just say "incomplete information" :D You are absolutely correct. Quote Link to comment Share on other sites More sharing options...
Blofeld Posted July 18, 2007 Report Share Posted July 18, 2007 To see that this cannot be true, imagine you open envelope A and find 1 cent in it. Envelope B can't have half a cent in, so must have 2 cents - and the probability it has more money is 100% rather than 50%. I don't really like this argument but I agree with the rest of your post. (although you shouldn't call it "Han's argument" because I didn't come up with it myself, it's a well-known problem). OK, I admit that I don't really like that argument either. Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 To see that this cannot be true, imagine you open envelope A and find 1 cent in it. Envelope B can't have half a cent in, so must have 2 cents - and the probability it has more money is 100% rather than 50%. I don't really like this argument but I agree with the rest of your post. (although you shouldn't call it "Han's argument" because I didn't come up with it myself, it's a well-known problem). OK, I admit that I don't really like that argument either. I will be happy to scan a U.S. Half Cent for you, so I don't like it either. :D (Yes, at one time they did exist...) Quote Link to comment Share on other sites More sharing options...
BebopKid Posted July 18, 2007 Report Share Posted July 18, 2007 Does this make any difference? What if... You are given three envelopes: one green and two white? You are told that you can look in the green envelope. It contains twice as much as one of the other envelopes and half as much as one of the envelopes. You may select any of the three envelopes to leave with. Quote Link to comment Share on other sites More sharing options...
Echognome Posted July 18, 2007 Report Share Posted July 18, 2007 By the way, the problem as postulated by Barry Nalebuff is fairly well known from his book called Thinking Strategically. However, the way he posted the problem there was that you had a finite number of envelopes. I think the example he gave was $5, $10, $20, $40, $80, and $160. Note that here the equally likely assumption holds without issue. He then said suppose two people were given envelopes and told that one contained twice as much as the other. So say person A opens his envelope and finds $20. If he postulates that person B's envelope contains $40 half the time and $10 half the time, then he should switch. Then if person B opens and finds $40, he might postulate that the other envelope contains $20 half the time and $80 half the time. This finite version of the problem is easy to solve via backward induction. You start with a person who opens an envelope with $160. He will *never* agree to switch. Thus, if you open an envelope with $80, you should reason that if the other person has $160, they will never agree to switch. So I shouldn't agree to switch. Thus, the person with $80 will not agree to switch. Then the person who opens the $40 envelope will reason that the person who opens $160 won't switch, therefore the person with $80 won't agree to switch, thus I should not agree to switch. And so on down to the person who finds the smallest amount and always agrees to switch. The extension to an infinite number of envelopes takes away this endpoint. However, you can extend the reasoning for any finite number of envelopes. Quote Link to comment Share on other sites More sharing options...
Blofeld Posted July 18, 2007 Report Share Posted July 18, 2007 Does this make any difference? What if... You are given three envelopes: one green and two white? You are told that you can look in the green envelope. It contains twice as much as one of the other envelopes and half as much as one of the envelopes. You may select any of the three envelopes to leave with. Yes, this makes a difference. Now the argument with expectations works because you know you've got 50% chance each of getting half as much and twice as much, so you should pick a white envelope (assuming your aim is to maximise expected winnings). Quote Link to comment Share on other sites More sharing options...
david_c Posted July 19, 2007 Report Share Posted July 19, 2007 Han is of course correct. The fallacy is this. Although you have picked between the two envelopes randomly, it is in general false that the other envelope will have more money in 50% of the time. To see that this cannot be true, imagine you open envelope A and find 1 cent in it. Envelope B can't have half a cent in, so must have 2 cents - and the probability it has more money is 100% rather than 50%. This is just an illustration of the fact that our intuition in this area is often wrong, and relies on the accidental fact that money isn't infinitely subdivisible. If you allow for arbitrarily small amounts of money, the argument doesn't hold. But the one that Han gave prevents any distribution of possible amounts of money to envelopes that always has a 50% chance of having more money. In fact, what the "paradox" shows is precisely that such a distribution cannot exist!Hmm, I believe that last sentence is incorrect, and I think people are missing the point of what's really paradoxical about this. Owen here makes the good point that the conditional probability P ( our envelope contains the smaller amount | our envelope contains $M ) is not necessarily the same as the a priori probability (which is 1/2 of course). We're trying to work out the expectation from switching given that our envelope contains $M, and in order to do this we need to know the probability that our envelope contains the smaller amount given that it contains $M. That is, the a priori probabilities are useless, what we need to know are the conditional probabilities. And Han's very first post proved that it is not possible for all the conditional probabilities to be 1/2. (Or to put it another way, we can't arrange things so that all possible amounts of money are equally likely.) BUT Even though the conditional probabilities can't all be 1/2, we can make them as close to 1/2 as we like. For example, we can put amounts into the envelopes according to a probability distribution in such a way that all the conditional probabilities, for every possible amount $M, are between 0.49 and 0.51. [For the smart math people: just take a very slowly decaying distribution. For example, let the probability that envelopes contain $(2^n) and $(2.2^n) be k.(1-epsilon)^(|n|).] In this case, when analysing whether it's right to switch for a particular amount M, the calculations are so close to what they would be if the probabilities were 1/2 that it makes no difference. The conclusion is For every amount M you see in the envelope, your expectation if you switch is greater than M. (In fact, greater than 1.2M, say. We can get any multiple less than 1.25.) Is this a paradox? It shouldn't be - it's true. Do you believe me? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.