Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 2 were having the following discussion: A: Say you have 2 envelopes filled with money. The value of one envelope is half the value of the other. These values can approach infinite $. B: Ok. A: Say you are handed one envelope and open it contains 5000 dollars. You are offered the choice to switch envelopes, do you? B: Of course, the other envelope could have either 10,000 dollars or 2500 dollars. My EV is higher by switching. Person B is obviously wrong, but I cannot find the flaw in his argument. Can you prove specifically why the last statement of person B is wrong (I'm not looking for there are 2 possible cases etc, I'm looking for specifically the flaw of person B's thinking of which there must be something and I'm missing it). Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 B is indeed wrong, or rather, the set-up of the problem isn't possible. Before you can answer the problem, you would have to describe how likely each combination is. The problem assumes that every combination is equally likely, i.e. $2,500+ $5,000 is just as likely to $5,000+$10,000, and likewise for every other combination. But this isn't possible. To see why this isn't possible let's assume that we do have such a distribution of probabilities. The chance that the the lowest amount lies between 1 and 2 must be some positive number, let's say "x". But then the chance that it lies between 2 and 4 must also be "x". And the chance that it lies between 4 and 8 must also be x. But then the chance that the lowest amount lies between 1 and 2^n is x + x + x +.. = n*x. And for large n this is larger than 1! This isn't possible. I'm playing bridge atm so I'll try to do a better job later. Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 This is a well known paradox. Let me think about how to explain it best, I'll edit this post. I knew you would know !H Maybe it woulda been well known to me if I hadn't dropped out of school :) Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 I know a very similar problem, if you try to make a simulation you will find the answer: the answer is that whatever the limit is, it is not infinite, and when you have money close to the limit, your expected income is exactly half what you got, and since it is maximum loss it will just even with the expected (small compared) wins on the normal lies. Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 This is a well known paradox. Let me think about how to explain it best, I'll edit this post. Come one you are a cheater, I saw the post before, and you 'took the place' without answering to make it before grrrrrrrrrr :). Quote Link to comment Share on other sites More sharing options...
G_R__E_G Posted July 18, 2007 Report Share Posted July 18, 2007 There are a number of interesting "solutions" listed here: Two Envelopes Problem Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 I know a very similar problem, if you try to make a simulation you will find the answer: the answer is that whatever the limit is, it is not infinite, and when you have money close to the limit, your expected income is exactly half what you got, and since it is maximum loss it will just even with the expected (small compared) wins on the normal lies. I don't think this is the answer. (This is what person A initially suggested as well). Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 There are a number of interesting "solutions" listed here: Two Envelopes Problem Woah, I didn't realize this was such a well known problem lol. I feel better about not being able to come up with a satisfactory solution Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 There are a number of interesting "solutions" listed here: Two Envelopes Problem thx GREG I fount this one brilliant Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose. BTW, the problem I knew was the one that appears at the bottom: Two people, equally rich, meet to compare the contents of their wallets. Each is ignorant of the contents of the two wallets. The game is as follows: whoever has the least money receives the contents of the wallet of the other (in the case where the amounts are equal, nothing happens). One of the two men can reason: "Suppose that I have the amount A in my wallet. That's the maximum that I could lose. If I win (probability 0.5), the amount that I'll have in my possession at the end of the game will be more than 2A. Therefore the game is favourable to me." The other man can reason in exactly the same way. In fact, by symmetry, the game is fair. Where is the mistake in the reasoning of each man? . Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 B is indeed wrong, or rather, the set-up of the problem isn't possible. Before you can answer the problem, you would have to describe how likely each combination is. The problem assumes that every combination is equally likely, i.e. $2,500+ $5,000 is just as likely to $5,000+$10,000, and likewise for every other combination. But this isn't possible. To see why this isn't possible let's assume that we do have such a distribution of probabilities. The chance that the the lowest amount lies between 1 and 2 must be some positive number, let's say "x". But then the chance that it lies between 2 and 4 must also be "x". And the chance that it lies between 4 and 8 must also be x. But then the chance that the lowest amount lies between 1 and 2^n is x + x + x +.. = n*x. And for large n this is larger than 1! This isn't possible. I'm playing bridge atm so I'll try to do a better job later. Hi, I'm talking in a pure math sense, I don't care about the applied form. It is completely hypothetical in my mind in a world where infinity exists. There is no upper limit. This is also the solution Taleb offers but it seems flawed to me but obviously you know more than me. I am now reading the wiki article on this. Quote Link to comment Share on other sites More sharing options...
Guest Jlall Posted July 18, 2007 Report Share Posted July 18, 2007 To the infinity people... from the wiki: However, Clark and Shackel argues that this blaming it all on "the strange behaviour of infinity" doesn't resolve the paradox at all. Neither in the single case nor the averaged case. They show this by providing a simple example of a pair of random variables both having infinite mean but where one is always better to chose than the other. [12] This is the best thing to do at every instant as well as on average, which shows that decision theory doesn't necessarily break down when confronted with infinite expectations. Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 I know a very similar problem, if you try to make a simulation you will find the answer: the answer is that whatever the limit is, it is not infinite, and when you have money close to the limit, your expected income is exactly half what you got, and since it is maximum loss it will just even with the expected (small compared) wins on the normal lies. I don't think this is the answer. (This is what person A initially suggested as well). I probably didn't explain very well. If there is a limit of money (for example there is a limit of real money you can put on a bag), then if you have more than half the limit you cannot win by switching, you ahve an 'automatic' lose. But you can always lose by switching (except when oy have 1 maybe, but then it is too few compared to other posibilities tha ti thas no weight on the calculation). Now, if you say there is no limit, I guess my answer doesn't fit and the answer is what Han said: there is a distribution stating that when A is big enough, chances of B being greater are very low making your chances bad, bad enough to even infinite switches. Quote Link to comment Share on other sites More sharing options...
helene_t Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. Quote Link to comment Share on other sites More sharing options...
G_R__E_G Posted July 18, 2007 Report Share Posted July 18, 2007 I've never heard this problem before either Justin - thanks for posting it. I'm still struggling with some of the explanations on Wiki, as the basic "I'm either losing 50% or gaining 100% so I'll switch" seems to make sense. Anyway, the bottom line to this whole thing is.... To hell with the math - I'm switching. I feel lucky today. :) Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. The problem with these statements is that the envelopes in question contain a finite (and given amount of $5000 in the first envelope, call it Envelope A). All the other "probabilties" are meaningless. The only thing that matters in this problem is: Does Envelope B hold 1/2 as much as Envelope A or does it hold twice as much as Envelope A? B must equal either 1/2 A or 2A, and A is equal to $5000 in the problem as Justin has proposed it. So there is either $2500 or $10000 in envelope B, no other numbers matter. Person B is saying his Expected Value is greater if he switches envelopes. I happen to agree. His expected value is the sum total of what he receives by selecting B. (B = 1/2A) + (B = 2A) = (2B = 2 1/2) = (B = 1 1/4A) His expected value (on average) is 1 1/4 A. Lets say he does this 4 times, and meets the 50/50 odds. He draws 2500, 2500, 10000, 10000 = 25000/4 = 6250 which is greater than the 5000 he started with. If he does it 1000 times, 10000 times or a million times, his average expected value still works out to be 6250, which is greater than his current amount of $5000. QED. :) (Yes, I am sure there are flaws in my math arguments, but the logic is the same). Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 Hi, I'm talking in a pure math sense, I don't care about the applied form. ♥'s back for that comment Justin! But I was talking about the pure math sense. I tried to explain it without using math-lingo. With lingo it would go like this: "there isn't a probability measure on the positive real numbers that assigns equal mass to equal intervals", or something like that. Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 Here is a similar problem: There are two identical boxes, each box contains two identical envelopes. In the first box the envelopes contain $2,500 and $5,000, in the second box the envelopes contain $5,000 and $10,000. Anne and Beth pick on of the boxes and then each pick an envelope. Anne opens her envelope and sees that it contains $5,000. She is given the option of switching the envelopes, should she? When you think about this problem you'll notice that the solution that Fluffy quoted: Let the amounts in the envelopes be Y and 2Y. Now by swapping, if you gain you gain Y but if you lose you also lose Y. So the amount you might gain is equal to the amount you might lose. doesn't work. So apparently there is something else going on in the original problem, it is not as easy as that. Quote Link to comment Share on other sites More sharing options...
helene_t Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. The problem with these statements is that the envelopes in question contain a finite (and definite amount, $5000). All the other "probabilties" are meaningless. The only thing that matters in this problem is: Does Envelope B hold 1/2 as much as Envelope A or does it hold twice as much as Envelope A? B must equal either 1/2 A or 2A, and A is equal to $5000 in the problem as Justin has proposed it. So there is either $2500 or $10000 in envelope B, no other numbers matter. I'm talking about the prior probabilities, before the first envelop was opened. Some would complain that there are no such probabilities since we don't know anything, but that simply isn't possible. There is always some probability distribution that describes our (lack of) prior knowledge. Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. The problem with these statements is that the envelopes in question contain a finite (and definite amount, $5000). All the other "probabilties" are meaningless. It is not meaningless. You will only be able to understand the problem if you consider the complete distribution of probabilities into account. I hope my previous post (where all irrelevant amounts are discarded) clarified this. Quote Link to comment Share on other sites More sharing options...
han Posted July 18, 2007 Report Share Posted July 18, 2007 Oh, and to the people who want to switch, I always found this an entertaining thought: if you are going to switch anyway, why open the first envelope? Choose which one you will pick and then quickly pick the other one! 1 Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 Oh, and to the people who want to switch, I always found this an entertaining thought: if you are going to switch anyway, why open the first envelope? Choose which one you will pick and then quickly pick the other one! And then swap back and forth indefinitely as it is now equally likely you have now picked the wrong one. :) Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 Here is a similar problem: There are two identical boxes, each box contains two identical envelopes. In the first box the envelopes contain $2,500 and $5,000, in the second box the envelopes contain $5,000 and $10,000. I don't think this is similar problem. If I run a simulation upon this it would be correct to switch: 2500$ -> switch you will win +2500$ or +7500$5000$ -> switch you will win +5000$ or lose -2500$10000$ -> Do not switch. (-5000$ or -7500$) Wich relates to the initial problem, the higher you have, the less you want to switch. so what am I missing? Quote Link to comment Share on other sites More sharing options...
bid_em_up Posted July 18, 2007 Report Share Posted July 18, 2007 Han is right. Fluffy's reference to an upper limit is close but not quite. Suppose the probabilites of the first envelop containing x dollars are$ prob1 1/22 1/43 1/84 1/165 1/326 1/64.. etc This allows for no upper limit but Han's argument still holds. The problem with these statements is that the envelopes in question contain a finite (and definite amount, $5000). All the other "probabilties" are meaningless. It is not meaningless. You will only be able to understand the problem if you consider the complete distribution of probabilities into account. I hope my previous post (where all irrelevant amounts are discarded) clarified this. Yes it is. 1 1/4(N+1) is still greater than N+1. :) and it doesnt matter what amounts are in the envelopes, as long as the amount in the other envelope is always exactly 1/2 A or 2A. I won't argue math with you, though. But logically, it does not make sense. At least not to me. Quote Link to comment Share on other sites More sharing options...
Fluffy Posted July 18, 2007 Report Share Posted July 18, 2007 what Han and bid_them_up are talking about I don't understand :) What I understand is that you can pick a random number equally probable to any other between X and Y But you cannot do that between X and Infinite. Quote Link to comment Share on other sites More sharing options...
luke warm Posted July 18, 2007 Report Share Posted July 18, 2007 well i'm not a mathematician but (someone said this above) my thinking is that, given the original problem justin gave, i can either lose 50% of the value i already have or gain 100%... is that correct or not? if so, i'll switch... as for han's question: why not just pick an envelope and immediately 'change your mind', why not indeed? Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.