benlessard Posted August 5, 2008 Report Share Posted August 5, 2008 Belgian mathematician Maurice Kraitchik proposed this puzzle: Two people, equally rich, meet to compare the contents of their wallets. Each is ignorant of the contents of the two wallets. The game is as follows: whoever has the least money receives the contents of the wallet of the other (in the case where the amounts are equal, nothing happens). One of the two men can reason: "Suppose that I have the amount A in my wallet. That's the maximum that I could lose. If I win (probability 0.5), the amount that I'll have in my possession at the end of the game will be more than 2A. Therefore the game is favourable to me." The other man can reason in exactly the same way. In fact, by symmetry, the game is fair. Where is the mistake in the reasoning of each man? Quote Link to comment Share on other sites More sharing options...
Echognome Posted August 5, 2008 Report Share Posted August 5, 2008 See very long and involved thread on the envelope problem. Quote Link to comment Share on other sites More sharing options...
helene_t Posted August 5, 2008 Report Share Posted August 5, 2008 Indeed, this is the envelope problem. Quote Link to comment Share on other sites More sharing options...
cherdano Posted August 5, 2008 Report Share Posted August 5, 2008 Why is this tougher for mathematicians than for bridge players? Quote Link to comment Share on other sites More sharing options...
NickRW Posted August 6, 2008 Report Share Posted August 6, 2008 I ain't heard of no envelope problem. But if A is indeed < B, then the first person will be quite a bit better off. But it is equally possible that B < A, in which case the first person has lost more than the second person actually made with his blind stake. Either way collectively they started with A+B and that is what they ended up with. Quote Link to comment Share on other sites More sharing options...
benlessard Posted August 6, 2008 Author Report Share Posted August 6, 2008 Wasnt able to find the thread about the envelope problem. Can you post the link ?thanks. Quote Link to comment Share on other sites More sharing options...
barmar Posted August 6, 2008 Report Share Posted August 6, 2008 I think this is the thread: http://forums.bridgebase.com/index.php?showtopic=21801 Quote Link to comment Share on other sites More sharing options...
Mbodell Posted August 6, 2008 Report Share Posted August 6, 2008 Wasnt able to find the thread about the envelope problem. Can you post the link ?thanks. I don't know about the thread but if you type: envelope problem into Google and hit the "I'm feeling lucky button" you'll arrive at: Two envelope problem described on wikipedia Which has much discussion. It is a very, very famous problem. Quote Link to comment Share on other sites More sharing options...
helene_t Posted August 6, 2008 Report Share Posted August 6, 2008 Our discussion here is http://forums.bridgebase.com/index.php?showtopic=20292 But most of what come up during our discussion is in the wikipedia article also, and in a more concise way. So I can recommend reading the Wikipedia article. It is an extremely interesting problem. To understand the solution fully you will have to understand the concepts of probability and infinity at a very deep level. Quote Link to comment Share on other sites More sharing options...
kenberg Posted August 6, 2008 Report Share Posted August 6, 2008 There seems to be some (maybe mild) differences with the envelope problem. It seems to me the usual issue is trying to inject probability into a situation where it doesn't apply. For example, in the formulation here, surely a player is entitled to make use of the fact that his counterpart is willing to play. If he had one dollar in his wallet he would be willing to play. If he had just sold his Rolls for cash, he would not be willing to play. A willingness to play is evidence of a lesser amount of cash in his wallet. Of course you can change the rules and say that someone puts a gun to the head of each person and insists that they play. Still, there are facts about how much each usually carries in his wallet that would be relevant. You can introduce probability only after you have rather strictly defined exactly how the game is to be played. To say, before defining the rules, that each person has a 50-50 shot at winning is surely not correct. Actually, the person who has more money in his pocket has a 100% chance of winning, the other has a zero per cent chance, and estimating which person is which cannot be done by either person until more is stated about the rules. A person who has the opportunity to play, but is not forced to play, will surely base his decision at least partly on the amount of money in his wallet. Seems more akin to poker than to either bridge or mathematics. Quote Link to comment Share on other sites More sharing options...
helene_t Posted August 6, 2008 Report Share Posted August 6, 2008 Ken, you obviously interpret the game different from how I interpret it. The rule said that both did not know how much there is in either wallet. This may sound weird since one usually has more information about the contents of ones own wallet than that of someone else's wallet, even if the someone else is equally rich. However, that's how I read it. Quote Link to comment Share on other sites More sharing options...
kenberg Posted August 6, 2008 Report Share Posted August 6, 2008 Definitely I am guilty of careless reading. Not for the first time. Still, I would say there is a need for greater precision before one can apply probability. Mainly, we have to get at "not knowing how much is in either wallet". Since wealth is stipulated as equal, and presumably limited, he knows that he does not have more than his wealth. And then we presumably assume any amount up to his wealth is equally likely, although this is a stronger assertion than not knowing how much he has. Suppose that the possible contents of the wallets range from 1 to 5000 units. Let us say that not knowing how much is in either wallet means that it is equally likely to have any amount from 1 through 5000 in either wallet. With this agreement, but not without it or some other agreement, then probability applies. Now it is just a matter of observing that while Joe has a fifty percent chance of winning, and will at least double his money whenever he wins, it does not follow that the game favors him. When he holds 5 units in his wallet he has a fine chance if winning and at least doubling his 5 units. When he holds 4995 he stands a fine chance of losing, and then he loses it all. Doubling 5, or even more than doubling 5, has to compete with the times he loses 4995. No doubt we could add up all the numbers but symmetry tells us what the answer will be. He will hold 5, and his opponent 4995, exactly as often as the reverse. The envelope problem contends (in some formulations) with the possibility of arbitrarily large amounts of money, here the wealth is limited, and it seems all paradox disappears. But you are right, I didn't read so carefully. I still do not see the connection with bridge. Mostly it is a warning against careless reading (got me) and jumping to conclusions based on careless use of probability. Quote Link to comment Share on other sites More sharing options...
EricK Posted August 6, 2008 Report Share Posted August 6, 2008 It's a much easier problem for mathematicians than for bridge players. Mathematicians will always have nothing in their wallet so the bet will be in their favour, whereas the contents of a bridge player's wallet will vary depending on how his last rubber bridge session went. Quote Link to comment Share on other sites More sharing options...
benlessard Posted August 7, 2008 Author Report Share Posted August 7, 2008 Ive noticed a fallacy in the wikipedia page about the 2 envelope problems.Let the amounts in the envelopes be Y and 2Y. Now by swapping, the player may gain Y or lose Y. So the potential gain is equal to the potential loss This is complete BS, If you lose Y but you had the 2Y envelope you lose a Y that is worth half of the amount in your envelope. But if you hold the 1st envelope with Y and you switch you win Y when Y is the full amount of the envelope. If you have 50$ and switch. If the pair was Y=50$ & 2Y=100$ you win Y but that Y is worth 50$. If the pair was Y=25$ and 2Y=50$ you still lose Y but y here is worth 25$. Its clear that in infinity, real number dont have the same probability. Otherwise by adding them you would get more than 1. With the wallet problem its obvious that the more money you have into your wallet= the more likely the other wallet is smaller than yours. So the probability of having the highest wallet is 50% only if you dont look at the amount you have. Once you know the amount you have in your own wallet the probability depend on the amount you have even if you are in infinity. Quote Link to comment Share on other sites More sharing options...
Rossoneri Posted August 7, 2008 Report Share Posted August 7, 2008 It's a much easier problem for mathematicians than for bridge players. Mathematicians will always have nothing in their wallet so the bet will be in their favour, whereas the contents of a bridge player's wallet will vary depending on how his last rubber bridge session went. Lol! Quote Link to comment Share on other sites More sharing options...
helene_t Posted August 7, 2008 Report Share Posted August 7, 2008 Suppose that the possible contents of the wallets range from 1 to 5000 units. Let us say that not knowing how much is in either wallet means that it is equally likely to have any amount from 1 through 5000 in either wallet. With this agreement, but not without it or some other agreement, then probability applies. Spot on. Quote Link to comment Share on other sites More sharing options...
EricK Posted August 7, 2008 Report Share Posted August 7, 2008 Ive noticed a fallacy in the wikipedia page about the 2 envelope problems.Let the amounts in the envelopes be Y and 2Y. Now by swapping, the player may gain Y or lose Y. So the potential gain is equal to the potential loss This is complete BS, If you lose Y but you had the 2Y envelope you lose a Y that is worth half of the amount in your envelope. But if you hold the 1st envelope with Y and you switch you win Y when Y is the full amount of the envelope. If you have 50$ and switch. If the pair was Y=50$ & 2Y=100$ you win Y but that Y is worth 50$. If the pair was Y=25$ and 2Y=50$ you still lose Y but y here is worth 25$. Its clear that in infinity, real number dont have the same probability. Otherwise by adding them you would get more than 1. With the wallet problem its obvious that the more money you have into your wallet= the more likely the other wallet is smaller than yours. So the probability of having the highest wallet is 50% only if you dont look at the amount you have. Once you know the amount you have in your own wallet the probability depend on the amount you have even if you are in infinity. You can't argue like that because you are changing the definition of "Y". This is what gives rise to the paradox in the first place. One envelope contains Y the other contains 2Y. There is a 50% chance that you chose the envelope with "Y" in and 50% chance that you chose the envelope with 2Y in. Your expected gain if you stick is Y/2 + 2Y/2 =3Y/2 and your expected gain if you swap is 2Y/2 + Y/2 = 3Y/2. The whole point of the envelope paradox is not whether it gains to swap, because quite obviously it doesn't. The point is to find the flaw in the argument which appears to show that it gains to swap. Quote Link to comment Share on other sites More sharing options...
dburn Posted August 9, 2008 Report Share Posted August 9, 2008 Its clear that in infinity, real number dont have the same probability.I am not sure what is meant by this. Whereas there is no uniform probability distribution over the infinite number of rational numbers between 0 and 1, there is such a distribution over the (greater) infinite number of real numbers between 0 and 1. This means, as I understand the literature, that if the money in the envelopes had been determined by a rational person there would be no paradox, but if it were real money there would be. It is possible, though, that I may have misunderstood. Quote Link to comment Share on other sites More sharing options...
benlessard Posted August 9, 2008 Author Report Share Posted August 9, 2008 There is a 50% chance that you chose the envelope with "Y" in and 50% chance that you chose the envelope with 2Y in. Your expected gain if you stick is Y/2 + 2Y/2 =3Y/2 and your expected gain if you swap is 2Y/2 + Y/2 = 3Y/2. This is false if there is 50% of getting 2Y you should swap the enveloppe. Saying that the envelope already contain Y and 2Y where by switching you are gaining Y or losing Y is a fallacy. Since Y wont have the same value in both cases. The correct calculation is assuming envelope X with x/2 or 2x as the other envelope. Switching will provide. 1/2*2x + 1/2*1/2x = 5/4x wich is higher then x that you get if you do not switch. The answer of the problem is that in an infinity if you have Y in the envelope the other envelope is more likely to have y/2 than to have 2y. If you have a predertermined set of enveloppe you should always switch unless you are getting the highest envelope. 1,2,4,8,16,32,64. Even without looking at the envelope but knowing that if you get the highest number (64) you keep 64 even if you switch you should always switch. Quote Link to comment Share on other sites More sharing options...
EricK Posted August 9, 2008 Report Share Posted August 9, 2008 There is a 50% chance that you chose the envelope with "Y" in and 50% chance that you chose the envelope with 2Y in. Your expected gain if you stick is Y/2 + 2Y/2 =3Y/2 and your expected gain if you swap is 2Y/2 + Y/2 = 3Y/2. This is false if there is 50% of getting 2Y you should swap the enveloppe. Saying that the envelope already contain Y and 2Y where by switching you are gaining Y or losing Y is a fallacy. Since Y wont have the same value in both cases. The correct calculation is assuming envelope X with x/2 or 2x as the other envelope. Switching will provide. 1/2*2x + 1/2*1/2x = 5/4x wich is higher then x that you get if you do not switch. In my formulation of the problem, Y always has the same value independently of the envelope I happened to choose - it is simply the smaller of the two values. In your formulation, your x changes depending on which envelope you happened to choose. Imagine repeating this experiment a thousand times with a thousand different people. In each case, the envelopes contain $10 and $20. Those who swap either gain or lose $10. The average gain of those who stick with their choice is $15, the average gain of those who swap is also $15. This highlights the flaw in the "1/2*2x + 1/2*1/2x = 5/4x " formula. In this case if you open the envelope and find $10 there is actually (unbeknownst to you) a 100% chance of the other containing $20 (2x) and a 0% chance of it containing $5 (x/2). And something similar applies if you open your envelope and find $40. But just because you don't know the probabilities doesn't mean you can arbitrarily assume they're equal to 1/2, plug them into a formula and hope to get a sensible answer. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.