Jump to content

Smart math people, help?


Guest Jlall

Recommended Posts

For instance consider the following (classic, called as Bertrand's paradox) problem.

 

(Note I am using the site:  http://www.cut-the-knot.org/bertrand.shtml for the write ups)

 

Question: "Given a circle. Find the probability that a chord chosen at random be longer than the side of an inscribed equilateral triangle."

 

There are at least three ways of looking at it:

 

1) Probability = 1/3

 

We have to choose randomly two points on a circle and measure the distance between the two. Therefore, the only important thing is the position of the second point relative to the first one. In other words, position of the first point has no effect on the outcome. Thus let us fix the point A and consider only the chords that emanate from this point. Then it becomes clear that 1/3 of the outcomes will result in a chord longer than the side of an equilateral triangle.

 

2) Probability = 1/4

 

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints inside a smaller circle with radius equal to 1/2 that of the given one. Hence, its area is 1/4 of the big circle which also defines the proportion of favorable outcomes - 1/4.

 

3) Probability = 1/2

 

A chord is fully determined by its midpoint. Chords whose length exceeds the side of an equilateral triangle have their midpoints closer to the center than half the radius. If the midpoints are distributed uniformly over the radius (instead of over the area, as was the case in the second solution), the probability becomes 1/2.

 

Which of the three answers is right?

I think I've got this one. The problem is with the informal phrase "chosen at random". To calculate the probability of a result, you need to know the probability distribution of this input. Even if you assume it means uniform distribution (the typical layman definition), is it uniform along the circumference (result 1), among the areas of the smaller circles (result 2), or along the radius (result 3)? Going 1/4 the way around the circumference doesn't result in the same chord as choosing a midpoint 1/4 of the way towards the center.

Exactly!

 

Consider the following game now:

 

I choose a chord at random.

 

I give you two options:

 

i) If it is larger than the side of the equilateral triangle, I give you 2.5$ else you give me 1$

 

ii) If it is larger than the side of the equilateral triangle, you pay me 2.5$, otherwise I pay you 1$.

 

You can choose whichever option you prefer and let me know before we start the game.

 

Will you play this game with me and which option will you choose?

 

[edit] 100th! :) [/edit]

Link to comment
Share on other sites

but i still don't quite understand

 

the 'givens' of justin's problem were:

1) there are 2 and only 2 envelopes

2) the value of 1 of the envelopes is exactly half the value of the other

3) you open an envelope and it contains $5,000

 

are these statements accurate?

1) if i switch i'll either have $2,500 or $10,00

2) if i switch and am "wrong" i'll lose 50% of what i have in hand

3) if i switch and am "right" i'll gain 100% of what i have in hand

 

assuming one can live with oneself if wrong, why is it not better to switch? layman's terms please

You are assuming that being right and being wrong are equally likely. This can't be right for all possible amounts of money you find in the envelope, for the reasons explained by several in this thread.

arend, i'm only assuming these things, and only because they are givens of the problem:

 

1) there are 2 and only 2 envelopes

2) the value of 1 of the envelopes is exactly half the value of the other

3) you open an envelope and it contains $5,000

 

from an intuitive pov i can't see why it matters what "... all possible amounts of money..." can be, since i'm told that the envelope i'm given has $5,000 in it.. i understand what you and others are saying about theoretical amounts, but once it's known how much is in my envelope, it's also known how much is in the other (either 1/2 mine or 2x mine)

 

now that is either correct or it isn't... if it is, and if (i'm approaching this as if it actually happened) i have a choice between guaranteed gains of money regardless of my decision (after all, i had nothing), and if the "pot" pays better odds to gamble, i'll gamble (since i can't lose in any case)... it seems to me that i'll in essence be betting my $5,000... if i lose the bet, i win $2,500... if i win the bet, i win $10,000... why am i wrong to think this?

We have done our best. It's not that easy to explain, actually I didn't fully understand the problem before David pointed me to the concequences of assuming a probability distribution with infinite mean.

yes, i know you and others have done your best, and i appreciate it, but i don't see why it matters what the probability distribution etc is since i'm known to have a $5,000 envelope in hand... the only probabilities are: the remaining envelope contains wither $2500 or $10,000... why is that incorrect? why should i not switch every time?

It completely depends on the way the numbers are chosen.

why? once i'm known to hold an envelope worth either 1/2 or 2x the remaining envelope, why should i care how the numbers were chosen? i know i'm a winner regardless, the only question is how big a winner

Link to comment
Share on other sites

from an intuitive pov i can't see why it matters what "... all possible amounts of money..." can be, since i'm told that the envelope i'm given has $5,000 in it.. i understand what you and others are saying about theoretical amounts, but once it's known how much is in my envelope, it's also known how much is in the other (either 1/2 mine or 2x mine)

 

now that is either correct or it isn't... if it is, and if (i'm approaching this as if it actually happened) i have a choice between guaranteed gains of money regardless of my decision (after all, i had nothing), and if the "pot" pays better odds to gamble, i'll gamble (since i can't lose in any case)... it seems to me that i'll in essence be betting my $5,000... if i lose the bet, i win $2,500... if i win the bet, i win $10,000... why am i wrong to think this?

OK, to play this game assume that you had to pay the amount in the envelope which was shown to you.

 

Now, would you switch?

Link to comment
Share on other sites

I still think that the male-and-female-weight problem should be easier to understand but since no-one seems to appreciate it maybe I should stop promoting it  :)

I liked it.

 

This is the best water cooler thread I've seen. Particularly because I assumed from the title it was going to be the 3-envelope restricted choice problem which has been done to death many times.

 

Would you believe I studied maths for 7 years at what is usually considered one of the world's top universities for mathematics and had never even met this problem?

Link to comment
Share on other sites

I still think that the male-and-female-weight problem should be easier to understand but since no-one seems to appreciate it maybe I should stop promoting it  :)

 

Helene, what I wasn't sure about in your example was if you were adding a whole other dimension to the problem that I didn't believe was there. The dimension I am querying is whether you believe this is a signalling problem (in the economics sense, not the bridge sense). That is to say, if we are going to adjust our probabilities based on the amount we have, then the amounts are acting as a signal. The classic example of the signalling problem is Spence's model of going to college or not to signal whether you are smart or not. (Modern versions stress that it's not just intelligence, but intelligence and diligence combined.) I didn't feel this was really part of the issue. I think it does attest to the fact that the probabilities aren't equally likely just fine, but in a very specific way. We have already noted that the ex ante probability of envelopes cannot be equally likely, so we are left with some distribution of envelopes that we have not been told. In the wiki article they do offer up a distribution that works. Finally, we know if the number of envelopes is finite, we have a solution already.

Link to comment
Share on other sites

Helene, what I wasn't sure about in your example was if you were adding a whole other dimension to the problem that I didn't believe was there. The dimension I am querying is whether you believe this is a signalling problem (in the economics sense, not the bridge sense). That is to say, if we are going to adjust our probabilities based on the amount we have, then the amounts are acting as a signal.

I never heard the term "signal" in this context before, but if it just means that the amount in envelope A carries information that helps you estimate the probability that envelope A is the one with the large amount, then yes, that's the whole point. Just like the event that person A weights 100 kg influences the probability that it's a male. Excactly the same thing. I didn't add any new dimension, just used people instead of envelopes because they are more familiar.

Link to comment
Share on other sites

I never heard the term "signal" in this context before, but if it just means that the amount in envelope A carries information that helps you estimate the probability that envelope A is the one with the large amount, then yes, that's the whole point. Just like the event that person A weights 100 kg influences the probability that it's a male. Excactly the same thing. I didn't add any new dimension, just used people instead of envelopes because they are more familiar.

http://en.wikipedia.org/wiki/Signaling_%28economics%29

 

You can decide if this is what you meant or not. :)

Link to comment
Share on other sites

Jimmy, it is not a given of the problem that switching will be right 50% of the time and wrong 50% of the time.

(And some posts have tried to explain why this can't ever be a given.)

i agree with you arend (and the others) on that point... again, i'm only looking at this from a practical pov, and that pov tells me that since i have $0 when i start i can walk away with $2,500; $5,000; $10,000... now that is either true or it isn't

 

that pov also tells me that a gain of 100% is worth the risk of a 50% loss, most especially since that 50% loss *still* results in a net gain of $2,500... since, to me, it doesn't matter whether or not it's 50/50 that i'll be 'right' to switch, but only that a 100% gain is possible vs. a 50% loss, i'll switch

OK, to play this game assume that you had to pay the amount in the envelope which was shown to you.

 

Now, would you switch?

absolutely not... now i am guaranteed $5,000 if i walk away, a 100% gain if i switch to the envelope with $10,000, or a 200% loss if i switch to the envelope with -$5,000... am i wrong in that?

Link to comment
Share on other sites

OK, to play this game assume that you had to pay the amount in the envelope which was shown to you.

 

Now, would you switch?

absolutely not... now i am guaranteed $5,000 if i walk away, a 100% gain if i switch to the envelope with $10,000, or a 200% loss if i switch to the envelope with -$5,000... am i wrong in that?

No. If you walk away you get nothing.

 

In order to get any money, you have to pay the amount of the first envelope you saw.

 

So if you decide not to switch, you pay $5000 and get back $5000. If you decide to switch, you may lose $2500 or gain $5000.

Link to comment
Share on other sites

I never heard the term "signal" in this context before, but if it just means that the amount in envelope A carries information that helps you estimate the probability that envelope A is the one with the large amount, then yes, that's the whole point. Just like the event that person A weights 100 kg influences the probability that it's a male. Excactly the same thing. I didn't add any new dimension, just used people instead of envelopes because they are more familiar.

http://en.wikipedia.org/wiki/Signaling_%28economics%29

 

You can decide if this is what you meant or not. :)

Oh, but that's a two-player game. The envelope problem is a one-player game, totally different. Except maybe if you involve some "bank" that gives you the envelopes and tries to influence your decision for some reason. Suppose the bank wants to keep its loss (defined as the switching loss, the amount in envelope A is payed for by external funding) as small as possible by choosing a probability distribution that makes your expected gain by switching as close as possible to zero. Maybe that's a meaningful (non-paradoxical) game. (There would have to be some constraints, otherwise the bank could just chose to put $0 in every envelope).

 

This signaling thing is also a concept in evolutionary biology. For example, it could be that a male peacock developed it's impractical tail to signal that it's strong enough to afford risky behavior. Some sociobiologists think that risky behavior by young males (crime, drug abuse, sky-diving, 5-card preempts) is the same kind of phenomena.

Link to comment
Share on other sites

That informational advantage is that we know that we are on the winning side of the equation because we know that the amount is positive. If the amount was negative (do you owe $5000 or do you owe double or half that), then a switch would not benefit us.

Yes, but it's not impossible to set up the game such that the amounts are known to be positive. So requiring symmetry between negative and positive possible amount would just avoid the paradox, not solve it. What Han, I and others (and the Wiki article) have tried to explain, solves the paradox, at least in the case where the expected amount is finite.

Link to comment
Share on other sites

OK, to play this game assume that you had to pay the amount in the envelope which was shown to you.

 

Now, would you switch?

absolutely not... now i am guaranteed $5,000 if i walk away, a 100% gain if i switch to the envelope with $10,000, or a 200% loss if i switch to the envelope with -$5,000... am i wrong in that?

No. If you walk away you get nothing.

 

In order to get any money, you have to pay the amount of the first envelope you saw.

 

So if you decide not to switch, you pay $5000 and get back $5000. If you decide to switch, you may lose $2500 or gain $5000.

ahh i misunderstood the game.. since i can now lose my own money, i'd probably just walk away... no harm, no foul... i'd have seen this game as a big waste of time and wouldn't even have played, since there's a gamble involved and i think there are better gambling games that give me better odds

Link to comment
Share on other sites

OK, to play this game assume that you had to pay the amount in the envelope which was shown to you.

 

Now, would you switch?

absolutely not... now i am guaranteed $5,000 if i walk away, a 100% gain if i switch to the envelope with $10,000, or a 200% loss if i switch to the envelope with -$5,000... am i wrong in that?

No. If you walk away you get nothing.

 

In order to get any money, you have to pay the amount of the first envelope you saw.

 

So if you decide not to switch, you pay $5000 and get back $5000. If you decide to switch, you may lose $2500 or gain $5000.

ahh i misunderstood the game.. since i can now lose my own money, i'd probably just walk away... no harm, no foul... i'd have seen this game as a big waste of time and wouldn't even have played, since there's a gamble involved and i think there are better gambling games that give me better odds

Well, this is all hypothetical.

 

If you reason as B in the first post (by Jlall) did, you expect to gain by switching.

 

And if you play this game a few thousand times, according to B, it must be a pretty good game for the player (B would gladly play this game).

 

The point of the game is to explain a paradox (which might actually be applicable in the real world gambling games you wish to play).

 

If you look at theory from a practical viewpoint, most of it would seem like a big waste of time :lol:

Link to comment
Share on other sites

Small side note: the Martingale (or Gambler's Ruin) usually fails in the modern world against the house's bet limit, not the Gambler's initial stake. This makes it happen more often, and they'd rather nail someone for $10 000 several times than $1 000 000 on the off chance it happens. Anyone who plays the Martingale isn't going to get it when they hit the losing option anyway.

 

Loved the story, however, about the Pelayo family. OTOH, that takes more work than I am willing to spend.

Michael.

Link to comment
Share on other sites

i'm not sold on the gambler's run being a loser, per se... i know a guy who always bet the 'don't come' in craps, starting with $100 and doubling up... he always started with $10,000 and would quit when he won $200.. it didn't matter if he won that $200 on the first two rolls, he'd quit when up $200... i never saw him lose and he never admitted to losing money, but who knows
Link to comment
Share on other sites

Gamblers ruin is not particularily interesting. In finite time you get a small chance of an enormous loss and a big chance of a small gain. The expectation is clearly zero. Whether your subjective utility of such a deal is good or bad is a matter of taste of course, but I think most economists would say it's bad and that people who claim it's good to them have wrong ideas about their own preferences.

 

And in infinite time, the problem becomes meaninigless.

Link to comment
Share on other sites

And you have to take into consideration that in a normal gambling world it is always the case that the ante is lower than the expectancy so the gambler's ruin will have a total of a negative expectancy.
Link to comment
Share on other sites

And you have to take into consideration that in a normal gambling world it is always the case that the ante is lower than the expectancy so the gambler's ruin will have a total of a negative expectancy.

Not sure how you mean to use the word ante, however, this statement doesn't seem quite right.

 

Lets consider a game of poker with 6 identically skilled players. In this case, the expected value for any player on any hand is going to be zero. By definition, any positive ante is going to be greater than the expected value. If the casino is charging a table fee or a rake it will only increase the skew.

Link to comment
Share on other sites

not speaking for gwen, but it seems that she might be simply comparing the GR to the expectancy of a skilled poker player, and if that's true i certainly hope the ante is lower than what one would expect to win (over the course of a session)... i don't know how richard's example figures in, except maybe in a strict mathematical sense, but i don't think there's any such thing as a table full of identically skilled players
Link to comment
Share on other sites

not speaking for gwen, but it seems that she might be simply comparing the GR to the expectancy of a skilled poker player, and if that's true i certainly hope the ante is lower than what one would expect to win (over the course of a session)... i don't know how richard's example figures in, except maybe in a strict mathematical sense, but i don't think there's any such thing as a table full of identically skilled players

I stand by my original point. The notion of an “ante” is at best tangentially related to “Gambler’s Ruin”. Both “Ante” and “Gambler’s Ruin” have a very specific meaning and the very fact that "Gambler's Ruin applies to games like Blackjack and Craps that don't have any kind of ante should be a dead give away.

 

An ante is best illustrated using the game “poker”. At the start of each deal, each player contributes a set amount of money into the pot. The amount of the ante could be 25 cents, or a dollar, or whatever. The ante serves a very important purpose. Without an ante, players could adopt an ultra conservative playing style (For example, check every hand, and only and place a bet if you are dealt a straight flush. This would cause the game to degenerate into something ridiculous) Adding an ante to the game accomplishes two ends:

 

1. Each and every hand eats away at a player’s bankroll. You need to play a more aggressive style or you are going to get nickled and dimed to death.

 

2. There is suddenly some money in the pot to fight over.

 

There are variants to the ante system (for example, playing “Dealer antes”, the Dealer contributes the ante for the entire table. Regardless, the basic purpose of the ante structure is to make the game interesting.

 

Gambler’s ruin is completely different. Gambler’s ruin is a simple function of the fact that players can’t wager infinite amounts of money. These players either have a finite bankroll or (alternatively) there is limit on the maximum amount that one can bet. Let’s look at a simple martingale strategy using LukeWarm’s example. We have a player with a $10,000 bankroll who wants to win $200. Furthermore, lets assume that he is making a “fair” 50-50% bet will an expected value of zero. If he wins, he will take his money and run. If he loses, he’s going to increase his bet sufficiently that he can cover his entire loss plus win the $200 that he originally wanted.

 

50% of the time, your friend will win $200 on the first roll of the dice. If he wins, he walks away.

 

If he loses, your friend increases his bet size to $400. 50% of the time, he’ll win and walk away. 50% of the time, he’ll lose and be forced to increase his bet size to $800. (Right now, he’s in the hole $600 for the first two losing bets, plus he still wants to win $200)

 

Once again, you’re friend will win $200 50% of the time. Unfortunately, if your friend loses again, he needs to increase his bet size to $1600. Guess what happens, 50% of the time, you’re friend gets to walk away with $200. If he loses, he needs to increase the bet size to $3200.

 

And now, we hit the point where things get ugly. 50% of the time, your friend will finally win his $200. Unfortunately, guess what happens if your friend loses. You’re buddy has now lost $6200. In order to have any chance of recovering his losses and winning the original $200, your friend would now need to bet $6400. Unfortunately, his bankroll is now down to $3800. Game over. Your friend is now forced to walk away from the table after losing $6200.

 

In order for this to happen, your friend would have to loose 5 bets in a row. The chance that this would happen is very slim. (1 - .5^5) = .03125. 96.9% of the time, your friend gets to walk away with an extra $200. However, the remaining .03125% of the time, he has lost $6200.

 

96.875% of the time your friend is going to win $200 (96.875% * $200 = $193.75)

3.125% of the time you friend is going to loss $6200 (3.125% * $6200 = $193.75)

 

Expected value = zero

 

In actuality, casinos don’t offer “fair” 50% / 50% games. They are in business to make money and a naïve “strategy” like a martingale doesn’t offer any protection.

Link to comment
Share on other sites

David: Wow! I hadn't thought about the problem enough. I find it very interesting, as well as highly counter-intuitive, that there can be a distribution such that the expectation is always higher when you swap[1]. Thanks for doing the analysis on that.

 

[1] Though it perhaps shouldn't be a surprise that this kind of distribution leads to counter-intuitive results. After all the "gambler's fallacy" is a guaranteed way of making money even with 99% odds of losing each bet, if there is no limit to the funds available to you (or to the size of stake you can wager).

  • Upvote 1
Link to comment
Share on other sites

  • 8 years later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...