Just reposting the game from that article:

**WARNING: THESE ARE THEORETICAL GAMES. Try not to bias yourself by how much YOU value 1000$ compared to how a millionaire values 1000$ (the utility function of money is a constant for all people). Also, we assume God has infinite amount of money with him, and does not lie when he says he will pay you, so please don’t give arguments like “put the money on the table and I will play” (replace 1$ by 0.001$ or any such figure if you want to satisfy yourself practically).**

God offer you the option of playing a game,

*exactly once*, against me. This is how the game works. God will toss a fair coin until a T turns up. The sequence of coins H^{n}T will earn you 2^{n}dollars. More explicitly, a T on the first toss gives you 1 dollar, a Head followed by a Tail gives you 2, HHT gives you 4, HHHT gives you 8. As soon as the T turns up, we settle accounts, and leave, never to see each other again. However, there is a constant pre-agreed charge P you must pay to play this game against me (say 10000 $). Upto what price P are you willing to play this game?*Analysis:*The probability of the T is 1/2, of HT is 1/4, of HHT is 1/8 and so on (1/2 * 1/2 * 1/2… as they are independent events).

Hence your expected value of earnings for this game,

E(earnings)

= P(T).Earnings(T) + P(HT).Earnings(HT) + P(HHT).Earnings(HHT)….

= (1/2 * 1) + (1/4 * 2) + (1/8*4) + (1/16*8) + …. = 1/2 + 1/2 + 1/2 + 1/2 …. = (infinite).

However the 2000th term of this series of halves is highly improbable (1/2^1000). If you believe expected values, you should be willing to pay any finite amount of money to play this game.

But if you think over it, the probability that you will get at least 1000$ is 0.0005 which is too small. So, you should not be willing to pay infinite amount of money. Your intuition will not allow you to play with infinite money. Can you explain the paradox?

**Solution:**The wikipedia article on St. Petersburg paradox

u know what i really had in mind...it's something like probability re-weighting...where really low probability events are ignored...like in prospect theory...need to read up a bit...

ReplyDeleteNice observation..

ReplyDeleteSo, there are two ways to see this.. Don't know which is better? Should we use both together?

1) Low probability should be ignored. So, low probabilities should be given less weight.

2) Too much money is really too much. So, law of decreasing marginal utility should be taken into account.

The way I look at it is that I will pay any fixed amount provided I can play the game as many times as I like. But if it's a one time game only then I'd introduce a marginal utility function, not for the wins, but for the fee.

ReplyDeleteWhen using expectation to base your decisions, you must allow the law of large numbers to perform.

Can anyone comment on "decisions in infrequent games"?

@Asad..

ReplyDeletePrecisely.

But in general, for each successive game, we make decision on the basis of expected value only. Expected value is the amount expected at that time. Decision making on the basis of expectations is well established.

In practical scenarios, expectation based decision making is employed by parties which deal with large numbers. Case in point being insurance. Statistically speaking, one should never opt for an insurance since the expected return for the insurer is definitely positive and thus negative for the insured (assuming that the risk assessment of the insurer is much better than the insured i.e. absence of information asymmetry). Then why should anyone opt for an insurance? In my view one needs to introduce the utility function somewhere to make the returns positive for both the parties.

ReplyDelete@Asad.. :) You really want to go deep into this :P

ReplyDeleteRead Decision Theory at wikipedia: http://en.wikipedia.org/wiki/Decision_theory#Choice_under_uncertainty

This theory is accepted since 1670 and for all theoretical and non-practical purposes (:P), we can "decide" using expected value.