Many of you are probably familiar with the St. Petersburg paradox. This is the following paradox. I offer to play the following game with you: I will flip a fair coin repeatedly until it comes up heads. If the first time it comes up heads is on the nth toss, I will pay you 2n dollars. How much are you willing to play this game?
Well, the probability that the coin comes up heads on the first toss is 1/2; in this case you get 2 dollars. The probability that it comes up heads for the first time on the second toss is 1/4; in this case you get 4 dollars. In general, the probability that the coin comes up heads for the first time on the nth toss is 1/2n, and in this case you get 2n dollars. So your expected winnings are
2(1/2) + 4(1/4) + 8(1/8) + 16(1/16) + ...
and each term here is 1; the series diverges. So you should be willing to pay an infinite amount of money to play this game. Yet you're not. (If you are, let me know. I'd like to have an infinite amount of money. Notice that you will only win a finite amount of money playing this game, so even after I pay you I will still have an infinite amount of money.)
Yet you're not. You may suspect that this is because you know the person you're betting against doesn't have an infinite amount of money, so your expected winnings don't come from actually summing the whole infinite series. For example, let's say Bill Gates is willing to play this game with you; let's say his net worth is 236 dollars. Then your expected winnings are
2(1/2) + 4(1/4) + 8(1/8) + ... + 235(1/235) + 236(1/236) + 236(1/237) + ...
where the first thirty-six terms are all 1; then what follows is 1/2, 1/4, 1/8, and so on. So you should be willing to spend $37 to play this game if the Gates fortune is backing it.
Of course, you might argue -- and a lot of economists have -- that $2n is not worth twice as much to you as $n. The usual assumption here is that the utility of $n is something like log2(n) "utils" (I'm not sure how they handle the problem of the units here), and that people play to maximize their expected number of utils, not their expected number of dollars. Then the "expected value" of the previous game, in utils, is
1(1/2) + 2(1/4) + 3(1/8) + ... = 2
and you should be willing to pay that amount of money which is worth 2 utils to you, namely $4.
But I can construct a similar paradox. If the coin comes up heads on the first toss, I pay you $2. If it comes up heads on the second toss, I pay you $4. If it comes up heads on the third toss, I pay you $16. If it comes up heads on the fourth toss, I pay you $256, and so on... then you receive 1 util if the coin comes up heads on the first toss, 2 on the second toss, 4 on the third toss, 8 on the fourth toss, and so on. So you should still be willing to pay infinitely much to pay this game. And in general one can construct such a payoff sequence for any unbounded utility function.
The somewhat counterintuitive resolution that I heard for this recently is that utility functions must be bounded. So say $1 gives me a certain amount of utility. Then in order to make it impossible to construct a St.-Petersburg-style wager, for which I would be willing to pay an infinite amount of money, there must be some K such that any amount of money gives me at most K times as much utility as $1. I'm not sure I believe this, either... it just goes to show that sometimes expected value might not be the way to go.