Marginal Utility

moved from AnAcceptableWayOfFailing

Wouldn't the economic theory of decreasing return explain this behaviour? Simply put, the first $10 you own has more value to you than the next $10 you own. Let's call $10 a unit for simplicity, if you have $100, you have 10 units. You can toss a dice to gain or lose 1 unit. Since the 10th unit (the one you might lose) has more value to you than the 11th unit (the one you might gain), thus potential gain do not justify the potential lost given a mere 50% chance of winning. Not tossing the dice is the rational thing to do!

-- OliverChung

This is probably related to why millionaires don't play the lottery as much as poor people. I remember reading an excerpt of a speech Ted Turner gave about how making a million dollars was cool, but making a billion dollars was kind of just blah. I believe the quote was "Really great sex is better than being a billionaire."

If I had $100, would I make a fair 50%-50% bet to win or lose $100?

No, not if I'm rational: With $100 I can take my girl out to a nice dinner. With $200 (should I win), I can't get a much nicer dinner. With $0 (should I lose), I'm really up a creek.

No thanks, I'd rather keep and use this $100. I'd have to stand a good chance of large gain to risk it all.

The rational decision, maximizing utility, deals with alternatives. Lawn mowing is worth more than $10 dollars to me (regardless of whether someone is doing it for me or I am doing it for someone else) but not to the kid; which is why I won't do it for myself or my neighbor for ten bucks.

Also, when considering the 50% chance of gaining or losing $10 of $100, the $10 gained is worth less, from a utility perspective, than the $10 lost. So the rational decision is to hold in that case. Consider an extreme illustration: If you have only $100 and you need $95 to pay your loan shark, do you toss the coin?

-- GregWiley?

Conversley, if you need $105 to pay your loan shark the rational move would be to toss the coin.

OK, scratch that illustration. Disregarding my lack of illustrative powers, however, each $10 you accumulate is worth less than the previous $10. --GregWiley?

Economics calls this "MarginalUtilityTheory"

As someone mentioned above, you still need to look at your utility function. You can't just reduce everything to an expectation value in dollars. (For you Canadians out there, that's dollar-fiftys.) [Or in the UK, currently 70 pences, but that would make the math very awkward.]

So, in the initial loan-shark example, your utility function looks like this:

  f(x) where x >= 95 is (x-95) foos
  f(x) where x < 95  is (x - P) foos
A foo is an arbitrary unit of utility approximately equal to a candy bar, and P is the number of candy bars it would take to make up for the pain of having your legs broken. This varies from person to person, as does the amount owed to the loan shark. Let's fix it for the sake of discussion at 10,000 (which is close to the number of candy bars your doctor will want for setting those broken legs).

What's your expectation value in foos if you make the bet? The average of f(90) and f(110). Working it out: -9,910 and 15 average to -4947.5 .

What's your expectation value in foos if you don't make the bet? Just f(100), which is 5. This is still much, much better than -4947.5 . The rational move is to not take the bet. Of course, if you habitually made rational moves, you probably wouldn't have that little loan-shark problem.

If you owe Rocco $105, the numbers shift: f(100) = (- 100 10000) = -9900, f(90) = -9910, and f(110) = 5. So now the bet gives you an expectation value of -4952.5 foos (slightly worse than before), and no bet gives you one of -9900 foos -- much worse than before, and much worse than the expectation value for the bet. So if the bet's your only option (and it probably isn't; quit thinking that way if you want Rocco out of your life), you should take it.

Anyway, the point being seconded here is that people don't value all dollars the same, and given the complexities of their lives, probably shouldn't. You can still apply the mathematics of expectation values, but don't just blindly apply it to dollar amounts. Figure out a (possibly very rough) utility function, and look at expectation values of that.

[Somebody want to carefully extract all the rational-expectations stuff from this page and give it its own? I would, but no obvious title leaps to mind.]

-- GeorgePaci

I think that the economics newbies among us would appreciate if somebody wrote a short page explaining how to read an EconomicUtilityFunction.

Here's a simpler example. Which of these would you choose? 1) Receiving a definite $10 million, or 2) Having a 50/50 chance of winning $50 million

Unless you're already a multi-millionaire, you'd probably choose the first one, even though choice 2 has a higher expectation value, in dollars. In terms of quality of life, choice 1 makes more sense, since the first $10 million contributes far more to your life than the next $40 million. -- JaredLevy

Marginal utility is something we all understand. An extra £100 (that's $145 to you guys over there) is meaningless to a millionaire but a great deal to a pauper. We all think in terms of percentages: pay rises, discounts, inflation, etc. The maths behind the RiskAvoidance goes like this:

Let U(m) be the utility (i.e. the desirability) of an amount of money m. In general, because of our view of percentages, U(.) is a slowly increasing function, and roughly logarithmic. This means that the mean of 1/2 U(m + x) + 1/2 U(m - x) is less than U(m). In other words, the expected utility of the average if you take the risk is less than if you don't. Hence, people are risk averse. -- HubertMatthews

The Bernoulli's were aware of this and may have noticed this in correspondence with a mathematician name Gabriel Cramer. In general Daniel Bernoulli evaluated gambling decision in dollar terms though. I think von Neumann and Morgenstern developed extensively the concept of utility. ( Reference Risk, Ambiquity and Decision by Daniel Ellsberg ).


CategoryEconomics


EditText of this page (last edited April 22, 2005) or FindPage with title or text search