Economists and some psychologists theorize that most people are risk averse due to evolution. The idea is that people who are risk neutral or risk seekers die more often than people who avoid risk. I wish I could find my old economics textbook that describes this now...
This and other evidence caused my research partner and me to suggest that brokerage firms would do well to put their traders on ProZac?. Nobody, including our professors, took us seriously. -- MarkAddleman
I recall it has to do with risk management. Give them $100, and the chance to enhance or lose 10% with a coin toss, they'll choose not to toss the coin. Variations tested, showing people will not take a risk if it means there's a possibility of losing. "Hoard what I have" might be the thought. -- AlistairCockburn
This is reasonable - the average return per game is zero (so the choice doesn't matter), and, even worse, an infinite chain of such games converges towards total loss of the wager, because, assuming the number of games won and games lost equals n, (90%)^n * (110%)^n = (90% * 110%)^n = (99%)^n, converging against zero with a probability that approaches one. (There is an increasingly diminutive chance of increasingly insane returns, though.)
That must be an invalid calculation. For each game, the expected return is zero. Hence no matter how many games are played, the expected return is still zero.
The more interesting question is why people do actually participate in games that offer a negative average return, such as lotteries and casino games, at least most of the time (when a jackpot has accumulated, the average return for that single game can be positive). The reason is that these games play people's psychology. For example, Roulette seems like a fair (zero return) game that offers the possibility of very high returns, at least if you do not consider the 0 and 00 fields. If the 0 comes into play, non-special fields (black/red, even/odd, high/low) become losing fields, returning an average of 36 out of 37 parts. And the 00, if present, makes special fields losing fields, too, and thus every strategy a losing strategy. Anyway, the average loss is so small that people don't notice. But over the course of an evening with, let's say, 40 games, this adds up to a loss of about two thirds.
That calculation relies on assumtions that are atypical. Most people would not stake all their chips on a game, but they would assess their loss in relation to all the chips they started with, even if some were never staked.
A lottery typically has higher average loss per game, and this is necessary to make substantial profit, because time intervals between games are longer. A typical number is 50% loss. Why do people play lottery games? I don't know, but I liken it to a sort of insurance. In an insurance contract, you pay a fixed monthly sum so that the insurance company will compensate you in the unlikely event of catastrophic personal loss. Most people want to have insurance although insurance contracts result in an average net loss (due to company overhead, taxes, shareholder profits etc.). A lottery is similar except that you aren't compensated (with low probability) for a catastrophic event. Instead, the high sum is the prize and the winner is/winners are determined more or less randomly. Just like an insurance protects you against the small probability of ruin, the lottery offers you the small probability of winning a personal fortune.
Additionally, lottery games may be supported by memory bias (the few times that you won $100 or so are remembered much better than the hundreds of times that you spent $5 without winning anything), and a form of fallacious thinking known as misleading vividness. When people hear of high prizes in a lottery, and try to imagine what they would do with the money, these vivid fantasies will create the impression that this outcome (winning the jackpot), though still improbable, is much more likely than it actually is.
Also, many consider the utility cost of playing a $1/week lottery against the possible gains - sure, I'm out $52 at the end of the year, but if I win the big prize, I need never work again, so the $52 loss is minimal in comparison.
A few more comments on lotteries on GamblingAddiction.
Gambling on outcomes a bird in the hand is worth two in the bush?
Perhaps a more relevant analogy to software development would be if Alistair provided me $100 and I decided whether to bet on the coin toss. If I lose, I give Alistair $90. If I win, I give Alistair $100 and keep $10. For me, this would be an acceptable way of failing; given the correct risk management documents, I might also convince Alistair of this.
In addition, one of the things to be considered is that to even contemplate odds takes time, and therefore in some situations acceptable failure might be a better option than thinking about alternatives. Let's imagine you're doing business in a foreign city where you don't know the city and you don't have a guidebook. You only have a short time to have some lunch, and you see a McDonalds. Now, McDonalds isn't your favorite restaurant - if you were home you could think of 100 different restaurants you'd rather go to - but are you really going to stumble around this foreign city looking for something better? Are you going to stand there on the sidewalk for five minutes, trying to quantify your odds of finding a better restaurant? If you go to McDonalds, you know exactly what you'll get, and you probably won't pay an outlandish price for it either. This is formally known in economics as a Search Cost.
The real problem in life is nothing has guaranteed probabilities. If you have $100, and decide to decline a 50%-50% bet to win or lose $100, you do not have a 100% chance of still having the money tonight. You could be robbed, the bill you have could turn out to be counterfeit, etc. In software development situations, there is no way to assign a numeric probability for "success". It is not even possible to fully define success. In the end, it comes down to subjective evaluation.
I find it interesting that we persist in using the 100 -> 90 or 110 example. From the discussion at the top of the page, I'm pretty confident that the irrationality of a lot of humans that's being described would take the same choice (risk avoidance), even if it was 100 -> 90 or 120. So I think a lot of the maths about 'same expectation' is a RedHerring. -- AlexChurchill
The version of the experiment that I've heard goes like this:
People don't really believe in change. You can tell them 50/50 but they mentally discount in favour of the status-quo. So a 50/50 gain is treated as a 40/60 gain while a 50/50 loss is treated as 60/40. So people take the certain $100 gain and prefer to punt for the $90 loss.
-- BenAveling
(This is because people are doing the math for the long run scenario. Scenario 1, They will win $100 in each iteration, no risk. Taking the risk, they will only gain $55 in each iteration. Same thing is true in the Scenario 2.)
Holding at $100 is perfectly rational. The net expected result for tossing is 50% * $110 + 50% * $90, or $100, but the actual result has a 50% chance of being $90. Unless winning has a substantially higher value than what I have currently, not losing is a rational strategy. -- JohnBrewer
I don't follow why 50% losing and 50% winning chances make for selecting not to play the game a rational choice. Does not compute. Seems you could just as easily argue that "the actual result has a 50% chance of being $110. Unless losing has a substantially lower value than what I have currently, winning is a rational strategy". -- Alistair
This makes sense if you use a geometric rather than arithmetic mean: i.e. - people gauge by magnitude. By that metric, holding at $100 becomes an even deal if the payoffs are $90/$111. -- PaulMurray
If the odds are even - and the net expected result is exactly what you have, as JohnBrewer notes - then both playing and not playing are entirely rational. If the vast majority of people decline to flip the coin in this case, perhaps it is not because they are emotionally magnifying the risk of loss, but rather because they have better things to do at that moment? -- BrettNeumeier
In the case of this coin-tossing example, it seems clear to me that neither "toss" nor "don't toss" is obviously irrational. Given the chance to win 50% or lose 10% with a coin toss, it's probably reasonable to describe declining as irrational, but with more evenly balanced gains and losses I don't understand why minimizing risk at a small cost in expectation is regarded as crazy. And in this particular case, where the expectations are the same either way, I can't imagine why anyone would think it irrational either to prefer not risking loss or to prefer having the chance of gain. -- GarethMcCaughan
Another take on this is to note that if you play twice (with a fair coin), your possible results are 121 (+10,+11), 99 (+10,-11), 99 (-10,+9), and 81 (-10,-9). While the expected value is still balanced, you're now looking at a 75% chance of losing some value. And it just gets worse the more you play. -- NathanWallwork
Sorry, Nathan - you are jumping into conclusions. Tossing twice is a very bad idea (because of what you said above), but three, four or five times give exactly 50% chance to be ahead. The real issue is that after falling behind at any stage, it will always take two consecutive good tosses to get ahead again. In other words, after falling below 100, you have only 25% chance of ever recovering. Funny that... -- JanosGaram
We'll have to disagree on the math, I guess. What I get is that four tosses gives you these possibilities: 146.41 + (4*119.79) + (6*98.01) + (4*80.19) + 65.61. Still a net balance, but only 5 chances in 16 of being better off than where you started, 6 chances in 16 of losing 1.99 for your trouble, and 5 chances in 16 of being noticeably worse off. -- NathanWallwork
There's another reason that holding the money is rational, even with a single coin toss. If you evaluate the probability that the coin is biased against you at greater than 0, your expected return on the toss is less than $100. That may sound nitpicky, but consider the situation: I walk up to you on the street, offer you $100 and then invite you to gamble with it. Something fishy is definitely going on, and even though you're not quite sure what, you do know one thing: you've got $100 you didn't have a minute ago. -- ColinPutney?
I believe that the detailed math is largely irrelevant, and that RiskAverse behavior results simply from the fact that positive and negative directions of resource change are not symmetrical, since negative resource change takes one closer to a cliff (zero resources) which has no counterpart in the positive direction.
People will, of course, differ in their relative preferences for balancing the risk of a possible approach to this cliff with the benefit of a possible retreat from it. This may or may not be the same thing as having different estimates of the cost of falling off the cliff.
It seems to me that this natural human behaviour may be the cause of a few InevitableIllusions.
I suppose that one solution is to decrease risk so much that success looks like a good way to stave off boredom. It'd be interesting to test that one. -- MichaelFeathers
Perhaps what is perceived as "risk aversion" is simply an allowance for the likelihood of the person making a mistake. For example, people are likely to conclude (after stocks have gone down) that they are likely to continue to go down, so they are more likely to buy high and sell low in risky investments. An article with some interesting data on this phenomenon is Christine Benz (of Morningstar.com)'s article on November 13, 2006 titled "How Did Investors Really Do?" (http://biz.yahoo.com/ms/061113/178504.html). During the 10-year period her article covered, people who invested in low risk investments tended to buy-and-hold. They generally earned the rate of return that would be expected from looking at typical statistics on the investments' performance. On average, people who invested in high risk investments tended to buy high and sell low just enough to reduce their returns to what the investors in low risk investments earned.
This seems to make me not normal. If somebody gives me $100, and offers me a $10 win/lose based on a coin toss; I will take the coin toss. If I win, I get $110 and if I lose I still get $90; either way I win. Take it all they way to a $100 win/lose, or even a win=$100 / lose=$0; I still haven't actually lost anything. Seems like a dumbass undergrad economics example to me. People forget undergrad learning, like all schooling, is *still* only an approximation to the truth.
Now playing the lottery; yes I do this. Not because of the numbers. Not because a jackpot win would be a life-changing event (do I really want my life to change so drastically anyway?). But because there is entertainment attached to buying the ticket. I am prepared to pay for this entertainment, much like I am prepared to buy a music CD or see a film at the cinema / movie theatre. Make the jackpot smaller and the 'entertainment value' decreases to a point where the pleasure return is not worth the cost.
The same applies to casino / gambling games.
The same applies to *anything*.
This is even if the money is sensible like sterling or euros and not the monopoly / toy money called $$$$ :).
-- BarnySwain
If you have been given the $100, it's yours to keep. You should therefore assess future risks against what you have after receiving the $100, not what you had before receiving $100. The rational decision would be based on weighing the advantage of receiving $10 against the disadvantage of losing $10 (in the given context of having $100 already).
Experience shows that many more lottery tickets are sold when the top prize is really high. Apparently, the possibility of becoming a multi-millionaire is a lot more attractive than the possibility of becoming a millionaire!
Personally, I play when the annuitized payout tops $2M/year - that way I will clear $1M/year (based on current tax rates) for 20-30 years (depending on lottery), which should be enough to satisfy my hobbies
See Also: FukushimaAndShuttleManagementLessons
Refactoring Note: This page refactored from: AnAcceptableWayOfFailing