Suppose that a disease, or a monster, or a war, or something, is killing people. And suppose you only have enough resources to implement one of the following two options:
- Save 400 lives, with certainty.
- Save 500 lives, with 90% probability; save no lives, 10% probability.
Most people choose option 1. Which, I think, is foolish; because if you multiply 500 lives by 90% probability, you get an expected value of 450 lives, which exceeds the 400-life value of option 1. (Lives saved don't diminish in marginal utility, so this is an appropriate calculation.)
"What!" you cry, incensed. "How can you gamble with human lives? How can you think about numbers when so much is at stake? What if that 10% probability strikes, and everyone dies? So much for your damned logic! You're following your rationality off a cliff!"
Ah, but here's the interesting thing. If you present the options this way:
- 100 people die, with certainty.
- 90% chance no one dies; 10% chance 500 people die.
Then a majority choose option 2. Even though it's the same gamble. You see, just as a certainty of saving 400 lives seems to feel so much more comfortable than an unsure gain, so too, a certain loss feels worse than an uncertain one.
Read the whole thing.
I don't think this is too difficult to understand. Both in both situations, the deciders don't want to be think of themselves as possibly responsible for avoidable death. In the first scenario, you don't want to be the guy who made a gamble and everyone dies. In the second, you don't want to choose for 100 people to die. People make different choices in the two situations because they want to minimize moral culpability.
Is that rational? Strictly speaking, mabye not. Is it human? Absolutely!
1 comment:
Thanks for posting. This is an area of interest, for me and you are creating POSITIVE exerternalities!
Post a Comment