Home -> How To -> Or not to be -> Descriptive

Rational Decision: The Descriptive View

According to the normative view of rational decisions, people are assumed to be trying to maximize their general "happiness": what John Stuart Mill, a 19th-century philosopher, called "utility". In other words, given a choice one would take the option with the highest "expected utility". This implies that the choices would always be rational and consistent, for example, if somebody prefers apples to oranges, and oranges to pears, he also should prefer apples to pears.

However, looking at how people actually make decisions in reality, it is obvious people do not follow the normative view, even though it supposedly epitomizes rationality. There are many considerations which may cause people to choose an option which is not ideal according to the normative approach. The rest of this essay tries to describe these considerations which govern how people make decisions in practice.

When possible, we also try to show how these considerations might affect the decision to exit. Such considerations either cause a biased evaluation of the difficulties or risks involved in the options of life and death, or they affect the process of choice itself. The considerations are presented according to their relevance to this decision, where the most relevant are presented first, and the least relevant appear last.

Loss Averseness

According to Kahneman and Tversky[2], people are "loss averse": they have an asymmetric attitude to gains and losses, getting less utility from gaining, say, $100 than they would lose if they lost $100. Those suffering from loss aversion do not measure risk consistently. They take fewer risks that might result in suffering losses than if they were acting as rational utility maximizers.

This may apply to the decision of exiting. Any method of suicide has risks which may lead to a "loss" in utility, such as permanent injury. These risks are not trivial, however, efforts could be made to minimize them. Even so, the amount of fear from losses may be inproportional to the actual risk, due to loss averseness.

Status Quo Bias

People are willing to take bigger gambles to maintain the status quo than they would be to acquire it in the first place. In one common experiment, mugs are allocated randomly to some people in a group. Those who have them are asked to name a price to sell their mug; those without one are asked to name a price at which they will buy. Usually, the average sales price is considerably higher than the average offer price.

The obvious application to suicide is that even if, according to the normative model, suicide is rational for a person, the current situation may still be preferred merely because change is more difficult. The current situation is valued for more than its real worth, just to maintain Status Quo.

Cognitive Dissonance

Cognitive Dissonance occurs when one holds a belief which contradicts the evidence or the truth[3]. The reason is because of the need for consistency between one's opinions and one's behavior.

One experiment[3], involved the participation of a group of teenage girls. Each girl was asked to rate pop records according to preference. After this, the girl was offered to choose between two pop records which the girl rated as "medium". The chosen one was given to her as a gift. After this choice the girl was asked to rate the records again. In the second time, the chosen record rated higher than the one not chosen. The explanation is that in order to justify their choice, the girls increased their preference to the chosen record, to make their thoughts consistent with their actions.

One way cognitive dissonance can affect the decision to exit is by adapting our thought to the way we have acted so far. For example, a person suffering from physical pain for many years, yet not considering suicide, may become aware of the possibility, yet decline it, just because he is unable to justify to himself why he has not exited so far. To make his thoughts consistent with his actions, he may convince himself that his situation is not bad enough to justify suicide.

Little Picture Thinking

Expected-utility theory assumes that people look at individual decisions in the context of the big picture. But psychologists have found that, in fact, they tend to compartmentalize, often on superficial grounds. They then make choices about things in one particular mental compartment without taking account of the implications for things in other compartments.

This is particularly relevant to the decision to exit. There are so many factors to consider: "Is suicide right for me?", "What are my chances to succeed in exiting?", "What are the risks?", "Will I be able to get over the fear of taking the last step?", "How much weight should I give to the suffering of my family, if I choose to exit?", etc. It is not humanly possible to take so many issues into consideration at the same time. There is a natural tendency to limit one's attention on a small number of factors which determine the decision.

Fear of Regret

People appear to be disproportionately influenced by the fear of feeling regret, and will often pass up even benefits within reach to avoid a small risk of feeling they have failed.

Fear of regret is especially present in important and irreversible decisions. People might fear to exit because they will regret it. It sounds strange since once you succeed to you cannot feel anything. In particular you cannot feel regret. However, the fear of regret can still play a part while you are alive, even though, rationally, it has little meaning.

Probability

People regularly miscalculate probabilities [2]: they assume that outcomes which are very probable are less likely than they really are, that outcomes which are quite unlikely are more likely than they are, and that extremely improbable, but still possible, outcomes have no chance at all of happening.

This can effect the estimation of many probabilities related to exiting. For example, the probabilities of success of specific suicide methods, and the possibility that life improves, if one decides to continue to live.

Emotionalism

People often become emotional, causing damage to themselves in order to damage others. One of the psychologists' favorite experiments is the "ultimatum game" in which one player, the proposer, is given a sum of money, say $10, and offers some portion of it to the other player, the responder. The responder can either accept the offer, in which case he gets the sum offered and the proposer gets the rest, or reject the offer in which case both players get nothing. In experiments, very low offers (less than 20% of the total sum) are often rejected, even though it is rational for the responder to accept any offer (even one cent!) which the proposer makes. And yet responders seem to reject offers out of sheer indignation at being made to accept such a small proportion of the whole sum, and they seem to get more satisfaction from taking revenge on the proposer than in maximizing their own financial gain.

Some people choose to commit suicide in order to revenge. The revenge is carried out by blaming somebody, saying "Look what a terrible thing you caused me to do...". This does not make any sense according to the normative view. The result for the one committing suicide, is probably of extremely low utility.

Availability Heuristic

People focus excessive attention on a particular fact or event, rather than the big picture, simply because it is more visible or fresher in their mind.

For example, if a successful suicide is reported in the news, people are more likely to use that same method, even though there may be other, better methods available to them.

Other Factors

For completeness, we also present factors and considerations which cause "irrational" decisions, but have apparently little or no affect on the decision to exit.

Hindsight bias: once something happens, people overestimate the extent to which they could have predicted it.

Memory bias: when something happens, people often persuade themselves that they actually predicted it, even when they didn't.

Magical thinking: occurs when people attribute to their own actions something that had nothing to do with them, and thus assuming that they have a greater influence over events than is actually the case. For instance, an investor who luckily buys a share that goes up may become convinced that he is a skillful investor rather than a merely fortunate one.

Anchoring: occurs when people are overly influenced by outside suggestion. People can be influenced even when they know that the suggestion is not being made by someone who is better informed. In one experiment, volunteers were asked a series of questions whose answers were in percentages-such as what percentage of African countries is in the United Nations? A wheel with numbers from one to 100 was spun in front of them; they were then asked to say whether their answer was higher or lower than the number on the wheel, and then to give their answer. These answers were strongly influenced by the randomly selected, irrelevant number on the wheel. The average guess when the wheel showed 10 was 25%; when it showed 65 it was 45%.

Over Confidence: People are persistently, and irrationally, over-confident[1]. Asked to answer a factual question, then asked to give the probability that their answer was correct, people typically overestimate this probability.

Representativeness Heuristic: a tendency to treat events as representative of some well-known class or pattern. This gives people a sense of familiarity with an event and thus confidence that they have accurately diagnosed it. This can lead people to "see" patterns in data even where there are none.

Conclusion

Supposedly, this essay can be used in order to detect our own biases and reach a more rational choice with regard to suicide. However, it is very difficult to try to correct these biases in one's own thought.

Some biases are almost impossible to deal with. Compartmentalizing is almost unavoidable since our minds are not capable with dealing with so many different factors at once. Other factors such as the biased estimation of probabilities or Loss Averseness are difficult to counter because estimation is subjective and is not amenable to rigorous reasoning. Finally, other factors like Cognitive Dissonance, Fear of Regret, and Emotionalism are based on emotions, over which we have very little control.

Indeed one might claim that these considerations are not biases at all. Rather, they just point out limitations in the normative model. The normative model is too simple, and therefore its results do not fit reality.

However, we hope that the contrast between the normative and descriptive view may provide new insights to people who are deliberating. If you find your reasoning is contradictory to the normative model, you can search for the reason for this using the descriptive view. You may choose to accept your original reasoning, or you may try to make a conscious effort to counter whatever you consider is distorting the process of deliberation. At the end, it is your choice.


REFERENCES

[1] D. Kahneman and A. Tversky, "On the psychology of prediction", Psychological review, Vol. 80, 1973.

[2] Tversky, A. and D. Kahneman (1992), "Advances in Prospect Theory: Cumulative Representation of Uncertainty", Journal of Risk and Uncertainty 5, 297-323.

[3] L. Festinger, "A theory of cognitive dissonance", New York, Row and Patterson, 1957.


Go to top