« Global Warming Blamed for Increasingly Destructive Wildfires | Main | EMS to the NEPA Rescue! »
July 12, 2006
Why Can't We Learn From Our Mistakes?
Dave
In The Limits to Learning, James Montier says "The major reason we don't learn from our mistakes (or the mistakes of others) is that we simply don't recognise them as such. We have a gamut of mental devices all set up to protect us from the terrible truth that we regularly make mistakes." Don't we ever!
I have decided that in government, in large industry, and in personal endeavors people don't want to study decision-making in part or in whole because it proves much easier to pretend that they know more than they can, or at least to pretend to others that they know more than they do.
In either case, the lucky get rich and or famous, proving up on Kurt Vonnegut's line: "If you would be unloved and forgotten, be reasonable." PS.. The "unlucky" are just collateral damage from the systems, never to be remembered, never to be mourned except by the few who knew them personally and who also tend to be disgusted by the "systems that be," by the "powers that be."
Montier continues by identifying several "biases" that thwart learning:
Self attribution bias: heads is skill, tails is bad luckIn a more broadly framed article titled "Part man, part monkey" [PDF: 12 pp.], James Montier helps us understand how our decision-making errors can be traced to four common causes: self-deception, heuristic simplification, emotion, and social interaction. Looking for biases that limit our learning, Montier develops a broader list under his four categories, that he calls a "Taxomony of Biases."We have a relatively fragile sense of self-esteem; one of the key mechanisms for protecting this self image is self-attribution bias. This is the tendency for good outcomes to be attributed to skill and bad outcomes to be attributed to sheer bad luck. This is one of the key limits to learning…. This mechanism prevents us from recognizing mistakes as mistakes, and hence often prevents us from learning from those past errors. …
Hindsight bias: I knew it all along
One of the reasons I suggest that people keep a written record of their decisions and the reasons behind their decisions, is that if they don't, they run the risk of suffering from the insidious hindsight bias. This simply refers to the idea that once we know the outcome we tend to think we knew it was so all along. …
Illusion of control
We love to be in control. We generally hate the feeling of not being able to influence the outcome of an event. It is probably this control freak aspect of our nature that leads to us to behave like [B.F.] Skinner's pigeons. ["Skinner's theory was based around operant conditioning. As Skinner wrote, 'The behavior is followed by a consequence, and the nature of the consequence modifies the organism's tendency to repeat the behavior in the future.'"] ...
Feedback distortion
Not only are we prone to behave like Skinner's pigeons but we also know how to reach the conclusions we want to find (known as 'motivated reasoning' amongst psychologists). …
We have outlined four major hurdles when it comes to learning from our own mistakes. Firstly, we often fail to recognize our mistakes because we attribute them to bad luck rather than poor decision making. Secondly, when we are looking back, we often can't separate what we believed beforehand from what we now know. Thirdly, thanks to the illusion of control, we often end up assuming outcomes are the result of our actions. Finally, we are adept at distorting the feedback we do receive, so that it fits into our own view of our abilities.
Some of these behavioural problems can be countered by keeping written records of decisions and the 'logic' behind those decisions. But this requires discipline and a willingness to re-examine our past decisions. Psychologists have found that it takes far more information about mistakes than it should do, to get us to change our minds.
Taxonomy of Biases:
Self-deception biases (lmits to learning):
- Over-optimism (derives from Illusion of Control – people are "surprised more often than they expect to be")
- Overconfidence (derives from Illusion of Knowledge – tendency to believe that the "accuracy of your forecasts increases with more information.") "Over-optimism and overconfidence are a potent combination. They lead you to over-estimate your knowledge, understate the risk, and exaggerate your ability to control the situation."
- Self Attribution bias (prone to "attribute good outcomes to our skill," bad outcomes to "the luck of the draw")
- Confirmation bias (clinging "tenaciously to a view or forecast," "looking for information that agrees with us," "thirst for agreement rather than refutation is known as confirmatory bias.")
- Hindsight bias (tendency for people knowing the outcome to believe that they would have predicted the outcome ex ante.)
- Cognitive dissonance (mental conflict that people experience when they are presented with evidence that their beliefs or assumptions are wrong. )
- Conservatism (…tendency to cling tenaciously to a view or a forecast. Once a position has been stated most people find it very hard to move away from that view.)
Heuristic Simplification (information processing errors):
- Representativeness (People judge events by how they appear, rather than by how likely they are.)
- Framing [cognitive heuristic in which people tend to reach conclusions based on the 'framework' within which a situation was presented.]
- Categorization {Iverson note: I suspect this one ought to be linked to Stereotypes.}
- Anchoring/Salience (grabbing at irrelevant anchors when forming opinions)
- Availability bias [causes people to base their decisions on the most recent and meaningful events. See also availability heuristic]
- Cue competition […involves a comparison between the probability of the outcome given the target cue and the probability of the outcome given the competing cue. ]
- Loss aversion/Prospect theory [Decision making under risk can be viewed as a choice between prospects or gambles. ]
Emotion/Affect:
- Mood {/Affect} [Human interactions are how people share information and communicate emotion and mood. The cues obtained from others influence how one's own opinions. A shared attitude, or social mood is propagated.] See also: Social Mood and Financial Econoimics, John R. Nofsinger [PDF: 43 pp.]
- Self-Control Bias (Hyperbolic Discounting) ["conflict between a person's overarching desires and their inability, stemming from a lack of self-discipline, to act concretely in pursuit of those desires."] See also: Self-Control Bias [PDF: 15 pp.], Chapter 14 in Behavorial Finance and Wealth Management, Michael M. Pompian. 2006
- Ambiguity aversion ["…an attitude of preference for known risks over unknown risks"]
- Regret theory [reflecting on how much better an individual's position would have been had they chosen differently]
Social Interaction:
- Imitation [advanced animal behaviour whereby an individual observes another's behaviour and replicates it itself.]
- Contagions [...cross-country transmission of shocks or the general cross-country spillover effects.]
- Herding [Following the trend.]
- [Information] Cascades [a situation in which every subsequent actor, based on the observations of others, makes the same choice independent of his/her private signal. ] See also: http://cascades.behaviouralfinance.net/
Notes:
- Parenthetical references "( )"are quotes from Montier's "Part man, part monkey" [PDF: 12 pp.]
- Bracketed references "[ ]" are drawn from other sources as noted.
- Bias category hyperlinks are drawn from to Behavioral Finance or Wikipedia. If I could not find a suitable hyperlink in either, I found one where I could. If others find better hyperlinks (or if mine are off-base) please let me know.
See also:
- Winner's Curse: "If we assume that on average the bidders are estimating accurately, then the person whose bid is highest has almost certainly overestimated the good's value. Thus, a bidder who wins after bidding what they thought the good was worth has almost certainly overpaid."
- Gambler's Fallacy: "With gambler's fallacy, [people] expect reversals to occur more frequently than actually happens."
Finally, I always wonder how much sense it makes to create lists. Certainly it doesn't make sense if lists are all we have. In addition to Montier's work, the classic works I keep close-by on my shelf include:
- A Primer on Decision-Making: How Decisions Happen. James G. March. 1994.
- Policy Paradox: The Art of Political Decision Making. Deborah Stone. 2001 (Revised Edition).
- System Effects: Complexity in Political and Social Life. Robert Jervis. 1997.
- Decision Traps: The Ten Barriers to Brilliant Decision-Making and How to Overcome Them. J. Edward Russo and Paul H. Schoemaker. 1989.
- The Logic of Failure: Recognizing and Avoiding Error in Complex Situations. Dietrich Dörner. 1996.
- Making Sense of the Organization. Karl E. Weick. 2001.
- Managing the Unexpected. Karl E. Weick and Kathleen M. Sutcliffe. 2001.
- How the Way We Talk Can Change the Way We Work: Seven Languages for Transformation. Robert Kegan and Lisa Laskow Lahey. 2001.
Decision making, decision framing, and related matters are all the stuff of such interest and intrigue, "stuff" that seems resistant to reductionistic rendering. Still, there is something in our education and/or culture that drives us to lists.
As for "learning," Robert Heinlein may have said it best through his Sci Fi character Lazarus Long, paraphrasing: "People don't learn from the mistakes of others. They seldom learn from their own mistakes. Never underestimate the power of human stupidity."
Adapted from a July 7 post at Economic Dreams-Economic Nightmares
Posted by Dave on July 12, 2006 at 11:36 AM | Permalink
TrackBack
TrackBack URL for this entry:
https://www.typepad.com/services/trackback/6a00d83451b14c69e200d835655b0769e2
Listed below are links to weblogs that reference Why Can't We Learn From Our Mistakes?:
Comments
The comments to this entry are closed.