Humility and Rationality

Humility and Rationality
By Wade Lee Hudson

A review
Thinking, Fast and Slow
Daniel Kahneman
Farrar, Straus and Giroux, 2011, 512 Pages

Controlling emotions, instincts, intuitions, and biases is like riding an elephant. As Jonathan Haidt wrote: “The emotional tail wags the rational dog.” In his magnum opus, Thinking, Fast and Slow, Nobel Prize winner Daniel Kahneman sums up decades of research and urges readers to strengthen “slow thinking” to better manage “fast thinking.” Rationality demands discipline, practice, and effort, but over-confident, we often fail. A humble understanding of why and how we don’t always choose the most rational action can help us be better human beings. 

Kahneman argues that humans

often need help to make more accurate judgments and better decisions, and in some cases policies and institutions can provide that help. The assumption that agents are rational provides the intellectual foundation for the libertarian approach to public policy: do not interfere with the individual's right to choose, unless the choices harm others. For behavioral economists, however, freedom has a cost, which is borne by individuals who make bad choices, and by society that feels obligated to help him. The decision of whether or not to protect individuals against their mistakes, therefore, presents a dilemma.

Whether to require motorcyclists to wear helmets is an example. Requiring everyone to get health insurance is another.

Social-change activists have much to learn from Kahneman’s work, which calls for a commitment to overcome the arrogance that interferes with learning from mistakes. No wonder pride has been considered the number-one sin, and humility the number-one virtue. 

In Democracy Is for the Gods, Costica Bradatan concludes:

One element that is needed for democracy to emerge is a sense of humility. A humility at once collective and internalized, penetrating, even visionary, yet true. The kind of humility that is comfortable in its own skin, one that, because it knows its worth and its limits, can even laugh at itself. A humility that, having seen many a crazy thing and learned to tolerate them, has become wise and patient…. To live democratically is, mainly, to deal in failure and imperfection, and to entertain few illusions about human society. The institutions of democracy, its norms and mechanisms, should embody a vision of human beings as deficient, flawed and imperfect.

Kahneman’s wide-ranging book addresses multiple issues, including business and public policy, and presents the results from many experiments (including multiple-choice questions) that support his conclusions. Here I focus on insights relevant to social-change activists.

Kahneman makes a distinction between two basic elements of the human personality, which he calls “System 1” and “System 2” — the “automatic system” and the “effortful system.” Though these two systems “do not really exist in the brain or anywhere else,” he treats them as “fictitious identities,” and suggests thinking in terms such as: “This is your System 1 talking. Slow down and let your System 2 take control.” 

The relevance of Kahneman’s work is demonstrated in a Pod Save the People interview with Jennifer L. Eberhardt, PhD, author of Biased: Uncovering The Hidden Prejudice That Shapes What We See, Think, And Do. Eberhardt reports that the social media platform Next Door consulted with her about their members’ racial profiling. Concerning implicit, or unconscious, bias, Eberhardt says:

Even though we are all prone to bias, that doesn't mean we're acting on bias all the time. There are certain situations that trigger it. Having power over that bias or having some control over it or even responsibility for it means understanding the situations under which bias can get activated and lead you to make decisions that you wouldn’t otherwise make. 

Here was a situation where people were in a heightened state, like when you're feeling frightened or fearful or stressed and bias can emerge. Bias can also emerge when people are having to act quickly, so you're trying to make split-second decisions and you don't have time to think. That means you're going to fall back on well-practiced associations that you have, say between Blackness and crime. You see a black person [on your street] and you're worried about criminal activity. So you go to the Next Door platform and you want to alert your neighbors. 

What Next Door had to do was think about: How do you slow people down so they can make better decisions? For them this was difficult because a lot of tech products including theirs are designed so you can get to where you want to go quickly. That's where the bias can emerge. They were concerned that if they curbed bias they might get a lot of drop off from the platform because of the friction. 

But they decided they were going to go ahead and put the friction in there because the issue was kind of core to why they exist. So they decided to add a checklist. Now when people go to the platform and click the Crime & Safety tab, you have to answer a series of questions. And the first question is: What is it about this person's behavior that’s suspicious? It can't be “suspicious black man,” which is what was happening before. And the second category is: Describe this person in detail. And the third thing they did was to define what racial profiling is, and they said it is prohibited. They set a social norm on the site that you can’t do this. And they found that creating that friction led to a 75% decrease in racial profiling. 

Kahneman writes:

Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in a comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine — usually.

System 1 includes heuristics (simple procedures or rules of thumb) that help find answers to questions, “the entirely automatic mental activities of perception and memory,” and expert intuitions. System 2 involves “slower, more deliberate and often effortful form of thinking.” Both systems affect how we make judgments and decisions.

System 2 “can follow the rules, compare objects on several attributes, and make deliberate choices between options.” System 1 cannot because it works automatically. 

Unfortunately, we often display “excessive confidence in what we believe we know,” and seem unable

to acknowledge the full extent of our ignorance and the uncertainty of the world we live in.... We documented systematic errors in the thinking of normal people, and we traced these errors to the design of the machinery of cognition rather than to the corruption of thought by emotion…. We can be blind to the obvious, and we are also blind to our blindness. 

System 1 operates “quickly, with little to no effort and no sense of voluntary control,” “cannot be turned off,” is complex, rich, and often unconscious, and is “the origin of most of what we do right — which is most of what we do.” It is rooted in “associated memory,” which “continually constructs a coherent interpretation of what is going on in our world at any instant,” “holds the vast repertoire of skills we have acquired in a lifetime of practice,” can execute skilled responses and generate skilled intuitions (“after adequate training”), “distinguishes surprising from normal events in a fraction of a second,” “automatically searches for some causal interpretation of surprises and events as they take place,” and is rarely dumbfounded.

But System 1: 

  • is also “the origin of much that we do wrong.”

  • “may substitute a response that more easily comes to mind” rather than answering the question that was asked. (These “heuristic answers are not random, and they are often approximately correct. And sometimes they are quite wrong.”)

  • is not reality bound.

  • is prone toward predictable biases and cognitive illusions.

  • produces intuitive predictions that “tend to be overconfident and overly extreme…. Even when we know our predictions are little better than random guesses, we continue to feel and act as if they are valid.” (“Considering how little we know, the confidence we have in our beliefs is preposterous — and it is also essential.”)

  • “overweights low probabilities.”

Emotionality is one source of problems. System 1:

  • “is rarely indifferent to emotional words: mortality is bad, survival is good, and 90% survival sounds encouraging whereas 10% mortality is frightening.”

  • leads individuals to be “impulsive, impatient, and keen to receive immediate gratification.”

  • “links a sense of cognitive ease to illusions of truth, pleasant feelings, and reduced vigilance.”

  • “exaggerates emotional consistency (halo effect).”

  • “responds more strongly to losses than to gains (loss aversion).” (“He suffers from extreme loss aversion, which makes him turn down very favorable opportunities.” “Considering her vast wealth, her emotional response to trivial gains and losses makes no sense.” “He weighs losses about twice as much as gains, which is normal.”)

  • “may know that the probability is low, but this knowledge does not eliminate the self-generated discomfort and the wish to avoid it.”

  • prefers the status quo. (“Giving up a bottle of nice wine is more painful than getting an equally good bottle is pleasurable….  Negativity and escape [are more powerful than] positivity and approach…. Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.”)

Cognitive complications are also problematic. System 1:

  • “is biased to believe and confirm.” (“I call it theory-induced blindness: once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws.”)

  • has trouble reconstructing “past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

  • “understands sentences by trying to make them true, and the selective activation of compatible thoughts produces a family of systematic errors that make us gullible and trying to believe too strongly whatever we believe.”

  • “is strongly biased toward causal explanations and does not deal well with ‘mere statistics.’” (“People who are taught surprising statistical facts about human behavior may be impressed to the point of telling their friends about what they have heard, but this does not mean that their understanding the world has really changed…. Even compelling causal statistics will not change long-held beliefs or beliefs rooted in personal experience.”) 

  • “infers and invents causes and intentions.”

  • “neglects ambiguity and suppresses doubt.”

  • “focuses on existing evidence and ignores absent evidence.”

  • “generates a limited set of basic assessments.”

  • “represents sets by norms and prototypes,” such as automatically activating stereotypes.

  • “computes more than intended (mental shotgun).”

  • has expectations that do not correspond to the actual probability. 

  • “frames decision problems narrowly, in  short-term memory, in isolation from one another.”

  • infers the general from the particular, and has trouble deducing the particular from the general.

  • takes the “inside view” by focusing on specific circumstances and searching for evidence in our own experiences.

  • generates moral intuitions, which are sometimes conflicting. 

To deal with these problems, we must rely on System 2, which “is who we think we are [and is] often associated with the subjective experience of agency, choice, and concentration.” System 2:

  • must slow down “to distinguish between a skilled and [an incorrect] response.” 

  • “articulates judgments and makes choices.”

  • is called in to deal with situations for which System 1 does not have the skill to handle. 

  • “monitors System 1, and maintains control as best it can, within its limited resources.” (It tames System 1, but cannot vanquish it.)

  • “is not a paragon of rationality. Its abilities are limited and so is the knowledge to which it has access… Often we make mistakes because we (our System 2) do not know any better.”

  • can program System 1 “to mobilize attention when a particular pattern is detected.”

  • can develop the habit of looking for the “outside view.” (“He's taking an inside view. He should forget about his own case and look for what happened in other cases.”)

  • “allocates attention to the effortful mental activities that demand it, including complex computations.”

  • often involves difficult statistical thinking.

  • enables children who develop more self-control as four-year-olds to have “substantially higher scores on tests of intelligence” later in life. 

  • generally has no moral intuitions of its own to answer questions. It must evaluate intuitions that come from System 1.

  • “is not merely an apologist for System 1; it also prevents many foolish thoughts and inappropriate impulses from overt expression...with “intense monitoring and effortful activity [when] cues to likely errors are available.” (“Continuous vigilance is not necessarily good, and it is certainly impractical…, but the chance to avoid a costly mistake is sometimes worth the effort.”)

  • attempts “to construct an answer on its own, which it is reluctant to do because it is indolent,...easily tired, [and] normally lazy” — partly because ”disbelieving is hard work [and] reframing is effortful.” (But “people sometimes expend considerable effort for long periods of time without having to exert willpower,” as when they are “in the flow.”)

  • involves both rationality and intelligence, which are different. “'Superficial’ or ‘lazy’ thinking is a flaw in the reflective mind, a failure of rationality.”

In addition, Kahneman reports on recent research that “has introduced a distinction between the experiencing self and the remembering self, which do not have the same interests.” The remembering self “is the one that keeps score and governs what we learn from living,” but it is sometimes wrong. 

One example is that although “we want pain to be brief and pleasure to last,” when one option induces a less painful experience at the end of the experiment, subjects choose to repeat that option even though it lasts longer and involves more pain overall. System 1 will remember the end (or the peak) more than it remembers the whole experience. “We believe that duration is important, but our memory tells us it is not…. This is the essence of the focusing illusion.”

Kahneman also analyzes the impact of “emotional framing.” He illustrates the point with one of his typical thought experiments. Ask yourself:

Would you accept a gamble that offers a 10% chance to win $95 and a 90% chance to lose $5?

Would you pay $5 to participate in a lottery that offers a 10% chance to win $100 and a 90% chance to win nothing?

Chances are you, like most people, preferred the second option. But the two options are identical. In each you 

must decide whether to accept an uncertain prospect that will leave you either richer by $95 or poorer by $5. Someone whose preferences are reality bound would give the same answer to both questions, but such individuals are rare…. 

A bad outcome is much more acceptable if it is framed as the cost of a lottery ticket that did not win than if it is simply described as losing a gamble. We should not be surprised: losses evokes stronger negative feelings than costs.

This example demonstrates that the way issues are framed can undermine rationality. “People's emotional evaluations of outcomes…play a central role in guiding decision making.”

Kahneman says “I am not generally optimistic about the potential for personal control of biases.” But with effort it’s possible. We can improve our ability “to recognize situations in which errors are likely.” Unfortunately, however, “the voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision.” 

With regard to intuition, he writes:

Significant effort is required to find the relevant reference category, estimate the baseline prediction, and evaluate the quality of the evidence, The effort is justified only when the stakes are high and when you are particularly keen not to make mistakes. Furthermore, you should know that correcting your intuitions may complicate your life.

To contribute to progress on this front, Kahneman writes:

My aim [is to] improve the ability to identify and understand errors of judgment and choice, in others and eventually in ourselves, by providing a richer and more precise language to discuss them. In at least some cases, an accurate diagnosis may suggest an intervention to limit the damage that bad judgments and choices often cause…. Much like medicine, identification of judgment errors is a diagnostic task, which requires a precise vocabulary. 

These corrections are facilitated if, when considering others’ reactions, “we don't focus exclusively on the average. We should consider the entire range of normal reactions.”

We can remember that “the common admonition to ‘act calm and kind regardless of how you feel’ is very good advice: you're likely to be rewarded by actually feeling calm and kind.”

We can ask ourselves difficult questions about whether “to cut losses when doing so would  admit failure.” That can be hard because 

we are biased against actions that could lead to regret, and we draw an illusory but sharp distinction between omission and commission, doing and doing, because the sense of responsibility is greater for one than for the other. The ultimate currency that rewards or punishes is often emotional…. The sunk-cost fallacy keeps people for too long in poor jobs, unhappy marriages, and unpromising research projects…. 

Regret is an emotion, and it is also a punishment that we administer to ourselves. The fear of regret is a factor in many of the decisions that people make…. People expect to have stronger emotional reactions (including regret) to an outcome that is produced by an action than to the same outcome when it is produced by inaction…. 

We spend much of our day anticipating, and trying to avoid, the emotional pains we inflict on ourselves. How seriously should we take these intangible outcomes?...  Is it reasonable, in particular, to let your choices be influenced by the anticipation of regret?... You should not put too much weight on regret; even if you have some, it will hurt less than what you now think.

Concerning public policy, Kahneman suggests:

Widespread fears, even if they are unreasonable, should not be ignored by policymakers. Rational or not, fear is painful and debilitating, and policymakers must endeavor to protect the public from fear, not only from real dangers….

[The book,] Nudge, which quickly became an international bestseller, is the Bible of behavioral economics. [People] also need protection from others who deliberately exploit their weaknesses — and especially the quirks of system 1 and the laziness of system 2. The flagship example of behavior policy, called Save More Tomorrow, was sponsored in Congress by an unusual coalition that included extreme conservatives as well as liberals. [An employee might, for example, allocate 50% of any future salary increases to their pension plan.]... [Also,] the best single predictor of whether or not people will donate their organs is the [use of a] default option that will be adopted without having to check a box.

Concerning our cultural environment, Kahneman points out:

Living ina culture that surrounds us with reminders of money may shape our behavior and our attitudes in ways that we do not know about and of which we may not be proud….

Stories of how businesses rise and fall strike a chord with readers by offering what the human mind needs: a simple message of triumph and failure that identifies clear causes and ignores the determinative power of luck and the inevitability of regression [to the mean]….

Except for the very poor, for whom income coincides with survival, the main motivators of money seeking are not necessarily economic…. Money is a proxy for points on a scale of self-regard and achievement. These rewards and punishments, promises and threats, are all in our heads. We carefully keep score of them…. 

We can counter these pressures by affirming fairness.

A basic rule of fairness, we found, is that the exploitation of market power to impose losses on others is unacceptable…. They showed indignation only when a firm exploded its power to break informal contracts with workers or customers, and to impose a loss on others in order to increase its profit…. The replacement worker has no entitlement to the previous worker’s reference wage, and the employer is therefore allowed to reduce pay without the risk of being branded unfair.

One lesson that can be derived from Kahneman’s books is that strategies for nurturing social change are most effective if they use statements that are concrete and based on “richer and more fluent,” rather than abstract, language because “fluency, vividness, and the ease of imagining contribute to decision weights.” Simple messages of triumph and failure are also powerful.  

Fortunately, “organizations are better than individuals when it comes to avoiding [cognitive] errors.”

They naturally think more slowly and have the power to impose orderly procedures…. Organizations can also encourage a culture in which people watch out for one another as they approach minefields… The operative concept is routine…. 

There is much to be done to improve decision-making. One example out of many is the remarkable absence of systematic training for the essential skills of conducting efficient meetings.

A particular method developed by Gary Klein is the premortem

The procedure is simple: when the organization has almost come to an important decision but has not formally committed itself Klein proposes gathering for a brief session a group of individuals who are knowledgeable about the decision. The premise of the session is a certain short speech: “Imagine that we are a year into the future. We implemented the plan as it now exists. The outcome was a disaster. Please take 5 to 10 minutes to write a brief history of that disaster.”

If the leaders of an organization really listen to their members or subordinates, they can learn a great deal.

There is a direct link from more precise gossip at the water cooler to better decisions. Decision-makers are sometimes better able to imagine the voices of present gossipers and future critics than to hear the hesitant voice of their own doubts. They will make better choices when they trust their critics to be sophisticated and fair, and when they expect their decision to be judged by how it was made, not only by how it turned out.

This review of these findings documented by Kahneman and his associates is mind-boggling. The human mind has available a dazzling bag of tricks, which can either be useful or interfere with seeing reality for what it is. Appreciating these complexities induces humility. 

Those of us who want to improve the world and cultivate compassion will be well-advised to try to tame our gut reactions, engage in relentless self-examination, admit mistakes, resolve not to repeat them, and pick personal and political battles wisely. Humans are not selfish machines that merely pursue their self-interest in zero-sum competitions. Rather, down deep, we are motivated by a sense of fairness and a commitment to family, tribe, nation, and all humanity — as well as self-interest.