Michael Lewis’ The Undoing Project and History as Choice:
Taking Control of Your Mind through Listening to Others.
If historians can explain the past as logical,
and the present as inevitable, why can’t they predict the future?
The Undoing Project and Intellectual Bias: Conventional Wisdom Undone.
“Conventional wisdom [consists of] ideas which are esteemed at any time for their acceptability.” (emphasis added) John Kenneth Galbraith.
The Undoing Project: A Friendship that Changed Our Minds, by Michael Lewis, is the story of two psychologists, Amos Tversky and Daniel Kahneman, who wrought a Copernican revolution in the way we think about thinking. Their experiments and findings on how people evaluate evidence, reach conclusions, and make decisions forced experts in a wide variety of fields to change their assumptions and methods of doing business. The effect on the field of economics was so great that Kahneman was awarded a Nobel Prize in economics in 2002, despite having no special knowledge of economics. He later summarized his and Tversky’s findings in the book Thinking Fast and Slow. This essay applies their findings to the study of history, and suggests that they support an approach that makes history a study of people making choices, as opposed to the conventional textbook approach that makes history into a chain of causation.
The conventional wisdom among both scholars and policymakers during the post-World War II era was that people are basically rational thinkers, subject to distortions in their thinking and cognitive disorders that can result from the influence and interference of their emotions. That is, people are rational unless they are led astray by their emotions, and the key to rational thinking is, therefore, to control one’s emotions. The conclusion was that if one controls one’s emotions, one can rely on one’s reasoning abilities. Policymakers can, in turn, either assume that people will control their emotions and think rationally, or can factor into their policies a variable that compensates for people’s emotions. This assumption of rationality was especially influential in the field of economics. Mainstream economics was constructed around the so-called economic man who ostensibly made decisions based solely on rational cost-benefit analyses.
Tversky and Kahneman upended this conventional wisdom. They conclusively demonstrated that humans are programmed intellectually with a host of thinking biases and shortcuts that short-circuit rational decision-making. We are hardwired to respond instinctively and immediately to problematic situations. In the evolutionary scheme of things, these biases and shortcuts may have been useful when our ancestors were reptiles and, later, mini-mammals who had to make instantaneous life-and-death decisions in the primordial swamps and pampas. But these instinctive reactions are not consistent with logical reasoning or cost-benefit analyses, and they can lead us humans to wrongheaded and harmful conclusions in our civilized societies.
Contrary to conventional wisdom, emotional distortion is not the source of these thinking disorders. These are not emotional biases or biases that stem from emotions, albeit they may at times be connected to emotion. They are purely intellectual biases that operate with or without emotions. We instinctively resort to them, usually without knowing it, and usually without being able on our own to avoid it. We think we are having a brilliant intuition, but it is really an instinctive bias, and probably wrong. Left to our own devices, we will almost invariably fall into these biases. For better and mainly for worse, they are part and parcel of the way we think. They are our inherited conventional wisdom. And although they affect our thinking about almost everything, they especially affect the way we think about the past, that is, our memories and our history.
The moral of the story of The Undoing Project is that we must find ways to pull back from many of our intuitive reactions. We must find ways of forestalling our instinctive fast thinking, and force ourselves to engage more frequently in reflective slow thinking. We must undo our first thoughts to arrive at better second thoughts. Kahneman’s Thinking Fast and Slow exemplifies this message. Kahneman is a good writer, and the book is genially written. But it is long and repetitive. This is because Kahneman is very generous in attributing the origins and the sources of his findings to predecessors and colleagues. He also meticulously describes the history of their various researches, the ways in which they discovered and undid the biases they brought to their own research, and the processes of reflection through which they came to their conclusions.
The premise of this essay is that most historians bring to their work the intellectual biases described by Kahneman and Tversky. A consequence of this is that the conventional wisdom about history is often unhistorical and not very useful. The argument of this essay is that approaching history as people making choices is a way to undo the intellectual biases we bring to the study of the past. We can, thereby, achieve a more rational and useful history.
The Revenge of the Reptiles: Prehistoric Thoughts in our Brain Stems.
“We have met the enemy, and he is us.” Pogo
The human brain is the product of eons of evolutionary development. Having assembled itself in stages, our present-day brains incorporate different capabilities, traits, and parts that emerged at different times over the years. For better and for worse, most of the older parts are still intact inside our heads, and these older parts sometimes cooperate and sometimes conflict with newer parts. The oldest operating part is the brain stem, the core of which we inherited from our reptilian forefathers. This reptilian core operates largely on a “fright, then fight or flight” basis, an unthinking instinctive reaction to danger that was a successful strategy for our relatively small forebears who had to survive among much larger and voracious carnivores.
The brain stem is also the repository of most of the intellectual biases that Tversky and Kahneman describe. These biases were developed in our humanoid progenitors, who had a greater ability to think than our reptilian ancestors, but still needed to think quickly. They combined the “fright, then fight or flight” reflex, that was already programmed into their brain stems, with intellectual shortcuts, that also became hardwired. The combination helped them to survive and thrive among their slower-thinking competitors. Our human ancestors later inherited both the reptilian reflex and the humanoid shortcuts, which are experienced by us as forms of intuition. But what worked for our humanoid predecessors does not always work for us humans.
Reflecting the more complex world in which they lived, our human ancestors developed the intellectual ability to reflect on problems, rather than merely react to them. This ability resides in our cerebral cortex. It has historically been the pride of the human race, and our excuse for lording it over other creatures.  The embarrassing fact that Tversky and Kahneman uncovered is that we all-too-rarely take advantage of our higher intellectual capabilities, and persist in reacting to problems like reptiles and humanoids when we should be reflecting on them like humans.
Reflective thinking is hard, and it takes considerable time and effort to mobilize the cerebral cortex to think deeply about things. It is much easier and quicker to just react. It is also the case that it would be impossible for us to get much done if we tried to reflectively think about everything we do. So, we don’t. The problem is that we are not very good at distinguishing between decisions that we can safely make instinctively, and decisions that we need to think through more thoroughly, and think about with the help of others. Making that distinction itself requires reflective thinking. So, we are often caught in a vicious circle of thinking too quickly.
The goal of Tvervsky’s and Kahneman’s “undoing project,” and of Kahneman’s admonition that we should think more slowly, is essentially to substitute reflective judgments, derived in the cerebral cortex, for instinctive reactions, emerging from the brain stem, on important matters.
Instinctive Biases: What We Don’t Know Can Hurt Us.
“I think unconscious bias is one of the hardest things to get at.” Justice Ruth Bader Ginsburg.
In his book Thinking Fast and Slow, Kahneman describes four main biases that distort our thinking, and that are particularly relevant to the study of history. They are the “aversion bias,” the “planning fallacy,” the “outcome bias,” and the “availability bias.” The aversion bias and the planning fallacy distort the ways in which we process information. The outcome bias and the availability bias distort our access to information and to our memories.
1. The Aversion Bias. Probably the most persistent and powerful bias with which we are plagued is what Tversky and Kahneman call the “aversion bias,” or what I think could be called a “sky is falling” reaction to adversity. We humans are programmed to react quickly and drastically to potential adversity. It is what helped our hapless ancestors to survive among bigger, stronger, and faster adversaries. The aversion bias, however, leads people to overweigh and overreact to small possibilities of loss, so that our “worry [about a threat] is not proportional to the possibility of the threat.” We are instinctive worrywarts, and that can be worrisome.
When faced with almost any adversity, people tend to react as though the sky is falling. Unless forestalled by others’ better judgments or by their own reflective second thoughts, people will frequently make short-sighted panicky decisions based on little evidence. Making quick judgments based on little evidence was not an unreasonable operating procedure for our pre-human ancestors. The threats to them tended to be direct and simple, and a successful reaction to those threats could also be direct and simple. They had to decide quickly whether to fight or flee, and they had to do it fast before their adversaries got in a first and fatal blow. This do-or-die reaction is the core of the aversion bias. It is a reaction that was seemingly helpful to pre-humans, but is often unhelpful in the more complex world in which we humans live.
The aversion bias takes two main forms that are logically inconsistent, but that make sense together as sky-is-falling reactions to adversity. In the first form, when people are faced with a choice between keeping a tolerable status quo, or opting for a change that will most likely make things better but might make them slightly worse, most people will choose to stay with the status quo. Reflecting the conventional wisdom that a bird in the hand is worth two in the bush, people are generally unwilling to risk upsetting a tolerable status quo, even when the probabilities of a successful change are great, and a cost-benefit analysis clearly favors the proposed change. We are, Kahneman claims, innately conservative creatures, and this interferes with rational thinking.
Any loss is unacceptable to most people most of the time, and is seen by them as a sky-is-falling outcome. A consequence of this inherent conservatism is that many people remain in situations in which they are unhappy, while forgoing favorable opportunities to be happier. The aversion bias has social and political ramifications as well. Conservative politicians routinely appeal to the aversion bias as part of their campaigns. Be afraid of change, they preach. The aversion bias also has historical implications. Did, for example, American Tories fall prey to the aversion bias when they refused to join in criticizing the British government during the 1770’s, and did their aversion to any change help incite radicals to make a revolutionary change?
In the second form of the aversion bias, when people are given the choice between either accepting a manageable loss, or risking a disaster on the small chance of avoiding any loss, most people go for broke and risk everything to avoid what would have been a manageable loss. They do this even when the odds and a cost-benefit analysis favor going with the manageable loss. This willingness to act radically to avoid small losses, even at the risk of suffering disastrously large losses, seems inconsistent with the first form of the aversion bias, in which people act conservatively to avoid small losses even at the expense of forgoing gains. But it isn’t. The common core of both forms of the aversion bias is that people see any loss as a sky-is-falling result and, in turn, are generally unable to distinguish between a small loss and a disastrous loss.
Radical politicians of both the revolutionary Left and the fascistic Right have made appeals to this go-for-broke form of the aversion bias a standard part of their operating procedures. In American history, for example, did the Sons of Liberty in the 1770’s fall prey to the aversion bias, or prey upon the aversion bias, when they vehemently rejected British proposals to increase taxation? The increases were small and the British intended to spend the taxes on protecting the American colonies from foreign attacks. The Sons of Liberty claimed, however, that accepting any taxes, no matter how small, would lead to total oppression. Was this a reasonable reaction?
Appeals to both forms of the aversion bias could be seen in the Trump campaign during the 2016 presidential election, particularly in the ways Trump denied the evidence on global warming, gun control, and immigration, and played on people’s fears of change. With respect to global warming, the evidence is overwhelming that human activities are the main cause of rising temperatures and erratic weather patterns. There is also a consensus among environmental scientists that global warming will adversely affect human life. And there is a consensus among economists that ameliorating global warming by going green would generate jobs and economic growth, which would greatly benefit the public as well as the environment.
Nonetheless, Trump, along with the oil and coal billionaires who support him, and those who speak and act on their behalf, such as the new EPA Administrator Scott Pruitt, have been able to generate widespread fear that if the government acts on global warming, people might have to give up their SUV’s and pickup trucks. This is seemingly an instance of people choosing to forgo a significant benefit out of fear of a small loss, the first form of the aversion bias.
Likewise, with respect to gun control, the evidence is overwhelming that we are safer both individually and as a society without guns in the hands of private individuals. If you own a gun, there is virtually no chance that you will ever use it to thwart an attack on yourself or someone else. In fact, if you own a gun, your chances of being shot increase several-fold, and it will most likely be with your own gun. Significant measures of gun control would be in the best interests of almost everyone. Nonetheless, Trump, along with a handful of gun fanatics, and the fascists who run the NRA, such as Wayne LaPierre, have stirred up fears that gun control will make people less safe because people won’t be able to defend themselves with their own guns. Again, this is an example of people forgoing a significant benefit out of fear of a small potential loss.
Trump’s rantings against immigrants and immigration, fed by racists such as his top advisor Steve Bannon, exemplify go-for-broke politics, the second form of the aversion bias. The demographic facts are that the United States is going to have a population within the next twenty-five years in which over fifty percent of the people will be minorities. The historical facts are that immigrants have always been, and still are, the backbone of economic and cultural advancement in our country. Trump, nonetheless, won the election in large part by stirring up fears among white European-Americans that they might lose their top dog status in our society, and might have to share prestige and power with other ethnic groups. Instead of recognizing the possibility of this minor loss of status as a small price to pay for positive social change, Trump and his supporters have chosen to wage an all-out pejorative campaign against immigrants, minorities, and foreigners. It is a reckless policy that portends potential disaster for the country.
2. The Planning Fallacy. The aversion bias is a powerful motivating force. Fear of loss will usually trump hope of gain because, as Kahneman claims, “Losses are weighted about twice as much as gains” in our instinctive thinking. But the aversion bias is not all-powerful. Hope can sometimes triumph, and hope is our only hope in defeating fear mongers who would rule us through our aversion bias. Hope, however, also has its pitfalls. When hope becomes optimism, overweening optimism can lead us astray. The problem is that optimists almost inevitably fall prey to what Kahneman calls the “planning fallacy,” or what I think could be diagnosed as a “narcissistic intellectual disorder.”
If the aversion bias leads us to be overly pessimistic about what is happening to us, the planning fallacy leads us to be overly optimistic about what we are doing about it. When we decide to do something, whether it be to stick with the status quo, go for broke, or do otherwise, we almost inevitably overestimate the likelihood for success of the things we plan to do. We become enamored of our plans, and overconfidence often leads to the failure of our enterprise.
Narcissism plays a big part in this mistake because when we decide to do things, we tend to focus solely on our own abilities, our own actions, and how well we have prepared to do them. We fail to pay sufficient attention to what others are doing, or what bad luck could befall us, that might foil our plans. We focus on what we are putting into the project, but fail to focus on the context in which we are operating. Optimism is one of the main reasons that entrepreneurs start so many new small businesses every year, but overweening optimism is one of the main reasons that some eighty percent of them fail within the first year and a half.
The planning fallacy and the narcissism bias produce miscalculations on the part of political actors as well as businessmen, and reckless and regrettable behavior can be the result. Did the American revolutionaries, for example, fall prey to the planning fallacy when they started a war with Britain which they expected to win quickly and easily, but which went on for over seven bloody years? Is it an instance of the narcissism bias, taken to a seemingly pathological extreme, when despite four bankruptcies and three marriages, among other bungled enterprises, Donald Trump claims to be able to do anything, to have been successful at everything, and is unable to acknowledge any sort of mistake or failure?
3. Memory Tricks: The Availability Bias. Biases affect not only the way we process information, but the way we store information. Our memories are the storehouses of the information we use to reach conclusions and make decisions. But our memories play deceptive tricks on us. One of these is what Kahneman calls the “availability bias,” which could be described as a “last in, first out mind set.” People give more weight to recent events than they reasonably should, especially if the events are dramatic. The last thing we have experienced becomes the first thing we think about when evaluating a situation and reaching a decision. The evidence that is most readily available, that is, the last evidence to be stored in our memories, is the first and most influential evidence that we consider, even if it is not the best evidence.
People are also short-sighted. They tend to see things within a narrow and short-term frame of reference. They give too much weight to small pieces of anecdotal evidence, and too little consideration to the big picture and the long-term. “We are by nature narrow framers,” Kahneman claims. People also tend to be enchanted by melodramatic stories, and turned off by statistics and abstract arguments. It is easier to access and process small pieces of simple information than to retrieve and reflect on complex conglomerations of evidence. As a result, we often fail to put events into a big picture or see them in long-run terms. We give too much weight to either bad news or good news, and tend to overreact either pessimistically or optimistically to situations because we fail to consider the weight of all the best evidence.
The availability bias has historical and political implications. Did, for example, the American revolutionaries overreact to the actions of King George III based on narrowly framing what he was doing? The revolutionaries claimed that because the King was working actively with Parliament, he was trying to become a dictator when, in fact, the King and Parliament were working toward the parliamentary government that still prevails in England today. Was Donald Trump also guilty of narrow framing in the recent election when he harped on a few isolated stories of harm caused by immigrants, while failing to acknowledge the bigger picture of the good things immigrants have contributed and continue to contribute to our country?
4. The Outcome Bias. In addition to the availability bias, our memories are also subject to what Kahneman calls an “outcome bias,” which could be characterized as an “all’s well that ends well mindset” and a “winners get to write the history syndrome.” Kahneman reports that if the outcome of a decision is good, people do not generally care how the result was achieved. And they generally remember the process of having decided to do the thing, and the way the thing was done, as having been good, even if that wasn’t the case.
The outcome bias can lead to dangerously false conclusions about a person’s perspicacity. “A few lucky gambles,” Kahneman claims, “can crown a reckless leader with a halo of prescience and boldness.” Winners get to write the history, even if it is wrong. This bias can also lead to dangerously false conclusions about the successfulness of aggressive ways of acting. We tend, for example, to forget the death and destruction of a war if our side won. We almost completely ignore questions of whether the war was necessary, let alone worth it. And we avoid questions of whether the same or better results could have been achieved without the war, and without the death and destruction. In our memories, all’s well that ends well, even if it really wasn’t.
All’s well that ends well is the conventional wisdom promoted in most history textbooks. This approach is especially the case with American history, which is conventionally portrayed as an inexorable march of progress, freedom, and goodness. As applied to the American Revolution, for example, since the Revolutionaries won, and the country grew bigger and better thereafter, the Revolution must have been a good thing. All you need to know, in the conventional view, is that we have made it to the present, and the present is pleasant. So why question whether there were better alternatives to the ways in which we got here? This conventional wisdom is rebuffed by an approach to history as people making choices. In that approach, questions need to be asked about whether there were alternatives to the way we got here, and voices other than those of the winners need to be heard, if we are to learn from the past and prepare for the future.
Donald Trump’s reaction to the recent presidential election is an example of the outcome bias, and why we need to listen to multiple voices. Trump is outraged that people might be concerned with how he won the election, and whether his campaign colluded with Russian intelligence agents to undermine his opponent. As far as he is concerned, the election is over. He won. And the winners get to write the history. Trump thinks people should remember the election as one in which his qualifications and strategies prevailed over his opponent’s, and that all is well that has ended well. That is how most people usually remember things and that, Trump insists, is how people should think about his election. But maybe not this time.
Undoing the Past and Learning from the Losers: Reflection as Collective Thinking.
“The greatest of faults is to be aware of none.” Thomas Carlyle.
Tversky and Kahneman are skeptical, but not pessimistic, about the possibilities that people can think rationally and make reasonable decisions. The best way to approach almost any problem, they contend, is with other people, so that you can identify and critique each other’s biases. Even if you share the same biases with your colleagues, it is easier to recognize and reject biases in the thinking of others than in your own thinking. So, we can give each other a lift. “Organizations,” Kahneman claims, “are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures” on how conclusions are reached and decisions are made. In his description in The Undoing Project of the ways in which Tversky and Kahneman, worked together and critiqued each other, Michael Lewis essentially describes the underlying message of their findings, that we need critical input from others and cooperative effort with them, to be able to overcome the biases that are built into our thinking processes.
Kahneman goes on to suggest that if you cannot work with others, and are faced with a problem on your own, you should note your first intuition or instinct as to how to solve the problem, and then reject it. You should then force yourself to reflect on the problem and on your first response to it, explore alternative options for a solution, and select the one that seems to be based on the best evidence and arguments. In so doing, you should listen to the voices in your head of people whose judgments you generally respect and trust, and you should subject your memories and your ideas to vicarious critique by those significant others.
Reflection is primarily a process of listening to competing voices in our heads. It is a symposium of influential books we have read, convincing speakers we have heard, and significant people whose points of view we have incorporated into our internal dialogues. Through listening to these voices, we can conjure up memories we might otherwise miss, and consider arguments we might otherwise ignore. Reflection can, thereby, help keep us from instinctively reaching wrongheaded conclusions. Once we have reflected on a problem, and given ourselves the best chance of thinking rationally, we should decide on how to deal with it. This is the method that I promote in approaching history as people making choices.
Approaching history as a study of people making choices treats the subject as a collective enterprise in which people from the past and the present participate. The method can help us see whether the people we are studying fell prey to the intellectual biases identified by Tversky and Kahneman, and whether and to what extent we ourselves are prone to those biases in studying the history of those people. Tversky and Kahneman teach us that the subjects of our historical studies probably should not have followed their first instincts, and may have been mistaken in their decisions if they did. The same goes for us as students of history.
Approaching history in this way requires us to examine our historical subjects’ reactions to the problems they faced, see if they made reflective decisions or instinctively reacted, and speculate on whether they could have made better decisions than the ones they made. In studying the American Revolution, for example, this means looking at what was said by those Americans who opposed the Revolution, who were the losers in the debate over whether to revolt, and to decide in retrospect who had the better of the argument.
Reconsidering past decisions in this way is sometimes disparaged as twenty-twenty hindsight, and a cheap shot at the past. But that is neither fair nor accurate. We humans are inveterate second-guessers, and we routinely evaluate our own decisions, so why not similarly evaluate the decisions of our predecessors? Whether we are businesspeople having made a deal, soldiers having fought a battle, or Little League baseball managers having called for a squeeze play, we invariably look back at what we did, revisit the alternatives we had, and speculate on what might have happened if we had made a different choice. And this is not a waste of time and effort. Evaluating our past decisions helps prepare us for our next decision. It is the same with history.
The point of historically studying people’s choices is to understand why and how they got things right and got things wrong. It is not to condemn or demean them. The goal is to learn from their successes and mistakes, just as we try to learn from our own successes and mistakes. It is not, for example, a condemnation of the Founding Fathers if we were to conclude that the American Revolution was a mistake, and that things might have been better if Americans had achieved independence gradually and peacefully, as did Britain’s other English-speaking colonies.
It is, in turn, no disloyalty on our part to the Founders or to the United States that we want to try to get a past decision right retrospectively as an aid to getting our next decisions right prospectively. It is, I would contend, a patriotic act. While moral turpitude may attach to an ill-intentioned decision by an ill-meaning person, there is no moral turpitude attached to a well-intentioned and well-meaning decision that turns out to be a mistake because of an unwitting bias. That is one of the differences between Donald Trump and George Washington.
The Method of Approaching History as People Making Choices.
“There are always choices…Our responsibility as historians is as much to show that there were paths not taken as it is to explain the ones that were.” (emphasis in original) John Lewis Gaddis.
The method of approaching history as people making choices can be outlined in six main steps. First, we must decide what historical event we want to study. History is virtually infinite in scope, and there are an almost infinite number of events we could choose to study. So, we need to make a choice and, in order to avoid falling prey to an availability bias, we need to reflect on the reasons we are choosing to study a particular event.
Whether we are aware of it or not, we invariably study problems in the past, and ask questions about them, that relate to issues in which we are interested in the present. It is almost inevitable that an event we choose to study is somehow related to a current social issue. That connection is not a problem so long as we are aware of it, and do not let our predilections toward the current issue lead us to predetermine our response to the historical problem. The purpose of studying history is to let our conclusions about past events help inform our judgements of present issues. That educational purpose is foiled if we merely judge past events based on our current biases.
Social issues change, and so do the list of historical events in which we are interested and the questions we ask about those events. This is the main reason history books are continually being rewritten, and the history of subjects being revised. New history books are rarely a result of significant new evidence but are, instead, usually the result of changing interests. During the 1940’s, for example, social conformity and political apathy were issues of concern. Historians, in turn, looked at the American Revolution as a case study of how masses of people might be motivated to act. During the turbulent 1960’s, historians looked at the Revolution as a case study of how masses of people might directed toward constructive ends. Historians today, in the wake of the recent election, may be choosing to focus on the Revolution as a study of ways and means of countering a potentially tyrannical ruler.
Having chosen the subject of study, the second step is to delineate the plausible options that people had in deciding what to do about the problem they were facing. We need to resurrect and understand the arguments that different groups of people made in support of various options. Since history is generally written by and on behalf of the winners, we will likely need to recover and listen to some lost voices in this process. What, for example, were the arguments of the Tories during the American Revolution? How do their arguments look in retrospect?
As the third step, we need to examine why and how the winning argument prevailed, and a choice was made. How did the winners win, and what happened to the losers in the debate? What happened, for example, to those who initially opposed the Revolution? Why and how did some opponents turn to supporting it, while others did not? And what happened to these people?
In the fourth step, we need to examine the consequences of the choice that prevailed. How did the prevailing choice affect people then and afterwards, and how does it affect us now? Since conventional history is generally written as all’s well that ends well, we need to distinguish between current circumstances that are a consequence of that choice, and things that might have come to pass even without that choice.
For example, the United States is today a relatively prosperous and free country. Conventional histories generally attribute our current circumstances to our having undertaken and won the American Revolution. But is that so? The other English-speaking former British colonies, i.e. Canada, Australia, and New Zealand, gained their independence gradually and generally peacefully during the nineteenth century. They did not suffer the death and destruction of a violent revolution. And they are today at least as prosperous and free as the United States. Comparing their histories to ours raises questions of whether the success of the United States can be attributed to the Revolution, and whether we could not have done as well or better without it.
This leads to the fifth step, which is that we need to speculate as to what might have happened if a different choice had been made. This is the second-guessing part of the project, to which many historians object, but which I think is crucial to getting beyond an outcome bias that all is well that ends well.
As the sixth step, we should apply what we have concluded about the historical event to the present-day issue that led us to study that event in the first place. The premise of studying history as people making choices is that things might have been different, for better or for worse, if different choices had been made, and that exploring those past possibilities might help us make better choices in the present. Maybe the losers in the debate over the American Revolution were right. Or maybe they weren’t. It is enlightening to consider the possibilities.
Saving History from the Post Hoc Fallacy: Choice versus Causation.
“The supposition that the future resembles the past is not founded on any arguments, but is derived entirely from habit.” David Hume.
History is story. Like other stories, history starts with a “Once upon a time” scenario. This starting scenario is a dynamic situation from which a narrative unfolds, and from which events pass in time from “Once upon a time.” But time can take on very different meanings depending on the narrative form of a story, and whether events are portrayed as flowing randomly as a function of chance, predetermined as a result of causation, or determined freely as a consequence of people’s choices. History can take the form of chance, causation, or choice.
Chance is luck, something that happens unpredictably without discernable human intention or observable cause, so that history as chance is a story of happenstance that people can neither predict nor control. History as chance is seemingly arbitrary and unfathomable. And dangerous. It is the world of small children baffled and intimidated by adults, and by the host of things they do not understand and cannot control, many of which might hurt them. It is also the world of our reptilian ancestors, and a realm in which the instinctive “fright, then fight or flight” response of our prehistoric brain stems would seem appropriate.
If history is the result of chance, there is little reason to study it, and little to be learned from studying it, other than the worldly wisdom of stoic resignation. History as chance is a rationale for an aversion bias. If history is chance, then aversion would seem to be the proper response to any potential change in a tolerable status quo, no matter what the promised benefits of the change. For in a world dominated by chance, who knows what might come next? Better the devil you know than the one you don’t, as the stoic saying goes.
Unlike chance, causation is inexorable, with consequences flowing inevitably from circumstances, so that history approached as causation appears to be the product of forces and factors that control events behind our backs and despite our intentions. Causation is the form in which most conventional history is presented. Most of us remember, for example, having to memorize in school the six or eight or ten so-called “causes” of the American Revolution, the American Civil War, and other important historical events. In this approach, history is portrayed as a chain of causes and effects that we can understand but cannot control.
However, if history is a chain of causation in which one thing follows logically from another, then the future ought to be predictable from the past, and the study of history ought to make us fortune tellers. But, it doesn’t. Causation history exemplifies the outcome and availability biases described by Tversky and Kahneman, and it is an instance of the logical fallacy known as “post hoc ergo propter hoc. In approaching history as causation, one assumes that because something came after something else, the first thing must have caused the second thing. But, this is not logical, or even empirical. And it leaves us with nothing to do but contemplate our navels as we watch events unfold.
History as causation takes the current state of the world, and then outlines the stream of events that led to it. It delineates the events in a chain of causes and effects that can look like an inevitable path from the past to the present. But, it leaves out all the paths not taken, all the plausible options not chosen, and all the real-life contingencies faced by people in the past and by us today. It is an abstraction that is neither interesting nor useful. Like history as chance, history as causation can serve as a rationale for quietism and political passivity.
Unlike chance and causation, choice is deliberate, so that history as choice is a story of people making decisions in the face of circumstances they may not be able entirely to predict or control, but with the belief that they can freely choose among plausible options and reasonably predict what might be the consequences of their actions. History as people making choices is realistic and seemingly reasonable. It is the way we experience life, as people debating and choosing among options within prescribed circumstances. If history is a matter of choices, then time is a medium of opportunity and not futility, and life is not merely a matter of waiting for arbitrary or inevitable things to happen. History as choice is a rationale for social and political activism.
Approaching history as people making choices allows us to relate consequences from the past to circumstances in the present without falling into an outcome bias. The method makes connections between the past and the present debatable rather than inevitable. The same events that are conventionally presented as a chain of causes and effects can be reconceived as a series of circumstances, choices and consequences. This is a narrative distinction that makes a big difference in the meaning and moral of a history. With respect to the American Revolution, instead of seeing the Revolution as an inevitable result of causation, we can approach it as a series of debates about who should govern, and how government should operate. These debates, in turn, helped form subsequent debates about government and democracy that have permeated American history from then to now. That is a much more useful history.
In sum, approaching history as people making choices is a method of studying how and why people think the way they do, and make the choices that they do. Historical events are approached essentially the way those events were approached by the people who experienced them, and the way we approach situations in our own lives, as contingencies that could go different ways depending on the choices that are made. Past decisions are, in turn, related to problems and choices facing us in the present day. Studied in this way, history becomes an important life skill, and an education in avoiding the intellectual pitfalls described by Tversky and Kahneman.
Postscript: For Further Reading…
The purpose of this essay has been to introduce the findings of Tversky and Kahneman, and promote the method of approaching history as people making choices. Although conventional history textbooks do not reflect it, most of the best scholarly historians have either explicitly or implicitly approached history as people making choices. If you are interested in seeing how this method works, I have written a book that is based on my reading of some of the best historical works of the past fifty years, and that exemplifies the method. It is titled Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).
The book describes ways of teaching American history as people making choices, and includes a thematic history of the United States that exemplifies the method. The book examines thirteen turning points in American history from the early 1600’s through the late 1900’s. It focuses on the decision-making processes of the people involved, uncovers many of their biases, explores debates among historians about those turning points, and debates the conclusions of historians. The book is intended as an encouragement for readers to explore historical events for themselves, debate their own and others’ ideas, and arrive at their own considered conclusions about history.
 Michael Lewis. The Undoing Project: A Friendship that Changed Our Minds. New York: W.W.Norton, 2016.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013.
 For a discussion of how one might teach history as people making choices, and a thematic history of the United States using that method, see my book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice. Bloomington, IN: AuthorHouse, 2013.
 See the Walt Kelley cartoon at http://www.igopogo.com/we_have_met.htm (1953) After a long, arduous and comic search for the source of the world’s problems, and the enemy that is plaguing us, Pogo Possum concludes that we are the source of our problems, and that we must start to think differently in order to resolve them.
 David Sloan Wilson. Evolution for Everybody. New York: Delacorte Press, 2007. p.285.
 Jared Diamond. The Third Chimpanzee: The Evolution and Future of the Human Animal. New York: Harper Perennial, 1993. pp.220-221.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.320-322, 329.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. p.364.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.339, 341-342.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.12, 350.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.212-213.
 This is being written in early March, 2017 when the ways and means of the election are still a considerable source of controversy.
 Daniel Kahneman. Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. p.436.
 Ronald Dworkin. Justice for Hedgehogs. Cambridge, MA: Harvard University Press, 2010. p.127.
 Richard Hofstadter. The American Political Tradition. New York: Random House, 1948. pp.3-17.
 Staughton Lynd. Intellectual Origins of American Radicalism. New York: Random House, 1968.
 John Lewis Gaddes. The Landscape of History: How Historians Map the Past. New York: Oxford University Press, 2002. p.9.
 Isaiah Berlin. Historical Inevitability. London: Oxford University Press, 1954. pp.3, 20-21,68.
 I have an extended discussion of this narrative distinction in my blog post “What to do about the Big Bad Wolf: Narrative Choices and the Moral of a Story.”