The Undoing Project and History as Choice: Reprogramming Your Intellect through Listening to Others.

 The Undoing Project and History as Choice:

Reprogramming Your Intellect through Listening to Others.

Burton Weltman

If historians can explain the past as logical,

and the present as inevitable, why can’t they predict the future?

Anonymous.

The Undoing Project and Intellectual Bias: Conventional Wisdom Undone.

“Conventional wisdom [consists of] ideas which are esteemed at any time for their acceptability.” (emphasis added) John Kenneth Galbraith.

The Undoing Project: A Friendship that Changed Our Minds, by Michael Lewis[1], is the story of two psychologists, Amos Tversky and Daniel Kahneman, who wrought a Copernican revolution in the way we think about thinking.  Their experiments and findings on how people evaluate evidence, reach conclusions, and make decisions forced experts in a wide variety of fields to change their assumptions and methods of doing business.  The effect on the field of economics was so great that Kahneman was awarded a Nobel Prize in economics in 2002, despite having no special knowledge of economics.  He later summarized his and Tversky’s findings in the book Thinking Fast and Slow.[2]  This essay applies their findings to the study of history, and suggests that they support an approach that makes history a study of people making choices, as opposed to the conventional textbook approach that makes history into a chain of causation.[3]

The conventional wisdom among both scholars and policymakers during the post-World War II era was that people are basically rational thinkers, subject to distortions in their thinking and cognitive disorders that can result from the influence and interference of their emotions.  That is, people are rational unless they are led astray by their emotions, and the key to rational thinking is, therefore, to control one’s emotions.  The conclusion was that if one controls one’s emotions, one can rely on one’s reasoning abilities.  Policymakers can, in turn, either assume that people will control their emotions and think rationally, or can factor into their policies a variable that compensates for people’s emotions.  This assumption of rationality was especially influential in the field of economics.  Mainstream economics was constructed around the so-called economic man who ostensibly made decisions based solely on rational cost-benefit analyses.

Tversky and Kahneman upended this conventional wisdom.  They conclusively demonstrated that humans are programmed intellectually with a host of thinking biases and shortcuts that short-circuit rational decision-making.  We are hardwired to respond instinctively and immediately to problematic situations.  In the evolutionary scheme of things, these biases and shortcuts may have been useful when our ancestors were reptiles and, later, mini-mammals who had to make instantaneous life-and-death decisions in the primordial swamps and pampas.  But these instinctive reactions are not consistent with logical reasoning or cost-benefit analyses, and they can lead us humans to wrongheaded and harmful conclusions in our civilized societies.

Contrary to conventional wisdom, emotional distortion is not the source of these thinking disorders.  These are not emotional biases or biases that stem from emotions, albeit they may at times be connected to emotion.  They are purely intellectual biases that operate with or without emotions.  We instinctively resort to them, usually without knowing it, and usually without being able on our own to avoid it.  We think we are having a brilliant intuition, but it is really an instinctive bias, and probably wrong.  Left to our own devices, we will almost invariably fall into these biases.  For better and mainly for worse, they are part and parcel of the way we think.  They are our inherited conventional wisdom.  And although they affect our thinking about almost everything, they especially affect the way we think about the past, that is, our memories and our history.

The moral of the story of The Undoing Project is that we must find ways to pull back from many of our intuitive reactions.  We must find ways of forestalling our instinctive fast thinking, and force ourselves to engage more frequently in reflective slow thinking.  We must undo our first thoughts to arrive at better second thoughts.  Kahneman’s Thinking Fast and Slow exemplifies this message.  Kahneman is a good writer, and the book is genially written.  But it is long and repetitive.  This is because Kahneman is very generous in attributing the origins and the sources of his findings to predecessors and colleagues.  He also meticulously describes the history of their various researches, the ways in which they discovered and undid the biases they brought to their own research, and the processes of reflection through which they came to their conclusions.

The premise of this essay is that most historians bring to their work the intellectual biases described by Kahneman and Tversky.  A consequence of this is that the conventional wisdom about history is often unhistorical and not very useful.  The argument of this essay is that approaching history as people making choices is a way to undo the intellectual biases we bring to the study of the past.  We can, thereby, achieve a more rational and useful history.

The Revenge of the Reptiles: Prehistoric Thoughts in our Brain Stems.

“We have met the enemy, and he is us.”   Pogo[4]

The human brain is the product of eons of evolutionary development.  Having assembled itself in stages, our present-day brains incorporate different capabilities, traits, and parts that emerged at different times over the years.  For better and for worse, most of the older parts are still intact inside our heads, and these older parts sometimes cooperate and sometimes conflict with newer parts.  The oldest operating part is the brain stem, the core of which we inherited from our reptilian forefathers.  This reptilian core operates largely on a “fright, then fight or flight” basis, an unthinking instinctive reaction to danger that was a successful strategy for our relatively small forebears who had to survive among much larger and voracious carnivores.[5]

The brain stem is also the repository of most of the intellectual biases that Tversky and Kahneman describe.  These biases were developed in our humanoid progenitors, who had a greater ability to think than our reptilian ancestors, but still needed to think quickly.  They combined the “fright, then fight or flight” reflex, that was already programmed into their brain stems, with intellectual shortcuts, that also became hardwired.  The combination helped them to survive and thrive among their slower-thinking competitors.  Our human ancestors later inherited both the reptilian reflex and the humanoid shortcuts, which are experienced by us as forms of intuition.  But what worked for our humanoid predecessors does not always work for us humans.

Reflecting the more complex world in which they lived, our human ancestors developed the intellectual ability to reflect on problems, rather than merely react to them.  This ability resides in our cerebral cortex. It has historically been the pride of the human race, and our excuse for lording it over other creatures. [6] The embarrassing fact that Tversky and Kahneman uncovered is that we all-too-rarely take advantage of our higher intellectual capabilities, and persist in reacting to problems like reptiles and humanoids when we should be reflecting on them like humans.

Reflective thinking is hard, and it takes considerable time and effort to mobilize the cerebral cortex to think deeply about things.  It is much easier and quicker to just react.  It is also the case that it would be impossible for us to get much done if we tried to reflectively think about everything we do.  So, we don’t.  The problem is that we are not very good at distinguishing between decisions that we can safely make instinctively, and decisions that we need to think through more thoroughly, and think about with the help of others.  Making that distinction itself requires reflective thinking.  So, we are often caught in a vicious circle of thinking too quickly.

The goal of Tvervsky’s and Kahneman’s “undoing project,” and of Kahneman’s admonition that we should think more slowly, is essentially to substitute reflective judgments, derived in the cerebral cortex, for instinctive reactions, emerging from the brain stem, on important matters.

Instinctive Biases: What We Don’t Know Can Hurt Us.

“I think unconscious bias is one of the hardest things to get at.”  Justice Ruth Bader Ginsburg.

In his book Thinking Fast and Slow, Kahneman describes four main biases that distort our thinking, and that are particularly relevant to the study of history.  They are the “aversion bias,” the “planning fallacy,” the “outcome bias,” and the “availability bias.”  The aversion bias and the planning fallacy distort the ways in which we process information.  The outcome bias and the availability bias distort our access to information and to our memories.

1. The Aversion Bias. Probably the most persistent and powerful bias with which we are plagued is what Tversky and Kahneman call the “aversion bias,” or what I think could be called a “sky is falling” reaction to adversity.  We humans are programmed to react quickly and drastically to potential adversity.  It is what helped our hapless ancestors to survive among bigger, stronger, and faster adversaries.  The aversion bias, however, leads people to overweigh and overreact to small possibilities of loss, so that our “worry [about a threat] is not proportional to the possibility of the threat.”[7]  We are instinctive worrywarts, and that can be worrisome.

When faced with almost any adversity, people tend to react as though the sky is falling.  Unless forestalled by others’ better judgments or by their own reflective second thoughts, people will frequently make short-sighted panicky decisions based on little evidence.  Making quick judgments based on little evidence was not an unreasonable operating procedure for our pre-human ancestors.  The threats to them tended to be direct and simple, and a successful reaction to those threats could also be direct and simple.  They had to decide quickly whether to fight or flee, and they had to do it fast before their adversaries got in a first and fatal blow.  This do-or-die reaction is the core of the aversion bias.  It is a reaction that was seemingly helpful to pre-humans, but is often unhelpful in the more complex world in which we humans live.

The aversion bias takes two main forms that are logically inconsistent, but that make sense together as sky-is-falling reactions to adversity.  In the first form, when people are faced with a choice between keeping a tolerable status quo, or opting for a change that will most likely make things better but might make them slightly worse, most people will choose to stay with the status quo.  Reflecting the conventional wisdom that a bird in the hand is worth two in the bush, people are generally unwilling to risk upsetting a tolerable status quo, even when the probabilities of a successful change are great, and a cost-benefit analysis clearly favors the proposed change.  We are, Kahneman claims, innately conservative creatures, and this interferes with rational thinking.

Any loss is unacceptable to most people most of the time, and is seen by them as a sky-is-falling outcome.  A consequence of this inherent conservatism is that many people remain in situations in which they are unhappy, while forgoing favorable opportunities to be happier.  The aversion bias has social and political ramifications as well.  Conservative politicians routinely appeal to the aversion bias as part of their campaigns.  Be afraid of change, they preach.  The aversion bias also has historical implications.  Did, for example, American Tories fall prey to the aversion bias when they refused to join in criticizing the British government during the 1770’s, and did their aversion to any change help incite radicals to make a revolutionary change?

In the second form of the aversion bias, when people are given the choice between either accepting a manageable loss, or risking a disaster on the small chance of avoiding any loss, most people go for broke and risk everything to avoid what would have been a manageable loss.  They do this even when the odds and a cost-benefit analysis favor going with the manageable loss.  This willingness to act radically to avoid small losses, even at the risk of suffering disastrously large losses, seems inconsistent with the first form of the aversion bias, in which people act conservatively to avoid small losses even at the expense of forgoing gains.  But it isn’t.  The common core of both forms of the aversion bias is that people see any loss as a sky-is-falling result and, in turn, are generally unable to distinguish between a small loss and a disastrous loss.

Radical politicians of both the revolutionary Left and the fascistic Right have made appeals to this go-for-broke form of the aversion bias a standard part of their operating procedures.  In American history, for example, did the Sons of Liberty in the 1770’s fall prey to the aversion bias, or prey upon the aversion bias, when they vehemently rejected British proposals to increase taxation?  The increases were small and the British intended to spend the taxes on protecting the American colonies from foreign attacks.  The Sons of Liberty claimed, however, that accepting any taxes, no matter how small, would lead to total oppression.  Was this a reasonable reaction?

Appeals to both forms of the aversion bias could be seen in the Trump campaign during the 2016 presidential election, particularly in the ways Trump denied the evidence on global warming, gun control, and immigration, and played on people’s fears of change.  With respect to global warming, the evidence is overwhelming that human activities are the main cause of rising temperatures and erratic weather patterns.  There is also a consensus among environmental scientists that global warming will adversely affect human life.  And there is a consensus among economists that ameliorating global warming by going green would generate jobs and economic growth, which would greatly benefit the public as well as the environment.

Nonetheless, Trump, along with the oil and coal billionaires who support him, and those who speak and act on their behalf, such as the new EPA Administrator Scott Pruitt, have been able to generate widespread fear that if the government acts on global warming, people might have to give up their SUV’s and pickup trucks.  This is seemingly an instance of people choosing to forgo a significant benefit out of fear of a small loss, the first form of the aversion bias.

Likewise, with respect to gun control, the evidence is overwhelming that we are safer both individually and as a society without guns in the hands of private individuals.  If you own a gun, there is virtually no chance that you will ever use it to thwart an attack on yourself or someone else.  In fact, if you own a gun, your chances of being shot increase several-fold, and it will most likely be with your own gun.  Significant measures of gun control would be in the best interests of almost everyone.  Nonetheless, Trump, along with a handful of gun fanatics, and the fascists who run the NRA, such as Wayne LaPierre, have stirred up fears that gun control will make people less safe because people won’t be able to defend themselves with their own guns.  Again, this is an example of people forgoing a significant benefit out of fear of a small potential loss.

Trump’s rantings against immigrants and immigration, fed by racists such as his top advisor Steve Bannon, exemplify go-for-broke politics, the second form of the aversion bias.  The demographic facts are that the United States is going to have a population within the next twenty-five years in which over fifty percent of the people will be minorities.  The historical facts are that immigrants have always been, and still are, the backbone of economic and cultural advancement in our country.  Trump, nonetheless, won the election in large part by stirring up fears among white European-Americans that they might lose their top dog status in our society, and might have to share prestige and power with other ethnic groups.  Instead of recognizing the possibility of this minor loss of status as a small price to pay for positive social change, Trump and his supporters have chosen to wage an all-out pejorative campaign against immigrants, minorities, and foreigners.  It is a reckless policy that portends potential disaster for the country.

2. The Planning Fallacy. The aversion bias is a powerful motivating force.  Fear of loss will usually trump hope of gain because, as Kahneman claims, “Losses are weighted about twice as much as gains” in our instinctive thinking.[8]  But the aversion bias is not all-powerful.  Hope can sometimes triumph, and hope is our only hope in defeating fear mongers who would rule us through our aversion bias.  Hope, however, also has its pitfalls.  When hope becomes optimism, overweening optimism can lead us astray.  The problem is that optimists almost inevitably fall prey to what Kahneman calls the “planning fallacy,” or what I think could be diagnosed as a “narcissistic intellectual disorder.”

If the aversion bias leads us to be overly pessimistic about what is happening to us, the planning fallacy leads us to be overly optimistic about what we are doing about it.  When we decide to do something, whether it be to stick with the status quo, go for broke, or do otherwise, we almost inevitably overestimate the likelihood for success of the things we plan to do.  We become enamored of our plans, and overconfidence often leads to the failure of our enterprise.

Narcissism plays a big part in this mistake because when we decide to do things, we tend to focus solely on our own abilities, our own actions, and how well we have prepared to do them.  We fail to pay sufficient attention to what others are doing, or what bad luck could befall us, that might foil our plans.  We focus on what we are putting into the project, but fail to focus on the context in which we are operating.  Optimism is one of the main reasons that entrepreneurs start so many new small businesses every year, but overweening optimism is one of the main reasons that some eighty percent of them fail within the first year and a half.[9]

The planning fallacy and the narcissism bias produce miscalculations on the part of political actors as well as businessmen, and reckless and regrettable behavior can be the result.  Did the American revolutionaries, for example, fall prey to the planning fallacy when they started a war with Britain which they expected to win quickly and easily, but which went on for over seven bloody years?  Is it an instance of the narcissism bias, taken to a seemingly pathological extreme, when despite four bankruptcies and three marriages, among other bungled enterprises, Donald Trump claims to be able to do anything, to have been successful at everything, and is unable to acknowledge any sort of mistake or failure?

3. Memory Tricks: The Availability Bias. Biases affect not only the way we process information, but the way we store information.  Our memories are the storehouses of the information we use to reach conclusions and make decisions.  But our memories play deceptive tricks on us.  One of these is what Kahneman calls the “availability bias,” which could be described as a “last in, first out mind set.”  People give more weight to recent events than they reasonably should, especially if the events are dramatic.  The last thing we have experienced becomes the first thing we think about when evaluating a situation and reaching a decision.  The evidence that is most readily available, that is, the last evidence to be stored in our memories, is the first and most influential evidence that we consider, even if it is not the best evidence.

People are also short-sighted.  They tend to see things within a narrow and short-term frame of reference.  They give too much weight to small pieces of anecdotal evidence, and too little consideration to the big picture and the long-term.  “We are by nature narrow framers,” Kahneman claims.   People also tend to be enchanted by melodramatic stories, and turned off by statistics and abstract arguments.  It is easier to access and process small pieces of simple information than to retrieve and reflect on complex conglomerations of evidence.  As a result, we often fail to put events into a big picture or see them in long-run terms.  We give too much weight to either bad news or good news, and tend to overreact either pessimistically or optimistically to situations because we fail to consider the weight of all the best evidence.[10]

The availability bias has historical and political implications.  Did, for example, the American revolutionaries overreact to the actions of King George III based on narrowly framing what he was doing?  The revolutionaries claimed that because the King was working actively with Parliament, he was trying to become a dictator when, in fact, the King and Parliament were working toward the parliamentary government that still prevails in England today.  Was Donald Trump also guilty of narrow framing in the recent election when he harped on a few isolated stories of harm caused by immigrants, while failing to acknowledge the bigger picture of the good things immigrants have contributed and continue to contribute to our country?

4. The Outcome Bias. In addition to the availability bias, our memories are also subject to what Kahneman calls an “outcome bias,” which could be characterized as an “all’s well that ends well mindset” and a “winners get to write the history syndrome.”  Kahneman reports that if the outcome of a decision is good, people do not generally care how the result was achieved.  And they generally remember the process of having decided to do the thing, and the way the thing was done, as having been good, even if that wasn’t the case.

The outcome bias can lead to dangerously false conclusions about a person’s perspicacity.  “A few lucky gambles,” Kahneman claims, “can crown a reckless leader with a halo of prescience and boldness.”[11]  Winners get to write the history, even if it is wrong.  This bias can also lead to dangerously false conclusions about the successfulness of aggressive ways of acting.  We tend, for example, to forget the death and destruction of a war if our side won.  We almost completely ignore questions of whether the war was necessary, let alone worth it.  And we avoid questions of whether the same or better results could have been achieved without the war, and without the death and destruction.  In our memories, all’s well that ends well, even if it really wasn’t.

All’s well that ends well is the conventional wisdom promoted in most history textbooks.  This approach is especially the case with American history, which is conventionally portrayed as an inexorable march of progress, freedom, and goodness.  As applied to the American Revolution, for example, since the Revolutionaries won, and the country grew bigger and better thereafter, the Revolution must have been a good thing.  All you need to know, in the conventional view, is that we have made it to the present, and the present is pleasant.  So why question whether there were better alternatives to the ways in which we got here?  This conventional wisdom is rebuffed by an approach to history as people making choices.  In that approach, questions need to be asked about whether there were alternatives to the way we got here, and voices other than those of the winners need to be heard, if we are to learn from the past and prepare for the future.

Donald Trump’s reaction to the recent presidential election is an example of the outcome bias, and why we need to listen to multiple voices.  Trump is outraged that people might be concerned with how he won the election, and whether his campaign colluded with Russian intelligence agents to undermine his opponent.  As far as he is concerned, the election is over.  He won.  And the winners get to write the history.  Trump thinks people should remember the election as one in which his qualifications and strategies prevailed over his opponent’s, and that all is well that has ended well.  That is how most people usually remember things and that, Trump insists, is how people should think about his election.  But maybe not this time.[12]

Undoing the Past and Learning from the Losers: Reflection as Collective Thinking.

“The greatest of faults is to be aware of none.” Thomas Carlyle.

Tversky and Kahneman are skeptical, but not pessimistic, about the possibilities that people can think rationally and make reasonable decisions.  The best way to approach almost any problem, they contend, is with other people, so that you can identify and critique each other’s biases.  Even if you share the same biases with your colleagues, it is easier to recognize and reject biases in the thinking of others than in your own thinking.  So, we can give each other a lift.  “Organizations,” Kahneman claims, “are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures” on how conclusions are reached and decisions are made.[13]  In his description in The Undoing Project of the ways in which Tversky and Kahneman, worked together and critiqued each other, Michael Lewis essentially describes the underlying message of their findings, that we need critical input from others and cooperative effort with them, to be able to overcome the biases that are built into our thinking processes.

Kahneman goes on to suggest that if you cannot work with others, and are faced with a problem on your own, you should note your first intuition or instinct as to how to solve the problem, and then reject it.  You should then force yourself to reflect on the problem and on your first response to it, explore alternative options for a solution, and select the one that seems to be based on the best evidence and arguments.  In so doing, you should listen to the voices in your head of people whose judgments you generally respect and trust, and you should subject your memories and your ideas to vicarious critique by those significant others.

Reflection is primarily a process of listening to competing voices in our heads.  It is a symposium of influential books we have read, convincing speakers we have heard, and significant people whose points of view we have incorporated into our internal dialogues.  Through listening to these voices, we can conjure up memories we might otherwise miss, and consider arguments we might otherwise ignore.  Reflection can, thereby, help keep us from instinctively reaching wrongheaded conclusions.  Once we have reflected on a problem, and given ourselves the best chance of thinking rationally, we should decide on how to deal with it.  This is the method that I promote in approaching history as people making choices.

Approaching history as a study of people making choices treats the subject as a collective enterprise in which people from the past and the present participate.  The method can help us see whether the people we are studying fell prey to the intellectual biases identified by Tversky and Kahneman, and whether and to what extent we ourselves are prone to those biases in studying the history of those people.  Tversky and Kahneman teach us that the subjects of our historical studies probably should not have followed their first instincts, and may have been mistaken in their decisions if they did.  The same goes for us as students of history.

Approaching history in this way requires us to examine our historical subjects’ reactions to the problems they faced, see if they made reflective decisions or instinctively reacted, and speculate on whether they could have made better decisions than the ones they made.  In studying the American Revolution, for example, this means looking at what was said by those Americans who opposed the Revolution, who were the losers in the debate over whether to revolt, and to decide in retrospect who had the better of the argument.

Reconsidering past decisions in this way is sometimes disparaged as twenty-twenty hindsight, and a cheap shot at the past.  But that is neither fair nor accurate.  We humans are inveterate second-guessers, and we routinely evaluate our own decisions, so why not similarly evaluate the decisions of our predecessors?  Whether we are businesspeople having made a deal, soldiers having fought a battle, or Little League baseball managers having called for a squeeze play, we invariably look back at what we did, revisit the alternatives we had, and speculate on what might have happened if we had made a different choice.  And this is not a waste of time and effort.  Evaluating our past decisions helps prepare us for our next decision.  It is the same with history.

The point of historically studying people’s choices is to understand why and how they got things right and got things wrong.  It is not to condemn or demean them.  The goal is to learn from their successes and mistakes, just as we try to learn from our own successes and mistakes.  It is not, for example, a condemnation of the Founding Fathers if we were to conclude that the American Revolution was a mistake, and that things might have been better if Americans had achieved independence gradually and peacefully, as did Britain’s other English-speaking colonies.

It is, in turn, no disloyalty on our part to the Founders or to the United States that we want to try to get a past decision right retrospectively as an aid to getting our next decisions right prospectively.  It is, I would contend, a patriotic act.  While moral turpitude may attach to an ill-intentioned decision by an ill-meaning person, there is no moral turpitude attached to a well-intentioned and well-meaning decision that turns out to be a mistake because of an unwitting bias.  That is one of the differences between Donald Trump and George Washington.

The Method of Approaching History as People Making Choices. 

“There are always choices…Our responsibility as historians is as much to show that there were paths not taken as it is to explain the ones that were.” (emphasis in original) John Lewis Gaddis.

The method of approaching history as people making choices can be outlined in six main steps.  First, we must decide what historical event we want to study.  History is virtually infinite in scope, and there are an almost infinite number of events we could choose to study.  So, we need to make a choice and, in order to avoid falling prey to an availability bias, we need to reflect on the reasons we are choosing to study a particular event.

Whether we are aware of it or not, we invariably study problems in the past, and ask questions about them, that relate to issues in which we are interested in the present.  It is almost inevitable that an event we choose to study is somehow related to a current social issue.  That connection is not a problem so long as we are aware of it, and do not let our predilections toward the current issue lead us to predetermine our response to the historical problem.  The purpose of studying history is to let our conclusions about past events help inform our judgements of present issues.  That educational purpose is foiled if we merely judge past events based on our current biases.

Social issues change, and so do the list of historical events in which we are interested and the questions we ask about those events.  This is the main reason history books are continually being rewritten, and the history of subjects being revised.  New history books are rarely a result of significant new evidence but are, instead, usually the result of changing interests.[14]  During the 1940’s, for example, social conformity and political apathy were issues of concern.  Historians, in turn, looked at the American Revolution as a case study of how masses of people might be motivated to act.[15]  During the turbulent 1960’s, historians looked at the Revolution as a case study of how masses of people might directed toward constructive ends.[16]  Historians today, in the wake of the recent election, may be choosing to focus on the Revolution as a study of ways and means of countering a potentially tyrannical ruler.

Having chosen the subject of study, the second step is to delineate the plausible options that people had in deciding what to do about the problem they were facing.  We need to resurrect and understand the arguments that different groups of people made in support of various options.  Since history is generally written by and on behalf of the winners, we will likely need to recover and listen to some lost voices in this process.  What, for example, were the arguments of the Tories during the American Revolution?  How do their arguments look in retrospect?

As the third step, we need to examine why and how the winning argument prevailed, and a choice was made.  How did the winners win, and what happened to the losers in the debate?  What happened, for example, to those who initially opposed the Revolution?  Why and how did some opponents turn to supporting it, while others did not?  And what happened to these people?

In the fourth step, we need to examine the consequences of the choice that prevailed.  How did the prevailing choice affect people then and afterwards, and how does it affect us now?  Since conventional history is generally written as all’s well that ends well, we need to distinguish between current circumstances that are a consequence of that choice, and things that might have come to pass even without that choice.

For example, the United States is today a relatively prosperous and free country.  Conventional histories generally attribute our current circumstances to our having undertaken and won the American Revolution.  But is that so?  The other English-speaking former British colonies, i.e. Canada, Australia, and New Zealand, gained their independence gradually and generally peacefully during the nineteenth century.  They did not suffer the death and destruction of a violent revolution.  And they are today at least as prosperous and free as the United States.  Comparing their histories to ours raises questions of whether the success of the United States can be attributed to the Revolution, and whether we could not have done as well or better without it.

This leads to the fifth step, which is that we need to speculate as to what might have happened if a different choice had been made.  This is the second-guessing part of the project, to which many historians object, but which I think is crucial to getting beyond an outcome bias that all is well that ends well.

As the sixth step, we should apply what we have concluded about the historical event to the present-day issue that led us to study that event in the first place.  The premise of studying history as people making choices is that things might have been different, for better or for worse, if different choices had been made, and that exploring those past possibilities might help us make better choices in the present.  Maybe the losers in the debate over the American Revolution were right.  Or maybe they weren’t.  It is enlightening to consider the possibilities.[17]

Saving History from the Post Hoc Fallacy: Choice versus Causation.

“The supposition that the future resembles the past is not founded on any arguments, but is derived entirely from habit.”  David Hume.

History is story.  Like other stories, history starts with a “Once upon a time” scenario.  This starting scenario is a dynamic situation from which a narrative unfolds, and from which events pass in time from “Once upon a time.”  But time can take on very different meanings depending on the narrative form of a story, and whether events are portrayed as flowing randomly as a function of chance, predetermined as a result of causation, or determined freely as a consequence of people’s choices.  History can take the form of chance, causation, or choice.

Chance is luck, something that happens unpredictably without discernable human intention or observable cause, so that history as chance is a story of happenstance that people can neither predict nor control.  History as chance is seemingly arbitrary and unfathomable.  And dangerous.  It is the world of small children baffled and intimidated by adults, and by the host of things they do not understand and cannot control, many of which might hurt them.  It is also the world of our reptilian ancestors, and a realm in which the instinctive “fright, then fight or flight” response of our prehistoric brain stems would seem appropriate.

If history is the result of chance, there is little reason to study it, and little to be learned from studying it, other than the worldly wisdom of stoic resignation.  History as chance is a rationale for an aversion bias.  If history is chance, then aversion would seem to be the proper response to any potential change in a tolerable status quo, no matter what the promised benefits of the change.  For in a world dominated by chance, who knows what might come next?  Better the devil you know than the one you don’t, as the stoic saying goes.

Unlike chance, causation is inexorable, with consequences flowing inevitably from circumstances, so that history approached as causation appears to be the product of forces and factors that control events behind our backs and despite our intentions.  Causation is the form in which most conventional history is presented.  Most of us remember, for example, having to memorize in school the six or eight or ten so-called “causes” of the American Revolution, the American Civil War, and other important historical events.  In this approach, history is portrayed as a chain of causes and effects that we can understand but cannot control.

However, if history is a chain of causation in which one thing follows logically from another, then the future ought to be predictable from the past, and the study of history ought to make us fortune tellers.  But, it doesn’t.  Causation history exemplifies the outcome and availability biases described by Tversky and Kahneman, and it is an instance of the logical fallacy known as “post hoc ergo propter hoc.  In approaching history as causation, one assumes that because something came after something else, the first thing must have caused the second thing.  But, this is not logical, or even empirical.  And it leaves us with nothing to do but contemplate our navels as we watch events unfold.

History as causation takes the current state of the world, and then outlines the stream of events that led to it.  It delineates the events in a chain of causes and effects that can look like an inevitable path from the past to the present.  But, it leaves out all the paths not taken, all the plausible options not chosen, and all the real-life contingencies faced by people in the past and by us today.  It is an abstraction that is neither interesting nor useful.  Like history as chance, history as causation can serve as a rationale for quietism and political passivity.[18]

Unlike chance and causation, choice is deliberate, so that history as choice is a story of people making decisions in the face of circumstances they may not be able entirely to predict or control, but with the belief that they can freely choose among plausible options and reasonably predict what might be the consequences of their actions.  History as people making choices is realistic and seemingly reasonable.  It is the way we experience life, as people debating and choosing among options within prescribed circumstances.  If history is a matter of choices, then time is a medium of opportunity and not futility, and life is not merely a matter of waiting for arbitrary or inevitable things to happen.  History as choice is a rationale for social and political activism.

Approaching history as people making choices allows us to relate consequences from the past to circumstances in the present without falling into an outcome bias.  The method makes connections between the past and the present debatable rather than inevitable.  The same events that are conventionally presented as a chain of causes and effects can be reconceived as a series of circumstances, choices and consequences.  This is a narrative distinction that makes a big difference in the meaning and moral of a history.[19]  With respect to the American Revolution, instead of seeing the Revolution as an inevitable result of causation, we can approach it as a series of debates about who should govern, and how government should operate.  These debates, in turn, helped form subsequent debates about government and democracy that have permeated American history from then to now.  That is a much more useful history.

In sum, approaching history as people making choices is a method of studying how and why people think the way they do, and make the choices that they do.  Historical events are approached essentially the way those events were approached by the people who experienced them, and the way we approach situations in our own lives, as contingencies that could go different ways depending on the choices that are made.  Past decisions are, in turn, related to problems and choices facing us in the present day.  Studied in this way, history becomes an important life skill, and an education in avoiding the intellectual pitfalls described by Tversky and Kahneman.

Postscript: For Further Reading…

The purpose of this essay has been to introduce the findings of Tversky and Kahneman, and promote the method of approaching history as people making choices.  Although conventional history textbooks do not reflect it, most of the best scholarly historians have either explicitly or implicitly approached history as people making choices.  If you are interested in seeing how this method works, I have written a book that is based on my reading of some of the best historical works of the past fifty years, and that exemplifies the method.  It is titled Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice (AuthorHouse, 2013).

The book describes ways of teaching American history as people making choices, and includes a thematic history of the United States that exemplifies the method.  The book examines thirteen turning points in American history from the early 1600’s through the late 1900’s.  It focuses on the decision-making processes of the people involved, uncovers many of their biases, explores debates among historians about those turning points, and debates the conclusions of historians.  The book is intended as an encouragement for readers to explore historical events for themselves, debate their own and others’ ideas, and arrive at their own considered conclusions about history.

[1] Michael Lewis.  The Undoing Project: A Friendship that Changed Our Minds. New York: W.W.Norton, 2016.

[2] Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013.

[3] For a discussion of how one might teach history as people making choices, and a thematic history of the United States using that method, see my book Was the American Revolution a Mistake? Reaching Students and Reinforcing Patriotism through Teaching History as Choice.  Bloomington, IN: AuthorHouse, 2013.

[4] See the Walt Kelley cartoon at http://www.igopogo.com/we_have_met.htm (1953) After a long, arduous and comic search for the source of the world’s problems, and the enemy that is plaguing us, Pogo Possum concludes that we are the source of our problems, and that we must start to think differently in order to resolve them.

[5] David Sloan Wilson. Evolution for Everybody. New York: Delacorte Press, 2007. p.285.

[6] Jared Diamond. The Third Chimpanzee: The Evolution and Future of the Human Animal. New York: Harper Perennial, 1993. pp.220-221.

[7] Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.320-322, 329.

[8] Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013.  p.364.

[9] Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013.  pp.339, 341-342.

[10] Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.12, 350.

[11]  Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. pp.212-213.

[12] This is being written in early March, 2017 when the ways and means of the election are still a considerable source of controversy.

[13]  Daniel Kahneman.  Thinking Fast and Slow. New York: Farrar, Straus & Giroux, 2013. p.436.

[14] Ronald Dworkin. Justice for Hedgehogs. Cambridge, MA: Harvard University Press, 2010. p.127.

[15] Richard Hofstadter. The American Political Tradition. New York: Random House, 1948. pp.3-17.

[16] Staughton Lynd. Intellectual Origins of American Radicalism. New York: Random House, 1968.

[17] John Lewis Gaddes. The Landscape of History: How Historians Map the Past. New York: Oxford University Press, 2002. p.9.

[18] Isaiah Berlin. Historical Inevitability. London: Oxford University Press, 1954. pp.3, 20-21,68.

[19] I have an extended discussion of this narrative distinction in my blog post “What to do about the Big Bad Wolf: Narrative Choices and the Moral of a Story.”

Distrust in the Hinterlands: Bozo the Clown Promotes Fear and Hate, and It Ain’t Funny.

Distrust in the Hinterlands:

Bozo the Clown Promotes Fear and Hate, and It Ain’t Funny.

Burton Weltman

“I won’t close my eyes, I can’t close my eyes, I never close my eyes.

See, they’re always there, with that funny hair.  Oh, I’m so scared.”

Can’t Sleep, Clowns Will Eat Me.

Alice Cooper.

Prologue:  Here’s Johnny!

“Whoever is careless with the truth in small matters cannot be trusted with important matters”

Albert Einstein.

“You can always tell when Donald Trump is lying.  He says ‘Believe me'”

Jon Stewart

“Who Do You Trust” was an ungrammatically named television game show emceed by Johnny Carson and announced by Ed McMahon during the 1950’s, before they went on to “Tonight Show” fame.  The show tested the trust in each other of husbands and wives.  In focusing on trust, the game exemplified a key theme in modern American culture.  Although we live in a culture based on an individualistic ideology, and we are bombarded with the mantra that each of us should think for him or herself, no one, not even Einstein, could know everything and think about everything for him or herself.  It is a fact of life that we inevitably must depend on others for most things, including our ideas.  Most of our ideas, judgments, and decisions are derived from others.  In this context, “Whom do you trust?” is probably the most important question that people must answer in their lives.

Each of us resides psychologically in what could be called a community of trust, which includes those family members, friends and other significant others whose ideas we absorb, and upon whom we rely for guidance when we make judgments and decisions.  We live, however, in a society in which face-to-face relationships have been increasingly replaced by long-distance and electronic contacts.  Increasingly, we must rely on people we do not know and will never see.  Experts, reporters, government officials, manufacturers, scientists – the list of people in whose hands we routinely place our lives is almost endless.  That can be discomfiting for people raised on the idea of self-reliance, and it raises the stakes on the question of whom one can trust.

It is the importance of trust in our lives that makes the election of Donald Trump as President so perplexing to many people, including me.  Trump is a chronic and seemingly compulsive liar.  He has repeatedly cheated on his several wives, and repeatedly violated contracts with people working for him.  He is a narcissist who seemingly cares for nothing but massaging his own ego.  Trump is also a vulgar person, who regularly behaves in repulsive ways, insulting anyone who disagrees with him, including the Pope, and sexually abusing women and then bragging about it.

With his ever-present smirk and phony orange hair, Trump was widely the butt of late night jokes long before “Saturday Night Live” recently got on his case.  In a monologue in 1992, for example, Johnny Carson deadpanned on “The Tonight Show” that Jennifer Flowers, who claimed to have been Bill Clinton’s mistress and had recently been fired as a receptionist, had just been hired by Trump as his backup mistress in case his current mistress was unavailable.

Donald Trump is a man who has clearly shown that he cannot be trusted in matters either small or great.  That this clownish character is now the President is keeping a lot of people up at night, scared and unable to sleep.  Nonetheless, enough Americans decided he could be relied upon to be their President so that he was elected.  How can that be?

Bozo the Clown becomes President: Trump trumps trust with fear.

“Bozo the Clown.  Do we really need ‘the Clown?’  Are we going to confuse him with Bozo the Tax Attorney?  Bozo the Pope?”

Jerry Seinfeld.

“How about Bozo the President of the United States?”

Anonymous.

It has been said that the Republican Party could run Bozo the Clown as its candidate for President and still get the support of the 33% of the voters who make up the hard core Republican base.  Bozo would get the troglodytes who fantasize about returning to the laissez-faire ways they think prevailed in the United States during the nineteenth century, but really didn’t, and to whom Republicans have historically appealed with a mantra of free enterprise.

Bozo would also get the racists who still cannot accept the end of segregation, let alone that we have had a black President, and to whom Republicans have been appealing through coded racist messages since the mid-1960’s.  And he would get opponents of abortion who think abortion is mass murder, and therefore have no moral choice but to vote for an anti-abortion Republican, no matter how offensive he or she might otherwise be.  So, even Bozo would have a base vote of some 33%.

The problem for Democrats is that the Republicans actually did run Bozo the Clown in the recent Presidential election, and he got 46% of the popular vote.  His total was significantly less than the percentage of the popular vote received by his opponent, but still enough to gain a victory in the Electoral College, thereby defying almost everyone’s expectations, seemingly including those of Bozo himself.

In past elections, victorious Republican candidates have succeeded by tacking toward the ideological middle after they are nominated, and making a rational and hopeful appeal to the broader electorate beyond the Republican base. In the recent election, the Republican candidate did no such thing.  His campaign was extremist, ridiculous and scandalous from beginning to end.  A truly Bozo production.  Yet, to the astonishment of many, and the dismay of most, he won.  So, how did Bozo pick up that extra 13% of the vote that he needed to win?  More particularly, how did he sway voters in the so-called swing states in the American hinterlands (Pennsylvania, Ohio, Michigan, Wisconsin) that he needed to take an Electoral College victory?

To be clear, this was a very close election in which any number of things could have been the proximate cause that tipped it toward Trump.  Putin’s enmity, Comey’s infamy, Clinton’s overconfidence, and voters’ apathy at what was supposed to be a Clinton landslide, are just a few of the things.  We should not jump to broad conclusions from this election about the irrationality of American voters, or about some sort of growing fascist sentiment in the country.

Some ninety years ago, the great American cynic H.L. Mencken predicted that “As democracy is perfected, the office of president represents, more and more closely, the inner soul of the people.  On some great and glorious day the plain folks of the land will reach their heart’s desire at last and the White House will be adorned by a downright moron.”  Many disgruntled liberals have resurrected that prediction as an epitaph for Trump’s election victory, along with various dystopian literary fantasies about the demise of democracy and the rise of fascism in America that have been published over the last century.

But that response seems a gross overreaction.  After all, Clinton got more votes than Trump.  And she won virtually all the major cities and the most productive states.  Likewise, as of this writing in mid-February, Trump’s post-election approval ratings have been the lowest for any President in history, and dropping, with a majority of Americans disapproving of him and a majority wishing Obama was still President.  The reaction of most Americans to most of Trump’s early actions as President have also been generally negative.

Trump’s electoral success cannot, however, be dismissed as a fluke.  Trump seems to have triumphed through an appeal to the fears of a great swath of so-called middle Americans who are afraid they are being left out and left behind by forces beyond their control.  And his election seems to reflect a longstanding cultural gap in American society between those who are willing to entertain new theories and practices in the arts, religion, science, and public policies, and are willing to embrace diversity in our population, and those who want to maintain what they see as tried and true traditional practices, and white European homogeneity in our population.  It is a split that could be described as between progressives and traditionalists.  Trump’s extra 13% of the electorate seems to have come from traditionalists who voted to protect their entrenched vision of the world, fearing for its demise, along with what they saw as their self-interests.

Many Americans, especially those in the so-called rust belt, coal belt, Bible belt, and farm belt, are afraid of the wider world and what they do not know about it.  Most importantly, they do not trust scientists, experts, intellectuals, government officials, and immigrants whose ideas derive from involvement in the wider world, and whom they never see or see only occasionally on national news programs.  This is, I think, a key to our political situation.  Many Americans do not trust the messengers of modern science and progressive government and, so, they reject the message.

They do not trust scientists, so they refuse to believe in climate change, despite the overwhelming scientific evidence that it is real and that it would hurt people like them most.  Likewise, they do not trust foreigners, so they reject international trade agreements, despite the evidence that these agreements work to most of their advantage. For the same reason, they do not trust international organizations, such as NATO and the UN, even though these organizations have overwhelmingly supported American initiatives over the years.  And they don’t speak Spanish or any language other than English, so they fear and reject immigrants, even though immigrants are a key component of our country’s success and their own well-being.

They do not trust faceless bureaucrats in Washington DC, so they hate the federal government and government programs, despite the fact most of them depend on all sorts of federal government programs to survive.  “Keep the government’s hands off of my Medicare” is a common demand among people who regularly proclaim their fear and hatred of the federal government.  I have heard it from people I know.  Similarly, there are many people from these areas who want to get rid of Obamacare but not the Affordable Care Act.

Fear rather than facts seems to drive this group of people.  The reality is that Red states receive more money in aid from the federal government than they pay in taxes.  If the Republicans follow through on their proclaimed goals of cutting back on federal programs and cutting down on economic and environmental regulations, it will be Red states and Republican voters that are hurt the most.  Many red states also depend heavily on immigrant labor, so that Trump’s promised crackdown on immigrants will harm the very people who voted for him.  Distrust was, however, Trump’s trump card, and much to the amazement of most Democrats, many middle American voters distrusted Hillary Clinton more than Trump.

It would be hard to find another person in America who less represents what these people stand for than Trump.  He is a dissolute libertine, who has been married three times to trophy wives, conducted numerous extramarital affairs, routinely assaulted women sexually, and then bragged about it.  He is a draft dodger who inherited his wealth from his father, and used it to engage in financial manipulations and to build high end hotels, golf courses and other amenities for the very wealthy.  He has spent his life cavorting with the rich and famous.  But, as the saying goes, the enemy of my enemies is my friend, and Trump savagely disparaged the people whom many middle Americans most fear.  And they believed him in that.

Trump persistently appealed to the fears of these people by attacking scientists, bureaucrats and foreigners, who he contended were bent on destroying the traditionalists’ vision of America.  He persistently proclaimed that Clinton was untrustworthy, and that she represented the forces and the people that traditionalists believe are wrecking their world.  Clinton was unable to convince these traditionalists that scientists, bureaucrats and foreigners need not be feared, and that they and she could be trusted and included within their community of trust.

Instead of trying to counter the fears of traditionalists, Democrats emphasized their fears of Trump, and trumpeted the ways in which a Bozo presidency might harm the country.  Democrats, thereby, countered Trump’s fear mongering with their own fear mongering.  But fear mongering is a game that conservatives generally win.  And they did so once again in this election.  In the struggle between fear and trust, fear was the consensus winner, trust the loser.  But this is nothing new, and Democrats should have known better and done better.

The Public and Its Problems: Dewey and Lippmann Debate Democracy.

“The fact that some geniuses were laughed at does not imply that all who are laughed at are geniuses.  They laughed at Columbus, they laughed at Fulton, they laughed at the Wright Brothers.  But they also laughed at Bozo the Clown.”

Carl Sagan.

“And what if Bozo thinks he is a genius, then what?”

Anonymous.

The election of Donald Trump is symptomatic of splits between cultural progressives and traditionalists, and between social progressives and conservatives, that almost all societies have faced since the rise of the first civilizations, and that have plagued the United States since the early twentieth century.  Progressives look to the wider world for ideas, and look forward to social and cultural change.  Traditionalists look inward to their local communities, and backward to their traditions.  They have usually occupied middle to lower-middle positions in society.  Conservatives look to maintain the position and power of the social elite.  While most progressives have historically been members of the social elite, they have generally sought social and cultural change at the risk of some loss of position and power.

A literacy gap between those who could and could not read and write was the foundation of cultural divisions in most societies for most of history which, in turn, mirrored social, economic and political differences.  Most societies were made up of literate city dwellers and illiterate rural peasants, with the city dwellers ruling over and looking down on their rural brethren.  Literacy provided a sharp and obvious dividing line between those in the know and in power, and those not.  Illiteracy and poverty were also obvious reasons why traditionalists could not reach out to the wider world, and may not even have known that a wider world existed.

In ancient Greece, for example, Athenian philosophers looked to Egypt and Sumeria for new ideas.  During the Renaissance, European intellectuals looked to India and the Ottoman Caliphate.  Even during the Middle Ages in Europe, which used to be called the Dark Ages because cultural conservatism was the order of the day, intellectuals in the Catholic Church looked to ancient Greece and Rome for ideas.  Meanwhile, peasants in most societies around the world inhabited cultures that changed so slowly that most people did not notice the changes at all.

With the rise of literacy in most parts of the world over the last century and a half, literacy was no longer a major dividing line in society or defining point in culture.  By the early twentieth century, literacy was almost universal in the United States, and almost all Americans had access to the wider world of culture, if they wanted it.  Nonetheless, a culture gap persisted between progressives who looked outward for cues and forward toward change, and traditionalists who looked inward and backward.  A new way of defining this culture gap was needed, especially by conservatives who resisted the egalitarian implications of universal literacy, and wanted to maintain their elite status.  They came up with a distinction between highbrows and lowbrows.

Highbrows were described as people involved in the fine arts and engaged in challenging intellectual pursuits.  Lowbrows were people involved in popular culture and parochial pursuits.  Highbrows were ostensibly highly intelligent, their thinking was complex, and they were connected to a wide world of culture.  Lowbrows were supposedly unintelligent, their thinking was simple and simplistic, and they were narrowly confined to their local culture.  Highbrows, who were almost invariably members of the upper classes in America, were ostensibly the natural leaders of the country.  Lowbrows were natural followers, if they only knew it.

This distinction between highbrows and lowbrows had significant political implications.  Highbrows were supposedly capable of understanding and dealing with the complexities of modern twentieth century society, including the intellectual challenges of an ongoing technological revolution, the social problems of mass immigration and urbanization, and the managerial conundrums of large-scale industries and other organizations.  Lowbrows were supposedly stuck in the obsolete theories and practices of the small towns, the family farms, and the local businesses of the nineteenth century.  They looked for simplistic solutions to complex problems, and could not be trusted with the management of the country.

The question that faced political leaders in America in the early twentieth century was how to deal with what most saw as an ignorant majority of lowbrows in an ostensibly democratic society.  It is a question that persists to the present day.  Although the names have changed over time, the essence of the distinction between highbrows and lowbrows has remained the same.  The euphemism “high information voter” versus “low information voter” is widely used today.  But a lowbrow by any other name is still a person being demeaned and degraded, and a problem for those who think of themselves as at a higher intellectual and cultural level.  For both conservatives and progressives, the solution to the problem has lain in education and the mass media, but with very different approaches to both.

Conservatives have tended toward Mencken’s belief that “Democracy is a pathetic belief in the collective wisdom of individual ignorance.”  They have complained that democracy inevitably panders to the lowest common denominator among people, and that left unchecked would produce idiotic leaders and moronic public policies.  Mitt Romney’s claim during the 2012 election that forty-nine percent of Americans essentially want to live on welfare benefits, and vote Democratic for that reason, is only a recent example of that sentiment.  So, conservatives developed during the 1920’s a scare tactic to appeal to erstwhile lowbrow traditionalists, and pander to their ignorance and fears.  This approach was designed to appeal to the lowest common denominator among the masses through scaring them into following the lead of their betters.  This approach would put a check on egalitarian policies and democratizing politics through melodramatizing the need for plutocratic policies and authoritarian politics

Demonizing ostensibly dangerous immigrants, violent blacks, anarchistic terrorists, traitorous Communists, arrogant liberals, effete intellectuals, and atheistic scientists became the stock-in-trade of most Republicans from that time to the present.  And they have worked.  Republican policies have invariably favored the rich and powerful, and have never been in the best interests of most of the people who have voted Republican.  The policies of Democrats have almost always been in their better interests.  Republican scare tactics have, nonetheless, historically worked with alienated groups of distrustful and fearful Americans, and this largely explains how it is that Republicans have been able to win elections.  Donald Trump merely represents an extreme version of the brand.

The effectiveness of Republican scare tactics is exemplified by the ability of even Trump to win with them.  He personally represents anything but the traditional theories and practices of the middle Americans who voted for him.  But many of those people seemed willing to suspend their disbelief in his villainy because they distrusted even more the people he disparaged.  In turn, the willingness of Trump to adopt Republican scare tactics also demonstrates the shallowness and hollowness of the distinction between so-called highbrows and lowbrows.

Trump was born and bred a New Yorker, and has spent his whole life consorting with so-called highbrows in that city.  But none of their supposed intellectualism or culturalism has seemed to rub off on him.  He thinks he is a genius, and constantly says so, but he is actually a cultural boor, and an extremely ignorant and inarticulate person.  He may have been in, but he was not of the highbrow class.  If anything, he fits the definition of a lowbrow, which is why he has been able to campaign so sincerely as an anti-intellectual who scorns science, facts, and truth.  With Trump, there is not much there, there.  What Trump has, however, is an amazing natural talent for promoting himself and fostering a cult of his person and personality.

Cultivating Trust in the Hinterlands: Policies versus Personalities.

“You’re gonna like this.”

Bozo the Clown.

“Naw, I don’t think so.”

Anonymous.

Trump’s success in this election, and his ability to foster a cult of his personality among middle Americans, who had every good reason to shun him, seems to demonstrate the need for progressives to develop a better strategy if they are going to consistently thwart Republican scare tactics in the future.  The middle Americans who put Trump over the top, the 13% who made the difference, did not vote for Trump because they were enamored of him.  They voted for Trump because they were afraid of Clinton more than of him.  They did not trust her and the cast of characters she represented.  How then should progressives reach out to middle Americans?

This is a question that liberals have been debating since at least the 1920’s, when Walter Lippmann and John Dewey engaged in a famous debate on the future of democracy in a mass society.  It is a debate that pits those who could be called technocratic progressives against those who could be called participatory democratic progressives.

In Public Opinion (1922) and The Phantom Public (1925), Lippmann described what he saw as the implications for public opinion and politics of recent insights in behavioral psychology and psychoanalysis, and recent developments in the technology of the mass media, particularly radio and the motion pictures.  He claimed that modern psychology had demonstrated ways in which public opinion could easily be manipulated, and modern mass media provided the means for doing so.  He noted that these ways and means had already been successfully deployed by advertisers who used them to manipulate people into buying their clients’ products.  Lippmann applied these psychological insights and technological developments to politics.

Lippmann was a technocrat who derided the idea that public policy could be made by the public or in public.  He claimed that the public invariably wanted simple answers to complex questions, and inevitably chose leaders based on the personalities these leaders projected rather than the effectiveness of their policy proposals.  This desire of the masses for simplistic answers had been aggravated by the ever-increasing complexity of modern society.  Their focus on celebrities and personalities had been encouraged by the pervasive mass entertainment industry.  Lippmann concluded that the American public had been reduced to a “mass of absolutely illiterate, feeble-minded, grossly neurotic, undernourished and frustrated individuals.”[1]

Lippmann claimed that public policy must first be made by experts and technocrats, and then sold to the public through advertising techniques.  The public’s role in public policy should be merely plebiscitary.  That is, the people could reject leaders and policies through elections.  The way to gain support for liberal policies was, in turn, to cultivate the public’s trust in celebrity liberal leaders, who would then be able to sell their policies to the people the way advertisers sold their products through the endorsement of entertainment celebrities.  Lippmann warned, however, that public opinion would invariably be molded by whoever was in control of the media. The battle for the future would be fought over control of the mass media.

John Dewey was what we might call a participatory democratic progressive.  In The Public and Its Problems (1927) and in Individualism, Old and New (1929), he agreed with Lippmann that public opinion was often shallow and transitory, and that the complex problems of modern society required experts to solve them.  But while he acknowledged the problems that Lippmann described, Dewey claimed the solution was through more public participation, not less.

Dewey rejected the idea that ordinary people were feeble-minded or irrational lowbrows.  The underlying problem, he contended, was the isolation of local communities from each other, and from the broader national and international communities.  Isolation, he claimed, leads to distrust which leads to fear.  His idea was to cultivate local communities, and then connect them to each other and to the wider world through their consideration of common problems and solutions.[2]

The culture gap for participatory democrats such as Dewey is not a difference in intelligence, but a difference in whom people trust.  Most people, whether highbrow or lowbrow, make most of their choices based on who and what they trust.  No one understands everything he or she accepts as valid.  We all must accept the validity of things we do not understand, which is most things, on trust.  Most progressives reside intellectually within a community of trust that includes scientists, scholars, public officials, experts, politicians, and public institutions of various sorts.  This community is essentially a web of trust based on people personally trusting people who trust other people who trust other people and so on.  It is a web that expands to many degrees of separation between people, but all of whom reside within the realm of trust.  The question for progressives is how to gain the trust of people so that they choose to inhabit a community of trust that is similar to ours and that overlaps with ours.

The answer for Dewey was education.  Rather than selling liberal policies through a top-down advertising program, and gaining the public’s support for liberal policies by promoting liberal celebrities, Dewey wanted to use newly developed progressive educational methods to gain public support for policies which could then be transferred to support for liberal leaders.  His was a bottom-up method of political organizing.  “Democracy must begin at home,” Dewey claimed, “and its home is the neighborly community.” [3]

Dewey acknowledged the educational power of the mass media, but the fact that public opinion was susceptible of control by the rich and powerful people who own the mass media led him to reject a reliance on the media as a means of educating people.  He thought, instead, that public schools could be the primary vehicles for organizing people locally, and then connecting them to national and international institutions.  Public schools are run by local people, but they teach students about the wider world of social science, physical science and public policy.  School teachers are local people, but through their advanced educations, their professional organizations, and the subjects they teach, teachers are connected to the most advanced learning in the world.  Schools could help children, and maybe their parents as well, expand what I have called their community of trust to include the scientists and other thinkers who best understand the world.

The proposals and predictions of both Lippmann and Dewey have proved partially correct over time.  Since 1960, for example, with the rise of television as a principle means of campaigning, the mass media has become the major terrain on which political battles have been fought.  As a consequence, political parties, which had historically been a means of bringing together disparate communities in the country, have declined.  Television has trumped political organization.

In the recent election, Bernie Sanders, who was not a Democrat, ran in the Democratic primary and almost won.  Donald Trump, who was not a member of the Republican Party, essentially ran against the party in the Republican primary, yet won the party’s nomination.  He then ran in the general election with almost no support the from the party, and won.  With his Bozo the Clown act, Trump received an extraordinary amount of free publicity from the mass media.  Then, with his own money and money from wealthy donors, he was able to buy more media time.  A media star to begin with, Trump parlayed that status to the election.  Lippmann was correct in predicting this sort of thing.

But Dewey was also correct in predicting the progressive educational effect of public schools.  The fact is that most Americans trust modern science and scientists.  Most accept, for example, the fact of global climate change and the effect that human activities have on that change.  Most accept racial equality, gender equality, and gay rights.  We have, after all, twice elected a black intellectual as President, something that would have seemed impossible in Dewey’s day.  These enlightened developments in public opinion did not come from nowhere.

Science classes and social studies classes in the public schools were major factors in developing enlightened public opinion in recent years.  Schools are where children have learned how science works, how government works, and how people are people, no matter their race, religion, or gender.  From that knowledge, they have also learned that it is possible for scientists and government officials to be trustworthy, and how to know whether and when they are.  Local schools run by local school boards, and staffed by local teachers who are connected intellectually to scholars and scholarship worldwide, made the difference.  This is the bottom-up, participatory democratic change that Dewey promoted.

At the same time, while most Americans are connected to the wider world, there are pockets of isolated people who essentially constitute communities of distrust of the wider world.  In these communities, local control of schools and the media has worked to the disadvantage of progressive education and enlightened thinking.  There are, for example, many states and localities in which school boards openly forbid or subtly discourage teaching about climate change and social justice issues.

These communities are often caught within vicious cycles of self-reinforcing distrust of modern science, scientists, federal officials, and foreigners.  Modern technology reinforces this vicious cycle by enabling people in these communities to connect with television news channels and websites that reinforce their narrow opinions, without any contact with alternative views.  The ability of Republicans to control these states and to control Congress through gerrymandering depends on these communities of distrust.  As did Trump in his presidential campaign.

The debate between technocratic progressives and participatory democratic progressives on how best to counter conservatives has gone on for close to one hundred years.  In recent years, the debate has included Bill and Hillary Clinton on the technocratic side, with their famous focus on triangulating public policy and public opinion, and Bernie Sanders and Elizabeth Warren on the participatory democratic side.  The long-term goals of these two groups of progressives are not that different, but their short-term goals and methods are.  Should liberals focus on the cult of personality or the cultivation of policy?  Should liberals pursue a top-down media-driven political strategy, or a bottom-up grass roots organizing strategy?

The cult of personality is at best a fifty-fifty proposition.  For every magnetic liberal personality such as Bill Clinton, you get a non-magnetic liberal personality such as Hillary Clinton.  The cult of personality also leaves you liable to the libeling of personality, as with the ridiculing of Jimmy Carter in 1980, the Swift-boating of John Kerry in 2004, and the vilifying of Hillary Clinton in 2016, which saddled us respectively with Ronald Reagan, George W. Bush, and Donald Trump.

In this last election, the technocrat Hillary Clinton may have been the most qualified presidential candidate in American history.  The clownish Donald Trump may have been the most unqualified presidential candidate in American history.  Basing his campaign on appealing to communities of distrust in the United States, the clown raged his way to victory.  There are many things that Democrats could have done during this election that might have tipped things their way.  Trump’s election, nonetheless, highlights many things that are wrong in our system of elections.  These include the inordinate influence of private money on elections, the disproportionate attention from the mass media that a flamboyant candidate gets, and the undemocratic nature of the Electoral College.  These things ought to be fixed, but almost certainly won’t be.

But reaching out to disaffected groups who either voted for Trump or didn’t vote, especially those in the hinterlands, and bringing them into a progressive community of trust, is something that could be done.  That means developing a grass roots participatory democratic strategy for penetrating communities of distrust, and encouraging trust in progressive policies that can translate into electing progressive politicians.  Maybe we can reach the 13% of the electorate that put Trump over the top, and work on the Republican base of 33% as well.  That might help keep thoughts of the scary clown at bay, so that maybe we can all get some sleep at night.

[1] Walter Lippmann. Public Opinion. New York: The Free Press, 1922. p..48.

[2] John Dewey. The Public and Its Problems. Chicago: Swallow Press, 1980. pp. 169, 178, 208-209.

[3] John Dewey. The Public and Its Problems. Chicago: Swallow Press, 1980. pp.213, 216.

The Election of Donald Trump and the Law of Small Numbers: A Statistical Note on the Choice of Incompetent Presidents.

The Election of Donald Trump and the Law of Small Numbers:

A Statistical Note on the Choice of Incompetent Presidents.

Burton Weltman

Making too much out of too little.

Statisticians have long warned us not to violate what they call the Law of Small Numbers, which is the mistake of making too much out of too little, and drawing big conclusions from a small sample of evidence.  At the same time, psychologists have told us that we are hardwired to do just that, and are programmed to reach hasty conclusions that would not survive reasoned reflection.  Without someone or something to make us stop and think, we all too often make decisions that we later regret.  And all of this, they tell us, is a result of evolution.

The tendency to reach hasty conclusions was, in fact, an evolutionary advantage for our puny ancestors, little rat-like mammals scurrying around trying to avoid being eaten by large predators.  For them, for example, seeing a potential predator in a given place more than once was probably a good reason to avoid that place forever more.  Taking extra precautions such as this was a key to survival for them.  But, the fact of the matter was that the appearance and reappearance of that predator in that place was often more likely a matter of chance than a pattern of behavior.  There was probably nothing to fear, but better safe than sorry was the order of the day.  The extra precaution was wise, albeit it was not statistically necessary.

We humans today are still operating under that primitive imperative of better safe than sorry, and we almost inevitably jump to broad conclusions based on limited data.  But what was a wise thing for our ancestors to do may be unwise for us.  Concluding, for example, that since your buddy was able to pick five winners in five horse races, you should place your life’s savings on his sixth tip, is probably unwise.  The sample of five winners in five races is just too small to reach a reasonable conclusion that your buddy knows what he is doing.  Unless, of course, you know that he has inside information and that the fix is in for the sixth race.

Making a silk purse out of a sow’s ear.

So, what does that have to do with the presidential election of Donald Trump?  Just this.  If you look at our system of electing Presidents in the United States, and read what commentators have been saying about it since the adoption of our Constitution, it is hard not to conclude that it is an extremely inefficient process.  At its best, the process has not worked at least since the first quarter of the nineteenth century to select the best and most qualified people to be our Presidents.  And with the degeneration of our political parties in recent years, and their declining influence, and with the increasing influence of big money and the mass media in the election of our Presidents, the process has gotten even worse in recent decades.  It is, at best, a random process, giving us maybe a fifty-fifty chance of having a decent or a disastrous President.

Under these circumstances, we have actually been very lucky in this country that we have had so few disastrous Presidents in our history.  Yet, we are surprised when someone as disastrous as Donald Trump gets elected.  We should not be surprised.  Our surprise is a function of our being fooled by the Law of Small Numbers.  Since a good majority of our Presidents have been at least decent, with disastrous Presidents seemingly as exceptions rather than the rule, we think we have a system of elections that works reasonably well.  Well, we don’t.  And it has once again been proven to us in spades.

There are many specific reasons why Trump was elected.  It was seemingly a perfect storm of things that went wrong or went against Hillary Clinton’s campaign, and went right or in favor of Trump’s.  But there are also many things wrong with our system of electing Presidents which contributed to his victory, from the undemocratic Electoral College to the extraordinary length of our election campaigns.

Some of these things will likely never be fixed, including the Electoral College, but some things can.  In particular, we need strong political parties that are organized from the bottom on up, and a strong program of public financing of election campaigns.  For information and ideas about public campaign financing, you can consult the website of Democracy Matters, a grass roots organization dedicated to taking big money out of our politics.  With stronger political parties and public financing, we can minimize the influence of demagoguery through the mass media of the sort that Trump successfully engaged in during the last election, and we can minimize the undemocratic influence of billionaires and big corporations on our elections.

These things are doable.  And we should, at least, learn from this last election not to trust in the illusion of conclusions that violate the Law of Small Numbers.  Given the nature of our electoral system, and the odds of the game, something like Trump was to be expected.  We can do better.

1/31/17

 

So what if Horton heard a Who? The Ethics of Hobbes, Hutcheson and Dr. Seuss in the Age of Trump.

So what if Horton heard a Who?

The Ethics of Hobbes, Hutcheson and Dr. Seuss in the Age of Trump.

Burton Weltman

Horton’s World: A person is a person, no matter how small.

In Dr. Seuss’ story Horton Hears a Who!, Horton is an elephant who lives in a jungle.  Since elephants have big ears, Horton is able to hear a tiny voice emanating from a tiny person on a speck of dust that is a tiny world.  The tiny person, who says he is a Who, is calling for help because the tiny world of the Whos has come unmoored and is blowing in the wind toward a pond in which the Whos will all drown.  To save the Whos, Horton grabs the speck of dust and places it on a flower.  He then promises the Whos that he will plant the flower in a safe place to secure their long-term safety.

But Horton is overheard by a group of his friends, a diverse bunch of animals, none of whom has ears as big as an elephant’s and none of whom can hear the Whos.  To them, Horton is seemingly talking to a flower, and they think he is delusional.  To save Horton from his delusions, they overpower him, seize the flower, and declare their intention to destroy it.  Horton resists and prevails upon the Whos to shout in unison until, finally, when the last little Who child adds his small voice to the chorus, Horton’s colleagues can hear the Whos clamoring for help.  At this point, they immediately adopt Horton’s mantra that “A person is a person, no matter how small,” and the book ends with them pledging to help him protect the Whos’ world.

But why?  Why should Horton’s jungle mates care about protecting a bunch of insignificant creatures on a minuscule piece of dust?  The answer to that question is the key to the moral and the message of this story, and most of Dr. Seuss’s other stories as well.  The story is not merely about Horton’s heroics, it is even more about the willingness of his colleagues to change their minds when confronted with convincing evidence, and their ability to demonstrate empathy toward other creatures no matter how different and how insignificant.

The world of Dr. Seuss is one in which people care for each other, differences among people can be reconciled, and one can reasonably expect people to be reasonable.  This, I contend, is one of the main reasons Dr. Seuss’s stories remain enormously popular among parents and children some sixty to eighty years after their publication.  And, I contend as well, the continuing popularity of Dr. Seuss’s books is a sign of hope for us in the coming Age of Trump.

Hobbes, Hutcheson, and Horton: All against all, or all for one and one for all.

The moral and message of a story are contained not merely in the words and actions of the main characters, but in those of the surrounding characters and in the overall ambience of the story.[1]  Does a story portray the struggles of heroically good individuals against a corrupt society and a generally malignant populace?  Or does it portray the efforts of good people to convince other basically good people to do the right thing?  The messages of these two types of stories are very different as to what children will face in the world and how they should behave.  The former message is the gist of the philosophy of Thomas Hobbes, a mid-seventeenth century English thinker.  The latter is the gist of the philosophy of Francis Hutcheson, an early eighteenth century Scottish thinker.

Anglo-American ethical thinking has been dominated by two main streams of thought since the eighteenth century, streams which are represented by Hobbes and Hutcheson.  Hobbes claimed that humans are essentially selfish, and that society is a zero-sum game in which one person’s gain is another person’s loss.  The suffering of others is nothing compared to the convenience to ourselves, Hobbes contended.  Life is a war of all against all.  If Hobbes were writing the story of Horton and the Whos, the story would likely end with Horton’s colleagues destroying the flower, since protecting the Whos was too much trouble, and who cares about Whos anyways.

Hobbes’s ethical position has been advanced over the centuries by a long train of social thinkers.  The position was represented in the eighteenth century by Bernard Mandeville’s advocacy of cutthroat laissez-faire capitalism because “Private vice makes for public good.”  That is, cheating, bullying, lying, greed, self-indulgence, and meanness are what make the world go around.  In the nineteenth century, this philosophy was represented by the so-called Social Darwinism of William Graham Sumner.  The rich are rich, Sumner claimed, because they are better people.  The poor deserve their poverty because they are worse.

In the twentieth century, Hobbes’s war of all against all was rationalized in the trickle-down theories of David Stockman.  It is better for everyone, he claimed, if the rich get richer because some of their wealth will trickle down to the poor.  The stock in trade of plutocrats in all ages, Hobbes’s thinking is currently the mantra of Donald Trump, for whom little people and refugees like the Whos are merely losers to be set aside while winners like him get on with life.

Hutcheson represented a contrary position.  He contended that humans are essentially social, and that society should be properly understood and operated on a mutual aid basis in which the gain of each is the gain of all.  He claimed that people are essentially empathetic, and that we inevitably share in the suffering and happiness of others.  Denying our responsibility for others in pursuit of selfish individualism is a self-defeating proposition, which only leaves one insecure and a loser, no matter how much one ostensibly wins.  Triumph over others is defeat for oneself.

In the eighteenth century, Hutcheson’s position was represented by Thomas Jefferson in The Declaration of Independence.  Jefferson took the phrase “pursuit of happiness” directly from Hutcheson, for whom it meant seeking one’s own happiness through helping others.  Pace Donald Trump and his Tea Party haters, the country was actually founded in empathy.

In the nineteenth century, Hutcheson’s theory was reflected in the cooperative ideas of Jane Addams, whose Hull House was a model of sharing and caring.  In the twentieth century, it was represented in Franklin Roosevelt’s declaration of the Four Freedoms to which all people are entitled – freedom of speech and expression, freedom of religion, freedom from want, and freedom from fear.  Embodied in the phrase one for all and all for one, the theory has been the stock in trade of liberals in all ages.  It has been the gist of Barack Obama’s slogan “Yes, we can,” with the emphasis on the “we,” and is currently the mantra of Bernie Sanders.  And it is the moral represented by Horton and his friends.[2]

Dr. Seuss’s World: Doing the Right Thing.

Dr. Seuss’s stories are above all else about our responsibility for each other and, especially, the responsibility of those with power to assist those without.  Sharing and caring are the keys.  The tension in his stories generally comes from disagreements about what is the responsible thing to do.  In Horton Hears a Who, it is the disagreement between Horton, who insists that he must protect the Whos, and Horton’s colleagues, who insist that they must help free Horton from his delusions.  But once Horton’s friends realize that Horton is not delusional, they immediately accept their responsibility as more powerful creatures to help the less powerful Whos.

One of the important points in the book is that no one, no matter how big and powerful, can succeed on his/her own.   Horton the elephant is by far the biggest animal in the story, but even he is liable to be overpowered by the combined efforts of the other smaller jungle animals.  Success, Dr. Seuss is saying, is social.  In turn, no one is too small and weak to make a difference.  It was the squeak of the last and smallest Who that finally enabled Horton’s friends to hear the Whos, and to realize the harm they were about to do. Failure, Dr. Seuss warns, can be individual.  So, everyone must help.  This message permeates all of Dr. Seuss’s books.

In Horton Hatches the Egg, Horton once again accepts a responsibility to take care of someone at risk, in this case a bird’s egg that has been abandoned by its mother.  Horton sits for what seems like months on the egg, through storm and stress, consoling himself with the mantra that “An elephant’s faithful – one hundred per cent.”  When the egg finally hatches, the infant is half bird and half elephant, a biological impossibility, but an ethical justice.  Most important, no one in the story rejects the baby elephant-bird as deformed or different.  The story is not just about Horton’s faithfulness, and the duty of those with power to help those without, but also about the willingness of others to accept diversity.

In Green Eggs and Ham, the conventional tables are turned, and an adult is being harassed by a child to try something new and different, something the adult thinks he won’t like.  It is normally the case that children are adjured by parents, teachers and other adults to try new things, things the kids think they won’t like.  In the end, the adult tries the green eggs and ham, and finds that he likes them.  The key to the story is that the adult is willing to admit he was wrong.  He does not merely try the green eggs and ham to get the kid off his back, and then save face by insisting that he still does not like them.  He is willing to swallow his pride, along with the green eggs and ham.  This is another instance of those with power accepting responsibility to support others.

Most of Dr. Seuss’ other stories – from The Sneetches to Yertle the Turtle to The Lorax to How the Grinch Stole Christmas and The Butter Battle Book – turn in the end on the idea that most people will do the right thing, the socially responsible and cooperative thing, if and when they realize what needs to be done.  Dr. Seuss is not a Pollyanna.  There are bad people in his books, and bad things happen to good people in his stories.  But there is always the possibility of reconciliation and consensus as an outcome.

Dr. Seuss treats what used to be called “the common man” and “the people” with respect.  People may be wrong, wrong-headed and ignorant, but they are not idiots.  He would seemingly support Lincoln’s claim that you can fool all of the people some of the time, and some of the people all of the time, but you cannot fool all of the people all of the time.  Dr. Seuss’s stories illustrate Lincoln’s adage, with the underlying assumption that most people can be reasoned with, and will change their minds and ways when they are given adequate evidence and appropriate arguments.

In this respect, Dr. Seuss’s stories stand in sharp contrast to children’s stories in which characters inevitably and irreconcilably fight one another, and in which the world is chronically ominous, dangerous and downright scary.  The stories of the three little pigs and the big bad wolf, Tweety Bird and Sylvester the Cat, and the Road Runner and Wiley Coyote are prime examples of this.  In these stories, large predator animals seek to kill small prey animals.  Given their biological differences and genetic imperatives, there is no basis for reconciliation or consensus between the enemies.  The large animals are meat eaters, and the small animals are their meat.

In these stories, the small animals are made to look and sound like little children.  Since small children are intended to identify with the small creatures, these stories portray a scary world for children.  And even though there is some consolation in that the predators in the stories never get their prey, the message to children is that the world is a dangerous place full of big creatures trying to kill little creatures like themselves.  In a similar way, stories such as “Sleeping Beauty” and “Snow White,” in which an innocent young heroine is threatened by an evil adult witch, convey to children the message that evil is real, that evil is all around us, and that you can never tell who is hiding their evil intentions behind a benign smile.

These stories represent the world that Donald Trump inhabits, a realm of false smiles and perpetual fighting for domination, in which doing dirty unto others before they can do unto you is the law of the land.  But Trump’s world is even scarier than these storybook worlds, because in his world the three little pigs, Tweety Bird, and the Road Runner would be considered weaklings and losers, and they would get eaten.  Trump’s is a world in which sharing and caring, doing the responsible and empathetic thing, have no place.

Trump’s America.  Or is it?

I think that those of us who are appalled at the election of Donald Trump as President of the United States need to distinguish between three things to be able go forward with some degree of optimism.  We need to distinguish between Trump the person, Trump the President, and Trump the ostensible representative of the American people.

Trump the person is abominable, and he is a classic loser despite his success.  The man is without couth or class and, seemingly, without conscience.   He is a perpetual adolescent, trying to assert himself amongst people whom he secretly seemingly sees as superior to himself.  So, he denigrates them, but he is really denigrating himself in the process.  He is a bully who relies on others to fight his battles, a billionaire who took his father’s money and did very little with it, a businessman whose only successful business has been in selling his name to a credulous portion of the public.  His racism, misogyny, ethnocentrism, and selfish self-centeredness represent most of the worst elements in American society.  As I write this essay, he is a seventy-year old man about to become the most powerful person in the world, but he is still acting out in tweets and in rants the insecurity of a pimply adolescent.

As awful as Trump is as a person, it is not clear that he will be able to translate all that awfulness into his presidency.  As President, he will need to cope with his own ignorance, incompetence and short attention span.  He will also need to deal with a sharply divided Republican Party, most of whose leaders dislike him, and with a Congress, most of whose members face election in less than two years.   He will also face a public that does not like him, and that gave his opponent a significant majority of the popular vote in the election.  So, it is not clear how much of his awfulness can be translated into policy.

Finally, it is quite clear that Trump does not represent the values and political preferences of a majority of the American people.  He not only lost the popular vote, but it seems that most of his votes came from people who were opposed to Clinton, not in favor of him.  There is a plethora of reasons why he won the election or, rather, why Hillary Clinton lost the election, and his candidacy and election have unleashed some of the worst elements and tendencies in our society.  But it is not the case that the populace has in recent years turned to the far right.  And the continued popularity of Dr. Seuss is one small proof.

Dr. Seuss’s characters represent almost all that is best about America, and not merely his main characters, the heroes of the stories, but the supporting cast as well.  That is the key to the morals and ethics of his stories.  Most of us see ourselves not as heroes, but as members of the supporting cast in society.  Dr. Seuss portrays his supporting cast of characters as basically good people, who are empathetic and responsible.  That is the role in which he casts people like most of us and our children in his books.  He tells us and our kids that good in the world comes not merely from powerful heroic individuals such as Horton, but from the support of ordinary people like us who end up supporting Horton.  That parents and children continue to find comfort, amusement and instruction in Dr. Seuss’s stories is a source of hope that the ethics of Horton and Hutcheson will prevail in the long run, and that we will emerge as a decent society from the reign of Donald Trump.

[1] For a discussion of storytelling and the moral messages of different narrative forms, I have posted an essay on this blog site entitled “What to do about the Big Bad Wolf: Narrative Choices and the Moral of a Story.”

[2] For a discussion of the devolution of conservatism and the evolution of liberalism in America, I have posted an essay entitled “Do unto others before they do unto you: The Devolution of Conservatism from Burke to Trump and the Evolution of Pragmatic Liberalism from Madison to Obama” on this blog site.

 

False Equivalencies Equal Bad History and Bad Politics: Populism vs. Nativism/Sanders vs. Trump.

False Equivalencies Equal Bad History and Bad Politics:

Populism vs. Nativism/Sanders vs. Trump.

 Burton Weltman

The current election cycle has featured two candidates for President, Bernie Sanders and Donald Trump, who were outsiders within the Democratic and Republican Parties, respectively.  Both succeeded beyond anyone’s expectations, seemingly even theirs.  Sanders came close to winning the Democratic nomination, and Trump actually won the Republican nomination.  Pundits and politicians have been grasping for months for an explanation of these candidates’ success.

A frequent explanation given by observers is that both candidates are “populists,” and that both are channeling the motives and emotions that were represented by the Populists of the late nineteenth century.  In turn, the challenges that Sanders and Trump have made to their parties’ establishments have been considered by the pundits to be equivalent.[1]  It is, however, neither historically nor politically accurate to label them both as populists, and it does a disservice to political discourse to propagate the idea that their challenges are equivalent.

Populism/populism.  Populism (with a capital “P”) was a late nineteenth-century political movement that hoped to sustain the viability of small farmers and small factories in this country through cooperative programs that would give them the economies of scale of big businesses and big farms (cooperative purchasing and selling agreements, sharing expensive equipment, working collectively on various tasks), and through government regulations that would keep big businesses and big banks from trampling on the little guys (limits on railroad rates, storage fees, bank loans, and price gouging).

Contrary to much present-day popular belief, Populists did not naively promote wild-eyed programs that had no chance of implementation or success.  Many Populist proposals were adopted at the state level, and they worked.  Some were enacted at the federal level, and they worked, too.  There is, in fact, no good reason why we have to have giant corporate farms or giant corporate businesses in most things.  Small can be beautiful, and can work.

The United States Supreme Court, however, was controlled in the late nineteenth century by a group of Justices who literally believed that laissez-faire was written into the Constitution, and they overturned most Populist legislation on Constitutional grounds.  This did not mean that Populists were unrealistic.  New Dealers faced a similar obstacle with a conservative Supreme Court during the 1930’s.  They were eventually able to overcome that hurdle.  Populist ideas were widely popular.  Given some changes in circumstances, Populism could conceivably have become the conventional wisdom of the country, and Populist policies might have resulted in a very different and possibly better America.

William Jennings Bryan is one of the reasons Populism failed.  He is often labeled a Populist, but he was not a Populist.  He ran his 1896 campaign as essentially a Silverite.  The Silverites believed that if the federal government backed its currency with silver and not merely gold, all would be well with small farmers and businesses.  Populists supported the coining of silver money, but did not see it as a cure-all.  Rather than promoting Populism, Bryan’s campaign for President, with its single-minded focus on a silver bullet solution, helped to kill it.  He is responsible in large part for the Populists’ reputation as being unrealistic.

Populist programs were revived during the New Deal by Secretary of Agriculture Henry Wallace through the Department of Agriculture.  He promoted the TVA (there were going to be five such projects), cooperative farm programs, cooperative small factories, and Greenbelt cities.  And all of these programs worked.  Wallace’s programs died as a result of the conservative political backlash of the late 1930’s, and so did Populism.

Populism was a big and broad movement.  As such, it included many different factions and tendencies.  Populism had a racist and xenophobic element at its fringes.  So did the Socialist movement, the Progressive movement, and the labor movement at that time.  In fact, the taint of racism and xenophobia has attached itself to almost every movement in American history.  But they were not major themes in Populism and they have not been major elements in Leftwing movements generally, as they have been in Rightwing movements to the present day.

Populism (with a capital “P”) was a producers’ movement that focused on the ways and means of producing things, with a goal of sustaining smallish scale production.  The populism (with a small “p”) that survives to the present day is a consumers’ movement that focuses on getting better wages, working and living conditions, and social services for ordinary people and people who are hard up.  The Bernie Sanders campaign was part of that movement.  And his campaign began with some very sensible proposals about what could be done now at the state level (single payer healthcare, higher minimum wage, environmental protections and many other populist programs can be adopted at the state level) and at the federal level (executive orders can do a lot).  As success went to his head, Sanders’ claims became progressively less realistic, but that does not mean that his campaign was based on naivete and wild-eyed proposals.

Nativism.  Nativism is not populism.  It is the Rightwing response to populism.  Nativism is the use of fear of racial and ethnic minorities as a means of promoting the social status quo and protecting the social position of those in control of a society.  Nativism, along with its components racism and xenophobia, have typically risen in this country in times of populist upheaval and social reform.  Nativism is an attempt to squelch social change through fear and hatred.  Its mantra is that change will help only Them and will hurt Us.

Contrary to popular opinion among pundits, George Wallace was not a populist.  He was a racist nativist.  Donald Trump is not a populist.  He is a racist and xenophobic nativist.  Nativism, racism and xenophobia are founded on the status anxiety of people with a little something, who are willing to support those who have the best and most of everything, in order to fend off and stay above those who have little or nothing.  Racism is the answer to the question of why Southern white small farmers supported with their lives a system of slavery that well served a few rich planters, but hurt them in every way other than in their ability to feel themselves above the black slaves.  Racism is the answer to the question of why so many whites today are so opposed to Obamacare.

Populism is a positive program of reform based on hope.  Nativism is a negative program of hate based on fear.  Populists and nativists are appealing to some of the same constituencies, but there is no equivalency in the appeals.

Consequences.  In turn, I would predict that there will be little equivalency in the consequences of the Sanders and Trump campaigns.  Assuming that Trump loses, his has been a campaign based on the lies that immigrants are taking American jobs, committing lots of crimes, and living on the dole paid for by good white American taxpayers.  His base has been old white people.  The lies will out and the old people will die out.  That could and should be the end of his influence.  Sanders’ campaign was based on truths about healthcare and wages, and his base has been young people.  His truths and his base will likely only grow so that even if he is personally done, his movement could and should live on.  Of course, if Trump wins, then all bets are off and God help us.

9/23/16

[1] See, for example, the recent article by John Judis in The New Republic. “All the Rage.” Sept. 19, 2016.

Do unto others before they do unto you: The Devolution of Conservatism from Burke to Trump And the Evolution of Pragmatic Liberalism from Madison to Obama.

Do unto others before they do unto you:

The Devolution of Conservatism from Burke to Trump

And the Evolution of Pragmatic Liberalism from Madison to Obama.

 

Burton Weltman

 

“We’ve got what they want, and we aim to keep it.”

Vice President Spiro Agnew

 

Prelude: A Concern with Unintended Consequences.

My purposes in writing this essay are twofold.  First, I will outline what I see as the devolution of conservatism from its starting highpoint in the eighteenth century to its low point as blatant racism, ethnocentrism and mere obstructionism in the present day.  I will focus on the historic concern of conservatives with the potential for unintended negative consequences in undertaking social reform, and their claim that negative results invariably overwhelm any positive change.  Edmund Burke, the father of conservatism, voiced this concern during the eighteenth century as a legitimate question of whether and how we can predict the results of social reform.

What began as a legitimate concern about unintended consequences devolved over the years into an excuse by conservative politicians to oppose any change that might negatively impact their wealthy sponsors.  That practice eventually devolved into a justification for opposing any program that might help racial and ethnic minorities, a coded appeal to the racial fears of white people.  In the current election cycle, what had been a coded appeal to bigotry has become open fearmongering and hate peddling by Donald Trump.  I will argue that the turning point in the devolution of conservatism came with the advent of Social Darwinism at the turn of the twentieth century, and the acceptance of its basic premises by most conservative politicians.

Second, I will argue that the evolution in the early twentieth century of pragmatism as a comprehensive social theory and practice undermined the rationale for conservativism and transformed the rationale for liberalism.  Backed by the methods of the then newly emerging social and physical sciences, pragmatism offered a way for social reforms to be subject to experimental methods, ongoing evaluation, and continuous revision.  This pragmatic review process could effectively mitigate most legitimate concerns about the unintended consequences of reform, so that conservatism had been rendered obsolete.  Politics could safely become a realm of continuous social reform, which is the position represented by President Obama.

Act I.  Actions, Reactions, and Reactionaries: The Birth of Liberalism and Conservatism.

“To every action there is always opposed an equal reaction.”

                 Isaac Newton.

“Ambition must be made to counteract ambition.”

               James Madison.

“We must all obey the law of change.  It is the most powerful law of nature.”

              Edmund Burke.

 

Setting the Scene: Let us reason together.

It was the turn of the eighteenth century.  Europeans had suffered through almost two centuries of political upheaval and religious wars.  The Protestant Reformation had precipitated the Catholic Counter-Reformation, which had led to Protestants and Catholics slaughtering each other, and to both Christian groups killing Muslims and Jews.  At the same time, the decline of feudalism had precipitated the economic upheaval of nascent capitalism, with land enclosures creating massive unemployment and unrest.

Europe was, however, about to enter a period that contemporaries called the Enlightenment in which prominent intellectuals and their backers tried to leave behind the superstitions, authoritarianism and violence of previous centuries.  And it was a period of relative calm compared to the recent past, despite the imperial rivalry of England and France, who engaged in a series of imperial wars from the 1690’s through the 1810’s.  During one of those wars, the French helped a group of North American colonies gain their independence from England, and establish the United States.  Calmness and control were watchwords in culture and society during the period.  These goals were reflected in the scientific and political theories and practices of the time, which included the rise of liberalism and conservatism as political philosophies.[1]

Isaac Newton’s World: Inertia, Friction and Orderly Change.

The eighteenth century marks the definitive opening act of modern science and politics.  By modern, I mean the theories and practices from which we most closely derive our own ideas today.  There are many people who can be cited as precursors of modernity, for example Bacon and Galileo in the physical sciences.  But their ideas were not given full exposition until the work of Isaac Newton at the beginning of the eighteenth century.  Newton established a framework that dominated the physical sciences for some two hundred years.  Most notably, in his Three Laws of Motion, Newton reversed scientific theories that dated back to Aristotle, and rejected common sense human experience as well.

In his First Law of Motion, Newton claimed that something in motion would continue moving in a straight line forever unless it was disturbed by some change in circumstances, some force that pushed it out of its inertial course.  That law was in direct contradiction to ancient Aristotle’s theory and to our common sense experience that a thing must be continuously pushed by a force in order to continue in motion.  In our common experience, things grind to a halt unless they are pushed.  That is mainly the result of friction, but since we live in a world of friction, we usually take it for granted, and do not factor it in as a countervailing force in our thinking about things.  Since we have little experience of things moving in a vacuum, in which there is no friction, Newton’s First Law is counter-intuitive to most of us.

Newton’s Second Law of Motion describes the change in circumstances, that is, the force, necessary to change the inertial course of something – to start it, stop it, or redirect it.  His Third Law emphasizes that for every action, there is an equal and opposite reaction.  Push and you will be pushed back.  This also seems counter-intuitive to most of us, as we do not experience as pushback the inertial resistance of something we are pushing.  We merely think of it as the heaviness of the object, not that the object is pushing back at us.

In his Laws of Motion, theories of gravity, and other work, Newton described a mechanical universe of complementary and competing forces, in which things take their customary course ad infinitum, unless they are forced to change by natural or unnatural circumstances.  These Laws of Motion were not only counter-intuitive to common sense experience, they also described a more orderly picture of the world than was experienced by most people.  Most Europeans were still reeling from the consequences of the religious and political wars of the sixteenth and seventeenth centuries, and the social and economic upheaval of nascent capitalism.  Most ordinary people lived precarious lives in circumstances that seemed in constant turmoil.  In the religious and political beliefs of most people, the only thing that kept things going and kept them in order was the constant intervention of God, the King and/or some strong outside force.

Newton disagreed.  Although he was a deeply religious man, who spent more time and effort in his studies of religion and ethics than he did on science, Newton’s scientific theories delineated a universe that was very different than that portrayed in conventional religious and political theory.  Contrary to the conventional view of the world as constantly teetering on turmoil, he portrayed a universe which was essentially stable, and in which ordinary people could choose to keep things the same or change them.  He was, thereby, describing the essence of our modern world view.[2]

Newtonian Politics and The Rise of Conventional Political Ideology. Developments in political theory and practice during the eighteenth century followed a course similar to that of physics.  Sharing a Newtonian view of the universe, newly evolving political theories described a political world which operated mechanically and predictably, instead of on the edge of chaos, and in which people could choose their governments, being no longer tethered to Divine Right Kings.  In this political development, liberalism came first, and conservatism came in reaction.

The liberal and conservative ideologies that emerged during this time dominated political theory and practice in England and America for some two hundred years.  They are still influential today.  Aspects of these ideologies were developed by Thomas Hobbes and John Locke during the seventeenth century, Hobbes a conservative forerunner, Locke a liberal forerunner.  Their ideas were given full exposition during the eighteenth century in the theories and practices of the liberal James Madison and the conservative Edmund Burke.[3]

Liberalism: The Obvious Truth.  The term “liberal” began as an ethical concept that denoted generosity.  A liberal person was someone, usually a person of station and means, who gave generously to the less fortunate in society.  During the eighteenth century, the term was extended to politics.  In politics, a liberal was a social reformer and social planner, usually a person of station and with a formal education, whose proposals were designed to make society fairer and more efficient, and were generously intended to help the less fortunate and oppressed in society.

Political liberals, like most devotees of the Enlightenment, believed in the power of Reason (with a capital R).  They generally held that one could derive self-evident truths through reasoning, and then develop social policies based thereon.  They were planners, who thought that if something was wrong, they could rationally design a fix for it.  They were impatient with tradition, as the sepulchral grip of the dead hand of the past choking the present, and insisted on change as the function of reason.  Nature was, to them, something to be tamed and made to work for humans.  Finely landscaped gardens, neatly plowed and hedged wheat fields, and clearly mapped roads and routes were their ideal of nature.  Human nature had similarly to be tamed and bounded, even as social problems were being solved.

Most eighteenth century liberals assumed a social hierarchy in which the People would instinctively defer to their natural leaders, that is, to those in the social and educational elite of society, so long as those leaders fulfilled their natural obligations to rule on behalf of the People.  Government was the result of a contract with the People, and they acted as a check on the elite.  The Declaration of Independence and the Constitution of our Founding Fathers exemplified eighteenth century liberal theories.  Based on “self-evident truths,” the Declaration outlines a philosophy of “liberty, equality and the pursuit of happiness” that is derived from Reason, and that balances the rights and duties of subjects with the powers and duties of their rulers.

The Constitution follows the philosophy of the Declaration in establishing a government of separate powers that were expected to check and balance each other, even as they worked together to “promote the general welfare” and provide other social goods for “We, the People.”  The Constitution describes a Newtonian political universe of actions and reactions.  Its original provisions even established different mechanisms and constituencies for the selection of members of the different branches of government.  The purpose of this complicated process was to ensure that no one group in society would dominate the government, and that the majority could not oppress minority groups.  It was also intended to facilitate the selection of members of the elite to most offices.

While the Founders were concerned with restraining politicians from running wild and ruining things, the Constitution also assumes an active government and continuous social reform.  It provides the federal government with powers to make changes in almost every area of society, including the government itself.  It is a short document short on specifics and, therefore, needs to constantly be interpreted and re-interpreted according to changes in society.  It also contains provisions for amending itself and, thereby, assumes that government must be changed as society changes.  Liberal social reform is incorporated into the fabric of the Constitution.

Critics of the Enlightenment have frequently contended that liberals of that time foolishly believed in the inevitability of progress.  That is not the case.  While many Enlightenment liberals, including Thomas Jefferson, the primary author of the Declaration of Independence, and James Madison, the primary expositor of the Constitution, may have in some ways been fools, they believed only in the possibility, and not the inevitability, of progress.  The weakness in their proposals was often in the paucity of evidence on which they were based.  Relying heavily on examples from ancient history, especially those of Greece and Rome, and on inevitably biased accounts of recent events, the Founding Fathers often rushed to judgments that proved wrong.  Although they relied on the best available evidence, that evidence was often not good enough.

The rationale for American Revolution was, for example, based on an inappropriate comparison of George III with Charles I, and on inaccurate reports from England about the doings and desires of the King.  The Revolution may have been a mistake.  The Founding Fathers were also seemingly mistaken in their expectations of the outcome of the Revolution, which is why they so quickly abandoned the Articles of Confederation for which they had fought, and established a very different government in the Constitution.  Government and politics under the Constitution, in turn, turned out to be very different than they intended and expected.[4]  This weakness in the predictive powers of liberal reformers opened the door for a conservative counterattack.

Conservatism: Old Truths are the Best.  Edmund Burke is almost universally considered the father of modern conservatism.  He was also almost universally considered by contemporaries to be a man of principle.  As an example, although Burke opposed the liberal philosophies embodied in the Declaration of Independence and the Constitution, he supported the American revolutionaries in their battle for independence from British rule.  A conservative supporting a revolution, and bucking his own political party and party leadership to boot.  To most of us today, this seems like odd behavior for the ur-conservative.  But that is the difference between what most people think of as conservatism today; the way it is represented by most so-called conservatives of the Social Darwinian school; and what it represented in the past.

The term “conservative” began as an ethical concept that denoted caution and frugality.  The term was extended to politics during the late eighteenth and early nineteenth centuries as part of the Romantic revolt against the Enlightenment and against liberal rationalism.  It is popularly thought that conservatives have always opposed all social change, and that they have wanted everything to stay the same or even go back to way they were in the past.  This is not the case.  Conservatives have historically accepted cautious social change.  People who oppose any and all progressive social change are more accurately called “right-wingers.”   Right-wingers generally represent interest groups that benefit from the status quo, and that fear social reform would entail a loss of power, profit and/or status.  And it is so-called “reactionaries,” not Burkean conservatives, who peddle nostalgia for the so-called “good old days” (that usually weren’t so good), and who want things to go back to the way they supposedly were in the past.

In contrast to right-wingers and reactionaries, Burke believed in incremental evolutionary change.  He rejected planned change, but accepted adaptive change.  He believed that society is strongest when it changes so gradually that the changes are barely noticed from generation to generation, and may only be recognized from a long historical distance.  He believed that tradition was the distilled wisdom of the ages.  And he believed that human reason was too weak and short-sighted to safely predict the consequences of social planning.  Burke insisted that the unintended negative consequences of social reforms were almost inevitably going to be greater than the positive effects.  However bad things were now, they would likely be worse if people took action to remedy the situation.

Burke’s insistence on the limits of reason and concern with the unintended consequences of reform comprise the most powerful legacy that Burke left to conservatives.  These ideas have historically been conservatives’ strongest argument against social reform.  They constitute an almost universal argument that can be used against almost any proposed reform.  Burke did not, however, oppose all reform.  He would support social reform if the survival of the social system seemed to require it, and if conscience and human decency seemed to demand it.

Burke believed in a hierarchical society controlled by an elite upper class.  But Burke’s elite could not merely pursue their own self-interest, even if it was justified with some sort of trickle-down theory of social benefits, as right-wingers proclaim today.  Burke’s elite were burdened with the obligation of caring for society, which included the noblesse oblige of the upper class to take care of the masses, a sort of mandatory charitable giving.  He was a vehement opponent of democracy, which he warned would lead to the subjugation of society by an ignorant mass.  But he also opposed oppression of the masses and persecution of racial and religious minorities by the elite.  His insistence on treating people decently was considered a matter of honor among conservatives during the nineteenth century, even if it was a principle that was almost always more honored in the breach.  It is a legacy that is all but gone among so-called conservatives today.

It was based on what he considered respect for tradition and the demands of decency that Burke supported the American revolutionaries.  He claimed that the King and Parliament had taken advantage of the British victory over the French in America in 1763 to radically change the terms on which the American colonies were being governed, and that tradition was being violated.  He thought also that the British government was being too harsh in its treatment of the colonists, and that noblesse oblige was being violated.  As a result, he believed the Americans were justified in rebelling against British misrule.

Burke had a deep respect for the facts.  The historical facts, the facts of evolutionary social change, and the facts of present-day problems were the foundation of his conservative ideology.  He accepted what was, and he did not hanker after what could be or what had been.  He challenged both liberals and reactionaries with what he saw as the facts.  This respect for facts also made him flexible.  He disdained Reason (with a capital R), but attempted to be reasonable.  He was the founder of conservative ideology, but he was not a conservative ideologue.

Act II. Dogmatism versus Pragmatism: Ideologues versus Ideology.

“It is not the strongest of the species that survive, nor the most intelligent, but the most responsive to change.”

“If the misery of the poor be caused not by the laws of nature, but by our institutions, great is our sin.”

             Charles Darwin.

“The social order is fixed by laws of nature precisely analogous to those of the physical order”

“Millionaires are a product of natural selection…Poverty and misery will exist in society just so long as vice exists in human nature.”

            William Graham Sumner.

“Our institutions, though democratic in form, tend to favor in substance privileged plutocracy.”

“Selfishness is the outcome of limited observation and imagination.”

           John Dewey.

Setting the Scene: Trying to find order in the midst of disorder.

It was the turn of the twentieth century. Change was the order of the day.  The nineteenth had been a century of revolution.  Europeans and Americans had suffered through the beginnings of the industrial revolution, which had produced enormous wealth for plutocrats but misery for the working classes, huge cities ringed by wealthy suburbs but with slums in their center, an abundance of goods but want among the masses, powerful inventions but large-scale environmental degradation, and miraculous medical advances but widespread disease.  There had also been a host of political revolutions, civil wars and other upheavals, as democratic aspirations gradually overcame aristocratic opposition in Europe and America.

The intellectual world was upended by the emergence of the specialized physical and social sciences, with their empirical and statistical methods, replacing the traditional emphasis on the classics and on Reason.  A cultural revolution was instigated toward the end of the century by the publication of Charles Darwin’s The Origin of Species.  The book put the theory of evolution and the consequences of evolution at the center of moral, intellectual and political life, where they remain today.

Charles Darwin’s World: Pragmatism, Relativism, and Probabilities.

The turn of the twentieth century was the age of Darwin.  Evolution was both the rage and a source of outrage.  Agnostics and atheists saw it as vindication of their beliefs or non-beliefs.  Protestant fundamentalists and Biblical literalists, in turn, damned it as sacrilege.  Scientists saw it as encouragement to take a more probabilistic and relativistic view of their fields.  Philosophical positivists and intellectual absolutists damned that as nihilism.  And it led some leading liberals and conservatives to revise their respective political beliefs, much to the chagrin of purists in both camps who damned that as unprincipled and immoral backsliding.

Theories of Evolution.  Darwin’s was not the first theory of evolution.  In the early nineteenth century, Jean-Baptist Lamarck had proposed what became a widely popular theory of evolution in which he claimed that creatures could genetically pass on to their progeny characteristics that they had acquired during their lifetimes.  Under Lamarck’s theory, for example, it could be said that giraffes acquired their long necks by dint of successive generations of giraffes stretching up to reach leaves at the tops of trees.  This theory implied that human families, ethnic groups, and racial groups could improve themselves through personal achievements that they then passed down to their descendants.  This seemed to mean that people were ultimately responsible for their own biological and social successes and failures.  A moral value could be attached to biological characteristics and to social success or failure.  People got what they deserved.

Darwin rejected Lamarck’s theory.  His theory was based instead on two key ideas, random variation and natural selection, that generated most of the opposition among religious fundamentalists to this theory.  Darwin claimed that new characteristics are not acquired through personal effort but through random genetic variation, essentially through what we would call mutation.  We cannot tell how or why these mutations occur.  It is pure happenstance to us.

This idea outraged many religious people and was greeted with glee by atheists.  It does not, however, necessarily mean that God is out of the evolutionary picture.  What is random to us humans could be planned by God.  It does not even mean that the creation stories in the Book of Genesis are invalid, if you read the stories metaphorically rather than literally.  The Catholic Church and most liberal Protestant groups read the Bible metaphorically and, therefore, have had no problems with Darwin’s theories.  But Protestant fundamentalists and Biblical literalists have rejected this view, and have rejected evolutionary theory.  They have, in turn, from that time to the present created havoc with the science programs in many American school districts.

Darwin also claimed that species survive and thrive based on their adaptability, which he called natural selection.  Natural selection is the ability of a creature either to successfully respond to environmental changes and challenges, or to fail and disappear.  Living things survive by trying to fit themselves into the existing environment.  They are assimilationists.  But they also try to better fit the environment to themselves.  They are social and environmental reformers.  The impetus for social reform is, thus, built into the structure of life.  Without it, we would die out.

Cultural relativism and ethical pragmatism are implicit in Darwin’s theory, and political and religious dogmatists have rejected Darwinian ideas for this reason.  According to Darwin the ability of living creatures to survive and thrive is based on the adaptability of their beliefs and practices.  If they adopt beliefs that do not work toward survival, they will disappear along with those beliefs.  If circumstances change and they are not willing or able to change with them, they will not survive.  Humans and other living creatures must take a tentative and probabilistic approach to beliefs and practices, willing and able to change them as circumstances require.

Darwin is popularly known for two main ideas, neither of which were his, but which were the foundation of Social Darwinism.  They are the idea of survival of the fittest, and the idea that there are inevitably losers as well as winners in evolution.  The latter idea derives from the population theories of Thomas Malthus.  Malthus claimed that population growth inevitably outpaces resources, and there are not enough resources to satisfy everyone.  In Malthus’ view, it is only through war, disease and famine that the human population has been kept under relative control.  And he opposed charity for the poor because it would only encourage them to have more children.

Malthus’ ideas are the inspiration for what is today known as the “zero-sum” theory of economics.  According to this theory, there is a limited amount of wealth in the world, not enough to make everyone well-off, and if some people get more, others must get less.  Darwin was inspired by Malthus’ population growth theory as an explanation for the rise and fall of the population of some species, but he did not use it as a general explanation of evolution.  Nor did Darwin think that human evolution was inevitably Malthusian.

Survival of the fittest was a term invented by Herbert Spencer.  Spencer had been a devotee of Lamarck’s evolutionary theory, and he believed that fitness was a moral achievement.  Social success as well as biological success were personal achievements that made a person fit to survive and thrive.  Social failure, according to Spencer, was a sign of genetic unfitness and unfitness to survive.  Darwin adopted the phrase “survival of the fittest” in his later works, but without any of the moral overtones that Spencer gave it.

Fitness did not mean for Darwin that one was the strongest, smartest, most powerful, most socially successful, or best in any other way except that one was able to fit oneself to the environment and fit the environment to oneself.  Spencer became a well-known supporter of Darwin’s biological theories, but used them to support his own so-called Social Darwinian social and economic theories, that neither Darwin nor Darwin’s theories supported.[5]

The Influence of Evolution on Philosophy and Science.  The theory of evolution ushered in a sea change in science from a positivist emphasis on finding absolute natural laws to proposing relativistic and probabilistic theories.  Mendel’s genetic principles in biology, Einstein’s theories of relativity in physics, and Heisenberg’s uncertainty principle in quantum mechanics were among turn-of-the-twentieth-century scientific advances that promoted a relativistic approach to truth.  William James’ radical empiricism and John Dewey’s experimentalism were among the philosophical applications of evolutionary theories.  This turn toward relativism on the part of scientists and philosophers generated an emotional reaction against science and philosophy among religious fundamentalists that continues to the present day in the United States.

It is a reaction that is based on misunderstanding.  Relativism does not mean that anything goes, or that there are no standards.  Relativism is not nihilism.  In saying that something is relative, one must always be willing to respond to the question “Relative to what?”, and be able to delineate some stable benchmark that provides a standard for evaluating the relativity of the thing.  In evolutionary theory, for example, survival is the standard by which things are evaluated.  In pragmatist philosophy, whether something works as an answer to a question is the standard.[6]           

The Evolution of Evolutionary Politics: Pragmatist Action, Dogmatist Reaction.  During most of the nineteenth century, liberals and conservatives shared many basic ideas, and their programs often overlapped.  Both liberal and conservative movements were broad-based, with a wide range of beliefs within each movement, and with the left-wing of conservatism shading into liberalism and the left-wing of liberalism shading into socialism.  Both groups had to adapt to the democratic trends of the time, and both hoped to bring order to democracy through the leadership of a meritocratic elite, albeit they had different types of elite in mind.

Conservatives generally looked to the rich to lead society.  Thomas Carlyle, among others, eulogized capitalists as “captains of industry” who ought to take command of society.  Liberals generally focused on education as the primary criteria for leadership, as they for the most part still do today.  John Stuart Mill, the leading liberal of the nineteenth century, advocated that those with more formal education should get more votes than those with less education, and Karl Marx, the leading socialist, promoted leadership by political theoreticians such as himself.

Both liberals and conservatives sought to promote industrialization, but with different emphases on how wealth should be distributed, and what sort of role government should play in the economy.  Both groups believed that government should encourage growth, and discourage corruption and crass exploitation.  Conservatives generally favored government intervention in the economy only if a problem was so severe that it threatened the social system.  Liberals generally supported government action to deal with a wide range of social ills.  Conservatives did, however support reform on humanitarian grounds.  It was English conservatives in the early nineteenth century who first proposed labor laws to protect working women and children.  And Abraham Lincoln, the ur-Republican, was a corporate lawyer who also supported labor rights as well as an end to slavery.

During the last half of the nineteenth century, economic and political events challenged the ideologies of both liberals and conservatives in the United States.  Economic depressions, violent labor disputes, rampant infectious diseases, overcrowded cities, rising crime rates, and other crises upset the orderly ideas of both groups.  Darwinian ideas of evolution came along at a time when both liberals and conservatives were looking for explanations of what was going on.

Avant garde intellectuals and activists among both liberals and conservatives seized on evolutionary ideas, but with very different applications and very different results.  The application of Darwin’s ideas to politics produced major splits within the ranks of liberals and conservatives, with the old guard in both groups fighting rear-guard actions to the present day.  An ever-widening split also developed between the Darwinian liberals and Darwinian conservatives who increasingly came to dominate the Democratic and Republican parties.

Social Darwinism: Every Man for Himself.  Social Darwinism was adopted by many erstwhile conservatives at the turn of the twentieth century as a rationale for control of society by the wealthy, and as a strategy for convincing the masses to support rule by the rich.  Historians have debated exactly how many people used the term Social Darwinism to describe themselves.  It is clear, however, that the ideas and the strategy represented by the term became increasingly influential among conservatives starting in the late nineteenth century and continuing to the present, even as conservatives increasingly rejected Darwinian theories of evolution.

These ideas can be summed up in two phrases, Malthusian catastrophe and survival of the fittest.  The strategy can be summed up in one word, fear.  A Malthusian catastrophe is when the downtrodden masses rise up and use up all the resources that the rest of us need to thrive, so that we all go down to a hellish existence together.  Malthusianism is the prediction of dystopia unless the masses are kept strictly in check.  It is an idea that gained currency when the closing of the American frontier in the 1890’s seemed to presage the closing down of opportunity, and has gained traction in the present day, when globalization seems to have a similar import.

Survival of the fittest means the cultivation of wealth and a cult of the wealthy.  According to this theory, laissez-faire capitalism is the competitive law of nature translated into an economic system, and it is ostensibly the single greatest vehicle for human evolution.  The winners in cutthroat capitalism are the best specimens of humanity, and having won the economic race are the ones who should lead the human race.  The losers in the race should be left behind, lest they become a drag on the rest of us.  This winner-takes-most theory is sometimes rationalized as what has come to be called “trickle-down” economics and culture.  The claim is that when the rich get more of something, some collateral benefits will trickle down to the rest of society.

Fear-mongering was the strategy to implement this theory.  It was a means of convincing those people who have little to support the reign of those people who have a lot in order to protect themselves against those people who have nothing.  Social Darwinism was an ideology and a strategy that allowed conservatives to eschew concern for the welfare of the masses that Burke had considered a matter of honor.  The poor get what they deserve, which is nothing, as do the rich, which is a lot.  Those who have a little bit are frightened into aligning with the rich.

In this theory, the last shall stay last because they chose their own fate.  This view of the poor gave conservatives an even more powerful argument against social reforms than Burke’s concern with unintended consequences.  According to this theory, giving to the poor only wastes precious resources and threatens catastrophe for the rest of us.  As Vice President Spiro Agnew once opined, the downtrodden want what we’ve got, and we’ve got to make sure they don’t get it.  Fear trumps decency, and we have to do unto them before they do unto us, meaning the masses have to be tricked into compliance when possible, repressed into compliance when necessary.[7]

From Herbert Spencer, William Graham Sumner and Andrew Carnegie at the turn of the twentieth century, to William Buckley, Joseph McCarthy, Richard Nixon, and Spiro Agnew in the mid-twentieth century, to George Will, George W. Bush, Dick Cheney, and Donald Trump in the twenty-first century, the proponents of Social Darwinian ideas and strategies have gained increasing prominence among so-called conservatives, and especially within the Republican Party.  Some conservative followers of Ayn Rand, such as Rand Paul and Paul Ryan, have taken to calling themselves libertarians, but they are still Social Darwinians.  All of them should really be called right-wingers or reactionaries, not conservatives in the Burkean sense.

Whatever they call themselves, their ideology is based on the twin principles of zero-sum and laissez-faire economics, and on a strategy of fear.  The strategy promotes nativism, since only those like us can be trusted, and racism, since those unlike us must be feared, especially those who look different.  And Social Darwinian right-wingers are constantly looking for an enemy to fear.  Although Burke and his conservative descendants were by no means loathe to use extreme force and fierce repression against those they considered dangers to the social order, they did not work overtime to invent dangers in order to justify their rule, as have generations of Republicans in the United States.

From the swarthy tramps, immigrants and anarchists at the turn of the twentieth century, to the blacks and bearded Communists in the mid-twentieth century, to the blacks, Hispanics, Arabs, Muslims, and olive-skinned immigrants in the early twenty-first century, fear-mongering has increasingly been the primary strategy of Republicans.  The Other is the danger, and repression is the answer.

With the decline and fall in the late twentieth century of the Soviet Union and Communists as threats, conservatives were hard put to find an enemy with which to scare the public.  George H.W. Bush was so desperate that he invaded Panama to overthrow Manuel Noriega, a former CIA operative and well-known drug trafficker, who had somehow become a grave danger to America.  Noriega is still in jail today, and drug trafficking is more widespread than ever.  The desperation implicit in this type of scaremongering demonstrates the depth of the worry among right-wing politicians that without a dangerous Other to fear, the public might no longer support their retrograde policies.  In the same vein, George W. Bush invaded Iraq to destroy weapons of mass destruction that were not there, with disastrous consequences that continue to the present.

The history of the Republican Party during the twentieth century has been the gradual decline, and now almost complete fall, of Burkean conservatives within the party.  This is a development which is popularly characterized as the disappearance of so-called moderate Republicans.  From Teddy Roosevelt, to Wendell Willkie, to Nelson Rockefeller, the Republican Party had for much of the twentieth century a progressive wing that curtailed the extremism of Republican right-wingers, and was willing to work with moderate Democrats toward bipartisan policies.

But with the rise Newt Gingrich as Speaker of the House of Representatives in the 1990’s, who shut down the federal government rather than cooperate with President Bill Clinton, and with the advent of the current Speaker Paul Ryan along with Senate Majority Leader Mitch McConnell, who have stonewalled every proposal of President Obama for the last seven and one-half years, right-wing Social Darwinians have taken over the Republican Party.  The recent nomination of Donald Trump for President only confirms what has been obvious for some time.

Darwinian Pragmatism and Progressivism.  The term Social Darwinism was a misnomer twice over.  It was not a social but an anti-social doctrine, a doctrine of selfish, self-centered individualism.  And it was not a Darwinian but an anti-Darwinian doctrine, that ran contrary to Darwin’s conclusion that humans have thrived because of their pro-social tendencies.  The pro-social implication of Darwinism was one of the reasons that conservatives increasingly came to reject Darwin’s actual theories of evolution over the course of the twentieth century, even as they increasingly embraced Social Darwinian ideas and strategies.

Darwin contended that socialization rather than individualism was the key to human success.  It was because of our cooperativeness, not our competitiveness, that we humans have done as well as we have.  And, Darwin complained, it is largely a result of competitiveness and our sometime selfish individualism that we have frequently done so poorly.  The pro-social implications of Darwinism were first given extensive treatment in 1883 in Lester Frank Ward’s book Dynamic Sociology.  In one of the first texts of the emerging field of sociology, Ward outlined a pragmatically socialist Darwinism as the genuine evolutionary theory.

Pragmatism was one of the outcomes of Darwin’s evolutionary theories, seemingly an unintended consequence, but one that was quite influential and helpful.  Pragmatism is a philosophy that describes the world as a succession of circumstances, actions and consequences, with the consequences of an action becoming the circumstances that lead to the next round of actions.  Pragmatism is a philosophy of action.  Pragmatists focus on the convergence of theory and practice into action, or what is sometimes called praxis, and they explain the world as a confluence of interconnected actions Pragmatism is a preeminently pro-social philosophy and it is an approach that can be applied to almost all human activities and fields of study.

Pragmatism developed from humble beginnings to become a comprehensive philosophy.  The term pragmatism was first proposed in the late nineteenth century by Charles Sanders Peirce as a contribution to lexicology, that is, a theory about the meaning of words.  Peirce claimed that the meaning of a word was our reaction to it and the action which it implies.  That is, what the word does to us and what we do as a result of the word.  A word, according to Peirce, is a call to action.[8]  Others took his concept of pragmatism as a call to action in a widening circle of fields.

William James took up Pierce’s ideas and applied them first to psychology.  His was a psychology of action, interaction and reaction.  Portraying the mind as “a stream of consciousness,” in which thoughts flow from one to the next in a constant interaction with each other and with the world, James claimed that the mind is neither a passive recipient of knowledge from the outer world nor an organ of logical conjugation.  Thinking is a dynamic activity in which the mind reaches out to the world, and interacts with it.  Thinking is a process of action and interaction.

James claimed, in turn, that our personal identities are defined by how we act toward people and things, and how they react to us.  We are our actions and interactions.  Contrary to Descartes’ claim that personal identity results from the reflection that “I think, therefore I am,” James proffered the explanation that “I think, therefore we are.”  That is, the only way I can know that I am, and who I am – the only way I can say “I” and be referring to my singular self — is through comparing and contrasting myself with others.  And the only way I can know who others are is by doing things with them.  Action, interaction and reaction are all we can know of ourselves.

James later extended these ideas to epistemology, that is, into a theory of knowledge.  Rejecting the Enlightenment idea of Reason (with a capital R) that ostensibly produced self-evident truths, he insisted that we know about things only from interacting with them.  We learn through doing, through action and reaction, precipitated by problems that we need to resolve.  Without the prod of problems, we would function solely on the basis of habit, and never think about anything in any significant way.  When problems arise that interfere with our habitual existence, we ask questions of the world, seek answers to those questions by looking for relevant evidence, and then either find answers or not.  Knowledge is a product of problem-solving, and expanding the realm of knowledge is a product of asking bigger questions and making wider and deeper connections among things.[9]

John Dewey took James’ idea of learning through doing and made it the cornerstone of his pedagogical theories.  It is a fact of life, he said, that we learn through what we do.  For example, a student who passively sits and takes notes about a subject in class is going to mainly learn how to sit still and take notes.  He or she is not going to learn very much about the subject.  It is only by actively engaging with the subject, and doing something with it, that the student will learn much of lasting value.  In formulating his educational theories, Dewey did something that pragmatists have frequently tried.  He took a fact of life and derived a proposed reform from it, in this case, a successful educational practice.

Dewey also extended the idea of learning through doing into an ethical theory which essentially embodies the Golden Rule that we should love our neighbors as ourselves, and we should do unto others as we would have them do unto us.  In formulating his educational ideas, Dewey took a fact of life and made it into an admonition.  In his ethical theories, he took an admonition and claimed it was a fact of life.  Dewey claimed that we do, in fact, love our neighbors in the way that we love ourselves.  The problem is that many of us do not think much of ourselves and, as a result, think the same of others.  People who think well of themselves will think well of others, Dewey concluded, and people who think well of others will think well of themselves.

Dewey claimed, in turn, that we do, in fact, treat others as we think they will treat us.  The problem is that many of us are afraid that other people will treat us badly, so we treat them that way first.  Too many people operate under the Social Darwinian principle of “Do unto others before they do unto you” with the meaning that you should get your goods first before others get them.  Dewey would reinterpret that mantra and have us do well to others before they do anything to us.  People who treat others well will likely be treated well by others, he claims.  He proposes this tactic as a means of establishing a virtuous cycle of people treating each other well, as opposed to the Social Darwinian vicious cycle of people treating each other badly.[10]

Pragmatism was a theory and practice that underlay the emergence of the physical and social sciences at the turn of the twentieth century.  Through most of the nineteenth century, most of what we today call the physical sciences were studied and taught under the umbrella of natural philosophy, and most of the social sciences were studied and taught as moral philosophy.  There was, however, an explosion in the number of academic fields toward the end of the century, with the rise of the multitude of specializations in the physical and social sciences that have produced most of the scientific advances of the twentieth century.  These scientific advances were powered by newly developed experimental and statistical methods, and pragmatism was a driving force in these developments.

Pragmatism was, in turn, a driving force behind the emergence of the Progressive movement in the early twentieth century.  Progressivism was a broad-based and multi-various social movement, encompassing politics, culture, education, and virtually every aspect of modern life, from fashion to the arts to social policy.  It was a movement, not merely a party or a faction, and, as such, it included many different tendencies, and even some conservatives who bowed to its popularity.  In the midst of the swirling trends, Dewey and other pragmatist scholars, journalists and politicians developed a progressive social theory that ran directly counter to the Social Darwinism that was gaining strength among conservatives.

They took as a main theme Hegel’s claim that the self-development of each person is dependent on the self-development of others, and Marx’s formulation of this as “the self-development of each is dependent on the self-development of all,” and vice versa.  That is, a person can only make something worthwhile of him/herself while working with others, so that each of them and the society as a whole prospers.  Social Darwinians claimed that we live in a top-down zero-sum world, and relied on fear to rally support among the masses.  Progressives countered that we live in a cooperative world in which all boats rise together.  They promoted hope as their means to gain popular support.  Theories based on cooperation and strategies based on hope underlay almost all of the progressive social, political, educational, and cultural developments during the twentieth century, and are the gist of pragmatic liberalism to the present.

Act III. The Obsolescence of Conservatism and the Birth of Fascism?

“We build too many walls and not enough bridges.”

            Isaac Newton.

“In republics, the great danger is, that the majority may not sufficiently respect the rights of the minority.”

            James Madison.

“The only thing necessary for the triumph of evil is for good men to do nothing.”

            Edmund Burke.

Where do we go from here?

Fast forward a hundred years from the turn of the twentieth century to the turn of the twenty-first.  Pragmatic liberalism has become the predominant philosophy of the Democratic Party.  The Progressive Era reforms under Woodrow Wilson, the New Deal under Franklin Roosevelt, the Great Society of Lyndon Johnson, and the healthcare reforms of Barack Obama have all been a product of that philosophy.  The fact of the matter is that pragmatic methods, backed by the tools of the social and physical sciences, can make social reform safe and successful.

Social reform in Burke’s time was a blunt instrument.  Social reformers conceived of a reform, and then tried it.  They had little ability to predict the consequences of a reform, or to monitor and reform the reform as it was being implemented.  If it worked, that was fine.  If it didn’t, that was too bad, and people had to live with the negative consequences.  With the specialization of the social and physical sciences that emerged in the late nineteenth century, social reform was revolutionized, and a pragmatic approach to social reform became possible.  Since that time, we have developed statistical methods, social and economic models, testing regimes of all sorts, and a myriad of ways we can evaluate whether or not a proposed social change is working.  The development of computers has enormously enhanced our abilities in this regard.  We can monitor the progress of a reform, see whether it is producing unintended negative consequences, and make adjustments accordingly.  We can protect ourselves against most of the unintended negative consequences that might arise from a reform.

The means and methods of pragmatic liberalism have absorbed and resolved the concerns of conservatives and the rationale for conservatism.  The flexibility that the new techniques and technologies bring to the process of social reform has undermined the core concern of conservatives about unintended consequences.  Conservatism has essentially become obsolete, and pragmatic social reform should be the order of the day.  Social reform can and should become the conventional wisdom of our society.  But only if the politics of our society will permit it.

Most of the problems that have developed in social programs over the last century have, in fact, been political problems, the result of either liberal proponents overselling their proposals or right-wing opponents obstructing the programs.  The present-day problems with Obamacare are only the latest example.  Burkean conservatives should have no big problem with Obamacare.  It is a market-based system that is motivated by common decency.  But Republican right-wingers have been determined to wreck the program, regardless of its successes, and irrespective of harm to individuals and society.  The program has, as a result, suffered from right-wing political obstruction, and reformers have been severely hampered in their efforts to revise and reform the program.

The Republican Party has, unfortunately, turned aggressively against the Progressive Republicanism that was promoted by Theodore Roosevelt and Bob La Follette in the early twentieth century, and the moderate policies of the so-called Rockefeller Republicans of mid-century.  The Party has turned, instead, towards a radical Social Darwinism that is today epitomized by Donald Trump.  Over the last six years, the right-wing Republicans who control Congress have stonewalled every pragmatic proposal from President Obama, while obstructing his work at every turn.  Meanwhile, Trump, the Republican presidential candidate, is flirting with fascism as his theory and practice.  We are a long way from the days of Newton, Madison and Burke, but their actions and their words still speak loudly, and they don’t speak well of Trump or the Republican Party.

Postscript.

Not the End of Ideology but the Beginning of Politics: Pragmatic not Technocratic.

In 1960, the sociologist Daniel Bell published a book called The End of Ideology in which he claimed that ideological conflicts were coming to an end, and were being superseded by the technocratic administration of things.  He claimed that the future society would be a managed capitalism, in which technocratic elites would administer things that needed coordinating, and in which conflicts would take place only among experts around the technical edges of things.  The grand battles over ideas and utopias that had previously occupied history were obsolete and over.

In this prediction, Bell, a one-time Marxist, had turned on its head one of Marx’s utopian hopes, that once capitalism was overthrown and a communist regime fully implemented, government would wither away, leaving only a minimal non-coercive administration of things that needed coordinating.  Bell, still a social democrat but no longer a radical, applied the idea to capitalism.

The possibility of a capitalist system managed by a technocratic elite was not a new idea in 1960.  Le Comte de St. Simon and August Comte had proclaimed similar things during the nineteenth century.  Adolf Berle and Gardiner Means had predicted the evolution of competitive capitalism into managerial capitalism during the twentieth century.  Francis Fukuyama has predicted similar things in more recent years.  I don’t agree, and I think pragmatic politics should not be confounded with technocratic administration.

The gist of my argument in this essay is that the knee-jerk conservative objection to social reform, that we cannot sufficiently predict the unintended consequences of a reform, has lost its legitimacy.  We can sufficiently monitor most social reforms to make sure they are working as they should, and adjust them if they wander off course.  But that does not mean we will be ruled over by apolitical technical experts.

Our ability to plan and monitor social reforms does not mean the end of ideology or politics.  To the contrary, there will always be differences among people as to values and goals.  These will almost inevitably take the form of ideologies, and lead to political debates and struggles.  Rather than ending ideology and politics, the new pragmatic liberalism opens the door to ideologies and politics that are not bogged down by the knee-jerk nay-ism of conventional conservatism.  We should all be pragmatic liberals of one sort or another, but the differences will still be significant.

[1] On the Enlightenment, see Peter Gay. The Enlightenment: An Interpretation. New York: Vintage Books, 1968.

[2] On Isaac Newton, see James Gleick. Isaac Newton. New York: Pantheon Books, 2003.

[3] On James Madison, see Garry Wills. James Madison. New York: Henry Holt and Company, 2002.

On Edmund Burke, see Conor Cruise O’Brien. The Great Melody: A Thematic Biography of Edmund Burke.  Chicago: University of Chicago Press, 1994.

[4] On the coming of the Revolution and the making of the Constitution, see Gordon Wood. The Making of the American Republic, 1776-1787. Chapel Hill, NC: University of North Carolina Press, 1969.  I have written extensively on whether and how the Revolution and the Constitution may have been based on mistaken analyses and expectations in several posts on this blog and in my book Was the American Revolution a Mistake? Bloomington, IN: AuthorHouse, 2013.

[5] On Charles Darwin, see Loren Eiseley. Darwin and the Mysterious Mr. X. New York: E.P. Dutton, 1979.

[6] For the influence of Darwin on philosophy in general and pragmatism in particular, see John Dewey. The Influence of Darwin on Philosophy. Bloomington, IN: Indiana University Press, 1910.

[7] On Social Darwinism, see Richard Hofstadter. Social Darwinism in American Thought. Boston: Beacon Press, 1955.

[8] On Charles Sanders Peirce and the origins of Pragmatism, see Louis Menand.  The Metaphysical Club: A Story of Ideas in America.  New York: Ferrar, Straus and Giroux, 2001.

[9] On William James, see Robert Richardson. William James: In the Maelstrom of American Modernism. Boston: Houghton Mifflin, 2007.

[10] On John Dewey, see Robert Westbrook. John Dewey and American Democracy. Ithaca, NY: Cornell University Press, 1991.

 

Donald Trump and the Contours of American Decision-Making: Do we suffer from a collective thinking disorder? And what can we learn from Star Trek?

Donald Trump and the Contours of American Decision-Making:

Do we suffer from a collective thinking disorder?

And what can we learn from Star Trek?

Burton Weltman

“Insanity is repeating the same mistakes over and over again and expecting different results.”

Attributed to Albert Einstein.

 A.  The Irony of American Decision-Making: A History of Self-Defeating Policies.

“It is curious how often you humans manage to obtain that which you do not want.” 

Mr. Spock on Star Trek.

Why, in the name of peace, has the United States been involved in more wars since the founding of our country than any other country during that period of time (we have been at war in over 200 of our 239 years)?[1]  Why, in the name of self-protection, does the United States have the highest rate of gun ownership (some 88.8 guns per 100 people), but also the highest rate of gun deaths of any industrialized country?[2]  Why, in sum, do American policies often seem to produce the things they are supposed to prevent?  Is this crazy, or what?

Take for example the invasion of Iraq in 2003.  The United States conquered Iraq and overthrew its government in order to eliminate weapons of mass destruction that might threaten us.  It was a preemptive strike to eliminate a potential threat.  There was no evidence that the weapons existed, but George Bush, Dick Cheney and Donald Rumsfeld inveigled the mass media, scared the public, and stampeded the Congress into supporting the invasion.  It turned out, of course, that there were no such weapons.  In fact, the patient containment policy of President Clinton during the 1990’s had seemingly led Saddam Hussein to destroy Iraq’s chemical weapons.

The conquest was also supposed to help bring stability to the Middle East.  As a consequence of the invasion, however, Iraq became a haven for terrorists who continue to pose an actual threat to us.  The invasion also ignited a firestorm of violence in the Middle East and around the world that we are still struggling to get under control.  The Iraq invasion stands as one example among many in our history of the irony of provoking violence in the name of preventing it.

Take also the example of America’s gun policies.  Americans are the most highly and widely armed people in the history of the world.  The ostensible goal is for people to be able to protect themselves against violence.  And every time there is a significant incidence of gun violence in the country, people buy even more guns as a preemptive move to protect themselves.  But this is a self-defeating policy both for individual people and the populace as a whole.

The data is clear that people who own guns are more likely to be shot than those who do not.  And the most likely persons to be shot with a gun that you own are you and people you know.  There is almost no chance that you will ever use your gun to protect yourself or anyone else.  Significantly, states within the United States with looser gun policies have higher rates of gun violence than those with tighter gun restrictions.  Nonetheless, the recent trend has been mainly toward even looser gun controls in those states with the greatest gun violence.  The National Rifle Association and other gun groups largely funded by gun manufacturers have manipulated the mass media, made people afraid of each other and of the government, and stimulated a national obsession with owning guns.

The consequence of these policies is that guns are so easily and widely available in the United States, that conflicts which in other times and places might be settled with fists or, at worst, clubs and knives, are often settled with automatic weapons.  It also seems to have become the case that guns are so widely possessed by people that whenever a police officer confronts someone with something in his hand, the officer feels he has to assume it is a gun, and frequently decides he has to shoot the guy.  A wallet, a toy truck, anything can be taken for a gun.  As a result, we are currently experiencing a reign of fear between the police and the people they are supposed to protect, with each group scared of the guns possessed by the other.

Our gun obsession has given the United States the highest rate of gun violence of any industrialized country, all in the name of self-protection.  American gun policies stand as another instance of the irony of trying to prevent violence with violence.[3]  Why do we so often adopt these sorts of self-defeating policies?

B.  Confounded Founding: Was the American Revolution a Mistake?

“In critical moments, men sometimes see exactly what they wish to see.”

 Mr. Spock on Star Trek

 Concerns about self-defeating policies are not new in our history. They date back at least to the founding of the country.  In the 1780’s, after having defeated the most powerful army in the world and gained independence for the United States from England, George Washington expressed the overwhelming sentiment of the Founding Fathers when he complained about the outcome of the Revolution.  “Have we fought for this?!” Washington lamented as he surveyed the social and political landscape of the United States.  The Founders had expected that peace and harmony would reign among Americans after their English overlords had been expelled.

But the country seemed, instead, to be in chaos.  Social classes were in constant conflict with each other, pitting rich against poor, farmers against bankers, cities against the countryside.  Worse still, the populace was refusing to follow the lead of the Founders, who had expected to be recognized as the natural and rightful leaders of the people once the British were gone.  Demagogic upstarts and hucksters were taking center stage, and vying with the Founders for political power.  What had the Founders gotten wrong?  The problem may have been of their own making, and its roots may be seen in the Declaration of Independence.

The Declaration of Independence opens with the words “When in the course of human events it becomes necessary to….”  The document then goes on to make a case for why it was necessary for the American colonists to revolt against English rule.  The argument consists of two main parts.  The first part is a concise statement of the natural rights theories of John Locke and Francis Hutcheson, to the effect that when a government becomes tyrannical, it is not merely the right but the duty of people to overthrow it.

The second part is a lengthy list of grievances against the King of England which purports to demonstrate that Americans have a duty to revolt against him.  This part opens with the words “The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States” (Emphasis added.). That is to say, the King was in the midst of a long term plan to become a tyrant, not that he already was a tyrant.  This revolt is a precautionary and preemptive move against a king who intends to become a tyrant.  Consistent with this opening statement, the specific grievances that follow are mainly prospective in nature and effect.  The first is “He has refused his Assent to Laws, the most wholesome and necessary for the public good.”  That is, the King was keeping the colonists from doing what they wanted and, thereby, keeping them from growing in wealth and power, not that he was actually abusing them.

Most of the grievances are of this nature.  They complain that the has King ignored requests from the colonists and/or made things inconvenient for them.  The few grievances in the list that alleged actual harm to the colonies were punishments that England had imposed on New England because of the Boston Tea Party and other terrorist actions by the so-called Sons of Liberty.  The Declaration concludes, nonetheless, that the King has demonstrated his intent is to become a tyrant, and that the colonies will likely be strangled to death if they do not revolt.

Although the fear expressed in the Declaration is sincere, the document does not make an argument that the colonies are currently being tyrannically oppressed.  England’s North American colonies were, in fact, the freest places for freemen in the world, and the colonists knew this.  The Declaration is an argument that current events indicated the King’s intention to tyrannically oppress the colonies, and that the colonists had better get out while the getting was still possible. It is essentially an argument to make a preemptive strike for independence before it is too late, before the colonists are enmeshed in tyranny and unable to resist the King.  The Declaration was a powerfully written statement issued by most of the most respected men in the colonies.  It scared enough colonists into supporting a revolution to make that revolution happen.

It was not, however, accurate in the main.  The Founding Fathers were wrong about the King’s intentions and the import of his actions.  The King was not trying to become a tyrant, or to reclaim the powers that English Kings had claimed during earlier centuries.  To the contrary, although George III was an active King, England was evolving into a parliamentary system in which the King had very limited powers.  The Founding Fathers, however, relying to a large extent on misinformation they had gleaned from English radicals, feared for their liberties.  Scorning negotiations that might have produced a peaceful compromise, they rushed into war, risking their lives to save freedoms they were not seriously in danger of losing.[4]

It was a long and brutal war that they eventually won.  But things did not turn out the way the Founders expected.  Instead of peace, they found themselves enmeshed in an unexpected new war of social classes among Americans that upset their plans for the new nation.

The American Revolution was not merely, or even primarily, a movement for national independence.  Most of the revolutionaries did not mind being considered Englishmen.  What they minded was being controlled by the kind of government that ruled England, and that the English were imposing on the colonies.  That is, they were opposed to centralized government, and to government with a strong chief executive that might morph easily into tyranny.  Their goal was to establish a decentralized national government with a weak chief executive.  The Founders were not adherents of small government.  They were adherents of local government, with local government having broad powers of control over the local economy and social life.

This goal was exemplified by the Articles of Confederation, the first constitution of the United States.  The Articles left most governmental power to the states.  The President of the United States under the Articles was essentially the chairman of the meetings of Congress and served for only one year.  But no sooner had the Revolution ended, than Founding Fathers such as George Washington, James Madison, John Adams, and Alexander Hamilton, among others, turned against the decentralized government for which they had been fighting.

The decentralized system of government the Founders had fought so desperately to establish turned out to be unwieldy and unworkable.  They had expected a new regime in which local elites would rule with the consent of the local masses.  But the masses proved to be unruly, and unwilling to defer to the local elites.  The Founders found that in opting for a preemptive strike to prevent the potential of royal tyranny, they had stirred up a hornets’ nest of grievances among ordinary Americans, who saw tyranny in the way local elites were asserting control over things.  Ordinary people wanted to control things, and demagogues vied for their support.  The Founders worried, in turn, that this might lead to a new form of tyranny, the tyranny of the majority.

So, in something of a panic with the way things were going, and too impatient to try to reform the Articles, the Founders moved preemptively and peremptorily to adopt a new Constitution with the very sort of centralized government and strong chief executive that they had fought a war to oppose.  It was a government that defused the influence of ordinary people by allocating most power to officials who were chosen at several removes from the populace.  It also contained checks and balances against the influence of demagogues.  It was a government the Founders believed they could control, and that properly balanced democracy with what we today would call meritocracy.  Unfortunately for them, government and politics under this new Constitution did not work out the way they expected and produced, instead, exactly the sort of the political free-for-all they had hoped to avoid.  But that is another story.[5]

The contours of American decision-making seem to be shaped by impatience and impulsivity, tending toward self-defeating preemptive strikes, and often leading to unintended and unwanted results.  As a consequence of this pattern, we seem to make the same types of mistakes over and over again.  Why is this?  Are the Founding Fathers unwittingly to blame?  The Founders were brilliant and heroic.  They were also sincere in their fears and honest in their impulses.  But did the Founders initiate a pattern of national leadership that is based on fear, and decision-making that is based on impulse, that has devolved to permit the hucksters and demagogues, that they hoped to prevent, to come to the fore?  Are they ultimately responsible for Donald Trump?[6]

C.  Ironies of the American Psyche: Self-Inflicted ADD, DID, and PPD.

“I object to intellect without discipline.”

Mr. Spock on Star Trek.

Donald Trump is not a new or novel phenomenon in American history.  Visitors to the United States during the nineteenth century frequently noted the prominence of self-promoting hucksters and fearmongering politicians, who were full of grandiose promises, bullying bombast, and braggadocio claims.  These visitors generally bemoaned the publicity that demagogues of this sort received from a mass media that favored sensationalism over facts.  It is a character type that first became prominent during the 1820’s and 1830’s, with the rise of the demagogic President Andrew Jackson and a new national ethos of laissez-faire individualism.

Jackson came to power by stoking fear of Native Americans, blacks, immigrants, bankers and intellectuals.  Jackson, like Trump today, was a colorful figure who was always good for a sensational news story.  And Jackson, like Trump, represented a race to the bottom in American politics that appalled the remaining Founding Fathers.  He was exactly the sort of thing the Constitution was supposed to prevent.  Was this, they complained, what they had fought for?[7]

Charles Dickens was one of the visitors who commented on this development.  A firm believer in the ideals of the American republic, he described the Trump-type American during the 1840’s in these terms: “If I was a painter, and was to paint the American Eagle… I should draw it like a bat, for its short-sightedness; like a bantam, for its bragging; like a Peacock, for its vanity; like an Ostrich, for putting its head in the mud, and thinking nobody sees it.”[8]

Dickens was dismayed by the influence of demagogues and hucksters over the public, and incredulous at the way newspapers encouraged them.  Dangerous frauds, they promoted selfishness in the name of progress, slavery in the name of freedom, and murder in the name of peacekeeping.  Their arguments were invariably ad hominem, and they inevitably portrayed social problems as the result of Others who needed to be attacked and defeated.  How was it that Americans were wont to follow these sorts of people?  Were they a cause or a symptom of America’s problems?  Was this some sort of collective mental illness?

Many commentators and policy analysts, both past and present, have concluded that Americans do suffer from a thinking disorder.  One of these is Professor Tara Sonenshine, a former Undersecretary of State of the United States.  Reviewing our history of preemptive actions, self-defeating decisions, short-sighted policies, and violence, she claimed in 2014 that “American impatience is not a passing fad nor is it minimal in scope.  How to reign in the impulsivity in us is a major task.  It might take national therapy — if we have time and patience to explore it.”[9]  How to find the time and patience to deal with our impulsivity and impatience?  This is an ironic question, but a crucial one if we are to avoid falling prey to Trump-type demagogues.  So, if we think of ourselves as suffering from a thinking disorder, how might we diagnose it?

1. Do we collectively exhibit symptoms of Attention Deficit Disorder (ADD), i.e. chronic impatience, impulsiveness and short attention spans?

The French social critic Alexis de Tocqueville visited the United States during the 1830’s and 1840’s. Like Dickens, Tocqueville believed that American-style democracy was the wave of the future in the world.  But, also like Dickens, he complained that Americans were an impatient and impulsive people, who seemingly could not wait for anything, or see anything through to completion.  As a result, he claimed, American laws were “frequently defective and incomplete” because Americans did not think them through, and “even if they were good, the frequent changes they undergo would be an evil.”[10]  That is, Americans were so impatient and impulsive that even when they stumbled into a good public policy, they abandoned it for another policy if it did not succeed immediately.  The net result was that policymaking in America was erratically and poorly done, and violence was often substituted for reason.

Does the impatience and impulsiveness that Tocqueville and others have noted in Americans constitute ADD?  People diagnosed with ADD are generally incapable of making good decisions.  They are too impatient to go through a whole decision-making process, and consider all aspects of a problem.  They lack the concentration to arrive at a well-reasoned decision that is consistent with their own values and goals.  And they are too impulsive to make a decision that can be explained to others and understood by them.[11]  Is this diagnosis a description of American policy-making?  One might reasonably cite American educational policy over the last one hundred fifty years as an example of collective ADD.

Public education in America first took on its modern form during the 1840’s in Massachusetts, where Horace Mann oversaw the establishment of what were called “common school” methods in the elementary and secondary schools, and “normal school” methods for teacher preparation.  Modeled on the factories of the industrial revolution that was beginning in America at that time, common schools were assembly lines for the mass production of standardized education for children.  Mann and his colleagues invented the idea of grade levels, that is, that children should go through standardized stages of education, with all of them learning the same set things at each stage.  They invented standardized curricula for each grade level, standardized textbooks and workbooks, and standardized tests to determine if a child was eligible to move along the assembly line to the next stage.  Normal schools were invented to train teachers in standardized teaching methods appropriate to the common schools.  Rote memorization and recitation were the main teaching methods.

The common schools were a leap forward in both the democratization of education and the use of schools for purposes of socialization and social control.  There was widespread concern in the country at this time about the massive immigration of European peasants to America to work in the new factories and live in the burgeoning cities.  Public education was deemed necessary to make their children into more efficient workers and effective citizens.  Common schools were seen as a prescription for making democracy workable.

The common schools were a cheap and efficient way to instill what were called the 4R’s, reading, writing, ‘rithmetic, and religion.  But the education they provided was unimaginative, uninteresting and unintellectual.  As a result, no sooner did the common schooling become widespread during the mid-to-late nineteenth century, than alternative means and methods were proposed to make education more interesting, effective and intellectual.  Over the course of the next fifty years, two methods gained the most support among educational reformers.

The first is what came to be called Essentialism.  The second is what came to be called Progressivism.  Advocates of both methods rejected the rote teaching methods of common schooling.  Essentialists want schools to focus on teaching the recognized academic disciplines.  They want each of the major subjects taught separately, with the goal of making students into scholars in each of these fields.  Progressives want schools to have interdisciplinary curricula which focus on teaching for real-world problem-solving, with the goal of helping students to become active and effective citizens.  Advocates of common schooling have rejected both methods as frivolous.

The history of American educational reform over the last century and a half has been a struggle among advocates of Essentialism, Progressivism and common schooling, with common schooling as the default position of American public schools.  There has been a pattern to this struggle in which every fifteen to thirty years, there is a call for educational reform and a major proposal is issued by either the Essentialist camp or the Progressive camp, or both.  The proponents tout their proposals as revolutionary new ideas, ostensibly based on new educational research, although they are, in fact, really just repackaged past proposals.

In any case, the reform proposals get media attention, and are adopted in whole or in part by many schools in the country, with the expectation of revolutionary improvements in public education.  In the course of these events, Essentialists attack any Progressive initiatives, Progressives attack any Essentialist initiatives, and both are attacked by advocates of the common schooling status quo.  The mass media promote the controversy.  The reforms invariably have some limited short-term effects, but do not immediately bring radical improvements in education.  So, they are deemed a failure in the media.

Given the infighting between Essentialists and Progressives, and the inertial appeal of the status quo, the reforms are almost entirely abandoned after a few years, and forgotten.  That is, until they are resurrected the next time.  Meanwhile, common schooling defaults as the predominant method of teaching in our public schools, which it still is today.  Isn’t this ADD among educational policy makers and politicians?[12]

2. Do we as a people suffer from a Paranoid Personality Disorder (PPD), i.e. a Violent Hang-up on Violence?

H. Rapp Brown, a leader of the Black Panthers during the 1960’s, once claimed that “Violence is as American as cherry pie.” If so, then trying to prevent violence is as American as apple pie. The problem is that Americans all too often choose preemptive violence to prevent violence.  The idea is to accept a small amount of violence in order to avoid a greater amount.  But that lesser violence frequently becomes the greater violence it was meant to prevent.

The problem dates back to the first English settlers in what became the United States.  The settlers moved into what they thought was a wilderness, and were terrified of being overwhelmed by wild Indians and wild animals.  So to cleanse the continent of Native Americans and native animals, they began a series of wars that lasted almost three hundred years.  They initiated a pattern of preemptive violence that has been repeated with dire results throughout our history.

One of the more ironic examples of self-defeating preventive violence is the secession of southern states in 1861 that led to the Civil War.  It was a suicidal act and an example of snatching defeat from the jaws of victory.  The facts of the matter were that southern slave owners had almost everything going their way in 1861, even with Lincoln’s election as President.  Lincoln got only 40% of the vote, with 60% being split among competing pro-slavery candidates.  If the pro-slavery forces could have agreed at the next election on a single candidate, Lincoln would almost certainly have been a one-term President.  In any case, Lincoln could not do anything about slavery anyway because pro-slavery forces still controlled Congress.  So, Lincoln’s election was at most a temporary political inconvenience.

Meanwhile, the Supreme Court’s Dredd Scott decision of 1857 had held as a matter of Constitutional law that a slave owner could take his slaves anywhere in the country.  Slaves were property, and a man could safely take his property wherever he wanted within the United States.  This decision implied that slavery was legal everywhere in the country, and that there was no such thing as a “free state.”  It would have required a Constitutional amendment to change this decision, and there was no way an anti-slavery amendment ever would have gotten the necessary approval from three quarters of the states.  Rather than slave states seceding to protect slavery, it was northern states that should have seceded if they wanted to avoid slavery in their midst.  The only way in which slavery could have been undermined during the mid-nineteenth century was if southern states seceded so that northerners could become a majority in Congress and northern states could become a three-quarters majority in the Union.  And that is exactly what happened.

A group of so-called Fire Eaters among southern whites became convinced in the 1850’s that the only way they could save slavery was to take preemptive action to secede from the Union before northerners could become populous and powerful enough to conquer the South.  They were convinced that the South had a military and economic advantage over the North at that time, such that secession could be achieved.  And they were convinced that it was a now-or-never crisis.  They must decide for a little violence now to avoid catastrophic violence later.

The Fire Eaters gradually gained the support of the southern news media during the 1850’s.  With Lincoln’s election in 1860, enough southern whites were convinced by the Fire Eaters so that most slave states were pushed into choosing secession and fighting the Civil War.  As a result, they got the catastrophe they had wanted to avoid.  Ironically, if the South had not seceded in 1861, slavery might still be the law of the land today.[13]

Catastrophic violence has often resulted because Americans have had problems with the idea of compromise, confusing it with appeasement, and have had problems being able to sustain strategies of containment instead of resorting to confrontation.  Compromise involves reaching a deal with an adversary in which each side is able to preserve its principles while giving up on some collateral issues.  Compromise is different than appeasement in which one side gives up on its principles in order to assuage the other side.  Forging compromise generally requires patience and flexibility.  Containment involves accepting in the short run a status quo that includes things to which you are opposed, but applying pressure to attain gradual long term change without resorting to violence.  Economic sanctions against an offending person or country constitute an example of a containment measure.  Containment also requires patience and flexibility.

Americans have repeatedly lacked the patience to work out compromises and wait out containments.  In the cases of the American Revolution, the War of 1812 and the Iraq War, Americans resorted to violence just as economic sanctions seemed on the verge of bringing about the goals we had set.  In the summer of 1776, British negotiators were sailing to America to offer the colonists home rule on terms that were consistent with the demands that the colonists had been making.  The radicals in the Continental Congress forced through a Declaration of Independence just a few weeks before the negotiators arrived, so that all parties were faced with a revolutionary fait accompli that scotched any further negotiations.

Likewise, war was declared by Congress in 1812 just days before news arrived from England that the English were acceding to the Americans’ demands.  Again, war was a fait accompli.  And the economic and military sanctions that President Clinton had placed on Iraq had achieved their intent of essentially disarming and disabling the regime of Saddam Hussein, but that did not stop President Bush from declaring war based on bogus claims.  Each of these is an example of snatching war from the grip of peace.  Does this amount to a collective case of PPD?

3. Do we exhibit symptoms as a nation of Dissociative Identity Disorder (DID), or Split Personality?

Americans frequently have been split into competing political and social groups. The split has often been characterized as between liberals and conservatives, though the definitions of liberal and conservative have evolved over time.  As these terms have been used over the last hundred years or so, liberals are seen as tending to think favorably about government involvement in economic, environmental and social welfare policies.  Conservatives tend to oppose these government activities.  The country has seesawed politically depending on whether liberals or conservatives have had the upper hand.  This split is exemplified today by the battle between so-called Red State conservatives and Blue State liberals.

But Americans have not only been split into different groups, they have also been split as individuals, and have frequently held competing inconsistent political and social positions on issues, often without even realizing it.  For example, Americans have consistently been distrustful of governmental authority, but have also insisted that the government impose law and order on society.  Americans have consistently extolled individual freedom, but expected social conformity.  Americans have generously contributed to private charities for poor people, but often refused to support public programs of welfare for the poor.  Americans have often opposed government programs of economic assistance, but coveted the benefits those programs provide.  It is even common today, for example, to hear the ironic refrain from some conservatives that “I just want to keep the government’s hands off my Medicare.” [14]

This split personality is particularly acute in Red States.  The problem is that many areas of the country are almost totally dependent economically on government expenditures and programs, and these areas are disproportionately in Red States.  To fund these expenditures and programs, the federal government takes in taxes from the country as a whole, which it then doles out to those areas most in need.  By a wide margin, Red States get back in government expenditures more money than they pay in taxes.  That is, Blue State taxpayers are financing Red State recipients, and Red State conservatives generally have no problem with taking money from federal government programs they ostensibly oppose.

Historically, Southern slave owners, for example, claimed to be in favor of “states’ rights,” and condemned as government oppression any attempt to regulate or restrict slavery.  But these same slave owners were overwhelmingly in favor of federal government enforcement of slavery and restrictions on abolitionist campaigns.  They claimed the states’ right to nullify any federal restriction on slavery, but adamantly insisted on the enforcement of federal fugitive slave laws against the nullification of those laws by northern states.  Some might call these inconsistencies mere hypocrisy, but might they not also be examples of an underlying DID?

The ways in which Americans have resolved their social ambivalence and political contradictions in action has frequently turned on how the issue has been framed to them.  Americans have historically responded positively to broad rhetorical appeals to freedom from government control, and to wholesale warnings about possible government oppression.  At the same time, these same Americans have frequently promoted strong government regulations and restrictions with respect to practical matters of interest to them.  It has consistently been the case since public opinion polling first developed during the 1930’s that when a question is asked in broad ideological terms, as in “Do you favor free markets or government economic regulation?”, about two-thirds of the respondents favor free markets.  But when a question is asked in specific pragmatic terms, such as “Do you favor laws that keep dangerous drugs off the market?”, at least two-thirds of respondents answer “Yes.”

As a result, politicians who are against economic regulation — generally conservatives — usually pitch their campaigns in broad ideological terms, while politicians who favor economic regulation — generally liberals — usually tailor their campaigns to specific issues and pragmatic programs of concern to voters.  Likewise, demagogues and hucksters, who are trying merely to manipulate public opinion and have no real solutions to problems, invariably try to frame the discussion in broad generalities, with broad generalizations and stereotypes.  It is generally incumbent on responsible progressive politicians to focus on practicalities of how things might work rather than on sensational generalities.

Fear can easily be generalized.  Hope needs to be particularized.  Trump-type demagogues thrive on appealing to the fears, hatreds, and worse angels of Americans, rather than their hopes, likes and better angels.  They try to resolve the split in our personality in a way that is self-destructive to us.  We need to counter them with pragmatic rationality.[15]

D.  What’s in a Brain and What Can We Learn from Star Trek?

“I find their [humans’] illogic and foolish emotions a constant irritant.”

 Mr. Spock on Star Trek.

So, maybe it is a question of our brains?  Although people are often described as thinking with their hearts, their stomachs or other anatomical parts, we actually think with our brains.  The human brain consists of two key parts, the brain stem and the cerebral cortex, with the cerebral cortex split into a right hemisphere and a left hemisphere. The brain stem is the earliest and least sophisticated portion of the human brain.  We inherited it from our pre-human ancestors.  The brain stem is the locus of the “fright, then fight or flight” reaction of our puny rat-like evolutionary precursors who had to make their way in a world of giant carnivores.  This sort of reaction was apparently a successful survival strategy for helpless mini-mammals.  But it may not be as useful, and may often be counterproductive, in the world of modern humans in which shooting first and asking questions later can lead to unnecessary wars and suffering.

The cerebral cortex evolved later in humanoids, and is the locus of human self-consciousness and critical thinking.  It is in the cerebral cortex that we do our rational thinking.  The cerebral cortex is split into a left hemisphere which is largely responsible for logical thinking and a right hemisphere which is largely responsible for creative thinking and intuition.  Psychologists have described good decision-making as a thought process that combines a person’s brain stem and both hemispheres of a person’s cerebral cortex.  Simplistically put, the brain stem will stimulate the process, the right hemisphere will imagine possible responses, and the left hemisphere will analyze the evidence for them.  A plausible conclusion will then be reached.[16]

There are many formulas for a good decision-making process, but some elements are common to most formulas.  A good decision-making process should be whole, coherent, and transparent.[17]  A process is whole if it is based on significant reflection and discussion, and covers all aspects of the problem under discussion.  A process is coherent if it results in a reasoned decision, the reasons make sense, and the values and goals embedded in the decision are consistent with the actions proposed to deal with the problem.  A process is transparent if the information and reasoning upon which a decision has been made are open to public scrutiny, and the reasoning behind the decision can be replicated by others.  Each of the parts of the brain, the emotional stimulus of the brain stem, the logic of the left hemisphere, and the intuition of the right hemisphere, are crucial to the process.  Making use of the various capacities of the whole brain, with each part checking and balancing the others, is key to an effective decision.[18]  The importance of these elements is illustrated in the 1960’s television series Star Trek.

Star Trek dramatizes the voyages of the spaceship Enterprise.  The three main characters in the crew of the Enterprise represent different personalities and decision-making styles.  There is Mr. Spock, the stoic Vulcan science officer and first mate, who applies cold logic to every situation.  Dr. “Bones” McCoy, the ship’s doctor, is a charming Southern gentleman who is erratically emotional.  And Captain James Kirk, is a Western space cowboy who is always ready for action.  In the course of most episodes, these characters bounce off of each other and eventually combine their respective insights to come up with a workable resolution of whatever crisis confronts them.  One of the aims of the TV show seems to be to illustrate the elements of a good decision-making process, and the ways in which disparate personalities can work together.

Using the brain as an analogy, McCoy represents the excessive influence of the brain stem, Kirk a rashly decisive right hemisphere, and Spock an over-weaning left hemisphere.  Each of them tends to take his tendencies too far and, thus, each of them needs the leavening effects of the others.  And while each is bedeviled by the others, each is also bedeviled by himself.  McCoy must tame his emotional overreactions to operate within the bounds of medical science.  Kirk has to balance his impulse to act rashly on his own with his sense of responsibility as captain to the whole crew of the spaceship.  Spock, who is half-Vulcan and half-human, is frequently torn between his Vulcan rationality and his human emotions and imagination.  But he invariably comes to his senses by focusing pragmatically on whatever is the specific problem at hand.  And that focus on practicality is the key lesson of the show.

McCoy’s hysterical overreactions to problems are almost always wrong.  He repeatedly calls for drastic preemptive actions, and for action based on fright.  Kirk’s initial responses to problems are often unduly aggressive and, thereby, also wrong.  He frequently wants to jump into the middle of things based on insufficient information and reflection, and to act based on courage.  Using the thinking disorder diagnoses, Kirk seems to suffer from ADD (impatience and impulsivity) in his need always to be in motion.  McCoy suffers from PPD (exaggerated fears of others), and is almost always stoking panic.  Spock suffers from a mild case of DID (split personality), but does not exhibit symptoms of either ADD or PPD.

Although Spock’s logic seems alien to his colleagues because it is so cold, his conclusions are almost invariably correct.  Unlike McCoy, who often gets carried away with ideological prejudices and generalized fears, Spock is able to focus on the practicalities of a situation. Without Spock’s Vulcan rationality, the others would many times have doomed the Enterprise through their emotional and impulsive decisions.  And the alieness of Spock’s logic seems to be the point that the show is trying to make.  Star Trek seems to be saying that we Americans need a bit more of Spock’s rationality, a trait which is alien to most of us, and a little less of McCoy’s emotionality and Kirk’s impulsivity, which we find more natural.

Produced during the height of the Cold War, when “un-American” was a term of highest opprobrium, and to suggest that America was not the best at everything was deemed un-American, this was a courageous stand on the part of the producers of the show.  It was a suggestion that could perhaps be safely made to a popular audience only through the guise of science fiction.  Spock, in particular, often said things about other worlds that were pointedly applicable to the United States, as when he said about an alien world that the Enterprise had visited “This troubled planet is a place of the most violent contrast.  Those who receive the rewards are totally separated from those who shoulder the burdens.  It is not wise leadership.”  He said that in the 1960’s, ostensibly about some other world, but it applied to the United States then and applies all too increasingly well to our world today.

E.  A Martian’s Eye View: Positioning oneself in but not of a situation.

Life and death are seldom logical.”

Dr. McCoy on Star Trek.

 Intuition, however illogical, Mr. Spock, is recognized as a command prerogative.

Captain Kirk on Star Trek.

“Logic is the beginning of wisdom, not the end.”

 Mr. Spock on Star Trek.

Although the other characters on Star Trek often disparage Spock as being coldly inhuman, Spock is not inhumane.  To the contrary, while he is dispassionate in his reasoning, he is also very compassionate in his responses.  He frequently cites as one of his moral imperatives that “The needs of the many outweigh the needs of the few,” and repeatedly puts himself in danger to save others.  Despite the repeated references in the show to Spock’s logic, his greatest strength, and I think the main point of the show, is his ability to see things from an outsider’s point of view.  I had a history teacher in high school who would often ask the class to imagine what would a Martian think of the events we were discussing.  That is what Spock essentially does.

Spock, as a Vulcan, is literally inhuman and an outsider, but he is also the colleague of a crew of humans on a spaceship.  So, he is the outsider who is an insider.  As such, he is able to appreciate the situation in the way his human colleagues do, but also break out of their cycle of insiders’ thinking, and break away from the pattern of ADD and PPD that McCoy and Kirk represent.  McCoy’s passion and Kirk’s intuition, which are at least partly a function of their being insiders, are important to the decision-making process on the Enterprise.  But Spock is able to devise pragmatic solutions to problems by seeing them from the outside, where others are overwhelmed by the enormity of problems from seeing them only from the inside.  Spock’s compassionate and considerate rationality, an attribute he can bring to the situation in large part because he is an outsider, is the key to the crew’s survival.  And maybe to ours, as well?

If we want to break out of our vicious cycle of self-defeating policies, and our susceptibility to demagogues and hucksters, we need to adopt the position of outsiders, and see ourselves as outsiders might.  To define our thinking disorder in a nutshell, we Americans suffer from too much McCoy and Kirk, and too little Spock.  To describe our current political crisis in a nutshell, Donald Trump is McCoy on steroids pretending to be Kirk.  He is fooling a lot of people by playing on their fears of outsiders, and trapping those people in their mental cages.  Americans need, instead, to welcome outsiders and alternative points of view.

Abraham Lincoln once famously said that “You can fool all of the people some of the time, and some of the people all of the time, but you cannot fool all of the people all of the time.”[19]  Trump is fooling a lot of people.  We will soon find out how many and for how long.  When will we stop fooling ourselves so that we can stop being fooled and made fools of by the likes of Donald Trump?  Where are the Vulcans when we need them?

            July 26, 2016

[1] Alex Jones “America Has Been At War 93% of the Time – 222 Out of 239 Years – Since 1776.”  www.infowars.com 2/21/15.  See also “List of wars involving the United States.” Wikipedia. 7/19/16.

[2] Jonathan Masters. “U.S. Gun Policy: Global Comparisons.” Council on Foreign Relations. cfr.org 1/12/16.

[3] You can find an extended discussion of the Second Amendment and gun policies in the United States in my blog post on “History as Choice and the Second Amendment: Would you want to keep a musket in your house?”

[4] For the still definitive discussion of the politics of the American Revolution, see Gordon Wood’s The Creation of the American Republic, 1776-1787. University of North Carolina Press: Chapel Hill. 1969.

[5] You can find an extended discussion of how and why the Founders made the Revolution, their expectations for the Revolution, and their disappointments with the outcome and with both the Articles of Confederation and the Constitution in my blog posts on “George III’s Legacy,” “George Washington’s Lament,” “Would it have been better for the colonists and would it be better for us today if the American Revolution had not happened?” “How might things be worse if the American Revolution had not happened,” and in my book Was the American Revolution a Mistake? (AuthorHouse, 2013).

[6] For a Pulitzer Prize winning examination of political and intellectual hucksterism in American history, see Richard Hofstadter, Anti-Intellectualism in American Life. New York: Vintage Books, 1963.  Hofstadter would not be surprised at the rise of Donald Trump.

[7] For a brilliant discussion of early American history with a focus on demagoguery and hucksterism during the Jacksonian era, see Daniel Walker Howe, What Hath God Wrought: The Transformation of America, 1815-1848. New York: Oxford University Press, 2007.

[8] Charles Dickens. Martin Chuzzlewit. New York: Signet, 1965. p.581.

[9] Tara Sonenshine. “The Age of American Impatience: Why It’s a Dangerous Syndrome.”  huffingtonpost.com/tara-sonsenshine/the-age-of-american-impat_b_5916062  Accessed 3/24/15.

[10]  Alexis de Tocqueville. Democracy in America. (New York: Oxford University Press, 1947), 140.

[11] psychcentral.com/disorders/adhd/  Accessed 3/24/15.

[12] You can find an extended discussion of the history of American educational reform in my blog post “Struggling to Raise the Norm: Essentialism, Progressivism and the Persistence of Common/Normal Schooling in America.”

[13] You can find an extended discussion of why the South seceded, why the North did not, and the alternative realities that could have ensued in my three blog posts on “Would the United States still have slavery if the South had not seceded in 1861?”

[14] For an example of a prominent American who exhibited a liberal/conservative split personality, you can find a discussion of James Bryant Conant in my blog post “Progressivism, Postmodernism and Republicanism: The Relevance of James Conant to Educational Theory Today.”

[15] For a conservative view of America’s split personality, see Irwin Stelzer, “Split Personality America,” www.weeklystandard.com 2016.  For a liberal view, see Andrew O’Heir, “America’s Split Personality: Paranoid Superstate and Land of Equality,” www.salon, 2013.

[16]  Jared Diamond. The Third Chimpanzee: The Evolution and Future of the Human Animal. (New York: Harper Perennial, 1993), 220-221.  David Sloan Wilson. Evolution for Everyone. (New York: Delacorte Press, 2007), 51-57.

[17] onlinesuccesscentre.com/2011/04/three-characteristics-of-a-good-decision   Accessed 3/23/15.

[18] John Dewey. How We Think. (Lexington, MA: D.C. Heath, 1933), 96.  Jerome Bruner. “Going Beyond the Information Given,” in Contemporary Approaches to Cognition, J. Bruner, ed. (Cambridge: Harvard University Press, 1957), 66-67.

[19] You can see a discussion of Lincoln’s comment and the power and limits of demagogues in my blog post “Limiting the sum of Lincoln’s ‘Some:’ Democracy, Mobocracy, and Majority Rule.”