People who Disagree with you on Politics aren't Necessarily Evil
The master virtues of our republic should be humility and tolerance... political parties should aim for compromise rather than dominance.
Well, it is kind of amazing. We have a lot in common as human beings. Including the way we tend to get things wrong even when we are quite certain we are right. In the western tradition, some of the earliest philosophers to figure this out were Plato and Aristotle. Ironically, these great thinkers took the human tendency to error and went in different directions with it.
From the allegory of the cave in his Republic, Plato posited that the only unchanging reality is in the realm of ideas. In a very general sense, we can say that Plato placed the locus of error and disagreement in an over reliance on our senses, which can only provide us a distorted, shadowy projection of the ideal. Plato held that our only path to touch reality was through our minds. Further, our best hope of perfecting our reason sufficiently to achieve insight to the realm of ideas was through dialogue with others. Plato's ideal forms had power, though--to the extent that we had knowledge of the form of good, for instance, we would be motivated to act in accordance with it. For Plato, the source of disagreement and conflict was our imperfect perception of the good.
Aristotle's empiricism, on the other hand, embraces the imperfection of sense as a part of our human way of being. The only reality, for Aristotle, was the reality presented to us by our senses, as imperfect as those senses might be. His tireless study of the world around him led him to posit the notion that living things were combinations of form and substance--each living thing possessed a form or blueprint of sorts that caused its physical matter to change in accordance with its purpose or function. Human beings were social animals with senses and a capacity for reason. The purpose for individual humans was to realize the virtues or excellences of character that would make them exemplary citizens. The proper use of reason was to control emotions and desires and to understand the evidence presented by the senses in order to achieve the best society. The source of conflict and disagreement was a failure to achieve proper human virtue.
Two thousand years later, Immanuel Kant presented a synthesis of sorts. I find Kant's thinking powerful because he introduces the notion that the limits to our experience are in place before experience--a priori--and are insurmountable. Kant was an idealist--he agreed with Plato that our senses were a filter that prevented direct experience of the phenomena that cause our sensory experience. However, he denied that reason could gain access to phenomena. [1] Instead, in a nod to Aristotle, Kant used reason to construct an argument for a master ethical rule that applied to the world of perception--the world we experience. He called his rule the Categorical Imperative. Basically, Kant's idea was that people should act in a way that they could universalize. But the important point for me in Kant's thinking is the notion that the limits to our ability to reason are pre-cognitive.
The fact that Kant introduced these limits as pre-cognitive "forms of intuition" was a big deal for him--he referred to it as a "Copernican revolution" in ethical thinking. In fact, it's not clear to me that Plato would have found anything new in the assertion that the limits of perception came before any experience. By itself, with the assumption that we all have the same basic constraints--the same forms of intuition--Kant's revolution doesn't really buy us much. We are still left with the challenge of reasoning to a shared conception of the Good based on different individual circumstances within the common constraints of human experience. There is ample room for different interpretations of the Form of the Good, or the virtues of a good citizen, or even the Categorical Imperative.
When you combine the notion of precognitive constraints with modern psychology and economics, things get more interesting. Thomas Sowell's notion of conflicting visions and Daniel Kahneman's arguments for the role of evolutionary forces in shaping various precognitive response mechanisms open the door to the notion that, not only are we built to get things wrong as individuals, but even the differences in how we tend to get things wrong are a function of systems that operate more as a reflex than as a function of intentional reason.
Remember from the last blog that Sowell used a conflict between two archetypal "visions" of the way people relate to issues and events--a constrained vision and an unconstrained vision. He further defined the concept of a vision as a "pre-analytic cognitive act," and, later, as "a set of assumptions not necessarily spelled out even in the individual's own mind." [2] Sowell's visions are a metaphor of sorts for individual decision paradigms.
Kahneman, a research psychologist who won the 2002 Nobel Prize in economics for his work on decision-making, uses several metaphors of his own to present a fascinating look at our decision-making processes. His book, Thinking, Fast and Slow, is worth an essay in itself, and we'll cover that in more detail in a subsequent blog. For now, it is sufficient to note that Kahneman describes "systematic errors in the thinking of normal people" which his research shows are the result of "the design of the machinery of cognition rather than... the corruption of thought by emotion." He uses "System 1" to describe our fast response mechanism that relies more heavily on emotion and heuristics (models) to simplify our complex world and enable us to react in the way most likely to keep us safe. System 2, on the other hand, is slower and engages the ability to reason to a much greater extent, while still accepting inputs from emotions and cognitive models. [3]
The cognitive systems Kahneman describes are broadly shared, but at least some of the content of cognitive models, like availability and anchoring, vary based on individual and group experiences. It is like the child's toy that pushes blocks of clay through forms to create exotic shapes: the forms are the same but the color varies based on the type of clay that is put into the toy. One's level of education affords no necessary immunity to the error-inducing effects of our cognitive machinery. However, an understanding of the nature of the biases build into our systems of thought, along with conscious attention to mitigating those biases, can help.
The implications of all this are powerful. Rather than getting angry at those with different political views, we should acknowledge that those views are produced by cognitive machinery we share, coupled with System 1 inputs that may well be more reflexive than intentional. Our own views are as likely to be based on erroneous inputs as those with whom we disagree. The best path to reduce our collective errors is to ensure all views are considered in the light of a common understanding of our biases. The master virtues of our republic should be humility and tolerance, and our political parties should aim for compromise rather than dominance.
[1] "The undetermined object of an empirical intuition, is called phenomenon. That which in a phenomenon corresponds to the sensation, I term its matter; but that which effects that the content of the phenomenon can be arranged under certain relations, I call its form. But that in which our sensations are merely arranged, and by which they are susceptible of assuming a certain form, cannot be itself sensation. It is, then, the matter of all phenomena that is given to us a posteriori; the form must lie ready a priori for them in the mind, and consequently can be regarded separately from all sensation.... From this investigation it will be found that there are two pure forms of... intuition, as principles of knowledge a priori, namely, space and time." Kant, Immanuel, Critique of Pure Reason, 1781, reprinted 2004 by Barnes and Noble, Inc., New York, NY, pp.
[2] Sowell, Thomas, A Conflict of Visions, William Morrow and Company, New York, NY, 1987, pp. 96-108.
[3] Kahneman, Daniel; Thinking, Fast and Slow; Farrar, Strauss, and Giroux, New York, NY 2013, p. 35.