Rational Thinking and Emotional Attachments: How can we admit error?

Gods rest ye, Unitarians, let nothing you dismay;
Remember there's no evidence there was a Christmas Day;
When Christ was born is just not known, no matter what they say,
O, Tidings of reason and fact, reason and fact,
Glad tidings of reason and fact.

Our current Christmas customs come from Persia and from Greece,
From solstice celebrations of the ancient Middle East.
This whole darn Christmas spiel is just another pagan feast,
O, Tidings of reason and fact, reason and fact,
Glad tidings of reason and fact.

There was no star of Bethlehem, there was no angels' song;
There could not have been wise men for the trip would take too long.
The stories in the Bible are historically wrong,
O, Tidings of reason and fact, reason and fact,
Glad tidings of reason and fact!

These lyrics, based on the tune of "God Rest Ye Merry Gentlemen", were written by the Rev. Christopher Gist Raible of the First Unitarian Church of Worcester. I hope you all enjoyed your Christmas break.

Unitarians like to claim that they are primarily swayed by reason and fact, rather than by creed. The listed sources of the Unitarian-Universalist Association speaks of a source in "Humanist teachings which counsel us to heed the guidance of reason and the results of science, and warn us against idolatries of the mind and spirit". Rev. Christine Robinson of the First Unitarian Church of Albuquerque said in her 2008 address "Faith and Reason" that "We Unitarian Universalists rely on our reason in matters of faith more than most religious people. We reject the absurd. We welcome the insights of science and reason, and we use tools of reason when we encounter mystery....". In "Building Your Own Theology" the 2000 book by Rev. Richard Gilbert defines "Conviction ... [which] combines reason and feeling with the will to act."

We like to think that the Unitarians are a people who engage in a "free and responsible search for truth and meaning", that whilst respectful of traditions that we are not beholden to them. It would be good to think that this commitment can result in a situation where, if an opinion is held, it can change according to tests against the pragmatic validity claims that it raises; if it is a question of fact, that it will be tested against the objective and external correspondence with the claim, if it is a question of morals, against the mutual intersubjective consensus of the relevant actors, and if it is a question of taste, against the sincerity of the expressing subject. We would like to think that we can recognise the limits to our knowledge, our errors, and correct them when we are confronted with evidence that our opinion is incorrect.

To a large degree, this is sadly overly-optimistic nonsense at best. Recent evidence from neuropsychology suggests that human beings are, alas, a lot less reasonable that previously hoped. When a person has a passionate opinion which lacks evidence, or worse still, can be shown to be untrue, their first reaction is not to admit erorr, but rather to to defend their erroneous position, and discount the competing claims. Perhaps you have heard that the United States has weather control machines operating from a radio frequency transmitter frequency in Alaska that has caused hurricanes and earthquakes. Perhaps you have heard that Mossad and the U.S. government was directly responsible for the 9/11 terrorist attacks, for the purpose of promoting international war. Perhaps you have heard that homosexual desires are an unnatural adult sexual expression, caused by a distorted and damaged social environment. Perhaps you have heard that foreign workers from developing countries that seek work in Australia are greedy and selfish as they are taking jobs from Australians. Perhaps you have heard that grey aliens are part of a US government disinformation campaign against independent UFO researchers. These examples are given, partially because they are almost certainly false on the basis of evidence, but also because they represent opinions that have been expressed by members of this very congregation, as extraordinary as that may seem.

The problem that is being confronted here is no mere issue of the prejudices that come from attachments from the various cultural and religious associations or custom, although such beliefs and attachments themselves do serve as useful criticism against what is called "methodological individualism". This is the assumption that human beings are free floating rational choosers whose will is entirely independent of social influences. A classic critical response to this position can be found by Steven Lukes in the British Journal of Sociology from 1968. Human beings are very much social animals and carry with them irrational social prejudices which can be very difficult to dislodge. Something a trivial as a man wearing a skirt serves as a case in point. Culturally acceptable as the lava-lava in the Pacific isles, or the Austronesian sarong, or ceremonially as the Celtic kilt, it is rare to see such wearings in suburban Melbourne. It would be perceived as an example of transvestism and the wearer would have good reasons to fear for their physical safety. It would be worse if the skirt was pink - never mind that the convention that "pink is a feminine colour and blue is a masculine colour" is less than a hundred year's old.

These irrational and arbitrary cultural conventions are effectively fashions, and, as Thomas Paine pointed out in "Common Sense", they will inevitably change over time rather than through reasons, even if the reaction to socially constructed transgressions can be extreme. What is even more disconcerting however is the stubborn and irrational attachment to non-arbitrary beliefs. As Steven Novella of the Yale School of Medicine points out in "Your Deceptive Mind" (2012), by default our beliefs come firstly from emotional responses, not the critical reasoning faculties of the recently evolved parts of the brain. These more primal and subconscious emotional reactions provided a mechanism for quick decision-making that is largely adaptive; hate, fear, lust, hunger, disgust and so forth. As a sapient species we have heuristics for pattern-recognition, which also helped our decision-making adaptability. As a conscious species, these emotions and heuristics mix with our socially-derived needs, such as esteem, repute, and actualisation. As a result we become very irrational indeed. Because we have a very visceral reaction against being wrong we protect ourselves against that possibility - and often with even worse consequences.

Because we like being right and dislike being wrong we associate with those people that agree with what we believe to be the case. This is a type of selection bias. In statistics it is well known that such a sampling bias undermines the validity of a test. But the desire for social approval and self-esteem doesn't care about statistical validity. It cares for peer opinion or, to use Wyte's term in the classic 1952 article, "groupthink". It is, as Whyte explains, not "mere instinctive conformity" but rather "a rationalized conformity - an open, articulate philosophy which holds that group values are not only expedient but right and good as well" - whether or not it is true or false. In the 1970s and 1980s Irving Janis conducted groundbreaking research at Yale University concerning this issue and noted some major preconditions for groupthink to arise, including a high level of in-group cohesiveness, conformance and exclusivity, a partisan leadership and lack of transparent and consistent procedures, and a history of failure and perceived external threats. It is, perhaps not surprisingly, a common feature in small and sectarian political and religious organisations - independent of the actual real power they wield.

When confronted with evidence that is contrary to our deeply held beliefs, we may suffer cognitive dissonance. Initial research and experiments were conducted by Leon Festinger in 1957 after studying examples of how humans feel discomfort when confronted with evidence or behaviour contrary to their beliefs and then strive for internal consistency - his earliest and most well known example was a religious-UFO cult that prophecised the end of the world. Initial studies were conducted by the reactions of individuals that were part of prophetic cults, and whose prophecies did not come to fruition, but where then rationalised to allow a continuition and even strengthening of the belief and, importantly, ensuring group integrity. When the inevitable choice of either admitting error and making a fundamental change the behaviour or belief reinterpreting reality to compartmentalise the dissonance in a manner that satisfies the existing belief structure, almost invariably the latter chosen. More recently the social psychologists Carol Travis and Elliot Aronson have compiled a book of such incidents with the delicious title "Mistakes Were Made (But Not By Me)". The title is, in part, taken from a classic remark by Henry Kissinger in responding to claims that he had committed war crimes in Indochina and South America: "Mistakes were quite possibly made by administrations in which I served". In a similar manner when it was discovered that Supreme Court Justice Antonin Scalia was about to have a holiday with Dick Cheney - despite the fact that Cheney had a pending case before the court, Scalia was surprised at the public reaction. "I do not think my impartiality could be reasonably questioned", he claimed.

A related issue is that of confirmation bias, which also correlates well with sampling bias and groupthink. Coined in the 1960s by the psychologist Peter Wason, it has been observed anecdotally from the writings of the ancient Hellenes, and was well illustrated by Charles Dickens' obsequious flatterer, Uriah Heep, who was certain of his own sincerity. Confirmation bias is a process of reinforcement whereby information matches existing opinions - which we note is the opinion of the "in-group" - is treated in a far more positive manner and is actively sought, or information is interpreted in such a manner that concurs with a pre-existing opinion, and notably is strongest when the opinion is deeply held. Confirmation bias even changes our memory; the brain is not a passive recorder of information. It fills in gaps with thematic memory, the emotional content, and our existing beliefs. It is extremely prone to the suggestions of peers, whether they are right or wrong, as those employed in marketing well understand. The conformity experiments of Solomon Asch in the 1950s were the beginning of a swathe of experiment tests that illustrated these issues, including the more controversial Stanley Milgram obedience experiments in 1963, Ron Jones' secondary school history experiment, "The Third Wave" in 1967, and Philip Zimbardo's Stanford Prison experiment of 1971 - a live-action roleplay that went a little too far.

The great issue is that our brain, like our heart, lungs, liver, and so on, is an imperfect organ. The brain rewards us not on the basis of whether we are right or wrong, but rather whether our opinions confirm with those whom we've surrounded ourselves with and with our pre-existing beliefs. Where our knowledge is incomplete our brain encourages us to "fill in the gaps", to "connect the dots", as some might say (people could experiment with the latter, they would find that individual attempts to connect are quite random). In other words, the pattern-generating heuristic that has served us so well to make logical elaborations also serves to believe things without appropriate evidence through various logical fallacies. All considered one could be very pessimistic - and yet, sometimes people do change their opinions, the do admit error, they do admit responsibility, and so forth. How is this possible? It is possible first through the more recently evolved parts of the brain such as frontal lobe, which can control the more primitive and emotional parts, and take up our psychological discomforts as a challenge rather than avoiding them or seeking simple satisfactions. That provides us the possibility to admit error - but what about the actual process itself?

A number of very contemporary researchers, including Hugo Mercier of the University of Neuchatel, Dan Sperber of Jean Nicod Institue, and Helene E. Landemore of Yale University have conducted - and are conducting - a number of studies which strongly suggest that our ability to reason, rather than having an individual function, has a socially argumentative function. We reason in order to convince others and be convinced. Reasoning is argumentative; to find and evaluate arguments so as to convince others and only be convinced when then the validity claims in a proposition are justified. Accordingly, reasoning works well as an argumentative device, but quite poorly otherwise where the individual engages in self-justifying rationalisation instead. If these researchers are correct - and the available studies suggest that they are - then there is some cause for optimism, not only for rational thinking but for democratic theory as well. As long as democracy is grounded and preceded with individual free speech and is implemented in such a way that encourages delegation in proportion to acceptance and seeks the admixture of ideas, rather than a model of majoritarian tyranny.

Opinions can become properly grounded if one researches a matter first then forms an opinion, rather than other way around. This means giving up all existing beliefs or, at the very least, following what is called Cromwell's Rule, adopting a principle of probability in all matters. Oliver Cromwell, in writing to the Church of Scotland, pleaded with them to reconsider their opposition of supporting Charles II; "I beseech you, in the bowels of Christ, think it possible that you may be mistaken". When evaluating propositions concentrate on their content; not who said it or who benefits. Propositions need to be tested among a wide audience; the more expert the better, the more contrary one's own opinion, the better. It requires using rationalist methodology to test empirical claims dispassionately. It requires being accepting of uncertainty, and even more accepting of discovering that one's propositions are totally incorrect. Applying the principles of rational skepticism may be psychologically discomforting, and you may even initially lose social status. But in the longer run, deeply considered convictions are better for the mind than deeply ingrained prejudices, and logically grounded respect is preferable to being a uncritical loyalist. But most importantly, giving up our attachments is necessary because of the effects that these comfortable deceptions inevitably mean in practice.

To conclude, as Eric Blair (also known as George Orwell), remarked with his usual excellent understanding:

".. we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield."

Address to the Melbourne Unitarian Church, Sunday, January 19, 2014