Epiphanies

Changing one’s mind. Continue reading

In his famous book Thinking, Fast and Slow, Daniel Kahneman posits that the mind uses basically two systems of decision-making. The first is employed nearly all of the time in the thousands of quotidian decisions that confront people. The judgments are generated so fast and require so little effort that we often do not even think of them as thinking. The second system is more analytical and deliberate. When we say that we are paying attention, we mean that System 2 is in charge. The funny thing about this arrangement is that humans all identify with the second system even though the first one is clearly the mental workhorse. The commonly heard sentence “I don’t know why I did that” really means “System 2 does not know why System 1 did that.”

This made me think about epiphanies, which I would define as occasions in which someone’s System 2 decided that because of some new information it must override an area that had previously been assigned to System 1. I suppose that it could work the other way, too, when someone decides that he/she is just not going to worry about something any more. Going native, becoming an artist, or joining a monastery might qualify.

I experienced a very striking epiphany in 1973. I read a book with a theme that was somewhat similar to Kahneman’s entitled I’m OK,You’re OK. I learned soon enough that the divisions that the author, Thomas A. Harris, described among Parent, Adult, and Child aspects of one’s thinking were not perfectly accurate. In fact, however, I found the medical research by Michael Gazzaniga that served as the basis for the Transactional Analysis described in the book even more compelling. The idea that the same human body possessed more than one thought process and, moreover, that the separate processes actually competed for control of the body’s functions affected me (and by me I mean my System 2) greatly. More than anything else, I found it very liberating. There was no reason for me (System 2) to feel guilty about something that he (System 1) had done. Of course, I could work hard to prevent him from doing it again, but if I had no idea of what was coming, no jury, not even a jury of a dozen nuns, could convict me.

A few years later I had an epiphany of a different sort. Once again a book, James Randi’s The Magic of Uri Geller, was the instigator. In those days I was quite interested in slight-of-hand tricks. I had watched an astounded Charlton Heston on television — at the Amazing Kreskin’s direction — name the color of every card in a deck simply by holding the corner of each card between his fingers for a second. Even after I discovered how Kreskin did this incredibly simple trick, and I learned that Kreskin’s “mental powers” consisted of nothing more than memory and counting, I still thought that there must be some people who had paranormal abilities. This was not a thoroughly researched heartfelt conclusion. It just seemed reasonable that some of the thousands of people who claimed psychic abilities probably had powers that science could not yet understand. I mean, bats and dolphins have mental abilities that are beyond ordinary humans. System 1 often bases its conclusion on those types of impressions.

Randi’s book thoroughly exposed Geller, who was formerly a stage magician, and his assistant, formerly a stage magician’s assistant (!), as frauds. It also made it quite clear that the academics who had tested him and other such claimants did not understand how easy it was for one human being to fool another’s senses. They wrongly thought that they were smart enough to recognize tricks, and they may have wanted him to succeed.

The exposé induced me to pay more attention to the subject of credence. Reading Randi’s book turned me into a skeptic about everything. Shortly thereafter my belief in just about everything invisible — psychic powers, ghosts, souls, guardian angels, etc. — fell aside.

Recently I have been thinking about how one might induce an epiphany in someone else. Religious cults have developed a very effective technique. The first step is to smother someone with love and affection. The idea, in Kahneman’s terms, is to convince them to set aside System 2 processing for everything that has to do with the cult. System 1 quickly associates the cult with stimulation of the brain’s pleasure centers and thus forms a strongly favorable impression. Many become True Believers.

How can one reverse the process? How can one persuade another person to use System 2 in dealing with something (not necessarily a cult) that the target person had previously relegated to System 1? This is a thorny question. System 2 is inherently lazy, and employing it requires real effort that is difficult to sustain. I hope that Kahneman addresses this at some point in his book.

The dissertation that I abandoned long ago was going to focus on a very small aspect of this subject. A long string of well-known studies of some very specific questions involving judgments in situations in which the probabilities of outcomes had been specified showed that people consistently changed their minds about the advisability of certain options after a discussion with others. The direction of the shift is predictable. On some questions it shifts in the direction of taking risks. On others it shifts in a more conservative direction. The effect is not perfect, but it is strong.

I speculated that some of the people who changed their minds probably did not understand the concept of probability well enough to calculate the best course. The discussion might have provided enough context so that System 2 decided that it had enough information to take over and make a different decision.

I wonder if I was right. I probably will never know.