Wednesday, September 23, 2009

Jonah Lehrer's "How We Decide"


This is a fun book covering current research in decision-making. It has lots of fun facts and bits about research topics.

The sad news is that we aren't really all that rational. Worse yet, Lehrer points out that having the myth that the "rational" mind must control the emotional mind creates more problems for us. There are lots of cases -- he gives research examples -- where less is more. Situations where being less analytical actually produces better decisions. But he's not one-sided. He points out other research results that show when increased analysis is in fact useful. In short, the world is complicated.

I found this bit discussing political viewpoints and how it affects decisions very interesting:
The flawed thought process plays a critical role in shaping the opinions of the electorate. Partisan voters are convinced that they're rational -- it's the other side that's irrational -- but actually, all of us are rationalizers. The Princeton political scientist Larry Bartels analyzed survey data from the 1990s to prove this point. During the first term of Bill Clinton's presidency, the budget deficit declined by more than 90 percent. However, when Republican voters were asked in 19996 what happened to the deficit under Clinton, more than 55 percent said that it had increased. What's interesting about this data is that so'called high-information voters -- these are the Republicans who read the newspapers, watch cable news, and can identify their representatives in Congress -- weren't better informed than low-information voters. (Many low-information voters struggled to name the vice president.) According to Bartels, the reason knowing more about politics doesn't erase partiisan bias is that voters tend to assimilate only those facts that confirm what they already believe. If a piece of information doesn't follow Republican talking points -- and Clinton's deficit reduction didn't fit the tax-and-spend-liberal stereotype -- then the information is conveniently ignored. "Voters think that they're thinking," Bartels says, "but what they're really doing is inventing facts or ignoring facts so that they can rationalize decisions they've already made." Once you identify with a political party, the world is edited to fit with your ideology.

At such moments, rationality actually becomes a liability, since it allows us to just to justify practically any belief. The prefrontal cortex is turned into an information filter, a way to block our disagreeable points of view. Let's look at an experiment done in the late 1960s by the cognitive psychologists Timothy Brock and Joe Balloun. Half of the subjects involved in the experiement were regular churchgoers, and half were committed atheists. Brock and Balloun played a tape-recorded message attacking Christianity, and, to make the experiment more interesting, they added an annoying amount of static -- a crackle of white noise -- to the recording. However, the listener could reduce the static by pressing a button, at which point the message suddenly became easier to understand.

The results were utterly predictable and rather depressing: the nonbelievers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun that had smokers listeing to a speech on the link between smoking and cancer demonstrated a similar effect. We all silence the cognitive dissonance through sel-imposed ignorance.
Here's another bit about war and killing that I found very interesting. I already knew facts like this, but it is always interesting to get a new twist on old knowledge:
On the battlefield, men are explicitly encouraged to kill one another; the crime of murder is turned into an act of heroism. And yet, even in such violent situations soldiers often struggle to get past their moral instincts. During World War II, for example, U.S. Army Brigadier General S.L.A. Marshall undertook a survey of thousands of American troops right after they'd been in combat. His shocking conclusion was that less than 20 percent actually shot at the enemy, even when under attack. ... When soldiers were forced to confront the possibility of directly harming other human beings -- this is a personal moral decision -- they are literally incapacitated by their emotions. "At the most vital point of battle," Marshall wrote, "the soldier becomes a conscientious objector."

After these findings were published, in 1947, the U.S. Army realized it had a serious problem. It immediately began revamping its training regimen in order to increase the "ratio of fire." New recruits began endlessly rehearsing the kill, firing at anatomically correct targets that dropped backward after being hit. As Lieutenant Colonel Dave Grossman noted, "What is being taught in this environment is the ability to shoot reflexively and instantly ... Soldiers are de-sensitized to the act of killing, until it becomes an automatic response." ... These new training techniques and tactics had dramatic results. Several years after he published his study, Marshall was sent to fight in the Korean War, and he discovered that 55 percent of the infantry men were now firing their weapons. In Vietnam, the ration of fire was nearly 90 percent. The army had managed to turn the most personal of moral situations into an impersonal reflex."
This book is well worth reading. It will expose you to new ideas.

No comments: