Leland Teschler, Executive Editor
On Twitter @ DW—LeeTeschler
One of the strongest tendencies in the human thinking process is what’s called confirmation bias. It is a propensity to interpret new data in ways that support the opinions you already had.
Now, consider that most engineers are highly analytical people. Generally speaking, they have a strong ability to use quantitative data and can reason out a hypothesis pretty well. Unfortunately, having an analytical personality is bad news if you want to avoid confirmation bias. Though it may seem counterintuitive, psychologists say people with strong analytic personalities are more likely to twist data to confirm their own beliefs than are people with a lower ability to reason.
Researchers from Yale, Ohio State, Cornell, and the University of Oregon uncovered this tendency using an experiment where subjects had to solve a problem that depended on how well they could draw valid inferences from data presented to them. As you might expect, those who measured highest in numeracy did substantially better than less numerate people when researchers presented the data as though it resulted from a study of a new skin rash treatment.
But then researchers took the same data and presented it as results from a study on whether gun control affected crime rates. Interestingly, people in the study who scored highest in numeracy scored worse than the rest in terms of making valid inferences, though the numbers presented for gun control were exactly the same as those for skin rashes.
Obviously, opinions were interfering with the ability to analyze data objectively. Tali Sharot, professor of cognitive neuroscience at the University College of London, points out that these findings debunk the idea that only less intelligent people are likely to twist facts. Ironically, she says, people may use their intelligence not to draw more accurate conclusions but to find fault with data they don’t like.
Sharot also says this confirmation bias is why, when arguing with others, it may not help to call up facts and figures supporting our own position and contradicting theirs. Even smart people find it difficult to change their mind when presented with hard data indicating they’re wrong.
Keep these ideas in mind as you ponder another of Sharot’s experiments. Sharot and her colleagues seated people at a computer keyboard and told them to press the space bar every time they saw a painting by Klee. The reward was a dollar for every correct spacebar press. But when a Picasso appeared, another button had to be pressed quickly to avoid losing a dollar. It turned out that people were quicker to press a key that gained them cash, slower and more likely to miss pressing a key altogether to avoid a loss.
Sharot says biology explains the results. When we get the chance to acquire something good, our brains trigger a chain of biological events that make us more likely to act fast. But when we anticipate something bad, our instinct is to withdraw. The biological chain of events tends to inhibit a response. In a nutshell, we’re more likely to execute an action when we anticipate something we like than when avoiding something sad, she says.
These results are something to keep in mind during the dynamics of an engineering project. Engineers with an axe to grind may not be swayed by contrary evidence. And managerial determination to continue a questionable project may have little to do with grit and determination and much to do with a biological predisposition to move forward. DW