Teschler on Topic
Leland Teschler • Executive Editor
It’s no secret that the media eagerly report potential technical breakthroughs with hyperventilating headlines. The most recent example is the buzz surrounding battery technology for electric vehicles. We might wonder why EVs don’t all have a 1,000-mile range given the number of news reports about spectacular advances coming out of battery research labs.
But the reason EVs still need to charge up after a few hundred miles is that the media often fails to clearly indicate the preliminary nature of the findings they trumpet. Even worse, they seldom report when the studies they hyped previously fail to pan out.
Researcher Estelle Dumas-Mallet of France’s University of Bordeaux tried to estimate the depth of this bias in news reporting. Dumas-Mallet and her colleagues examined 156 studies on disease risk that were written up in the popular press. All the studies getting this attention reported positive results. Thirty-five of these papers (covered in 234 news articles) were eventually proven invalid. Yet only four news articles were published pointing out the original stories were incorrect.
It seems that many journalists don’t understand the phrase “preliminary results” when it comes to science. This lack of comprehension adds to the sea of nonsense and half truths now making up our news feeds. But it is only one aspect of why we seem to live in a never-ending stream of misinformation. Often, information is presented in ways that are either purposely or accidentally misleading.
You might think STEM grads would be better equipped to spot such shenanigans. Many of the engineers I’ve worked with have had a high opinion of their own mental faculties, particularly compared to the reasoning abilities of liberal arts majors. If your engineering courses were like mine, they focused heavily on problem solving. But the problems we were taught how to solve were mainly those that fell to physical principles and mathematical analysis rather than sniffing out prevaricators.
Researchers studying misinformation say STEM grads are no better at detecting it than those with liberal arts degrees. In fact, STEM grads may be worse at ferreting out half-truths because STEM has an orientation toward problems with neat solutions.
“We generally do a good job teaching mechanics: students learn how to manipulate matrices, transfect cells, run genomic scans, and implement machine learning algorithms,” says University of Washington professors Carl Bergstrom and Jevin West. “But this focus on facts and skills comes at the expense of training and practice in the art of critical thinking.”
The UW professors say the problem with a single-minded focus on facts and skills is that it leaves STEM students unprepared to detect deceptive arguments and nonsense. Bergstrom and West, whose fields are biology and information technology, claim students in humanities get more practice in these areas than do those in STEM. “In the humanities and the social sciences, students are taught to smash conflicting ideas up against one another and grapple with discordant arguments. In STEM fields, students seldom are given paradoxes that they need to resolve, conflicting forms of evidence that they must reconcile, or fallacious claims that they need to critique,” the two say.
One problem the two see is that STEM grads tend to believe arguments backed up by numbers. But “Numbers are ideal vehicles for promulgating BS,” they say. “Numbers feel objective but are easily manipulated to tell whatever story one desires….We are told that ‘the data never lie.’ But this perspective can be dangerous.”
Perhaps a reason for some humility the next time you’re ready to believe breathtaking data from a battery lab. DW
You may also like:
Filed Under: Commentary • expert insight, ALL INDUSTRY NEWS • PROFILES • COMMENTARIES