Feelings matter more than facts alone: A challenge and opportunity for science advisers

To interact more effectively, science advisers should re-consider what facts, evidence, knowledge and reason really mean to people. Photo courtesy of ouvyt
By David Ropeik, Instructor, Harvard University

David contributed this article as part of our call for blogposts on conference themes. Submit your blogpost: http://bit.ly/1yyo1P2 

Save for a few exceptions, the idea of an independent government science adviser is not controversial. Few argue with the notion that governments can make wiser decisions about any given issue if they fully understand what the scientific evidence about that issue says. And because elected officials and their staff often lack the scientific expertise to achieve that understanding on their own, an independent office of science adviser is needed to provide assistance, and ensure that policy making is as fully informed as we all want it to be.

But many of the specific issues that science advisers are asked to advise on are controversial or emotionally charged, and that makes the actual work of the science adviser more difficult, for two reasons. One more obvious, one less so.

The obvious problem is that passion clouds reason. That leads to values-driven conflicts over what the evidence says, in which the facts are only weapons in a war about deeper worldviews and goals. Objective analysis of the evidence is caught in the crossfire of such affective combat, and often mortally wounded. This is the challenge to be discussed in Theme Three of the upcoming meeting of Science Advisers in New Zealand,  Science Advice in the Context of Opposing Political/Ideological Positions.

In crisis circumstances, high emotions cloud reason (for well-understood neurological and biochemical reasons) leading to more instinctive and emotional perceptions of risk, and that also makes it harder for people to hear the evidence objectively. This is the issue the conference will consider in Theme Two Science advice in dealing with crises

But the emotional nature of many science issues challenges science advisers in another, less obvious way. People of high academic and intellectual accomplishment place particular faith in the human mind’s capacity for reason. Central to how they see the world is the deep belief that human intelligence, and the application of the scientific method, can reveal facts and knowledge, truths, on which we can all agree. In short, people in the science and academic communities believe that humans are smart enough to objectively figure things out.

This leads most people of higher academic and intellectual achievement to assume that the way to deal with people who don’t understand the facts, or who disagree with the facts, is to educate them…to teach them…to explain the facts in ways that will help people objectively see ‘the truth’. This approach to science communication is culturally ingrained in people who have been students and teachers much of their professional lives.

And yet this ‘deficit model’ of communication either ignorantly or arrogantly ignores the findings from several fields of research - including neuroscience, psychology, sociology, and economics among others - that powerfully establish the fallacy of faith in pure objective reason.

A rich and growing body of evidence makes clear that human cognition is not, and can never be, purely dispassionately objectively analytical. (see Descartes’ Error by neurologist António Damásio.)

Our perceptions (and this applies to scientists and academics too) are a combination of what facts we may have, and how we feel about those facts, what those facts mean to us. Melissa Finucane and Paul Slovic, pioneers in the study of the psychology of risk perception, call this The Affect Heuristic.

A rich body of research reveals the individual cognitive components that shape how we feel. Kahneman. et. al. have identified a number of specific heuristics and biases that shape how we see the facts. Slovic and others have identified many specific psychological characteristics that make risks feel more worrisome, or less, the evidence notwithstanding. Research by Dan Kahan and others has advanced Mary Douglas’ Cultural Theory, establishing in the study of Cultural Cognition that we shape our views of the facts so they agree with those in the groups with which we most powerfully identify, a tribal affiliation the social human animal relies on for protection.

And all of this comes on top of neuroscientific discoveries by Joseph LeDoux  and many others that that hardwiring and biochemistry of the brain ensures that we feel first and think/reason second, and over time that subconscious feelings and instincts shape our perceptions more than careful conscious cognitive deliberation. (These findings are all summarized in my recent book: How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts)

These insights explain, with reasonably granular precision, why people feel the way they do about the facts.

They explain why bright people can see the same facts in so many different ways.

They reveal the specific emotional and psychological elements behind the controversies in which scientific evidence, and the work of science advisers, are taken hostage by values and passions.

They are invaluable insights on which to base more effective risk communication, which must go beyond just education, and work to establish trust by demonstrating an understanding of and respect for how people feel.

They are tools to help with the challenges of Science Advice in the Context of Opposing Political/Ideological Positions and Science advice in dealing with crises.

And finally, these remarkable insights into human cognition pose a challenge to science advisers. They reveal that the way we figure things out doesn’t come close to meeting the Enlightenment ideal of pure Cartesian reason. As Ambrose Bierce put it in The Devil’s Dictionary, “the brain is only the organ with which we think we think.”

Facts alone literally have no meaning until our emotions and instincts and experiences and life circumstances give rise to how we feel about those facts. Interacting more effectively about science issues with governments, and various publics, may require a fresh consideration by science advisers about what facts and evidence, and knowledge and reason, really mean.


David Ropeik is an Instructor in the Environmental Management Program of the Harvard University School of Continuing Education, a consultant in risk perception, risk communication and risk management, and author of “How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts.”

He can be reached at [email protected]