Falling on deaf ears
Psychological research can yield truths, but sometimes those truths are hard for people to swallow. When scientific evidence challenges an important belief, people often defend the belief by resisting the scientific conclusions. There are a variety of techniques by which people discount research evidence, including the scientific impotence excuse. People reject belief-threatening evidence by forming another belief – that scientific methods are unable (or impotent) to yield valid answers about the topic. Interestingly, the scientific impotence belief can also generalise to other topics, creating a general erosion of belief in the effectiveness of science to yield 'truths’. Is psychology particularly prone to this process? How can psychological researchers and educators spot and reduce such discounting?
A few years back, the stage magician and scientific sceptic James ‘The Amazing’ Randi appeared on a television show with a psychic who claimed to have telekinetic powers, the ability to move things with his mind. After the psychic appeared to successfully demonstrate telekinesis by making telephone book pages move, Randi suggested altering some conditions
that would rule out any alternative explanations (namely, that the psychic was secretly blowing on the pages to make them move). Under these more rigorous, controlled conditions with Randi and the audience carefully observing, the psychic’s attempts
at telekinesis failed.
When asked to explain, the psychic suggested that his powers are shy and disappear under the harsh lighting of the studio, the whirring of the television cameras, and the sceptical eyes of the non-believers. Essentially, he suggested that his telekinesis could not be studied under scientifically rigorous conditions. Science was impotent to test whether his telekinesis existed or not. While this probably illustrates a con-man desperately grasping for an excuse to perpetuate his fraud, do people con themselves into believing that science is unable to address certain topics? Do we defend our own fraudulent beliefs in this way when scientific evidence challenges these beliefs? What impact does this have on people’s beliefs in the efficacy of science to answer a wide variety of questions?
The scientific impotence excuse
It is difficult to admit that you are wrong. In cases where a person is exposed to scientific conclusions that contradict their existing beliefs, one option would be to accept the scientific conclusions and change one’s beliefs. It sounds simple enough, and, for many topics, it is that simple. However, some of our beliefs are much more resistant to change. These are the ones that are important to us. They may be linked to important aspects of our identity or self-concept like core values (e.g. ‘I believe in the sanctity of human life’), social groups to which we are strongly connected (e.g. ‘I’m a union man like my father and grandfather before him’), or deeply-held emotions (e.g. ‘Homosexuality disgusts me’). When scientific conclusions challenge these kinds of beliefs, it is much harder to admit that we were wrong because doing so might require a rethinking of our sense of who we are, what values are important to us, with whom we align ourselves, and what our gut feelings tell us. Thus, a cognitively easier solution might be to not admit our beliefs have been defeated but to question the validity of the scientific conclusions. We might question the methodological quality of the scientific evidence (Lord et al., 1979; Munro & Ditto, 1997), the researcher’s impartiality (MacCoun & Paletz, 2009), or even the ability of scientific methods to provide us with useful information about this topic (and other topics as well). This final resistance technique is what I have called ‘scientific impotence’ (Munro, 2010).
In research published in the Journal of Applied Social Psychology, participants who either believed or disbelieved the stereotype that homosexuality is linked to psychopathology were given six brief scientific abstracts. The abstracts either all supported or all failed to support the stereotype. Abstracts were used to simulate the kind of real-world scientific information found on television news shows, newspaper and magazine columns, and even in textbooks, in which brief statements of the results and conclusions are provided with little or no description of the methodology. Because these abstracts focused on results and conclusions and failed to include methodological details, participants were not able to discredit the studies on methodological grounds. After reading the abstracts, participants rated the degree to which the question of whether or not homosexuality is linked to psychopathology is one that cannot be answered using scientific methods. Those reading belief-threatening abstracts endorsed the idea that scientific methods were impotent to address the question to a greater degree than those reading belief-supporting abstracts (and those in a control group). Interestingly, they also rated science as being more impotent to address a list of several other topics including the effectiveness of spanking, the physical health effects of herbal supplements, and the link between astrological sign and personality. In other words, exposure to scientific information that challenges an important belief can lead people to doubt the ability of science to answer real-world questions.
Social science impotence
Is psychological (and other social science) research particularly prone to provoke scientific discounting? In an interview with the APA Monitor, US Senator William Proxmire (famous for his ‘Golden Fleece’ awards given to government-funded research that he deemed wasteful of taxpayer’s money) stated the following about social science research: ‘It’s too bad they’re called sciences, because they are not quite. I don’t know what they are. They’re somewhere in between science and art’ (‘Proxmire speaks out’, 1975). The quote illustrates how social or ‘soft’ sciences are perceived differently than the ‘hard’ sciences.
There are a number of reasons why psychological science might be especially susceptible to being devalued. First, psychological research often does not resemble the science prototype. It is frequently conducted without the use of technologically sophisticated laboratories containing the fancy equipment that comes to many people’s minds when the word science is used. Thus, it may be perceived as lacking some of the qualities that give scientific evidence its epistemological power. Of course, some psychological research does employ high-tech laboratories (e.g. brain-imaging research). It would be interesting to test whether scientific impotence discounting is less likely in response to research evidence obtained from high-tech labs than to research evidence obtained from other methods (e.g. observational field studies). Of course, some psychological research does employ high-tech laboratories (e.g. brain imaging research), and research suggests that explanations that focus on the physiological underpinnings of psychological phenomena are indeed evaluated more positively (Weisberg et al. 2008).
Second, there is the belief that people’s thoughts and behaviours are less predictable, more mysterious, and affected by more variables than are inanimate objects like chemical molecules, planets in motion, or even the functioning of some parts of the human body (e.g. the kidneys). Furthermore, psychological science relies on probability (e.g. Variable A makes Behaviour B more likely to happen), and probability introduces the kind of ambiguity that makes conclusions easy to discount (Dunning et al., 1989).
Third, some psychological research is perceived to be derived from and possibly biased by the sociopolitical ideology of the researcher. That is, some believe that psychologists conduct their research with the goal of providing support for a particular political agenda. This is somewhat less common among the ‘hard sciences’ (although the controversy over climate change and the researchers who investigate it show that the ‘hard sciences’ are not immune to the problem of scientific discounting).
Fourth, people already have knowledge and expertise about human thought and behaviour (or believe that they do). Psychological research often investigates topics about which people already have lay theories or can easily call to mind experiences from their own lives that serve as comparisons to the research conclusions. When these opinions run counter to psychological research conclusions, then scientific discounting is likely. For example, there is a common belief that cathartic behaviours (e.g. punching a punchbag) can reduce the frustrations that sometimes lead to aggression. Psychological research, however, has contradicted the catharsis hypothesis (Lohr et al., 2007), yet the belief remains entrenched, possibly because it has such a strong intuitive appeal. Relatedly, psychological research (and social science generally) might be more likely to address topics that are linked to important beliefs, values, emotions and identities that people hold. Scientific conclusions on such topics are ripe for biased processing.
Short-circuiting scientific impotence discounting
How can strongly held beliefs be changed? How can scientific evidence break through the defensive tenacity of these beliefs? Scientific evidence can be threatening when it challenges an important belief. It can make a person question his or her intelligence, moral standing and group alliances in ways that make a person feel anxious, upset, angry and embarrassed (Munro & Ditto, 1997). Therefore, the most effective ways to break the resistance to belief-challenging scientific conclusions is to present such conclusions in less threatening ways. For example, Cohen and his colleagues (2000, 2007) have shown that affirming a person’s values prior to presenting belief-challenging scientific conclusions reduces the usual resistance. In other words, the science is not so threatening when one has had a chance to bolster one’s value system. Relatedly, framing scientific conclusions in a way that is consistent with the values of the audience is more effective than challenging those values (Kahan, 2010). This can often be accomplished by identifying the proper narrative structure and content – the story that will resonate with the audience (Jones & McBeth, 2010).
Reducing the negative emotional reactions people feel in response to belief-challenging scientific evidence can make people more accepting of the evidence. For example, in the laboratory, resistance is decreased when people are given another source (something other than the scientific conclusions they read) to which they could attribute their negative emotional reactions (Munro et al., in press). While this might be difficult to implement outside of the laboratory, other factors
can affect the degree to which negative emotional reactions occur. For example, a source who speaks with humility might be less upsetting than a sarcastic and arrogant pundit. Similarly, the use of discovery-type scientific words and phrases (e.g. ‘we learned that…’ or ‘the studies revealed that…’) might be less emotionally provocative than debate-type scientific words and phrases (e.g. ‘we argue that…’ or ‘we disagree with so-and-so’s position and contend that…’). In fact, anything that draws the ingroup-outgroup line in the sand is likely to lead to defensive resistance if it appears that the science or its source resides within the outgroup (Kahan et al., in press). Scientific evidence, even evidence that threatens a person’s beliefs, is more readily accepted when the source is a trusted member of one’s cultural ingroup, and this is especially true if the source appears to be presenting evidence against his or her own political self-interests (Eagly et al., 1978).
Finally, the media world is saturated with pundits and ‘truthiness’, where the validity of an argument is evaluated more by the conviction with which it was presented than on evidence and logic. The scientific world, on the other hand, tends toward measured conclusions containing caveats and exceptions that emerge at a deliberate pace. Thus, many scientists have difficulty communicating their conclusions
in a way that can compete with the strong opinions of the talking heads that appear on the entertainment/ propaganda shows that present themselves as television news shows (although see box on previous spread). In essence, scientists have trouble transitioning from the conservative communication style that is employed in the insulated scientific world of conference presentations and journal articles to the more straightforward evidence-based opinions that are required of a science advocate/public educator in the mediated world (Bushman & Anderson, 2001).
This analysis suggests the need for better science educators. Scientists should make public education a priority, not only in the media but also in the classroom. Building a solid framework of critical thinking skills among primary and secondary schooling as well as higher education may pave the way for a more scientifically informed citizenry. Frequent exposure to critical thinking skills, practice with critical thinking situations, and quality feedback about critical thinking allows people to understand how their own biases can affect their analysis of information and result in open-minded thinkers who are sceptical yet not defensive (Nisbett et al., 1987).
The discipline of psychology has made vast improvements in managing its public impression and is probably held in higher esteem than it was 50 or even 20 years ago. However, continued vigilance is essential against those (both within and outside of the discipline) who contribute to the perception of psychology as something less than science. The field of psychology has much to offer – it can generate important knowledge that can inform public policy and increase people’s health and happiness, but we cannot do so if our scientific conclusions fall on deaf ears.
Box 1: Scientists hit back?
Probably in response to an abundance of miscommunications about scientific evidence in the media, a number of hard-hitting science advocates have appeared on the mass media scene. For example, James Randi, Richard Dawkins and Ben Goldacre have all been accused of crossing the line from assertive to aggressive, and critics have suggested that they might do more harm than good.
The debate about these science educators illustrates the complexity of a broadcasting approach to science communication. When the target audience has such wide-ranging beliefs, it is impossible to effectively appeal to everyone. On the one hand, those who take a more forceful tone might be ‘fighting fire with fire’ or ‘standing toe-to-toe’ with the more straightforward approaches used by many who aim to discredit and deny scientific consensus (Bushman & Anderson, 2001). Their goal may be to gain an equal voice in the ears of undecided lay people, and a quarrelsome strategy may be quite effective. Among certain subcultures, failure to react strongly when provoked (e.g. the reserved tone used by most scientists) would be perceived negatively as a cowardly failure to defend one’s honor (Nisbett & Cohen, 1996). Also, those who already agree with the positions taken by these science advocates would likely find much validation in their delivery style. On the other hand, for those who hold strong beliefs against the positions taken by these science advocates, harsh and provocative communications will increase the threat, elicit negative emotional reactions, and lead to greater defensiveness. Their beliefs may harden rather than shifting toward the communication. Research evidence suggests that the best chance of changing the minds of the non-believers would be an artful combination of clear, strong logical argumentation mixed with value-affirming frames and presented in a humble manner that produces positive emotional reactions.
Bushman, B.J. & Anderson, C.A. (2001). Media violence and the American public. American Psychologist, 56, 477–489.
Cohen, G.L., Aronson, J. & Steele, C.M. (2000). When beliefs yield to evidence. Personality and Social Psychology Bulletin, 26, 1151–1164.
Cohen, G.L., Sherman, D.K., Bastardi, A. et al. (2007). Bridging the partisan divide: Self-affirmation reduces ideological closed-mindedness and inflexibility in negotiation. Journal of Personality and Social Psychology, 93, 415–430.
Dunning, D., Meyerowitz, J.A. & Holzberg, A.D. (1989). Ambiguity and self-evaluation: The role of idiosyncratic trait definitions in self-serving assessments of ability. Journal of Personality and Social Psychology, 57, 1082–1090.
Eagly, A., Wood, W. & Chaiken, S. (1978). Causal inferences about communicators and their effect on opinion change. Journal of Personality and Social Psychology, 36, 424–435.
Jones, M.D. & McBeth, M.K. (2010). A narrative policy framework: Clear enough to be wrong? The Policy Studies Journal, 38, 329–353.
Kahan, D. (2010). Fixing the communications failure. Nature, 463, 296–297.
Kahan, D.M., Braman, D., Cohen, G.L. et al. (in press). Who fears the HPV vaccine, who doesn’t, and why? An experimental study of the mechanisms of cultural cognition. Law and Human Behavior.
Lohr, J.M., Olatunji, B.O., Baumeister, R.F. & Bushman, B.J. (2007). The psychology of anger venting and empirically supported alternatives that do no harm. The Scientific Review of Mental Health Practice: Objective Investigations of Controversial and Unorthodox Claims in Clinical Psychology, Psychiatry, and Social Work, 5, 53–64.
Lord, C.G., Ross, L. & Lepper, M.R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37, 2098–2109.
MacCoun, R.J. & Paletz, S. (2009). Citizens’ perceptions of ideological bias in research on public policy controversies. Political Psychology, 30, 43–65.
Munro, G.D. (2010). The scientific impotence excuse: Discounting belief-threatening scientific abstracts. Journal of Applied Social Psychology, 40, 579–600.
Munro, G.D. & Ditto, P.H. (1997). Biased assimilation, attitude polarization, and affect in reactions to stereotype relevant scientific information. Personality and Social Psychology Bulletin, 23, 636–653.
Munro, G.D., Stansbury, J.A. & Tsai, J. (in press). A causal role for negative affect: Misattribution in biased evaluations of scientific information. Self and Identity.
Nisbett, R.E. & Cohen, D. (1996). Culture of honor: The psychology of violence in the South. Boulder, CO: Westview Press.
Nisbett, R.E., Fong, G.T., Lehman, D.R. & Cheng, P.W. (1987). Teaching reasoning. Science, 238, 625–631.
Proxmire speaks out on social science. (1975, May). APA Monitor, 6.
Weisberg, D.S., Keil, F.C., Goodstein, J. et al. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470–477.
BPS Members can discuss this article
Already a member? Or Create an account
Not a member? Find out about becoming a member or subscriber