Headshot of Melanie Green with short dark hair and black blouse.
Published on: Aug 14, 2018

By Melanie C. Green, Ph.D., Associate Professor, Department of Communication, University at Buffalo

To make good decisions, we need good information. Every day, people form opinions on health treatments, political policies, and consumer products. Social sciences help us understand how people can separate accurate information from misinformation—information that is false or misleading. 

Communication researchers, psychologists, and political scientists have all provided valuable research highlighting the dangers of misinformation, the difficulties in correcting it, and the most effective strategies for resisting it. Social scientists are also tackling related topics like conspiracy theories and rumors. 

Misinformation can occur in several ways. First, there are sometimes honest mistakes in either reporting or research. For example, early reports from a crime scene might contain errors about the number of perpetrators or victims. Although this type of misinformation is usually corrected, the wrong information may have already been widely reported. 

Second, misinformation can be intentionally spread by individuals or organizations who stand to benefit from that misinformation. These benefits may be financial, political, or both. For instance, a company may promote a health benefit of their product, even if research suggests that this claim is inaccurate. One classic case was the claim that Listerine mouthwash prevented colds. In the 1970s, the Federal Trade Commission required the manufacturers to correct this misinformation.

Political groups may spread false or misleading information about their opponents or about voting procedures. There are a variety of examples from the recent Presidential election, ranging from made-up quotes to memes falsely claiming that individuals could vote by texting a candidate’s name to a particular number.   

One of the problems with misinformation is that it can be difficult to correct, particularly when the misinformation fits with individuals’ preconceptions (such as negative stories about a disliked politician). Furthermore, misinformation can be more dangerous than ignorance about a topic. People who have accepted misinformation may believe themselves to be well-informed and may resist learning correct information. 

Research from psychology, communication, and political science helps us understand why misinformation can be so resistant to change. First, misinformation can become integrated with our existing attitudes and beliefs, so that even if the information itself is corrected, the implications and conclusions from that information may remain. (“Even if that politician really didn’t do that, it seems like the kind of thing they would do! I still don’t like them.”) Second, corrective information may be delayed or spread more slowly through media and social networks. The misinformation may be exciting or dramatic, which makes individuals more likely to share it, whereas the correction information may not have the same emotional impact. Third, individuals may resist accepting corrections from disliked sources (for example, a person who is distrustful of the medical establishment may resist the message that vaccines do not cause autism.)

On the bright side, social science research has also identified some best practices in reducing misinformation.

One important approach is “filling the gap.” That is, communicators should not just tell an audience what is not true, but when possible, they should provide an alternative (true) story. A common health myth is that people can get the flu from the flu shot. Simply telling people that this is not possible may not be enough; giving more information about how the immune response works may be needed to dispel the incorrect beliefs. People might not know that it takes two weeks for the immune response to occur, that side effects may mimic flu symptoms, or that the vaccine is not 100% effective. 

Another useful approach is to make sure the retraction or correct information is repeated. People encounter a lot of information every day, and we forget a lot of that information quickly. Reinforcing the correct information can be a big help. People tend to believe information that seems familiar and comes to mind easily. Additionally, communicators should be careful not to inadvertently repeat and reinforce the wrong information: ironically, formats such as “10 Health Myths!” can embed these myths more firmly in readers’ minds. 

Fostering a healthy skepticism is a more general strategy. When possible, warning individuals about false information before they encounter it can help reduce their susceptibility.  Additionally, instilling habits such as fact-checking or carefully considering sources can be valuable ways of stopping the spread of misinformation. Encouraging individuals to think before they share a story online is a useful practice. 

When I first started studying misinformation, it seemed that misinformation was relatively infrequent; today, there is a flood of it. The challenges of misinformation are likely to be part of the media, health, and political landscape for the foreseeable future. Indeed, new technologies such as advances in video editing (“deep fakes”) that make individuals appear to be doing or saying things that they never did may lead to more powerful and problematic forms of misinformation.

Fortunately, social science is rising to the challenge. Our understanding of how human minds work, how our motivations affect our responses to information, and how to effectively communicate different types of messages will serve us well.

Why Social Science? Because tackling these problems is essential for helping protect individual well-being and democracy itself.

Download as PDF

MELANIE C. GREEN is an Associate Professor in Communication at the University at Buffalo. Her work has focused on the power of stories. In particular, Melanie’s research examines how narratives can change the way individuals think and behave, including the effects of fictional stories on real-world attitudes and the challenges of correcting story-based misinformation. Her theory of "transportation into a narrative world" focuses on immersion into a story as a mechanism of narrative influence. She has examined narrative persuasion in a variety of contexts, from health communication to social issues. She has edited two books on these topics (Narrative Impact and Persuasion: Psychological Insights and Perspectives, Second Edition), and has published numerous articles in leading psychology, communication, and interdisciplinary journals. Her work has also examined the influence of social media on interpersonal interactions and well-being. Her research has been funded by the National Science Foundation, the National Institutes of Health, the Spencer Foundation, and the Russell Sage Foundation.  She received her Ph.D. in Psychology from Ohio State University.