Fake News, Big Lies: How Did We Get Here and Where Are We Going?
IPR experts explain how mis- and disinformation affect our lives and offer ideas for how to counter it
Get all our news
“Are we going to be a nation that lives not by the light of the truth but in the shadow of lies?” President Joe Biden asked the country on the first anniversary of the January 6 insurrection at the U.S. Capitol.
But distinguishing truth from lies can be a difficult task when every day Americans read and hear false “facts”—misinformation—and deliberately misleading information created to cause harm—disinformation.
IPR faculty experts have generated a noteworthy body of research across different disciplines that explores what drives people to believe in untruths—and how the U.S. may be especially susceptible to disinformation. They also examine how misinformation and disinformation have affected the media, our politics, and even our health.
- Why It’s Easy to Believe Misinformation and Disinformation
- How Misinformation and Disinformation Flourish in U.S. Media
- Declining Trust in News
- What About Social Media?
- Misinformation, Disinformation, and Polarization
- ‘Fake News’ in Presidential Elections
- Informational Distrust in the COVID-19 Era
- Resisting the ‘Shadow of Lies’
Although propaganda meant to persuade via argument, rumor, misunderstanding, and falsehood goes back to at least ancient Greece, today misinformation and disinformation are at the center of debate and research. Scholars have identified “information disorder syndrome,” the creating or sharing of false information out of error—misinformation—or to mislead or cause harm—disinformation or mal-information.
Why do people believe in misinformation and disinformation? Psychologist and IPR associate David Rapp, who studies how people learn through reading, finds that memory is key.
In experiments, he finds that when people read incorrect information, even about trivial subjects they already know, they often become confused and remember the inaccuracies. Subsequently, they answer questions using the incorrect statements.
“You can build memories for the things you’ve read that can then get resuscitated or recalled later in your decision making,” he said, especially when people are not carefully considering what they read.
Repeating false information over and over again—such as that the 2020 election was fraudulent— can lead to building memories for the information. And repeated information is often easy to retrieve, which can lead to problems, Rapp explained.
“If you can easily retrieve something, you tend to think it’s more true than if it’s something that’s hard to think of,” he said.
The more familiar people are with information they remember, including lies, the more likely they are to believe it is true, communication and policy scholar and IPR associate Erik Nisbet adds. In his and IPR associate research professor Olga Kamenchuk’s research, they note people might believe misinformation or disinformation they recall even if they do not recall if the source is credible.
Moreover, people are more likely to believe the content they read or listen to that reflects the same emotions—anger, sadness, or anxiety—that they presently feel.
“Certain emotional states might make you more open to misinformation,” Nisbet said.
Breeding familiarity through repetition and seeing one’s emotional state mirrored in content are examples of a “mental shortcut,” according to Nisbet, and together they make people more likely to accept false information as true.
Examples of media bias charts that map newspapers, cable news, and other media sources on a political spectrum are easy to find. Can understanding bias in news sources help clarify why people fall prey to misinformation and disinformation?
Stephanie Edgerly, media scholar and IPR associate, suggests that a better place to start is with people’s individual biases, rather than those of news sources. In examining how people make sense of news sources, she points to the audience’s understanding of whether the source was news or entertainment—its genre—and its political orientation.
But how people perceive political orientation varies widely. Some see the media world as conservative vs. liberal with no middle ground. Others position news outlets in surprising places, such as the very conservative woman Edgerly interviewed who only centered Fox News between right- and leftwing media.
“We need to be really careful about how we talk about media,” Edgerly said. “This either/or way of making sense of media is too reductive—it’s simplistic.”
She is also concerned that accusations of biased reporting—or worse—can backfire and lead people to lose trust in all sources.
“We’re in a moment where we give a lot of attention to what the negative sources, low quality, disinformation-prone sources, are doing,” Edgerly noted. “I see this as creating a narrative where people think: ‘There’s a lot of bad sources out there, I don’t know how to find good sources, and, therefore, I’m just not going to trust any of it.’”
For decades, the U.S. media market was known as apolitical, objective, and neutral, Edgerly points out, but that is no longer seen as true.
If people do not trust news sources, and there’s no general acceptance of where to find unbiased information, then misinformation and disinformation will likely continue to flourish, she says. In such a news environment, even fact-checking breaks down as a tool to change beliefs.
Media, technology, and society researcher and IPR associate Pablo Boczkowski explains that trust in news institutions, as well as political and social institutions, is declining in the U.S. as the country becomes more fractured.
In his research, Boczkowski shows that people view news reporting today as biased and polarized, and they are especially distrustful of news circulated via social media. They are also more concerned about the effects untrustworthy sources could have on others than on themselves.
He points out that an increase in the supply of misinformation does not necessarily imply an increase in the take up of misinformation.
“Most of the conversation—both academic and in news and policy circles—about issues of misinformation and disinformation focuses on the supply side: How much there is, and known distribution issues, how rapidly it propagates,” he said. He questions the implicit assumption that if there is more misinformation and disinformation, they must have proportionally more impact on the audience.
“I know that is not necessarily the case when I look at our research outside of the United States, at least,” he continued.
Social media such as Facebook and Twitter are often blamed as top disseminators of misleading and fabricated information. In public opinion surveys like this one on healthcare workers, respondents point directly at social media channels as spreaders of false information. While some IPR researchers hold social media channels responsible for misleading people, others note these outlets are easy targets of blame.
Boczkowski questions our “post-2016 fixation on the dystopic consequences of information technology,” pointing out that misinformation and disinformation are “as old as humanity itself.”
Nisbet offered, “I honestly believe that our focus on social media is a bit of a canard.”
“It’s easier to talk about regulating social media and dealing with social media as a problem than what I believe are the underlying political, economic, social, and cultural drivers of this ‘information disorder,’” he continued. “Social media might be a symptom or maybe amplifies like when you have a comorbidity—but it’s not the cause of our problems.”
IPR political scientist James Druckman, who studies the origins of partisanship and the role of persuasion in politics, sees “a mutually reinforcing relationship” between disinformation and polarization.
He describes those holding more polarized opinions as also being more susceptible to considering information as biased, and therefore, more susceptible to partisan bias.
“That information may reinforce their polarized tendencies,” he explained.
“Yet what is less appreciated but equally concerning is false polarization where people have misinformation about the other side and that misperception fuels their own polarization,” he added. “They believe the other side is much more different and threatening than they actually are, and that breeds polarization with social and political consequences.”
In Nature Human Behaviour, Druckman and his co-authors note partisan media and social media’s contribution to partisan animosity, but they highlight other social and political causes as well.
“I would be hesitant to place all the blame for political ills on misinformation,” Druckman cautions. “There are equally, if not more crucial, social and institutional factors at play–such as demographic shifts and political institutions that were set up in ways that did not anticipate some of these shifts.”
Did misinformation and disinformation play a role in the 2016 and 2020 U.S. presidential elections?
As Nisbet notes, we know a good deal about false and misleading information and why people believe it. What we do not fully understand, according to his research, is the impact it has on people’s attitudes and behavior.
It may seem that “fake news” and social media conspiracy theories grew in size and importance. However, as Druckman points out, since we could not measure misinformation very well in the past, we do not really know its full impact on opinion.
“It remains unclear just how much misinformation is out there—most systematic studies suggest less than many think—and if it has changed, given we could not measure it as easily before,” he said.
During the 2016 campaign, candidates were the focus of misinformation and disinformation, Nisbet explains, much of it on social media and mainly about Democratic candidate Hillary Clinton. Those attacks ended after the election.
The press and social media were a bit savvier during the 2020 campaign, Nisbet says, about allowing the spread of misinformation. But after the election, a “deluge of misinformation” followed when Facebook eased up on its precautions.
“It was not about Biden. It was about the election results and electoral processes and the integrity of the election,” Nisbet explained. “So the timing and the nature of the misinformation/disinformation was very different in 2016 versus 2020.”
He sees the possible long-term effects of the spread of false information about election integrity as a huge concern.
“I think that misinformation/disinformation about candidates is bad,” Nisbet said. “But misinformation/disinformation that destroys trust in our political process, institutions, and the integrity of democracy itself is a thousand times worse.”
As we enter the third year of the COVID-19 pandemic, some IPR experts have turned to tracking how misinformation and disinformation affect people’s health and survival.
Much of Druckman’s recent research has been based on the regular surveys collected and analyzed since April 2020 by the COVID States Project of the university consortium of Northwestern, Northeastern, Harvard, and Rutgers, which he co-leads. In July 2021, the project reported that people who relied on Facebook for news about COVID had substantially lower vaccination rates than the overall U.S. population, and they were more likely to believe falsehoods, such as that vaccines alter DNA or contain tracking microchips, were factual.
A November 2021 survey finds that nearly three-quarters (72%) of healthcare workers believe that misinformation has negatively influenced people’s decision to seek care for or get vaccinated against COVID-19.
Nisbet is also studying the effects of online COVID-19 information on health decisions in research supported by the National Science Foundation.
“One of the main effects of exposure, or at least endorsement, of COVID misinformation is reducing public trust in scientists and medical experts,” he said. “The more you believe false or misleading information about COVID the more you’re likely to be distrustful of public health or scientific experts.”
Communication studies researcher and IPR associate Ellen Wartella investigates how Twitter users prior to the pandemic promoted vaccine misinformation and connected it with a decrease in vaccination rates for diseases such as measles and tetanus and growing distrust of science and public health.
She sees a similar pattern of misinformation about the COVID vaccine.
“It’s absolutely the case that social media has been the main conveyor and mechanism by which anti-vaxxers can spread their message,” she said.
Rapp contributed to the “COVID-19 Vaccine Communication Handbook & Wiki,” an international collaboration created to improve vaccine communication and fight misinformation. To combat misinformation about the COVID vaccine, he suggests trying to find common ground with people to begin to persuade them.
“It’s going to take a concerted effort among many constituents,” he said.
Perhaps the biggest question overhanging the research is, how can we combat misinformation?
Druckman notes that a host of techniques have been developed, such as literacy courses.
Fact-checking is a very limited tool, as Edgerly and Nisbet observe, because it depends on the audience trusting the source of the checking.
Nisbet suggests what he calls “prebunking,” an “inoculation” against misinformation ahead of its distribution. For example, news organizations could have done more to publicize prior to the 2020 election that vote tallies would change overnight as mail-in ballots were added to the totals.
Rapp and Edgerly recommend scientists and journalists be more transparent about what they do.
“The general public largely doesn’t understand what journalists do, but they can recognize the power and importance of good journalism,” Edgerly said. She would like to see “a little bit of reminding the public about what journalism is supposed to do so it’s not tied into narratives about fake news and partisan bickering.”
Rapp encourages more “lateral reading” of different sources on the same subject—a technique endorsed in many classroom settings. He also suggests that academics, doctors, and politicians quit only speaking in jargon and in a top-down way about issues if we want to bridge the partisan divides exacerbated by misinformation and disinformation.
For Boczkowski, trust in institutions, including the media, is the fundamental issue. To restore trust, he says we must improve our institutions to work fairly for all groups, not just the privileged ones.
“Instead of spending so much time on [disinformation], we should spend all the energy we spend on that looking at what can we do to make our society more equitable, more just, more inclusive, to emphasize those that have been disenfranchised,” he said. “Otherwise, it’s like having a strep infection and thinking you’re going to cure it with Tylenol!”
Pablo Boczkowski is Hamad Bin Khalifa Al-Thani Professor in Communication Studies. James Druckman is Payson S. Wild Professor of Political Science and IPR associate director. Stephanie Edgerly is associate professor and director of research in the Medill School. Erik Nisbet is the Owen L. Coon Endowed Professor of Policy Analysis & Communication. David Rapp is professor of psychology. Ellen Wartella is Sheikh Hamad bin Khalifa Al-Thani Professor of Communication. All are IPR faculty members.
Photo credit: iStock
Published: January 26, 2022.