This article is republished from The Conversation under a Creative Commons license. Read the original article, which was published January 18, 2022.
Entering the new year, Americans are increasingly divided. They clash not only over differing opinions on COVID-19 risk or abortion, but basic facts like election counts and whether vaccines work. Surveying rising political antagonism, journalist George Packer recently wondered in The Atlantic, “Are we doomed?”
It is common to blame people who are intentionally distributing false information for these divisions. Nobel Prize-winning journalist Maria Ressa says Facebook’s “[bias] against facts” threatens democracy. Others lament losing the “shared sense of reality” and “common baseline of fact” thought to be a prerequisite for democracy.
Fact-checking, the rigorous independent verification of claims, is often presented as vital for fighting falsehoods. Elena Hernandez, a spokesperson for YouTube, states that “Fact checking is a crucial tool to help viewers make their own informed decisions” and “to address the spread of misinformation.” Ariel Riera, head of Argentina-based fact-checking organization Chequeado, argues that fact checking and “quality information” are key in the fight against “the COVID-19 ‘infodemic.‘”
Many people, including TV commentator John Oliver, are demanding that social media platforms better flag and combat the “flood of lies.” And worried Twitter engineers sought to “pre-bunk” viral falsehoods before they arose during the United Nations’ Glasgow climate summit in 2021.
As a social scientist who researches the role of truth in a democracy, I believe this response to Americans’ deepening political divisions is missing something.
Fact-checking may be vital for media literacy, discouraging politicians from lying and correcting the journalistic record. But I worry about citizens hoping for too much from fact-checking, and that fact checks oversimplify and distort Americans’ political conflicts.
Whether democracy requires a shared sense of reality or not, the more fundamental prerequisite is that citizens are capable of civilly working through their disagreements.
Curing misinformation?
Misinformation is no doubt troubling. COVID-19 fatalities and vaccine refusal are much higher among Republicans, who are more likely to believe unproven claims that COVID-19 deaths are intentionally exaggerated or that the vaccine harms reproductive health. And studies find that exposure to misinformation is correlated with a reduced willingness to get vaccinated.
Brookings Institution researchers found fact-checking mostly influences the politically uncommitted – those who do not have much information about an issue, rather than those who have inaccurate information. And debunking can backfire: Informing people that the flu shot cannot cause the flu or that the MMR injection is safe for children may make vaccine skeptics even more hesitant. Some participants in a study appeared to reject the information because it threatened their worldview. But some scientists say that fact-checking only very rarely backfires.
A 2019 experiment found that carefully crafted rebuttals to misinformation could dull the effects of false claims about vaccines or climate change, even for conservatives.
Still, a 2020 meta-analysis, a study that systematically combines dozens of research findings, concluded that fact-checking’s impact on people’s beliefs is “quite weak.” The more that a study looked like the real world, the less fact-checking changed participants’ minds.
Not that simple
The task of fact-checking also comes with its own set of problems. In my view, when the science is complex and uncertain, fact-checking’s biggest risk is exaggerating scientific consensus.
For example, the idea that COVID-19 might have emerged, or escaped, from a Wuhan, China, laboratory was labeled as “doubtful” in 2020 by The Washington Post’s fact-checkers. Facebook flagged it as “false information” in early 2021. But many scientists think the hypothesis merits investigation.
Or consider how USA Today has labeled as “false” the idea that “natural” immunity protects as well as vaccination. The newspaper’s fact-checkers only cited a recent Centers for Disease Control and Prevention study and did not address earlier Israeli research suggesting the exact opposite. When fact-checkers show limited views of the facts in a scientific debate, they can leave citizens with the impression that the science is settled when it really may not be.
Exaggerating the certainty of science can undermine public trust in science and journalism. When fact checks about masking flip-flopped in 2020, some people wondered whether the experts behind the fact checks were being genuine.
Also lost in worries about the dangers of misinformation is the reality that factually dubious speech can be politically important. A screed against the MMR vaccine might repeat a discredited claim about immunization causing autism, but it also contains vital political facts: Some people distrust the U.S. Food and Drug Administration and the pharmaceutical industry and resent the amount of control they feel that state health officials wield over them.
Citizens don’t just need to be alerted to potential misinformation. They need to know why other people are skeptical of officials and their facts.
No winners, no losers
The problems that Americans face are often too complex for fact-checking. And people’s conflicts run far deeper than a belief in falsehoods.
Maybe it is better to let go, at least a little, of the idea that Americans must occupy a shared reality. The point of political systems is to peaceably resolve conflicts. It may be less important to our democracy that the media focus on factual clarity, and more vital that it helps people to disagree more civilly.
Psychologist Peter Coleman studies how people discuss contentious issues. He has found that those conversations aren’t constructive when participants think of them in terms of truth and falsehood or pro and con positions, which tend to spur feelings of contempt.
Rather, productive discussions about difficult topics happen by encouraging participants to see reality as complex. Simply reading an essay highlighting the contradictions and ambiguities in an issue leads people to argue less and converse more. The focus becomes mutual learning rather than being right.
But it isn’t clear how best to bring Coleman’s findings out of the laboratory and into the world.
I propose that news outlets offer not only fact checks but also “disagreement checks.”
Rather than label the “lab leak” hypothesis or “natural immunity” idea as true or false, disagreement checkers would highlight the complicated sub-issues involved. They would show how the uncertain science looks very different depending on people’s values and level of trust.
Disagreement checks would be less concerned, for instance, with the correctness of calling ivermectin a “horse dewormer”. Instead they would focus on exploring why some citizens might favor untested treatments over the vaccine, focusing on reasons other than misinformation.
Maybe some combination of fact-checking and other tools can curb the public’s susceptibility to being misled. But by focusing a little less on the facts and more on the complexities of the problems that divide them, Americans can take one big step back from the abyss, and toward each other.
Written by Taylor Dotson, Associate Professor of Social Science, New Mexico Tech.