Surprising study results show limits of using recommended steps to debunk false content
New York University
Conventional wisdom suggests that searching online to evaluate the veracity of misinformation would reduce belief in it. But a new study by a team of researchers shows the opposite occurs: Searching to evaluate the truthfulness of false news articles actually increases the probability of believing misinformation.
The findings,
which appear in the journal Nature, offer insights into the impact
of search engines' output on their users -- a relatively under-studied area.
"Our study shows that the act of searching online to evaluate news increases belief in highly popular misinformation -- and by notable amounts," says Zeve Sanderson, founding executive director of New York University's Center for Social Media and Politics (CSMaP) and one of the paper's authors.
The reason
for this outcome may be explained by search-engine outputs -- in the study, the
researchers found that this phenomenon is concentrated among individuals for
whom search engines return lower-quality information.
"This
points to the danger that 'data voids' -- areas of the information ecosystem
that are dominated by low quality, or even outright false, news and information
-- may be playing a consequential role in the online search process, leading to
low return of credible information or, more alarming, the appearance of
non-credible information at the top of search results," observes lead
author Kevin Aslett, an assistant professor at the University of Central
Florida and a faculty research affiliate at CSMaP.
In the newly
published Nature study, Aslett, Sanderson, and their
colleagues studied the impact of using online search engines to evaluate false
or misleading views -- an approach encouraged by technology companies and
government agencies, among others.
To do so,
they recruited participants through both Qualtrics and Amazon's Mechanical Turk
-- tools frequently used in running behavioral science studies -- for a series
of five experiments and with the aim of gauging the impact of a common
behavior: searching online to evaluate news (SOTEN).
The first
four studies tested the following aspects of online search behavior and impact:
·
The effect of SOTEN on belief in both
false or misleading and true news directly within two days an article's
publication (false popular articles included stories on COVID-19 vaccines, the
Trump impeachment proceedings, and climate events)
·
Whether the effect of SOTEN can change
an individual's evaluation after they had already assessed the veracity of a
news story
·
The effect of SOTEN months after
publication
·
The effect of SOTEN on recent news
about a salient topic with significant news coverage -- in the case of this
study, news about the Covid-19 pandemic
A fifth study
combined a survey with web-tracking data in order to identify the effect of
exposure to both low- and high-quality search-engine results on belief in
misinformation. By collecting search results using a custom web browser
plug-in, the researchers could identify how the quality of
these search results may affect users' belief in the misinformation being
evaluated.
The study's
source credibility ratings were determined by NewsGuard, a browser extension
that rates news and other information sites in order to guide users in
assessing the trustworthiness of the content they come across online.
Across the
five studies, the authors found that the act of searching online to evaluate
news led to a statistically significant increase in belief in misinformation.
This occurred whether it was shortly after the publication of misinformation or
months later. This finding suggests that the passage of time -- and ostensibly
opportunities for fact checks to enter the information ecosystem -- does not
lessen the impact of SOTEN on increasing the likelihood of believing false news
stories to be true. Moreover, the fifth study showed that this phenomenon is
concentrated among individuals for whom search engines return lower-quality
information.
"The
findings highlight the need for media literacy programs to ground
recommendations in empirically tested interventions and search engines to
invest in solutions to the challenges identified by this research,"
concludes Joshua A. Tucker, professor of politics and co-director of CSMaP,
another of the paper's authors.
The paper's
other authors included William Godel and Jonathan Nagler of NYU's Center for
Social Media and Politics, and Nathaniel Persily of Stanford Law School.
The study was supported by a grant from the National Science Foundation (2029610).