Social media experiment reveals potential to 'inoculate' millions of users against misinformation
University of Cambridge
Short animations giving viewers a taste of the tactics behind misinformation can help to "inoculate" people against harmful content on social media when deployed in YouTube's advert slot, according to a major online experiment led by the University of Cambridge.
Working
with Jigsaw (https://jigsaw.google.com/),
a unit within Google dedicated to tackling threats to open societies, a team of
psychologists from the universities of Cambridge and Bristol created 90-second
clips designed to familiarise users with manipulation techniques such as scapegoating
and deliberate incoherence.
This
"pre-bunking" strategy pre-emptively exposes people to tropes at the
root of malicious propaganda, so they can better identify online falsehoods
regardless of subject matter.
Researchers
behind the Inoculation Science project (https://inoculation.science/)
compare it to a vaccine: by giving people a "micro-dose" of
misinformation in advance, it helps prevent them falling for it in future -- an
idea based on what social psychologist's call "inoculation theory."
The findings, published in Science Advances, come from seven experiments involving a total of almost 30,000 participants -- including the first "real world field study" of inoculation theory on a social media platform -- and show a single viewing of a film clip increases awareness of misinformation.
The
videos introduce concepts from the "misinformation playbook,"
illustrated with relatable examples from film and TV such as Family Guy or, in
the case of false dichotomies, Star Wars ("Only a Sith deals in
absolutes").
"YouTube
has well over 2 billion active users worldwide. Our videos could easily be
embedded within the ad space on YouTube to prebunk misinformation," said
study co-author Prof Sander van der Linden, Head of the Social Decision-Making
Lab (SDML) at Cambridge, which led the work.
"Our
research provides the necessary proof of concept that the principle of
psychological inoculation can readily be scaled across hundreds of millions of
users worldwide."
Lead
author Dr Jon Roozenbeek from Cambridge's SDML describes the team's videos as
"source agnostic," avoiding biases people have about where
information is from, and how it chimes -- or not -- with what they already
believe.
"Our
interventions make no claims about what is true or a fact, which is often
disputed. They are effective for anyone who does not appreciate being
manipulated," he said.
"The
inoculation effect was consistent across liberals and conservatives. It worked
for people with different levels of education, and different personality types.
This is the basis of a general inoculation against misinformation."
Google
-- YouTube's parent company -- is already harnessing the findings. At the end
of August, Jigsaw will roll out a prebunking campaign across several platforms
in Poland, Slovakia, and the Czech Republic to get ahead of emerging
disinformation relating to Ukrainian refugees. The campaign is designed to
build resilience to harmful anti-refugee narratives, in partnership with local
NGOs, fact checkers, academics, and disinformation experts.
"Harmful
misinformation takes many forms, but the manipulative tactics and narratives
are often repeated and can therefore be predicted," said Beth Goldberg,
co-author and Head of Research and Development for Google's Jigsaw unit.
"Teaching
people about techniques like ad-hominem attacks that set out to manipulate them
can help build resilience to believing and spreading misinformation in the
future.
"We've
shown that video ads as a delivery method of prebunking messages can be used to
reach millions of people, potentially before harmful narratives take
hold," Goldberg said.
The
team argue that pre-bunking may be more effective at fighting the
misinformation deluge than fact-checking each untruth after it spreads -- the
classic 'debunk' -- which is impossible to do at scale, and can entrench conspiracy
theories by feeling like personal attacks to those who believe them.
"Propaganda,
lies and misdirections are nearly always created from the same playbook,"
said co-author Prof Stephan Lewandowsky from the University of Bristol.
"We developed the videos by analysing the rhetoric of demagogues, who deal
in scapegoating and false dichotomies."
"Fact-checkers
can only rebut a fraction of the falsehoods circulating online. We need to
teach people to recognise the misinformation playbook, so they understand when
they are being misled."
Six
initial controlled experiments featured 6,464 participants, with the sixth
experiment conducted a year after the first five to ensure earlier findings
could be replicated.
Data
collection for each participant was comprehensive, from basic information --
gender, age, education, political leanings -- to levels of numeracy,
conspiratorial thinking, news and social media checking, "bullshit
receptivity," and a personality inventory, among other
"variables."
Factoring
all this in, the team found that inoculation videos improved people's ability
to spot misinformation, and boosted their confidence in being able to do so
again. The clips also improve the quality of "sharing decisions":
whether or not to spread damaging content.
Two
of the animations were then tested "in the wild" as part of a vast
experiment on YouTube, with clips positioned in the pre-video advert slot that
provides an option to skip after five seconds.
Google
Jigsaw exposed around 5.4 million US YouTubers to an inoculation video, with
almost a million watching for at least 30 seconds. The platform then gave a
random 30% of users that watched a voluntary test question within 24 hours of
their initial viewing.
The
clips aimed to inoculate against misinformation tactics of hyper-emotive
language and use of false dichotomies, and the questions -- based on fictional
posts -- tested for detection of these tropes. YouTube also gave a
"control" group of users who had not viewed a video the same test
question. In total, 22,632 users answered a question.
Despite
the intense "noise" and distractions on YouTube, ability to recognise
manipulation techniques at the heart of misinformation increased by 5% on
average.
Google
say the unprecedented nature of the experiment means there is no direct data
comparison available. However, increases in brand awareness from advertising on
YouTube -- known as "brand lift" -- are typically limited to 1% in
surveys of under 45,000 users.
"Users
participated in tests around 18 hours on average after watching the videos, so
the inoculation appears to have stuck," said van der Linden.
Researchers
say that such a recognition increase could be game changing if dramatically
scaled up across social platforms -- something that would be cheap to do. The
average cost for each view of significant length was the tiny sum of US$0.05.
Added Roozenbeek: "If anyone wants to pay for a YouTube campaign that measurably reduces susceptibility to misinformation across millions of users, they can do so, and at a miniscule cost per view."