Game combats political misinformation by letting players undermine democracy
A short online game in which players
are recruited as a “Chief Disinformation Officer” and use tactics such as
trolling to sabotage elections in a peaceful town has been shown to reduce
susceptibility to political misinformation in its users.
Sander van der Linden
The free-to-play Harmony Square is released to the public, along with a study on its effectiveness published in the Harvard Misinformation Review.
It has been created by University of
Cambridge psychologists with support from the US Department of State's Global
Engagement Center and Department of Homeland Security Cybersecurity and
Infrastructure Security Agency (CISA).
The gameplay is based on
“inoculation theory”: that exposing people to a weak “dose” of common
techniques used to spread fake news allows them to better identify and
disregard misinformation when they encounter it in future.
In this case, by understanding how
to incite political division in the game using everything from bots and
conspiracies to fake experts, players get a form of “psychological vaccine”
against the product of these techniques in the real
world.
“Trying to debunk misinformation
after it has spread is like shutting the barn door after the horse has bolted.
By pre-bunking, we aim to stop the spread of fake news in the first place,”
said Dr Sander van der Linden, Director of the Cambridge Social Decision-Making
lab and senior author of the new study.
Twitter has started using a “pre-bunk” approach:
highlighting types of fake news likely to be encountered in feeds during the US
election. However, researchers argue that familiarising people with techniques
behind misinformation builds a “general inoculation”, reducing the need to
rebut each individual conspiracy.
In the 10-minute game Harmony
Square, a small town neighbourhood “obsessed with democracy” comes
under fire as players bait the square’s “living statute”, spread falsehoods
about its candidate for “bear controller”, and set up a disreputable online news
site to attack the local TV anchor.
“The game itself is quick, easy and
tongue-in-cheek, but the experiential learning that underpins it means that
people are more likely to spot misinformation, and less likely to share it,
next time they log on to Facebook or YouTube,” said Dr Jon Roozenbeek, a
Cambridge psychologist and lead author of the study.
Over the course of four short levels, users learn about five manipulation techniques: trolling to provoke outrage; exploiting emotional language to create anger and fear; artificially amplifying reach through bots and fake followers; creating and spreading conspiracy theories; polarizing audiences.
In a randomised controlled trial,
researchers took 681 people and asked them to rate the reliability of a series
of news and social media posts: some real, some misinformation, and even some
faked misinformation created for the study, in case participants had already
come across real-world examples.
They gave roughly half the
sample Harmony Square to play, while the
other half played Tetris, and then asked them to rate another series of
news posts.
The perceived reliability of
misinformation dropped an average of 16% in those who completed Harmony
Square compared to their assessment prior to playing. The game also
reduced willingness to share fake news with others by 11%. Importantly, the
players’ own politics – whether they leaned left or right – made no
difference.
Having the “control group” who
played Tetris allowed the scientists to determine an “effect size” of 0.54 for
the study, said Van der Linden.
“The effect size suggests that if
the population was split equally like the study sample, 63% of the half that
played the game would go on to find misinformation significantly less reliable,
compared to just 37% of the half left to navigate online information without
the inoculation of Harmony Square,” he said.
The project follows other playful
attempts by CISA to illustrate how “foreign influencers” use disinformation to
target “hot button” issues. A previous demonstration took the example of whether pineapple belongs
on pizza.
However, Harmony Square is
based on the findings of a number of studies from
the Cambridge team showing how similar gamified approaches to digital literacy
significantly reduce susceptibility to fake news and online
conspiracies.
The team behind the game, which
includes the Dutch media agency DROG and designers Gusmanson, have recently
worked with the UK Cabinet Office on Go
Viral!, an intervention that specifically tackles conspiracies
around COVID-19.
Harmony Square is geared
towards the politically charged misinformation that has plagued many
democracies over the last decade. “The aftermath of this week’s election day is
likely to see an explosion of dangerous online falsehoods as tensions reach
fever pitch,” said van der Linden.
“Fake news and online conspiracies
will continue to chip away at the democratic process until we take seriously
the need to improve digital media literacy across populations. The
effectiveness of interventions such as Harmony Square are a promising start,”
he said.