People Willing To Spread Misinformation if They Believe It Could Become True in the Future
By AMERICAN PSYCHOLOGICAL
ASSOCIATION
People may be willing to condone statements they know to be false and even spread misinformation on social media if they believe those statements could become true in the future, according to research published by the American Psychological Association.
Whether the situation involves a politician making a controversial
statement, a business stretching the truth in an advertisement or a job seeker
lying about their professional skills on a resume, people who consider how a
lie might become true subsequently think it is less unethical to tell because
they judge the lie’s broader message (or “gist”) as truer. The study was
published in APA’s Journal of Personality and
Social Psychology.
“The rise in misinformation is a pressing societal problem,
stoking political polarization and eroding trust in business and politics.
Misinformation in part persists because some people believe it. But that’s only
part of the story,” said lead author Beth Anne Helgason, a doctoral student at
the London Business School. “Misinformation also persists because sometimes
people know it is false but are still willing to excuse it.”
This study was sparked by cases in which leaders in business and
politics have used claims that “it might become true in the future” to justify
statements that are verifiably false in the present.
To explore why people might be willing to condone this misinformation, researchers conducted six experiments involving more than 3,600 participants. The researchers showed participants in each study a variety of statements, clearly identified as false, and then asked some participants to reflect on predictions about how the statements might become true in the future.
In one experiment, researchers asked 447 MBA students from 59 different countries who were taking a course at a UK business school to imagine that a friend lied on their resume, for example by listing financial modeling as a skill despite having no prior experience.
The researchers then asked some
participants to consider the possibility of the lie becoming true (e.g.,
“Consider that if the same friend enrolls in a financial modeling course that
the school offers in the summer, then he could develop experience with
financial modeling”). They found that students thought it was less unethical
for a friend to lie when they imagined whether their friend might develop this
skill in the future.
In another experiment, 599 American participants viewed six markedly false political statements designed to appeal to either conservatives or liberals, including, “Millions of people voted illegally in the last presidential election” and, “The average top CEO makes 500 times more than the average worker.”
Each statement was clearly labeled as false by reputable,
non-partisan fact-checkers. Participants were then asked to generate their own
predictions about how each statement might become true in the future. For
instance, they were told that “It’s a proven fact that the average top CEO
currently makes 265 times more money than the average American worker,” then
asked to respond to the open-ended prompt, “The average top CEO will soon make
500 times more money than the average American worker if …”
The researchers found that participants on both sides of the
political aisle who imagined how false statements could eventually become true
were less likely to rate the statement as unethical than those who did not
because they were more likely to believe its broader meaning was true. This was
especially the case when the false statement fit with their political views.
Importantly, participants knew these statements were false, yet imagining how
they might become true made people find them more excusable.
Even prompting the participants to think carefully before judging
the falsehoods did not change how ethical the participants found the statements,
said study co-author Daniel Effron, PhD, a professor of organizational behavior
at the London Business School.
“Our findings are concerning, particularly given that we find that
encouraging people to think carefully about the ethicality of statements was
insufficient to reduce the effects of imagining a future where it might be
true,” Effron said. “This highlights the negative consequences of giving
airtime to leaders in business and politics who spout falsehoods.”
The researchers also found that participants were more inclined to
share misinformation on social media when they imagined how it might become
true, but only if it aligned with their political views. This suggests that
when misinformation supports one’s politics, people may be willing to spread it
because they believe the statement to be essentially, if not literally, true,
according to Helgason.
“Our findings reveal how our capacity for imagination affects
political disagreement and our willingness to excuse misinformation,” Helgason
said. “Unlike claims about what is true, propositions about what might become
true are impossible to fact-check. Thus, partisans who are certain that a lie
will become true eventually may be difficult to convince otherwise.”
Reference: “It Might Become True: How Prefactual Thinking Licenses
Dishonesty” by Beth Anne Helgason and Daniel Effron, PhD, London Business
School, 14 April 2022, Journal of Personality and
Social Psychology.
DOI: 10.1037/pspa0000308