Wednesday, November 15, 2023

Republicans are more comfortable with lies

Democrats and Republicans have sharply different attitudes about removing misinformation from social media


Your political leanings go a long way to determine whether
you think it’s a good or bad idea to take down misinformation.
 Johner Images via Getty Images
Misinformation is a key global threat, but Democrats and Republicans disagree about how to address the problem. In particular, Democrats and Republicans diverge sharply on removing misinformation from social media.

Only three weeks after the Biden administration announced the Disinformation Governance Board in April 2022, the effort to develop best practices for countering disinformation was halted because of Republican concerns about its mission. Why do Democrats and Republicans have such different attitudes about content moderation?

My colleagues Jennifer Pan and Margaret E. Roberts and I found in a study published in the journal Science Advances that Democrats and Republicans not only disagree about what is true or false, they also differ in their internalized preferences for content moderation. Internalized preferences may be related to people’s moral values, identities or other psychological factors, or people internalizing the preferences of party elites.

And though people are sometimes strategic about wanting misinformation that counters their political views removed, internalized preferences are a much larger factor in the differing attitudes toward content moderation.

Internalized preferences or partisan bias?

In our study, we found that Democrats are about twice as likely as Republicans to want to remove misinformation, while Republicans are about twice as likely as Democrats to consider removal of misinformation as censorship. 

Democrats’ attitudes might depend somewhat on whether the content aligns with their own political views, but this seems to be due, at least in part, to different perceptions of accuracy.

Previous research showed that Democrats and Republicans have different views about content moderation of misinformation. One of the most prominent explanations is the “fact gap”: the difference in what Democrats and Republicans believe is true or false. 

For example, a study found that both Democrats and Republicans were more likely to believe news headlines that were aligned with their own political views.

But it is unlikely that the fact gap alone can explain the huge differences in content moderation attitudes. That’s why we set out to study two other factors that might lead Democrats and Republicans to have different attitudes: preference gap and party promotion. 

A preference gap is a difference in internalized preferences about whether, and what, content should be removed. Party promotion is a person making content moderation decisions based on whether the content aligns with their partisan views.

We asked 1,120 U.S. survey respondents who identified as either Democrat or Republican about their opinions on a set of political headlines that we identified as misinformation based on a bipartisan fact check. Each respondent saw one headline that was aligned with their own political views and one headline that was misaligned. 

After each headline, the respondent answered whether they would want the social media company to remove the headline, whether they would consider it censorship if the social media platform removed the headline, whether they would report the headline as harmful, and how accurate the headline was.

Deep-seated differences

When we compared how Democrats and Republicans would deal with headlines overall, we found strong evidence for a preference gap. 

Overall, 69% of Democrats said misinformation headlines in our study should be removed, but only 34% of Republicans said the same; 49% of Democrats considered the misinformation headlines harmful, but only 27% of Republicans said the same; and 65% of Republicans considered headline removal to be censorship, but only 29% of Democrats said the same.

Even in cases where Democrats and Republicans agreed that the same headlines were inaccurate, Democrats were nearly twice as likely as Republicans to want to remove the content, while Republicans were nearly twice as likely as Democrats to consider removal censorship.

We didn’t test explicitly why Democrats and Republicans have such different internalized preferences, but there are at least two possible reasons. First, Democrats and Republicans might differ in factors like their moral values or identities

Second, Democrats and Republicans might internalize what the elites in their parties signal. For example, Republican elites have recently framed content moderation as a free speech and censorship issue. Republicans might use these elites’ preferences to inform their own.

When we zoomed in on headlines that are either aligned or misaligned for Democrats, we found a party promotion effect: Democrats were less favorable to content moderation when misinformation aligned with their own views. 

Democrats were 11% less likely to want the social media company to remove headlines that aligned with their own political views. They were 13% less likely to report headlines that aligned with their own views as harmful. We didn’t find a similar effect for Republicans.

Our study shows that party promotion may be partly due to different perceptions of accuracy of the headlines. When we looked only at Democrats who agreed with our statement that the headlines were false, the party promotion effect was reduced to 7%.

Implications for social media platforms

We find it encouraging that the effect of party promotion is much smaller than the effect of internalized preferences, especially when accounting for accuracy perceptions. However, given the huge partisan differences in content moderation preferences, we believe that social media companies should look beyond the fact gap when designing content moderation policies that aim for bipartisan support.

Future research could explore whether getting Democrats and Republicans to agree on moderation processes – rather than moderation of individual pieces of content – could reduce disagreement. Also, other types of content moderation such as downweighting, which involves platforms reducing the virality of certain content, might prove to be less contentious. 

Finally, if the preference gap – the differences in deep-seated preferences between Democrats and Republicans – is rooted in value differences, platforms could try to use different moral framings to appeal to people on both sides of the partisan divide.

For now, Democrats and Republicans are likely to continue to disagree over whether removing misinformation from social media improves public discourse or amounts to censorship.The Conversation

Ruth Elisabeth Appel, Ph.D. Candidate in Communication, Stanford University

This article is republished from The Conversation under a Creative Commons license. Read the original article.