Rational Riddles: Unpacking the Puzzle of Political Polarization
By PETER DIZIKES, MASSACHUSETTS
INSTITUTE OF TECHNOLOGY
U.S. politics is heavily polarized.
This is often regarded as a product of irrationality: People can be tribal, are influenced by their peers, and often get information from very different, sometimes inaccurate sources.
Tribalism and misinformation are real enough. But what if people are often acting rationally as well, even in the process of arriving at very different views?
What if they are not being misled or too
emotional, but are thinking logically?
Rational Polarization in Political Views
“There can be quite reasonable ways people
can be predictably polarized,” says MIT philosopher
Kevin Dorst, author of a new paper on the subject, based partly on his own
empirical research.
This may especially be the case when people
deal with a lot of ambiguity when weighing political and civic issues. Those
ambiguities generate political asymmetry. People consider evidence in
predictably different ways, leading them to different conclusions. That doesn’t
mean they are not thinking logically, though.
“What’s going is people are selectively
scrutinizing information,” Dorst says. “That’s effectively why they move in
opposite directions, because they scrutinize and selectively look for flaws in
different places, and so they get overall different takes.”
The concept of rational polarization may help
us develop a more coherent account about how views differ, by helping us avoid
thinking that we alone are rational — or, conversely, that we have done no real
thinking while arriving at our own opinions. Thus it can add nuance to our
assessments of others.
The paper, “Rational Polarization,” appears
in The Philosophical Review. Dorst, the sole author, is an
assistant professor in MIT’s Department of Linguistics and Philosophy.
Challenging Belief Formation Models
To Dorst, rational polarization stands as a
useful alternative to other models about belief formation. In particular,
rational polarization in his view improves upon one type of model of “Bayesian”
thinking, in which people keep using new information to hone their views.
In Bayesian terms, because people use new
information to update their views, they will rationally either change their
ideas or not, as is warranted. it, But in reality, Dorst asserts, things are
not so simple. Often when we assess new evidence, there is ambiguity present —
and Dorst contends that it is rational to be unsure about that ambiguity. But
this can generate polarization because people’s prior assumptions do influence
the places where they find ambiguity.
Suppose a group of people have been given two
studies about the death penalty: One study finds the death penalty has no
deterrent effect on people’s behavior, and the other study finds it does. Even
reading the same evidence, people in the group will likely wind up with
different interpretations of it.
“Those who really believe in the deterrent
effect will look closely at the study suggesting there is no deterrent effect,
be skeptical about it, poke holes in the argument, and claim to recognize flaws
in its reasoning,” Dorst says. “Conversely, for the people who disbelieve the
deterrent effect, it’s the exact opposite. They find flaws in the study
suggesting there is a deterrent effect.”
Even to these seemingly selective readings
can be rational, Dorst says: “It makes sense to scrutinize surprising
information more than unsurprising information.” Therefore, he adds, “You can
see that people who have this tendency to selectively scrutinize [can] drift
apart even when they are presented with the same evidence that’s mixed in the
same way.”
Online Experiment Illustrating Ambiguity’s Role
To help show that this habit exists, Dorst
also ran an online experiment about ambiguity, with 250 participants on the
Prolific online survey platform. The aim was to see how much people’s views
might become polarized in the presence of ambiguous information.
The participants were given an incomplete
string of letters, as one might find in a crossword puzzle or on “Wheel of
Fortune.” Some letter strings were parts of real words, and some were not.
Depending on what kinds of additional information participants were given, the
ambiguous, unsolvable strings of letters had a sharply polarizing effect on how
people reacted to the additional information they received.
This process at work in the experiment, Dorst
says, is similar to what happens when people receive uncertain information, in
the news or in studies, about political matters.
“When you find a flaw, it gives you clear
evidence that undermines the study,” Dorst says. Otherwise, people often tend
to be uncertain about the material they see. “When you don’t find a flaw, it
[can] give you ambiguous evidence and you don’t know what to make of it. As a
result, that can lead to predictable polarization.”
The larger point, Dorst believes, is that we
can arrive at a more nuanced and consistent picture of how political
differences exist when people process similar information.
Rethinking Rationality in Politics
“There’s a perception that in politics,
rational brains shut off and people think with their guts,” Dorst says. “If you
take that seriously, you should say, ‘I form my beliefs on politics in the same
ways.’”
Unless, that is, you believe you alone are
rational, and everyone else is not — though Dorst finds this to be an untenable
view of the world.
“Part of what I’m trying to do is give an
account that’s not subject to that sort of instability,” Dorst says. “You don’t
necessarily have to point the finger at others. It’s a much more interesting
process if you think there’s something [rational] there as well.”
Reference: “Rational Polarization” by Kevin
Dorst, 1 July 2023, The Philosophical Review.
DOI: 10.1215/00318108-10469499