Friday, January 6, 2023

Not everything you see on CSI is real

Is It Forensics or Is It Junk Science?

by Sophia Kovatch, Pamela Colloff and Brett Murphy for ProPublica

It’s been decades since the intersection of forensic science and criminal justice first became a pop culture phenomenon, popularized by countless TV shows, movies and books. 

But the public’s growing awareness of forensic techniques obscures a far more complex field that’s chock full of bogus science — and the people who champion it, often for profit.

For years, ProPublica has reported on these dubious techniques as they’ve wormed their way into every corner of our real-life criminal justice system.

So, what’s legitimate forensic science and what’s junk? Let’s start with the basics.

What Is Junk Science?

Junk science refers to any theory or method presented as scientific fact without sufficient research or evidence to support it. Some types of junk science have virtually no supporting evidence, while others are oversimplifications of real but complex science findings.

Adding to the risk they pose to the justice system, many forms of junk science are very subjective and depend highly on individual interpretation.

How to Spot Junk Science in Forensics

When ProPublica has reported on junk science, we’ve found many common traits. They could include:

  • It has limited or no scientific evidence or research supporting it.
  • It is presented as absolutely certain or conclusive, with no mention of error rates.
  • It relies on subjective criteria or interpretation.
  • It oversimplifies a complex science.
  • It takes just a few days to become an “expert.”

Examples of Junk Science in Forensics and Law Enforcement

Tracing the spread of junk science through the criminal justice system can be difficult. But ProPublica has followed forensic junk science in various forms for years.

911 Call Analysis

Police and prosecutors trained in 911 call analysis are taught they can spot a murderer on the phone by analyzing speech patterns, tone, pauses, word choice and even the grammar used during emergency calls. These are known as “guilty indicators,” according to the tenets of the program. A misplaced word, too long of a pause or a phrase of politeness could reveal a killer.

Analysis of 911 calls appears in the criminal justice system in lots of different ways. Some detectives say it’s a tool to help build a case or prepare to interrogate a suspect. They have used it to help extract confessions. Others present their analyses to prosecutors or enlist Tracy Harpster, the program’s creator and a retired deputy police chief from Ohio, to consult on cases.

During Harpster’s career, he had almost no homicide investigation experience or scientific background. He developed the 911 call analysis technique based on a small study for his master’s thesis in 2006. After teaming up with the FBI to promote his findings nationwide, there was enough demand from law enforcement to create a full-fledged training curriculum.

Since the technique’s development, 911 call analysis has been used in investigations across the country. ProPublica documented more than 100 cases in 26 states where Harpster’s methods played a role in arrests, prosecutions and convictions — likely a fraction of the actual figure. In addition, Harpster says he has personally consulted in more than 1,500 homicide investigations nationwide.

Despite the seeming pervasiveness of the technique, researchers who have studied 911 calls have not been able to corroborate Harpster’s claims. A 2020 study from the FBI warned against using 911 call analysis to bring actual cases. A separate FBI study in 2022 said applying 911 analysis may actually increase bias. And academic studies from researchers at Villanova and James Madison universities came to similar conclusions.

Ultimately, five studies have not been able to find scientific evidence that 911 call analysis works.

In a 2022 interview, Harpster defended his program and noted that he has also helped defense attorneys argue for suspects’ innocence. He maintained that critics don’t understand the research or how to appropriately use it, a position he has repeated in correspondence with law enforcement officials for years. “The research is designed to find the truth wherever it goes,” Harpster said.

Example: ProPublica chronicled how 911 call analysis was used in the case of Jessica Logan, who was convicted of killing her baby after a detective trained by Harpster analyzed her call and then testified about it during trial.

Bloodstain-Pattern Analysis

Bloodstain-pattern analysis is a forensic discipline whose practitioners regard the drops, spatters and trails of blood at a crime scene as clues that can sometimes be used to reconstruct and even reverse-engineer the crime itself.

The reliability of bloodstain-pattern analysis has never been definitively proven or quantified, but largely due to the testimony of criminalist Herbert MacDonell, it was steadily admitted in court after court around the country in the 1970s and ’80s. MacDonell spent his career teaching weeklong “institutes” in bloodstain-pattern analysis at police departments around the country, training hundreds of officers who, in turn, trained hundreds more.

While there is no index that lists cases in which bloodstain-pattern analysis played a role, state appellate court rulings show that the technique has played a factor in felony cases across the country. Additionally, it has helped send innocent people to prison. From Oregon to Texas to New York, convictions that hinged on the testimony of a bloodstain-pattern analyst have been overturned and the defendants acquitted or the charges dropped.

In 2009, a watershed report commissioned by the National Academy of Sciences cast doubt on the discipline, finding that “the uncertainties associated with bloodstain-pattern analysis are enormous,” and that experts’ opinions were generally “more subjective than scientific.” More than a decade later, few peer-reviewed studies exist, and research that might determine the accuracy of analysts’ findings is close to nonexistent.

When MacDonell, who died in 2019, was asked whether he ever considered changing his course structure or certification process after seeing students give faulty testimony, MacDonell answered in the negative. “You can’t control someone else’s thinking,” he said. “The only thing you can do is go in and testify to the contrary.”

Example: ProPublica has also reported on how bloodstain-pattern analysis was used to convict Joe Bryan of killing his wife, Mickey.

Other Junk Science Examples

ProPublica’s reporting on junk science in forensics goes beyond bloodstain-pattern analysis and 911 call analysis. We’ve also covered:

How Does Junk Science Spread in Forensics?

Junk science can spread a lot of different ways, but there are some common patterns in how it spreads across forensics and law enforcement.

Often, junk science originates when an individual devises a forensic technique based on minimal or narrow experience and data. For example, the original 911 call analysis training curriculum was based on a study of just 100 emergency calls, most of which came from a single state.

The creators of these techniques then put together curriculums and workshops targeting law enforcement at every level around the country. As more police officers take these courses, these techniques are employed more often in investigating crimes and interrogating suspects. When officers testify in court, the impact of junk forensic techniques makes its way into the justice system.

Other times, prosecutors call the creators and trainees of these forensic methods as expert witnesses, as was common with bloodstain-pattern analysis.

In the courtroom, it’s up to the judge to decide whether certain evidence is admissible. While judges are experts in the law, they aren’t necessarily experts in the scientific disciplines that make up forensics. Once a type of junk science is admitted in a case, other prosecutors and judges can use that as precedent to allow it in future cases too. In this way, new junk science methods like 911 call analysis can spread quickly through the justice system.

How Long Has Junk Science Been a Problem in Criminal Justice?

Forensic science has had a junk science problem for decades. In the 1980s and ’90s, the FBI and other law enforcement agencies used faulty microscopic hair comparison in hundreds of cases, only formally acknowledging the problematic science in 2015. Since at least the 1990s, law enforcement has used a written content analysis tool with no scientific backing to interpret witness and suspect statements.

The 2009 report from the National Academy of Sciences, which reviewed the state of forensic science in the United States, found that a lot of forensic evidence “was admitted into criminal trials without any meaningful scientific validation, determination of error rates, or reliability testing to explain the limits of the discipline.” A 2016 report from the President’s Council of Advisors on Science and Technology found that despite efforts to fund forensic science research, there was still a major gap in understanding the scientific validity of many forensic methods.

In 2017, the Trump administration allowed the charter for the National Commission on Forensic Science to expire, further limiting the progress on validating forensic science methods. Since then, many forensic professionals have critiqued the junk science problems rampant in forensics and criminal justice.

ProPublica is a Pulitzer Prize-winning investigative newsroom. Sign up for The Big Story newsletter to receive stories like this one in your inbox.