Flickr/Vince Alongi

John McCormick, a 63-year-old cab driver, was on his porch the morning of July 26, 1978, when he was robbed and shot dead. His wife, awakened by the commotion, reported that she saw the assailant flee wearing a face stocking, which investigators later found near the scene. Police identified 17-year-old Santae Tribble as the prime suspect after an informant linked him to the gun. At Tribble’s trial, an FBI agent testified that one of the hairs found in the stocking “matched in all microscopic characteristics” to Tribble. While half a dozen witnesses corroborated Tribble’s alibi, the hair was enough to convince the jury. Tribble served 23 years before he was paroled in 2003. Nine years later, mitochondrial DNA testing on the same hair samples revealed that Tribble was no match at all. In fact, one of the hairs was canine.

Nearly anyone with a garage and some capital could theoretically open a forensics lab.

Tribble is one of more than 350 people who have been exonerated by DNA testing after going to prison. Horrifically, 18 of the wrongly accused made it to death row before the truth came to light. It’s no secret that sloppy forensic science shoulders a lot of the blame. The Innocence Project has found, for example, that unverified or improper forensic analysis has contributed to more than 50 percent of its DNA exonerations. The National Research Council, meanwhile, released a blistering 328-page report in 2009 calling out qualitative forensic practices such as those routinely used to compare hair, bite marks, bullet markings, shoe patterns, and tire prints.

But it’s not just bad science that’s driving the problem. After all, some techniques, such as DNA analysis and blood typing, bear the imprimatur of rigorous, reliable research. The bigger issue is the way people perform the techniques—that is, largely without scientific training, oversight, or standardization.

Take the troubled crime lab of St. Paul, Minnesota. In 2012, it suspended all drug and fingerprint analysis after lawyers revealed that its operators had no standardized procedures, possessed little understanding of basic science, submitted illegible reports, and used dirty equipment. Like many crime labs in the U.S., it hadn’t been accredited by an independent forensic-science organization—something that’s required in only a few states. As the National Research Council report noted, “nearly anyone with a garage and some capital could theoretically open a forensics laboratory.”

Things are slowly improving. In January, the U.S. Departments of Justice and Commerce created a National Commission on Forensic Science to establish countrywide standards. The panel includes forensic scientists, lawyers, and police who are tasked with writing recommendations for the U.S. attorney general. Some rules, such as requiring crime labs to be clean and accredited, are no-brainers. But the justice system should also invest more money into a woefully neglected area: forensic-science training programs. Most programs don’t go beyond the undergraduate level, and many focus on the criminal-justice system rather than science and statistics. As the mess in St. Paul revealed, even the best technology is useless if analysts on the front lines don’t know how to use it. Just ask Santae Tribble.

This article originally appeared in the June 2014 issue of Popular Science.