Thursday, February 10, 2022

It's not junk science, it's just junk.

If you've been shooting for a while, you probably remember when your new handgun came with a fired case in a little manila envelope... or maybe a sticker on the side of the box that said "NOT FOR SALE IN NY OR MD". Both New York and Maryland set up programs to keep spent casings on file of every handgun sold, with the intention of using the magic of "ballistic fingerprinting" to match spent cases found at crime scenes to specific guns.

Both states ended their programs, the Empire State in 2012 and the Old Line State in 2015, because they were expensive flops that didn't solve anything, and for numerous reasons.
[T]he system Maryland bought created images so imprecise that when an investigator submitted a crime scene casing, the database software would sometimes spit out hundreds of matches. The state sued the manufacturer in 2009 for $1.9 million, settling three years later for $390,000.
The bigger problem is that "ballistic fingerprinting", like many other forensic techniques, relies on pattern matching and is highly subjective, despite being presented to juries as "science".

Here's Radley Balko, writing at The Daily Beast on the other kind of "ballistic fingerprinting", whereby the claim is made that a fired projectile can be conclusively matched to a specific firearm:
Alicia Carriquiry is director at the Center for Statistics and Applications in Forensic Evidence at Iowa State. She and her team have been assembling a database of the ballistics marks left on bullets. Their research thus far has indicated there’s little support for the claim that every gun leaves unique marks on the bullets it fires—or least not in a way that’s useful for distinguishing one gun from another.

Controlled studies have also shown that the entire field of forensic firearms analysis is inherently subjective. The Houston Forensic Science Center is one of the few crime labs in the country to take a strictly scientific approach to forensics. Director Peter Stout regularly administers blind proficiency tests to his analysts. He first gave his ballistics analysts “sensitivity tests,” in which they were asked to determine whether two bullets were fired by the same gun. The analysts reached the correct conclusion about 76 percent of the time—leaving a lot of room for reasonable doubt.

Stout also gave his analysts “specificity tests,” in which they were asked to determine whether two bullets were fired by different guns. Here, the success rate dipped to 34 percent.

Carriquiry points to another recent sensitivity study—funded by the FBI itself—in which the analysts’ success rate was just 48 percent. “A dispassionate observer would say that they would have made fewer mistakes if they had flipped a coin,” Carriquiry says. “Given that astonishingly low accuracy, it seems pure hubris to be recommending to examiners to ‘push back.’”
(Archive Link)

Jurors have watched plenty of police procedurals on TV and think that projectile matching is some precise science when in fact going much beyond "Well, the octagonal polygonal rifling tells me this .45 caliber bullet was likely fired from a Glock" is educated guesswork.