The National Institute of Standards and Technology announced last week that it is launching a new study of certain types of DNA analysis used in criminal prosecutions. These methods, “if misapplied, could lead to innocent people being wrongly convicted,” according to the institute’s statement. NIST will invite all public and private forensics labs in the U.S. to participate by testing the same set of complex DNA samples, and will compare the results, which will be published online next summer. Its goal is to develop a new set of national standards for DNA analysis.
This study comes at a time when labs are seeking to identify suspects based on especially small samples (such as “touch” DNA, which consists of just a few skin cells), and using software to help analyze mixtures of more than one person’s genetic material. ProPublica recently investigated the use of two such disputed methods by New York City’s crime lab: high-sensitivity testing of trace amounts of DNA, and the Forensic Statistic Tool, known technically as “probabilistic genotyping software.”
John Butler, a DNA expert and the author of several textbooks on forensic DNA testing, will be leading a team of scientists for the NIST study. He spoke to ProPublica Tuesday from the institute’s offices in Gaithersburg, Maryland.
Why this study, and why now?
Just in the past two years, there has been a huge rush to go into the probabilistic genotyping field, and people are jumping into this without really thinking about a lot of these issues: how sensitivity impacts what they’re doing, how “transfer” and “persistence” of DNA can impact their results, and what they’re doing in terms of the way that they set…