On the Past, Present, and Future of Forensic Science in the United States
The AAAS Forensic Conference on November 12, 2019, under the theme “An Update on Strengthening Forensic Science in the United States: A Decade of Development”, commemorates the 10th anniversary of the National Research Council (NRC) report, “Strengthening Forensic Science in the United States: A Path Forward.”
Past
Historically, the field we know as “forensic science” has developed under less than ideal scientific conditions. Often, as is the case with bullet and cartridge case examination, forensic science techniques were developed by and for law enforcement, outside of rigorous scientific lab settings. Even some of the more rigorously investigative methods, such as fingerprint analysis popularized by Francis Galton in the late 19th century, did not have purely scientific motivations. Galton began his biometric exploration in an effort to quantify his theory about the inferiority of non-European populations. (He also coined the term “eugenics.”) Because of this not-so-scientific history, much of forensic science today relies on subjective assessments and expert opinion. A forensic comparison requires a trained, experienced person to look at the evidence -- a bullet, a fingerprint, etc. -- and decide which markings are important. In the last 20 years, however, these subjective methods have come under increasing scrutiny.
Present
The gold standard in forensic science today is analysis of single-donor DNA, which in contrast to its predecessors, was developed in a controlled scientific setting. The father of modern DNA testing, Alec Jeffreys, developed his techniques at the University of Leicester, where he still serves as professor of genetics. DNA evidence has helped convict thousands of criminals, but has also been instrumental in freeing hundreds of wrongly convicted people, many of whom were convicted based on faulty subjective forensic science such as bitemark evidence. As criticism of forensic science has increased, reports such as the 2009 National Research Council report and the 2016 President’s Council of Advisors on Science and Technology report have called for the development of objective methods in the field.
The federal government, through National Institute of Standards and Technology (NIST), Center for Statistics and Applications in Forensic Evidence (CSAFE), and other organizations, funds research that aims to do just that. NIST has developed an algorithm for comparing 3D images of cartridge casings, and CSAFE has an objective bullet-matching method using data from 3D microscopic bullet scans. Though these methods are far from being ready to use in court, they represent an important step towards objectivity. In the meantime, research scientists and forensic science practitioners need to work closely to bring the field into the 21st century.
Future
Increasing computing power has opened many doors for the advancement of forensic science. Expanded storage capabilities mean that high resolution, detailed images can be saved and analyzed using intensive computational methods. Machine learning methods can also process and learn from more data than humans ever could. Ultimately, forensic science should move towards objective methods that are transparent, validated, and tested. This will require calculation of error rates, validation of new methods in realistic cases, and changing the culture so that any algorithm that is used in the criminal justice system is in the public domain and accessible to all. Development of methodology also needs to shift to independent scientists and away from police labs, which pose the risk of conflicts of interest. Strong partnerships between academia and the private sector can ensure that even for commercial products, anyone who wishes to do so can “look under the hood”. The 2009 NRC report was a wake-up call, and forensic science still has a long way to go.
About the Authors
Alicia Carriquiry, Ph.D. is a Distinguished Professor of Statistics at Iowa State University and the director of the Center for Statistics and Applications in Forensic Evidence. She researches applications of statistics in human nutrition, bioinformatics, forensic sciences and traffic safety, and has published over 100 peer-reviewed articles.
Samantha Tyner is a 2019-20 STPF Fellow with the Office of Survey Methods Research at the Bureau of Labor Statistics. She is an applied statistician with interests in data science, data visualization, forensic science, machine learning, text mining, and network analysis. You can follow her on Twitter at @sctyner.
Banner image: HC03834 A photograph of Belleville Police Detective, Dale Ashbury, (located on the right) and two other Belleville, Ontario city police officers conducting a crime scene investigation taken circa 1970. This photo was taken by Ian Robertson. Public domain image under CC0 1.0.