PlumX Metrics
Embed PlumX Metrics

Misuse of statistical method results in highly biased interpretation of forensic evidence in

Law, Probability and Risk, ISSN: 1470-840X, Vol: 23, Issue: 1
2024
  • 4
    Citations
  • 0
    Usage
  • 13
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Since the National Academy of Sciences released their report outlining paths for improving reliability, standards, and policies in the forensic sciences (NAS, 2009), there has been heightened interest in evaluating the scientific validity of forensic science disciplines. Guyll et al. (2023) seek to evaluate the validity of forensic cartridge-case comparisons. They conducted an experiment to test the accuracy of firearms examiners. They then describe how triers of fact such as a judge or jury in a criminal case, who are initially unbiased and have not yet seen any evidence, should apply the results of their experiment to the case at hand. Specifically, Guyll et al. (2023) use Bayes’ rule to calculate the posterior probability that a cartridge case found at a crime scene was fired from a reference gun (often a gun linked to the defendant), given the decision of a firearms examiner. A key input to this calculation is the prior odds that the crime scene cartridge case was fired from the reference gun, which Guyll et al. (2023) set to 1 and claim to be unbiased. However, as we explain below, this prior is typically highly biased against the defendant and can lead judges and jurors in criminal trials to grossly misunderstand how to interpret forensic evidence. It is imperative to address this erroneous statistical argument of Guyll et al. (2023), which is being presented by the prosecution in an ongoing homicide case (DC Superior Court, 2023). We discuss some other aspects of the study design and statistical analysis of Guyll et al. (2023) as well. Our focus is on a specific set of issues in Guyll et al. (2023) and is not exhaustive.

Bibliographic Details

Michael Rosenblum; Elizabeth T. Chin; Elizabeth L. Ogburn; Akihiko Nishimura; Abhirup Datta; Daniel Westreich; Susan Vanderplas; Maria Cuellar; William C. Thompson

Oxford University Press (OUP)

Arts and Humanities; Decision Sciences; Social Sciences

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know