PlumX Metrics
Embed PlumX Metrics

The utility of multivariate outlier detection techniques for data quality evaluation in large studies: An application within the ONDRI project

BMC Medical Research Methodology, ISSN: 1471-2288, Vol: 19, Issue: 1, Page: 102
2019
  • 46
    Citations
  • 1
    Usage
  • 113
    Captures
  • 0
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

Article Description

Background: Large and complex studies are now routine, and quality assurance and quality control (QC) procedures ensure reliable results and conclusions. Standard procedures may comprise manual verification and double entry, but these labour-intensive methods often leave errors undetected. Outlier detection uses a data-driven approach to identify patterns exhibited by the majority of the data and highlights data points that deviate from these patterns. Univariate methods consider each variable independently, so observations that appear odd only when two or more variables are considered simultaneously remain undetected. We propose a data quality evaluation process that emphasizes the use of multivariate outlier detection for identifying errors, and show that univariate approaches alone are insufficient. Further, we establish an iterative process that uses multiple multivariate approaches, communication between teams, and visualization for other large-scale projects to follow. Methods: We illustrate this process with preliminary neuropsychology and gait data for the vascular cognitive impairment cohort from the Ontario Neurodegenerative Disease Research Initiative, a multi-cohort observational study that aims to characterize biomarkers within and between five neurodegenerative diseases. Each dataset was evaluated four times: with and without covariate adjustment using two validated multivariate methods - Minimum Covariance Determinant (MCD) and Candès' Robust Principal Component Analysis (RPCA) - and results were assessed in relation to two univariate methods. Outlying participants identified by multiple multivariate analyses were compiled and communicated to the data teams for verification. Results: Of 161 and 148 participants in the neuropsychology and gait data, 44 and 43 were flagged by one or both multivariate methods and errors were identified for 8 and 5 participants, respectively. MCD identified all participants with errors, while RPCA identified 6/8 and 3/5 for the neuropsychology and gait data, respectively. Both outperformed univariate approaches. Adjusting for covariates had a minor effect on the participants identified as outliers, though did affect error detection. Conclusions: Manual QC procedures are insufficient for large studies as many errors remain undetected. In these data, the MCD outperforms the RPCA for identifying errors, and both are more successful than univariate approaches. Therefore, data-driven multivariate outlier techniques are essential tools for QC as data become more complex.

Bibliographic Details

Kelly M. Sunderland; Derek Beaton; Stephen C. Strother; Malcolm A. Binns; Julia Fraser; Donna Kwan; Paula M. McLaughlin; Manuel Montero-Odasso; Alicia J. Peltsch; Frederico Pieruccini-Faria; Demetrios J. Sahlas; Richard H. Swartz; Robert Bartha; Sandra E. Black; Michael Borrie; Dale Corbett; Elizabeth Finger; Morris Freedman; Barry Greenberg; David A. Grimes; Robert A. Hegele; Chris Hudson; Anthony E. Lang; Mario Masellis; William E. McIlroy; David G. Munoz; Douglas P. Munoz; J. B. Orange; Michael J. Strong; Sean Symons; Maria Carmela Tartaglia; Angela Troyer; Lorne Zinman

Springer Science and Business Media LLC

Medicine

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know