PlumX Metrics
Embed PlumX Metrics

Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy: Survey of studies published in the BMJ and PLOS Medicine

BMJ (Online), ISSN: 1756-1833, Vol: 360, Page: k400
2018
  • 153
    Citations
  • 0
    Usage
  • 193
    Captures
  • 2
    Mentions
  • 2
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    153
    • Citation Indexes
      147
    • Policy Citations
      5
      • Policy Citation
        5
    • Clinical Citations
      1
      • PubMed Guidelines
        1
  • Captures
    193
  • Mentions
    2
    • Blog Mentions
      1
      • Blog
        1
    • News Mentions
      1
      • News
        1
  • Social Media
    2
    • Shares, Likes & Comments
      2
      • Facebook
        2

Most Recent News

The Open Data Explosion

“R esearch parasite.” When a pair of physicians publicized the term in a 2016 editorial in the  New England Journal of Medicine, it was an

Article Description

Objectives To explore the effectiveness of data sharing by randomized controlled trials (RCTs) in journals with a full data sharing policy and to describe potential difficulties encountered in the process of performing reanalyses of the primary outcomes. Design Survey of published RCTs. Setting PubMed/Medline. Eligibility criteria RCTs that had been submitted and published by The BMJ and PLOS Medicine subsequent to the adoption of data sharing policies by these journals. Main outcome measure The primary outcome was data availability, defined as the eventual receipt of complete data with clear labelling. Primary outcomes were reanalyzed to assess to what extent studies were reproduced. Difficulties encountered were described. Results 37 RCTs (21 from The BMJ and 16 from PLOS Medicine) published between 2013 and 2016 met the eligibility criteria. 17/37 (46%, 95% confidence interval 30% to 62%) satisfied the definition of data availability and 14 of the 17 (82%, 59% to 94%) were fully reproduced on all their primary outcomes. Of the remaining RCTs, errors were identified in two but reached similar conclusions and one paper did not provide enough information in the Methods section to reproduce the analyses. Difficulties identified included problems in contacting corresponding authors and lack of resources on their behalf in preparing the datasets. In addition, there was a range of different data sharing practices across study groups. Conclusions Data availability was not optimal in two journals with a strong policy for data sharing. When investigators shared data, most reanalyses largely reproduced the original results. Data sharing practices need to become more widespread and streamlined to allow meaningful reanalyses and reuse of data. Trial registration Open Science Framework osf.io/c4zke.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know