Skip to content

Latest commit

 

History

History
50 lines (33 loc) · 2.1 KB

File metadata and controls

50 lines (33 loc) · 2.1 KB

ReproCheck 🔍

Systematic reproducibility assessments of published research

This repository contains detailed reproducibility analyses of influential research papers, verifying calculations, reproducing statistical analyses, and documenting discrepancies to promote transparency and improve research reliability.

📋 Purpose

Many published studies contain calculation errors, methodological discrepancies, or irreproducible results that go undetected. ReproCheck aims to:

  • Verify calculations and statistical analyses from published papers
  • 📊 Reproduce results using reported data and methods
  • 📝 Document discrepancies and potential errors
  • 🔬 Promote research transparency and reproducibility
  • 🤝 Contribute constructively to scientific accuracy

🗂️ Case Studies

1. Bucher et al. (1997) - Indirect Treatment Comparisons

Paper: Bucher HC, Guyatt GH, Griffith LE, Walter SD. The Results of Direct and Indirect Treatment Comparisons in Meta-Analysis of Randomized Controlled Trials. J Clin Epidemiol. 1997;50(6):683-691.

Status: ⚠️ Multiple discrepancies identified

Key Findings:

  • 3/23 individual study ORs had >25% calculation errors
  • Published indirect comparison OR (0.37) cannot be reproduced using the described Bucher method
  • Correct calculation yields OR = 0.54, a 46% difference
  • Discrepancy changes the paper's main conclusion about inconsistency between direct and indirect evidence

Files:


🚀 Getting Started

Prerequisites

Each case study folder contains its own requirements.txt or package dependencies. Generally, you'll need:

  • R (≥ 4.0.0) or Python (≥ 3.8)
  • Common statistical packages (specified in each analysis)

Running an Analysis