February 11, 2020: ADAPTABLE Roundtable Produces Consensus Statement on Analysis and Integration of Patient-Reported Data in Clinical Trials

A roundtable discussion organized by the NIH Collaboratory in 2017 has produced consensus findings on the analysis and integration of patient-reported health (PRH) data in clinical trials. The report is part of an effort by the ADAPTABLE Supplement project team “to address best practices for capturing PRH data in pragmatic studies and optimal analytic approaches for integrating PRH with other data sources.”

The consensus statement was published online ahead of print this month in the Journal of the American Medical Informatics Association.

The report discusses strengths and limitations of PRH data, approaches for ascertaining and classifying study end points, and methods for addressing incompleteness, data alignment, and data concordance. Roundtable participants used experiences from the ADAPTABLE trial as a case study to inform their discussions.

ADAPTABLE, the first major randomized comparative effectiveness trial conducted by the National Patient-Centered Clinical Research Network (PCORnet), seeks to determine the optimal dose of aspirin therapy for secondary prevention of atherosclerotic cardiovascular disease. The trial relies on both existing EHR data sources and PRH data.

This work was supported by a supplemental grant award to the NIH Collaboratory Coordinating Center from the National Center for Complementary and Integrative Health.

December 6, 2019: Millions More People, Stronger Collaborations: The New and Improved NIH Collaboratory Distributed Research Network (Richard Platt, MD, Kevin Haynes, PharmD, Denise Boudreau, PhD, Jerry Gurwitz, MD, Christopher Granger, MD)


Richard Platt, MD, MS
Professor and Chair
Harvard Medical School
Department of Population Medicine

Kevin Haynes, PharmD, MSCE
Principal Scientist

Denise Boudreau, PhD
Senior Scientific Investigator
Kaiser Permanente Washington Health Research Institute

Jerry H. Gurwitz, MD
Professor of Medicine, Family Medicine and Community Health, and Population & Quantitative Health Sciences
University of Massachusetts Medical School
Executive Director, Meyers Primary Care Institute

Christopher B. Granger, MD
Professor of Medicine
Duke University


Millions More People, Stronger Collaborations: The New and Improved NIH Collaboratory Distributed Research Network


Embedded clinical research; Distributed research network; Administrative claims data; Multisite research; Sentinel System; Electronic health data; National registries; Common data model; Curated research data

Key Points

  • The NIH Collaboratory Distributed Research Network (DRN) enables investigators funded by the NIH and other not-for-profit sponsors to collaborate with investigators based in health plans that participate in the FDA’s Sentinel System.
  • Examples from an array of real-world research studies highlight strengths of conducting collaborative research using the DRN.
  • Among the DRN’s attributes are the abilities to embed a randomized clinical trial in real-world clinical settings, to direct outreach to providers and patients/families, and to determine feasibility with high accuracy to allow confidence in planning of ambitious clinical trials.

Discussion Themes

The DRN is optimized for multicenter research and depends on partnerships. It was developed to enable productive research collaborations.

How do investigators who are not embedded in participating health systems learn to work effectively in the DRN?

Read more about the NIH Collaboratory’s DRN.

#pctGR, @Collaboratory1, @DeptPopMed, @HealthCoreRWE

November 8, 2019: Lumbar Imaging with Reporting of Epidemiology: Initial Results and Some Lessons Learned (Jeffrey Jarvik, MD, MPH, Patrick Heagerty, PhD)


Jeffrey (Jerry) G. Jarvik MD MPH
Professor, Radiology, Neurological Surgery and Health Services
Adjunct Professor, Pharmacy and Orthopedics & Sports Medicine
University of Washington

Patrick Heagerty, PhD
Professor and Chair
Department of Biostatistics
University of Washington


Lumbar Imaging with Reporting of Epidemiology: Initial Results and Some Lessons Learned


Embedded pragmatic clinical trials; Radiology imaging; LIRE; Stepped-wedge; Cluster randomization; Epidemiology; Back pain

Key Points

  • The LIRE Demonstration Project evaluated whether prevalence benchmark data inserted into lumbar spine imaging reports would reduce overall spine-related healthcare utilization for patients referred from primary care.
  • The inserted intervention text urges caution when interpreting the presence of certain findings that are common in normal, pain-free volunteers.
  • While the study team found no decrease in spine-related healthcare utilization for the overall cohort, there was a small but potentially important effect on reducing opioid prescriptions.

Discussion Themes

A characteristic of stepped-wedge study design is that it yields two comparisons: between-group comparisons (clinic A vs clinic B) and within-group comparisons. But temporal trends can have an impact and must be adjusted for in the analysis.

For what type of intervention would a stepped-wedge design be suitable?

The hope is for a wider dissemination about interventions where radiologic testing is done and incidental findings are common.

Read more about the LIRE Demonstration Project.

#pctGR, #PragmaticTrials, @Collaboratory1

October 30, 2019: Baseline Covariate Imbalance Influences Treatment Effect Bias in Cluster Randomized Trials

In a study supported by the NIH Collaboratory, researchers found that imbalance in individual-level baseline covariates influences bias in the observed treatment effect in cluster randomized trials. Using race as an example, the study highlights the importance of reducing covariate imbalance in the design stage of cluster randomized trials and of using statistical analysis techniques to minimize the resulting bias.

The innovative study, published in Contemporary Clinical Trials, used computer simulation models validated by real-data simulations from a large clinical trial to examine the influence of baseline covariate imbalance on treatment effect bias. They found that bias was proportional to the degree of baseline covariate imbalance and the covariate effect size. In the simulations, trials with larger numbers of clusters had less covariate imbalance. Statistical models that adjusted for important baseline confounders were more effective than unadjusted models in minimizing bias.

The authors recommend several design approaches and statistical analysis techniques for both reducing covariate imbalance and minimizing bias. Using the results of available prior data can help researchers identify important baseline confounders when designing cluster randomized trials.

This work was supported within the NIH Collaboratory by the NIH Common Fund through a cooperative agreement from the Office of Strategic Coordination within the Office of the NIH Director, and by a research supplement from the NIH Common Fund to promote diversity in health-related research.

September 27, 2019: Preparing for Clinical Trial Data Sharing and Re-use: The New Reality for Researchers (Rebecca Li, PhD, Frank Rockhold, PhD)


Rebecca Li, PhD
Executive Director, Vivli
Co-Director of Research Ethics, Harvard Center for Bioethics
Harvard Medical School

Frank W. Rockhold, PhD
Professor of Biostatistics and Bioinformatics
Duke Clinical Research Institute
Duke University Medical Center


Preparing for Clinical Trial Data Sharing and Re-use: The New Reality for Researchers


Data sharing; Individual patient data; Open access; Raw data; ICMJE; Research dissemination

Key Points

  • Open access to individual patient data from clinical trials is a critical tool for research in health care. Despite the challenges, the question is not whether data should be shared, but rather how and when access should be granted.
  • Preparing data for reuse is often an afterthought—yet it is a new reality for researchers and institutions.
  • As of January 1, 2019, the International Committee of Medical Journal Editors (ICMJE) requires registration of a trial’s data sharing plan at the time of trial registration.
  • Institutions or teams should begin their data sharing program planning at least 18 months before a major publication (or regulatory approval).

Discussion Themes

FAIR data are data that meet standards of findability, accessibility, interoperability, and reusability.

How do we manage scientific integrity, replication, and validity given that data sharing opens a study to multiple people asking the same or related questions in potentially different ways using different methods?

How do we plan for a future that rewards data quality and reuse?

Read more about data sharing from ICMJE, NIH Office of Science Policy, and the National Academy of Medicine.


#pctGR, @Collaboratory1, @VivliCenter, @FrankRockhold

August 16, 2019: Introducing the Digital Medicine Society (Andy Coravos, MBA, Jen Goldsack, MS, MBA)


Andy Coravos
CEO, Elektra Labs
Fellow, Harvard-MIT Center for Regulatory Science
Co-founder, Digital Medicine Society (DiMe)

Jen Goldsack, MS, MBA
Interim Executive Director, DiMe
Portfolio, Strategy & Ops, HealthMode


Introducing the Digital Medicine Society


Digital medicine; Mobile health; Digital technologies; Wearable health devices; Connected devices; Cybersecurity

Key Points

  • Digital medicine is a rapidly evolving field that is by nature multidisciplinary and introduces new considerations for the healthcare community.
  • The Digital Medicine Society (DiMe) sits at the intersection of two communities: healthcare and technology. The Society is helping to move the field of digital medicine forward by developing a common language for diverse stakeholders from engineers and ethicists to payers and providers.
  • The U.S. healthcare system has strong protections for patients’ biospecimens like blood or genomic data, but what about digital specimens?

Discussion Themes

Are digital medical technologies worthy of the trust we place in them?

Should there be a Hippocratic Oath for manufacturers, organizations, and individuals delivering care through connected medical devices?

Read more about the emerging field of digital medicine and learn more about the Digital Medicine Society (DiMe), the professional home for those who practice and develop products in the digital era of medicine.


#DigitalMedicine, #pctGR, @Collaboratory1, @_DiMeSociety

August 5, 2019: New Section of Living Textbook Addresses Missing Data in Intention-to-Treat Analyses

A new section of the NIH Collaboratory’s Living Textbook of Pragmatic Clinical Trials discusses challenges associated with missing data that result from noncompliance, crossover, and dropout.

Many randomized controlled trials use an intention-to-treat (ITT) analysis to measure the real-world effects of the intervention. The newly published section, Missing Data and Intention-to-Treat Analyses, considers the population-level causal effects in these trials when there is noncompliance or missing outcome data.

“One rationale for the ITT approach is that it evaluates the real-world effects of the intervention. However, a common misconception is that the ITT analysis will be unbiased regardless of crossover or missing data.”

The new section also introduces a white paper from the NIH Collaboratory’s Biostatistics and Study Design Core, “Analyses of Randomized Controlled Trials in the Presence of Noncompliance and Study Dropout.” This working document offers analysts a more detailed discussion of treatment effects in ITT analyses, including a case example and recommended strategies for estimating and reporting both ITT effects and average causal effects.

The Biostatistics and Study Design Core works with the Demonstration Project teams to create guidance and technical documents regarding study design and biostatistical issues relevant to pragmatic clinical trials.

June 7, 2019: Meeting Materials from the 2019 NIH Collaboratory Steering Committee Meeting

The Collaboratory has made available all the presentations from their recent Steering Committee meeting held in Bethesda May 1-2, 2019.

Highlights of Day 1 included updates on the progress and sustainability of the NIH Collaboratory, perspectives on the landscape of embedded PCTs (ePCTs) and the need for real-world evidence, challenges and lessons learned from the UH3 Demonstration Projects, updates on progress and transition plans from the UG3 Demonstration Projects, and discussions on data sharing policy and planning. Day 2 featured an intensive workshop hosted by the NIH with the goal of starting discussions on statistical issues with ePCTs.

View or download the meeting materials on the website.

April 15, 2019: Registration Now Open for Workshop on the Design & Analysis of Embedded Pragmatic Clinical Trials (ePCT)

The NIH Health Care Systems Research Collaboratory is hosting a one-day workshop on the Design & Analysis of Embedded Pragmatic Clinical Trials (ePCTs) on May 2, 2019, in the Lister Hill Auditorium on the NIH Campus.

The workshop will include a series of moderated discussions that focus on issues of measuring trial outcomes from available data sources, potential randomization strategies, specific ePCT design considerations, and unique challenges associated with ePCTs. Panel discussions will utilize case examples from the Collaboratory repertoire and beyond to illustrate how clinical investigators and biostatisticians work to address research questions posed by specific trials.

The Workshop Website provides information on meeting logistics, agenda, and registration. There is also an option to attend the workshop remotely via the NIH Videoconference Center, and those details are also available at the Workshop Website.

February 21, 2019: Living Textbook Offers New Content on Design and Analysis of Pragmatic Clinical Trials

Members of the NIH Collaboratory’s Biostatistics and Study Design Core contributed 3 new sections to the Living Textbook exploring issues in the design and analysis of pragmatic clinical trials. The new sections offer insights into emerging issues in embedded pragmatic clinical trials and lessons learned from the NIH Collaboratory’s first round of Demonstration Projects.

  • The Designing to Avoid Identification Bias section addresses a type of selection bias that can occur in pragmatic clinical trials that use information from electronic health records to determine study population eligibility and in which the study intervention influences who undergoes screening or receives a diagnosis in clinical care.
  • The Alternative Cluster Randomized Designs section describes alternative design choices for cluster randomized trials and their implications for statistical power and sample size calculations. Modified cluster randomized designs, such as cluster randomization with crossover, may reduce the sample size required for a pragmatic clinical trial and may be particularly feasible in trials embedded in healthcare systems with electronic health records.
  • Case Study: STOP CRC Trial explores challenges in design and analysis that were faced in the Strategies and Opportunities to Stop Colorectal Cancer in Priority Populations (STOP CRC) trial, one of the NIH Collaboratory Demonstration Projects. The case study illustrates how the study team dealt with pragmatic issues during the planning and conduct of the trial.

In addition to contributing content to the Living Textbook, the Biostatistics and Study Design Core works with the NIH Collaboratory Demonstration Projects to address challenges in their statistical plans and study designs during the planning phase and to develop guidance and technical documents related to study design and biostatistical issues relevant to pragmatic clinical trials.