Fynman Fynman

Write Systematic Reviews That Actually Get Published

Stop getting overwhelmed by systematic review complexity. This step-by-step guide makes PRISMA simple and shows you exactly what to do.

The Only Systematic Review Guide You Need

Systematic reviews are the gold standard of research evidence. But they don’t have to be impossible to write.

This guide breaks down every step so you can write reviews that get published and actually help people.

What You’ll Actually Be Able to Do

  • Write a protocol that PROSPERO will accept
  • Design searches that find all the relevant studies
  • Screen thousands of papers without losing your mind
  • Judge study quality like an expert
  • Extract data that actually answers your question
  • Run meta-analyses that make sense
  • Write reports that follow PRISMA perfectly

Understanding Systematic Reviews

What Makes a Review “Systematic”?

Systematic reviews are different from regular reviews because:

Everything is documented: You write down exactly what you did so others can copy it You search everywhere: Multiple databases plus the hard-to-find stuff Clear rules for what gets included: No cherry-picking studies that support your opinion You rate study quality: Some studies are better than others - you say which ones Organized analysis: You don’t just summarize - you systematically analyze Standard reporting: You follow PRISMA rules so everyone knows what you did

Types of Systematic Reviews

Intervention reviews:

  • Evaluate effectiveness of treatments or interventions
  • Often include randomized controlled trials
  • May include meta-analysis of effect sizes
  • Example: “Effectiveness of mindfulness interventions for anxiety”

Diagnostic accuracy reviews:

  • Assess performance of diagnostic tests
  • Focus on sensitivity and specificity
  • Use specialized quality assessment tools
  • Example: “Accuracy of rapid COVID-19 tests”

Prognostic reviews:

  • Examine factors predicting outcomes
  • Often observational studies
  • Focus on predictive accuracy
  • Example: “Factors predicting academic success in college”

Qualitative evidence synthesis:

  • Synthesize qualitative research findings
  • Meta-ethnography or thematic synthesis
  • Focus on experiences and meanings
  • Example: “Patient experiences with telehealth”

Mixed methods reviews:

  • Combine quantitative and qualitative evidence
  • Parallel or sequential synthesis
  • Comprehensive understanding
  • Example: “Effectiveness and acceptability of mental health apps”

When to Conduct a Systematic Review

Do a systematic review when:

  • There’s already decent research on your topic
  • You have a clear question you want to answer
  • You have serious time (6-18 months minimum)
  • You can access multiple research databases
  • You have a team with different skills

Don’t do a systematic review if:

  • Almost no research exists yet (do a scoping review instead)
  • Your topic is way too broad (do an umbrella review instead)
  • You need answers fast (do a rapid review instead)
  • You don’t have enough time or people (do a narrative review instead)

PRISMA 2020: Your Reporting Roadmap

Understanding PRISMA

PRISMA tells you exactly how to report your systematic review so journals will publish it.

PRISMA 2020 is better because:

  • 27 clear checklist items you must cover
  • Better guidance on judging study quality
  • Clearer instructions for complex studies
  • More focus on who gets left out of research
  • Simpler flow chart

Core PRISMA Elements

Title and abstract (Items 1-2):

  • Identify as systematic review in title
  • Structured abstract following PRISMA format
  • Registration number included

Introduction (Items 3-4):

  • Rationale and objectives
  • PICO(T) framework clearly stated

Methods (Items 5-12):

  • Protocol registration
  • Eligibility criteria
  • Search strategy
  • Selection process
  • Data collection
  • Risk of bias assessment
  • Synthesis methods

Results (Items 13-21):

  • Study selection flow
  • Study characteristics
  • Risk of bias results
  • Individual study results
  • Synthesis results
  • Additional analyses

Discussion (Items 22-24):

  • Summary of evidence
  • Limitations
  • Conclusions and implications

Other information (Items 25-27):

  • Registration details
  • Protocol access
  • Funding and conflicts

PRISMA Flow Diagram

The flow diagram tracks studies through each review stage:

Identification:

  • Records identified through database searching
  • Additional records through other sources
  • Total records before screening

Screening:

  • Records screened
  • Records excluded with reasons
  • Full-text articles assessed
  • Full-text articles excluded with reasons

Included:

  • Studies included in qualitative synthesis
  • Studies included in meta-analysis

The Systematic Review Process: Step by Step

Phase 1: Planning and Protocol (Weeks 1-4)

Step 1: Build your team

  • Lead reviewer: You - the person who coordinates everything
  • Second reviewer: Someone else who can check your work
  • Expert: Someone who knows the topic inside and out
  • Search specialist: Someone good at finding research
  • Stats person: Someone who can do meta-analysis (if needed)

Step 2: Write a crystal-clear question

Use PICO to structure your question:

  • Population: Who exactly are you studying?
  • Intervention: What treatment/program/thing are you testing?
  • Comparator: What are you comparing it to?
  • Outcome: What results do you care about?
  • Time: How long do studies need to follow people?
  • Study design: What types of studies count?

Example PICO:

  • P: Adults with Type 2 diabetes
  • I: Mobile health applications
  • C: Standard care or other interventions
  • O: Glycemic control (HbA1c), self-management
  • T: At least 3 months follow-up
  • S: Randomized controlled trials

Step 3: Write your protocol

Key protocol sections:

  1. Background and rationale
  2. Research question and objectives
  3. Methods:
    • Eligibility criteria
    • Search strategy
    • Study selection process
    • Data extraction plan
    • Quality assessment approach
    • Data synthesis methods
  4. Timeline and resources

Step 4: Register before you start

  • PROSPERO: The main place to register protocols
  • Do this first: Before you start searching
  • Why this matters: Proves you didn’t cherry-pick results
  • Changes allowed: But you have to document them

Phase 2: Search Strategy Development (Weeks 2-6)

Step 1: List everything people might call your topic

Take your PICO question and brainstorm terms:

  • What words describe your population?
  • What do people call your intervention?
  • What synonyms exist? Different spellings?
  • What do doctors call it vs. regular people?

Step 2: Turn concepts into search terms

Example - diabetes app study:

  • Population: diabetes, diabetic, “type 2 diabetes”, T2DM
  • Intervention: “mobile health”, mHealth, “health app”, “smartphone app”, telemedicine
  • Outcomes: “glycemic control”, HbA1c, “blood glucose”, “self-management”

Step 3: Build database-specific searches

PubMed/MEDLINE search example:

1
2
3
4
5
6
7
((diabetes[MeSH] OR diabetic OR "type 2 diabetes" OR T2DM) 
AND
("Mobile Applications"[MeSH] OR "mobile health" OR mHealth OR "health app" OR "smartphone app" OR "digital health")
AND
("Glycated Hemoglobin A"[MeSH] OR HbA1c OR "glycemic control" OR "blood glucose" OR "diabetes management")
AND
("Randomized Controlled Trial"[Publication Type] OR "randomized controlled trial" OR RCT))

Step 4: Search multiple databases

Must-search databases:

  • PubMed: The big medical database everyone knows
  • Embase: European studies and drug research
  • Cochrane Library: The highest-quality studies and reviews
  • PsycINFO: Psychology and mental health studies
  • CINAHL: Nursing and healthcare studies

Also search:

  • Google Scholar: Finds stuff other databases miss
  • Trial registries: ClinicalTrials.gov, WHO ICTRP
  • Conference websites: Where new research first appears
  • Reference lists: Check what your included studies cite
  • Ask experts: People in the field know hidden gems

Step 5: Document your search strategy

Record for each database:

  • Date of search
  • Database name and platform
  • Search terms used
  • Limits applied (date, language, study type)
  • Number of results
  • Search strategy (exact syntax)

Phase 3: Study Selection (Weeks 7-12)

Step 1: Remove duplicates

  • Use reference management software
  • Manual check for near-duplicates
  • Document removal process
  • Keep record of unique studies

Step 2: Title and abstract screening

How to screen:

  1. Two people work separately: Don’t influence each other
  2. When in doubt, keep it: Better to include too many than miss good ones
  3. Talk through disagreements: Get a third person if you can’t agree
  4. Write down why you excluded things: You’ll need this for your report

Screening criteria:

  • Does the study address your research question?
  • Does it meet basic eligibility criteria?
  • Is it the right study design?
  • Is it published in acceptable format?

Step 3: Full-text review

Obtain full texts:

  • Access through institutional subscriptions
  • Request from authors if unavailable
  • Consider interlibrary loan services
  • Document attempts to obtain texts

Full-text assessment:

  • Apply detailed eligibility criteria
  • Record specific reasons for exclusion
  • Maintain independent review process
  • Create final list of included studies

Step 4: Calculate inter-rater reliability

Assess agreement between reviewers:

  • Cohen’s Kappa: For categorical decisions
  • Percentage agreement: Simple measure
  • Target: Kappa ≥ 0.61 (substantial agreement)
  • If low: Additional training or criteria refinement

Phase 4: Data Extraction (Weeks 10-16)

Step 1: Design extraction form

Key data categories:

Study characteristics:

  • Author, year, country
  • Study design and duration
  • Setting and recruitment
  • Sample size and power calculation
  • Funding source

Participant characteristics:

  • Demographics (age, gender, ethnicity)
  • Inclusion/exclusion criteria
  • Baseline characteristics
  • Comorbidities

Intervention details:

  • Description of intervention
  • Control/comparison condition
  • Duration and intensity
  • Implementation fidelity
  • Theoretical framework

Outcome measures:

  • Primary and secondary outcomes
  • Measurement tools and timing
  • Follow-up periods
  • Missing data handling

Results:

  • Sample sizes at each time point
  • Mean differences and effect sizes
  • Confidence intervals and p-values
  • Adverse events

Step 2: Pilot test extraction form

  • Test on 2-3 diverse studies
  • Refine categories and definitions
  • Ensure consistency between extractors
  • Document any changes

Step 3: Extract data systematically

  • Dual extraction: Two reviewers extract independently
  • Standardized process: Use consistent procedures
  • Quality checks: Regular accuracy verification
  • Discrepancy resolution: Systematic approach to disagreements

Step 4: Contact study authors

When to contact authors:

  • Missing outcome data
  • Unclear methodology
  • Unpublished results
  • Additional analyses needed

Contact protocol:

  • Standard email template
  • Specific data requests
  • Reasonable timeline
  • Follow-up procedures

Phase 5: Quality Assessment (Weeks 14-18)

Choose appropriate tools:

For randomized trials: Cochrane Risk of Bias 2 (RoB 2):

  • Randomization process
  • Deviations from intended interventions
  • Missing outcome data
  • Measurement of outcome
  • Selection of reported result

For observational studies: ROBINS-I (Risk of Bias in Non-randomized Studies):

  • Confounding
  • Selection of participants
  • Classification of interventions
  • Deviations from intended interventions
  • Missing data
  • Measurement of outcomes
  • Selection of reported result

For diagnostic studies: QUADAS-2:

  • Patient selection
  • Index test
  • Reference standard
  • Flow and timing

Quality assessment process:

  1. Independent assessment: Two reviewers assess each study
  2. Structured approach: Use validated tools consistently
  3. Document rationale: Explain judgments with evidence
  4. Resolve disagreements: Discussion and consensus
  5. Create summary: Overall risk of bias assessment

Phase 6: Data Synthesis (Weeks 16-22)

Option 1: Narrative synthesis

When to use:

  • High heterogeneity between studies
  • Different outcome measures
  • Insufficient data for meta-analysis
  • Mixed study designs

Synthesis approach:

  1. Organize findings: Group by intervention, outcome, population
  2. Describe patterns: Identify consistent findings
  3. Explore heterogeneity: Explain differences between studies
  4. Assess robustness: Consider quality and bias
  5. Draw conclusions: Summarize evidence strength

Option 2: Meta-analysis

When appropriate:

  • Similar populations and interventions
  • Comparable outcome measures
  • Sufficient data quality
  • Low to moderate heterogeneity

Meta-analysis steps:

1. Choose effect measure:

  • Continuous outcomes: Mean difference, standardized mean difference
  • Dichotomous outcomes: Risk ratio, odds ratio, risk difference
  • Time-to-event: Hazard ratio

2. Select statistical model:

  • Fixed effects: Assumes one true effect size
  • Random effects: Allows for between-study variation
  • Generally prefer random effects for systematic reviews

3. Assess heterogeneity:

  • I² statistic: Percentage of variation due to heterogeneity
    • 0-40%: Low heterogeneity
    • 30-60%: Moderate heterogeneity
    • 50-90%: Substantial heterogeneity
    • 75-100%: Considerable heterogeneity
  • Chi² test: Statistical test for heterogeneity
  • Tau²: Estimate of between-study variance

4. Investigate heterogeneity:

  • Subgroup analysis: Pre-planned analyses by study characteristics
  • Meta-regression: Continuous moderator variables
  • Sensitivity analysis: Impact of study quality, outliers

5. Assess publication bias:

  • Funnel plots: Visual assessment of asymmetry
  • Egger’s test: Statistical test for small-study effects
  • Trim-and-fill: Adjust for missing studies
  • Fail-safe N: Number of null studies needed to change conclusions

Phase 7: Reporting and Dissemination (Weeks 20-26)

Writing your systematic review:

Abstract (250-300 words):

  • Background: Why this review matters
  • Methods: Key methodological details
  • Results: Main findings with numbers
  • Conclusions: Clinical/policy implications

Introduction:

  • Background: Current knowledge and gaps
  • Rationale: Why this review is needed
  • Objectives: Specific research questions

Methods:

  • Protocol registration: PROSPERO number
  • Search strategy: Databases and terms
  • Selection criteria: Detailed PICO
  • Data extraction: Process and forms
  • Quality assessment: Tools and procedures
  • Analysis methods: Statistical approaches

Results:

  • Search results: PRISMA flow diagram
  • Study characteristics: Descriptive summary
  • Risk of bias: Quality assessment results
  • Synthesis results: Main findings
  • Additional analyses: Subgroups, sensitivity

Discussion:

  • Summary: Key findings in context
  • Limitations: Review and study limitations
  • Implications: Practice and research implications
  • Conclusions: Clear take-home messages

Building Comprehensive Search Strategies

Search Strategy Principles

Comprehensiveness vs. precision:

  • Systematic reviews prioritize sensitivity (finding all relevant studies)
  • Accept lower precision (more irrelevant results)
  • Better to over-include than miss relevant studies

Multiple database searching:

  • Different databases have different coverage
  • Medical vs. psychological vs. educational literature
  • International vs. regional focuses
  • No single database captures all relevant literature

Database Selection Guide

Core medical databases:

MEDLINE (via PubMed):

  • Coverage: 1946-present, 5,000+ journals
  • Strengths: Comprehensive biomedical literature, MeSH terms
  • Best for: Clinical research, health interventions
  • Language: Primarily English with international coverage

Embase:

  • Coverage: 1947-present, 8,000+ journals
  • Strengths: European literature, drug research, conference abstracts
  • Best for: Pharmaceutical research, device studies
  • Unique features: EMTREE controlled vocabulary

Cochrane Library:

  • Coverage: High-quality systematic reviews and trials
  • Strengths: Rigorously peer-reviewed, methodologically sound
  • Best for: Finding existing reviews, high-quality RCTs
  • Components: Cochrane Reviews, CENTRAL, HTA database

Specialized databases:

PsycINFO:

  • Coverage: Psychology, psychiatry, behavioral sciences
  • Best for: Mental health, behavioral interventions
  • Unique features: PsycINFO subject headings

CINAHL:

  • Coverage: Nursing and allied health literature
  • Best for: Nursing interventions, patient care
  • Unique features: CINAHL subject headings

ERIC:

  • Coverage: Education literature
  • Best for: Educational interventions, learning outcomes
  • Access: Free through Department of Education

Web of Science:

  • Coverage: Multidisciplinary, citation indexing
  • Best for: Highly cited papers, citation analysis
  • Unique features: Citation searching, impact metrics

Advanced Search Techniques

Boolean operators:

  • AND: Narrows search (diabetes AND exercise)
  • OR: Broadens search (diabetes OR diabetic)
  • NOT: Excludes terms (diabetes NOT type 1)

Proximity operators:

  • NEAR/n: Terms within n words of each other
  • ADJ/n: Terms adjacent within n words
  • Example: mobile NEAR/3 health (finds “mobile health,” “health mobile apps”)

Truncation and wildcards:

  • Asterisk (*): Multiple characters (child* finds child, children, childhood)
  • Question mark (?): Single character (wom?n finds woman, women)
  • Dollar sign ($): Zero or one character (colo$r finds color, colour)

Field searching:

  • Title: [ti] or [title]
  • Abstract: [ab] or [abstract]
  • Author: [au] or [author]
  • MeSH terms: [mh] or [mesh]
  • Example: diabetes[ti] AND mobile[ti]

Controlled vocabulary:

MeSH (Medical Subject Headings) in PubMed:

  • Hierarchical structure: Organized by body systems, diseases
  • Major topic: Focus of article indicated by asterisk
  • Subheadings: Specific aspects (therapy, diagnosis, prevention)
  • Example: “Diabetes Mellitus, Type 2”[Mesh] AND “Mobile Applications”[Mesh]

EMTREE terms in Embase:

  • Similar to MeSH: But different terminology
  • Map terms: Use Embase’s mapping feature
  • Explosion: Include narrower terms

Grey Literature Searching

Why include grey literature:

  • Reduces publication bias
  • Captures recent research
  • Includes negative results
  • Covers policy and practice reports

Grey literature sources:

Trial registries:

  • ClinicalTrials.gov: US-based registry
  • WHO ICTRP: International registry platform
  • EU Clinical Trials Register: European trials

Thesis databases:

  • ProQuest Dissertations & Theses: Comprehensive thesis database
  • EThOS: UK thesis collection
  • National thesis databases: Country-specific collections

Conference proceedings:

  • Conference Papers Index: Multidisciplinary coverage
  • Professional society websites: Field-specific conferences
  • Google Scholar: Often indexes conference abstracts

Government and policy reports:

  • Government websites: National health departments
  • Policy databases: OECD, WHO, professional organizations
  • Think tank reports: Policy research organizations

Preprint servers:

  • medRxiv: Medical preprints
  • PsyArXiv: Psychology preprints
  • SSRN: Social science preprints

Search Documentation Template

Document each search with:

Database information:

  • Database name and platform
  • Date range covered
  • Date of search
  • Interface used

Search strategy:

  • Complete search syntax
  • Number of results
  • Limits applied
  • Time taken

Example documentation:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
Database: PubMed/MEDLINE
Platform: National Library of Medicine
Date range: 1946 to present
Search date: March 15, 2024
Interface: PubMed Advanced Search

Search strategy:
#1 "Diabetes Mellitus, Type 2"[Mesh] OR "type 2 diabetes" OR "diabetes mellitus" OR diabetic OR T2DM
#2 "Mobile Applications"[Mesh] OR "mobile health" OR mHealth OR "smartphone app" OR "health app" OR "digital health"
#3 "Randomized Controlled Trial" [Publication Type] OR "randomized controlled trial" OR "randomised controlled trial" OR RCT
#4 #1 AND #2 AND #3

Results: 847 records
Limits: English language, human studies
Time: 45 minutes

Managing Your Systematic Review Data

Reference Management

Choosing reference management software:

Zotero:

  • Pros: Free, excellent web integration, group libraries
  • Cons: Limited advanced features
  • Best for: Collaborative reviews, web-based research

EndNote:

  • Pros: Powerful features, institutional support
  • Cons: Expensive, steep learning curve
  • Best for: Large-scale reviews, advanced users

Mendeley:

  • Pros: Free basic version, PDF annotation
  • Cons: Storage limits, privacy concerns
  • Best for: Individual researchers, PDF management

Reference management workflow:

  1. Import directly: From databases when possible
  2. Organize by source: Separate folders for each database
  3. Remove duplicates: Use software tools and manual review
  4. Tag studies: Mark for different review stages
  5. Backup regularly: Export libraries and sync across devices

Study Selection Management

Screening software options:

Covidence:

  • Features: Automated screening workflow, conflict resolution
  • Pros: User-friendly, integrates with systematic review process
  • Cons: Subscription cost, limited customization
  • Best for: Traditional systematic reviews

Rayyan:

  • Features: AI-assisted screening, mobile app
  • Pros: Free for academic use, machine learning suggestions
  • Cons: Limited data extraction features
  • Best for: Large screening projects, international teams

DistillerSR:

  • Features: Comprehensive review management, custom forms
  • Pros: Highly customizable, full review lifecycle
  • Cons: Expensive, complex setup
  • Best for: Complex reviews, regulatory submissions

Manual screening workflow:

If using spreadsheets:

  1. Create standardized forms: Include all decision criteria
  2. Independent columns: Separate columns for each reviewer
  3. Track conflicts: Highlight disagreements
  4. Document decisions: Include reasons for exclusion
  5. Calculate agreement: Use kappa statistics

Data Extraction Organization

Spreadsheet setup:

Study identification tab:

  • Study ID, Author, Year, Title
  • Database source, Full-text availability
  • Reviewer assignments, Status tracking

Study characteristics tab:

  • Design, Setting, Country, Sample size
  • Population demographics, Inclusion criteria
  • Intervention details, Comparison condition
  • Outcome measures, Follow-up duration

Results tab:

  • Baseline characteristics, Sample sizes
  • Primary outcome results, Secondary outcomes
  • Effect sizes, Confidence intervals
  • Statistical significance, Missing data

Quality assessment tab:

  • Risk of bias domains, Overall assessment
  • Reviewer judgments, Supporting evidence
  • Conflicts and resolutions

Database management principles:

  • Consistent formatting: Standardize entries
  • Version control: Track changes and updates
  • Backup procedures: Regular saves and cloud storage
  • Access controls: Manage reviewer permissions
  • Quality checks: Regular accuracy verification

Quality Assurance Procedures

Double data extraction:

  • Independent extraction: Two reviewers work separately
  • Comparison process: Systematic comparison of extractions
  • Discrepancy resolution: Structured discussion protocol
  • Documentation: Record all decisions and changes

Accuracy verification:

  • Random checks: Sample of extractions verified
  • Cross-validation: Compare extracted data to original papers
  • Expert review: Subject matter expert reviews key extractions
  • Consistency checks: Look for patterns in data

Missing data protocols:

  • Author contact: Systematic approach to requesting data
  • Multiple attempts: Follow-up procedures
  • Alternative sources: Conference abstracts, related papers
  • Decision rules: When to exclude vs. include with limitations

Conducting Meta-Analysis

When Meta-Analysis is Appropriate

Clinical homogeneity:

  • Similar populations studied
  • Comparable interventions
  • Similar comparison conditions
  • Meaningful outcome measures

Methodological homogeneity:

  • Similar study designs
  • Comparable follow-up periods
  • Similar outcome measurement
  • Adequate data quality

Statistical considerations:

  • Sufficient number of studies (typically ≥3)
  • Quantitative outcome data available
  • Acceptable statistical heterogeneity
  • Appropriate effect measures

Meta-Analysis Software

RevMan (Review Manager):

  • Developer: Cochrane Collaboration
  • Pros: Free, comprehensive, PRISMA-compliant
  • Cons: Windows-only, limited advanced features
  • Best for: Standard meta-analyses, Cochrane reviews

R packages:

  • meta: Comprehensive meta-analysis functions
  • metafor: Advanced meta-analysis methods
  • Pros: Free, flexible, cutting-edge methods
  • Cons: Programming required, steep learning curve

Comprehensive Meta-Analysis (CMA):

  • Pros: User-friendly interface, publication bias tools
  • Cons: Expensive, limited flexibility
  • Best for: Users preferring point-and-click interface

Stata:

  • Commands: metan, metabias, metafunnel
  • Pros: Powerful statistics software, good graphics
  • Cons: Expensive, statistics background needed

Effect Size Calculation

Continuous outcomes:

Mean Difference (MD):

  • When to use: Same outcome measure across studies
  • Formula: MD = Mean₁ - Mean₂
  • Example: Blood pressure measured in mmHg
  • Interpretation: Absolute difference in original units

Standardized Mean Difference (SMD):

  • When to use: Different scales measuring same construct
  • Formula: SMD = (Mean₁ - Mean₂) / Pooled SD
  • Example: Different depression scales
  • Interpretation: Cohen’s d conventions (0.2 small, 0.5 medium, 0.8 large)

Dichotomous outcomes:

Risk Ratio (RR):

  • Formula: RR = Risk₁ / Risk₂
  • Interpretation: RR = 1 (no effect), RR > 1 (increased risk)
  • Example: RR = 0.75 means 25% reduction in risk

Odds Ratio (OR):

  • Formula: OR = Odds₁ / Odds₂
  • Interpretation: OR = 1 (no effect), OR > 1 (increased odds)
  • Note: Approximates RR when outcomes are rare

Risk Difference (RD):

  • Formula: RD = Risk₁ - Risk₂
  • Interpretation: Absolute difference in risk
  • Example: RD = -0.05 means 5% absolute risk reduction

Forest Plots Interpretation

Key components:

  • Study squares: Effect estimate, size proportional to weight
  • Horizontal lines: Confidence intervals
  • Vertical line: Line of no effect (RR=1, MD=0)
  • Diamond: Overall pooled effect

Reading forest plots:

  • Left of no-effect line: Favors intervention
  • Right of no-effect line: Favors control
  • Overlapping CIs: Consistent results
  • Non-overlapping CIs: Heterogeneous results

Weight calculation:

  • Inverse variance: Studies with smaller SEs get more weight
  • Fixed effects: Weight = 1/SE²
  • Random effects: Weight accounts for between-study variance

Heterogeneity Assessment

I² statistic interpretation:

  • 0-40%: Low heterogeneity, pooling likely appropriate
  • 30-60%: Moderate heterogeneity, investigate sources
  • 50-90%: Substantial heterogeneity, consider subgroups
  • 75-100%: Considerable heterogeneity, pooling may be inappropriate

Sources of heterogeneity:

  • Clinical: Population, intervention, comparator differences
  • Methodological: Study design, risk of bias differences
  • Statistical: Measurement error, sampling variation

Investigating heterogeneity:

Subgroup analysis:

  • Pre-specified groups: Planned in protocol
  • Meaningful divisions: Clinical or methodological rationale
  • Adequate power: Sufficient studies per subgroup
  • Example: Subgroups by age, severity, intervention type

Meta-regression:

  • Continuous variables: Explore relationship with effect size
  • Examples: Year of publication, study quality score
  • Limitations: Observational analysis, multiple testing

Sensitivity analysis:

  • Study quality: Exclude high risk of bias studies
  • Outliers: Remove extreme results
  • Study design: Restrict to specific designs
  • Fixed vs. random: Compare different models

Publication Bias Assessment

Funnel plot analysis:

  • Principle: Plot effect size vs. precision (1/SE)
  • Expected pattern: Symmetric inverted funnel
  • Asymmetry suggests: Publication bias, small-study effects
  • Limitations: Other sources of asymmetry exist

Statistical tests:

Egger’s test:

  • Null hypothesis: No small-study effects
  • Significance: p < 0.05 suggests bias
  • Limitations: Low power with few studies

Begg’s test:

  • Rank correlation: Between effect size and variance
  • Less powerful: Than Egger’s test
  • Robust: To outliers

Trim and fill:

  • Method: Imputes missing studies
  • Adjustment: Provides bias-corrected estimate
  • Limitations: Strong assumptions about bias mechanism

Advanced Meta-Analysis Methods

Network meta-analysis:

  • Purpose: Compare multiple interventions
  • Methods: Direct and indirect comparisons
  • Assumptions: Transitivity, consistency
  • Software: netmeta (R), WinBUGS, Stata

Individual participant data (IPD):

  • Gold standard: Access to raw participant data
  • Advantages: Standardized analyses, subgroup exploration
  • Challenges: Data sharing, harmonization
  • Methods: Two-stage, one-stage approaches

Meta-analysis of diagnostic tests:

  • Outcomes: Sensitivity, specificity
  • Methods: Bivariate model, HSROC
  • Software: mada (R), MetaDiSc
  • Challenges: Threshold effects, heterogeneity

Systematic Review Quality Checklist

Pre-Review Planning

Protocol development: □ Research question clearly defined using PICO framework □ Comprehensive search strategy developed with librarian input □ Inclusion/exclusion criteria explicitly stated □ Data extraction form designed and pilot tested □ Quality assessment approach selected and justified □ Statistical analysis plan specified □ Protocol registered in PROSPERO before beginning

Team composition: □ Lead reviewer with systematic review experience □ Second reviewer for independent screening/extraction □ Subject matter expert for clinical input □ Statistician for meta-analysis (if applicable) □ Information specialist for search strategy

Search and Selection

Search comprehensiveness: □ Multiple databases searched (minimum 3-4) □ Grey literature sources included □ Reference lists of included studies checked □ Clinical trial registries searched □ Expert consultation conducted □ No language restrictions (or restrictions justified) □ Search strategy peer-reviewed

Study selection process: □ Duplicate removal documented □ Independent screening by two reviewers □ Full-text review conducted independently □ Inter-rater reliability calculated and reported □ Conflicts resolved through discussion/third reviewer □ Reasons for exclusion documented □ PRISMA flow diagram completed

Data Extraction and Quality Assessment

Data extraction: □ Standardized extraction form used □ Pilot testing of extraction form conducted □ Independent extraction by two reviewers □ Study authors contacted for missing data □ Conflicts in extraction resolved systematically □ Data extraction accuracy verified

Quality assessment: □ Appropriate tool selected for study designs □ Independent assessment by two reviewers □ Quality assessment training provided □ Inter-rater reliability for quality assessment calculated □ Risk of bias summary created □ Quality assessment influences interpretation

Analysis and Synthesis

Data synthesis: □ Synthesis method appropriate for data type □ Heterogeneity assessed before pooling □ Random effects model used (unless justified otherwise) □ Subgroup analyses pre-specified in protocol □ Sensitivity analyses conducted □ Publication bias assessed □ GRADE approach used for evidence certainty

Statistical analysis: □ Appropriate effect measures selected □ Forest plots clearly labeled and interpretable □ Confidence intervals reported □ I² statistic calculated and interpreted □ Sources of heterogeneity investigated □ Meta-regression conducted if appropriate

Reporting and Interpretation

PRISMA compliance: □ All 27 PRISMA checklist items addressed □ PRISMA flow diagram included □ Search strategy fully reported □ Characteristics of included studies table provided □ Risk of bias assessment results reported □ Forest plots and summary statistics provided

Interpretation: □ Results interpreted in context of study quality □ Limitations clearly acknowledged □ Clinical significance discussed □ Implications for practice stated □ Recommendations for future research provided □ Conflicts of interest declared □ Funding sources reported

Common Quality Issues

Search limitations:

  • Insufficient databases searched
  • Language restrictions without justification
  • Grey literature not included
  • Search terms too narrow
  • No expert consultation

Selection bias:

  • Single reviewer screening
  • Inadequate inter-rater reliability
  • Post-hoc changes to inclusion criteria
  • Insufficient detail on selection process

Data extraction errors:

  • Single reviewer extraction
  • No verification of accuracy
  • Missing data not addressed
  • Authors not contacted

Analysis problems:

  • Inappropriate pooling despite heterogeneity
  • Fixed effects model without justification
  • No sensitivity analyses
  • Publication bias not assessed
  • Poor quality studies not downweighted

Reporting deficiencies:

  • PRISMA checklist not followed
  • Search strategy not fully reported
  • Risk of bias assessment inadequate
  • Limitations not acknowledged
  • Conflicts of interest not declared

Advanced Topics and Considerations

Rapid Reviews

When appropriate:

  • Urgent policy decisions needed
  • Limited time available (weeks to months)
  • Sufficient existing evidence
  • Lower precision acceptable

Streamlined methods:

  • Single reviewer screening: With verification sample
  • Limited databases: Focus on highest-yield sources
  • Restricted grey literature: Minimal grey literature searching
  • Simplified quality assessment: Use abbreviated tools
  • Narrative synthesis: Less complex analysis

Quality considerations:

  • Document all methodological shortcuts
  • Acknowledge limitations clearly
  • Consider updating to full review later
  • Use when speed more important than comprehensiveness

Living Systematic Reviews

Concept:

  • Continually updated systematic reviews
  • Regular surveillance for new evidence
  • Periodic updates when warranted
  • Maintain currency of evidence

Implementation:

  • Search automation: Regular database searches
  • Streamlined processes: Efficient screening and updating
  • Threshold for updates: Criteria for when to update
  • Version control: Clear versioning and change documentation

Challenges:

  • Resource intensive
  • Technology requirements
  • Methodological consistency
  • Reader notification systems

Systematic Reviews of Complex Interventions

Challenges:

  • Multiple components: Interventions have many parts
  • Implementation variation: Different ways of delivering
  • Context dependency: Effectiveness varies by setting
  • Mechanism complexity: Multiple pathways to outcomes

Approaches:

  • Logic models: Map intervention components to outcomes
  • Process evaluation: Include implementation data
  • Mixed methods synthesis: Combine quantitative and qualitative evidence
  • Realist synthesis: Focus on what works for whom in what circumstances

Equity and Social Justice Considerations

PROGRESS-Plus framework:

  • Place of residence: Rural vs. urban
  • Race/ethnicity: Racial and ethnic minorities
  • Occupation: Employment status and type
  • Gender/sex: Gender identity and biological sex
  • Religion: Religious affiliation
  • Education: Educational attainment
  • Socioeconomic status: Income, wealth
  • Social capital: Social networks and support
  • Plus: Additional factors (disability, sexual orientation, age)

Equity-focused methods:

  • Search for studies in diverse populations
  • Extract data on equity characteristics
  • Conduct subgroup analyses by equity factors
  • Consider applicability across populations
  • Discuss equity implications in conclusions

Troubleshooting Common Problems

Search Problems

Problem: Too few results Solutions:

  • Broaden search terms (use more synonyms)
  • Remove unnecessary limits
  • Check spelling and syntax
  • Search additional databases
  • Include grey literature sources

Problem: Too many results Solutions:

  • Add more specific terms
  • Use field searching (title, abstract)
  • Apply appropriate limits (study design, publication type)
  • Consider systematic search strategy refinement

Problem: Inconsistent results across databases Solutions:

  • Adapt search strategy for each database
  • Use database-specific controlled vocabulary
  • Check for platform differences
  • Document variations in search approach

Screening Challenges

Problem: Low inter-rater reliability Solutions:

  • Clarify inclusion/exclusion criteria
  • Provide additional reviewer training
  • Conduct calibration exercises
  • Discuss difficult cases as team
  • Consider third reviewer for persistent disagreements

Problem: Unclear eligibility Solutions:

  • Develop detailed decision rules
  • Create decision tree for common scenarios
  • When in doubt, include for full-text review
  • Document rationale for decisions
  • Contact authors for clarification

Data Extraction Issues

Problem: Missing data Solutions:

  • Contact study authors with specific requests
  • Check supplementary materials
  • Look for related publications
  • Consider excluding if critical data missing
  • Use multiple imputation if appropriate

Problem: Inconsistent reporting Solutions:

  • Standardize data extraction forms
  • Create decision rules for common problems
  • Extract data as reported in original study
  • Note inconsistencies and limitations
  • Consider contacting authors

Analysis Difficulties

Problem: High heterogeneity Solutions:

  • Investigate sources through subgroup analysis
  • Consider narrative synthesis instead of meta-analysis
  • Use random effects model
  • Explore clinical and methodological diversity
  • Present results with appropriate caveats

Problem: Few studies Solutions:

  • Consider broadening inclusion criteria
  • Conduct narrative synthesis
  • Acknowledge limitations clearly
  • Discuss implications for future research
  • Avoid meta-analysis with <3 studies

Problem: Publication bias Solutions:

  • Search grey literature thoroughly
  • Contact experts and authors
  • Register search in trial registries
  • Use statistical tests and plots
  • Discuss impact on conclusions

Systematic Review Tools and Resources

Essential Checklists and Guidelines

PRISMA 2020 Checklist: Comprehensive 27-item checklist for systematic review reporting Download: [PRISMA Statement website]

PRISMA-P Protocol Checklist: Specific guidance for systematic review protocols Download: [PRISMA-P Statement]

Risk of Bias Tools:

  • RoB 2: For randomized trials
  • ROBINS-I: For non-randomized studies
  • QUADAS-2: For diagnostic accuracy studies Download: [Cochrane Methods website]

Software and Platforms

Free options:

  • RevMan: Cochrane’s review management software
  • R packages: meta, metafor, netmeta
  • Rayyan: AI-assisted screening (free for academics)
  • Zotero: Reference management

Commercial options:

  • Covidence: Comprehensive review platform
  • DistillerSR: Advanced review management
  • EndNote: Reference management
  • Comprehensive Meta-Analysis: User-friendly meta-analysis

Training Resources

Cochrane Training:

  • Interactive learning modules
  • Self-paced online courses
  • Webinar series
  • Hands-on workshops

University courses:

  • Systematic review methodology courses
  • Evidence-based practice programs
  • Research methods training
  • Statistics for meta-analysis

Professional development:

  • Campbell Collaboration training
  • AHRQ Evidence-based Practice Centers
  • Professional society workshops
  • Conference training sessions

Reporting Templates

PRISMA Flow Diagram Template: Standardized flow chart for study selection process Format: PowerPoint, Word Download Id: prisma-flow-template

Data Extraction Form Template: Comprehensive form covering all key data elements Format: Excel, Word Download Id: data-extraction-template

Quality Assessment Summary: Template for presenting risk of bias assessments Format: Excel Download Id: quality-assessment-template

Search Strategy Documentation: Template for recording search strategies and results Format: Word Download Id: search-strategy-template

Write Systematic Reviews That Change Things

Systematic reviews are the best evidence we have for making decisions. But they only work if they’re done right.

Systematic reviews are hard work, but they’re worth it. Start with a focused question, write a detailed plan, and stick to proven methods. Do each step well and you’ll get results people actually use.

Get AI Help Without Compromising Quality

Fynman speeds up your systematic review while keeping it rigorous.

Find Studies Faster Quickly identify relevant research across databases. Get smart suggestions for studies you might have missed.

Screen Smarter, Not Harder Speed up title/abstract screening with AI suggestions. You still make the final decisions.

Organize and Synthesize Better Organize your findings, spot patterns, and draft sections with AI support. Faster reviews that don’t cut corners.