How To Do A Scoping Review In Medicine | Step By Step

Yes. Define a clear question, search widely, chart data, and report with PRISMA-ScR using a public protocol.

Scoping reviews map what exists, show where knowledge is clustered, and signal gaps that merit deeper study or a later systematic review. In medicine, teams use this approach to size a field, clarify terms, catalog study designs, and surface outcomes patients and clinicians care about. The process is structured yet flexible, which makes it ideal when concepts are broad or heterogeneous.

Core Stages At A Glance

Stage What You Do Practical Tips
Set the purpose and questions State the purpose, draft primary and secondary questions, and anchor them with the PCC lens: Population, Concept, Context. Write questions that a database can answer. Keep verbs neutral: “map,” “identify,” “describe.”
Write the protocol Record aims, eligibility, sources to search, draft search strings, screening rules, data items, and plans for synthesis and visuals. Post the protocol on OSF and time-stamp it. Keep a change log as you refine methods.
Assemble the team Assign a lead, a search specialist, two independent screeners, and a third reviewer to resolve ties. Pilot core steps together. Hold short calibration sessions before screening and charting.
Define eligibility Spell out in-scope populations, concepts, settings, designs, years, languages, and source types; list exclusions as well. Turn vague rules into if/then statements. Pretest rules on 30–50 records.
Design the search List databases and grey sources; combine controlled vocabulary with free-text synonyms; set date limits only when justified. Ask a health sciences librarian to peer-review the strings. Save every run and its date.
Run searches and deduplicate Export results from each source, remove duplicates, and store the master set with a distinct ID for each record. Capture counts by source and date. Keep the raw exports for audit.
Screen titles and abstracts Two reviewers screen independently, marking include, exclude, or maybe; meet to resolve conflicts. Use a pilot of 100 records to set thresholds. Track reasons for exclude.
Screen full texts Retrieve PDFs, apply the same rules, and record a single, specific reason for exclude per item. Log missing or inaccessible texts; try interlibrary loan before discarding.
Chart the data Create a form, pilot it on 10–20 studies, and refine; chart study traits, methods, outcomes, and central concepts. Treat the form as iterative; document each revision with a date and reason.
Synthesize and map Summarize counts and ranges, group by themes, and build tables or simple graphics that reflect the questions. Use simple graphics with clear captions. Let the questions dictate the displays.
Stakeholder input (optional) Invite patients, clinicians, or decision-makers to react to scope, gaps, and outputs. Schedule one short session early and one near the end. Record insights separately from results.
Report with PRISMA-ScR Describe methods, present a flow diagram, and follow the checklist so readers can repeat your process. Add appendices: full search strings, the charting form, and a list of excluded full texts with reasons.

Doing A Scoping Review In Medicine: Stepwise Flow

Set The Purpose And Questions

Start by stating why the review is needed. Use the PCC lens to frame the scope. Population sets who or what the evidence concerns. Concept sets the core idea or intervention. Context sets settings or circumstances. Write one primary question that matches the core purpose and add a short list of secondary questions that guide subgroup mapping or outcome lists.

Draft A Protocol

Write a protocol before the first database run. Include objectives, questions, eligibility rules, sources of evidence, draft search strings, screening process, data items, and plans for synthesis and visuals. Register or share the protocol on OSF so readers can see what changed over time. Use plain dates for version control and keep a change log in an appendix.

State any protocol updates in the final paper and in the repository record. Give dates for each change.

Build The Team And Workflow

Name a lead for decisions, a search specialist to build and test strings, two screeners, and a third reviewer for ties. Set up a shared folder with a master spreadsheet, a log of searches, and templates for screening and charting. Date entries. Label tables.

Choose Eligibility Criteria

Translate the PCC into concrete rules. List in-scope study designs and source types: trials, cohort studies, case series, qualitative studies, guidelines, or reports. Set the date range and language rules only when they serve the purpose. Pretest the rules on a small batch of records. Rewrite any rule that still needs debate after the pilot.

Design The Search

Pick databases that fit the scope: MEDLINE via PubMed or Ovid, Embase, CINAHL, PsycINFO, Web of Science, and subject-specific databases. Add grey sources such as conference proceedings, theses, major society sites, and trial registries. Blend controlled vocabulary with free-text synonyms for each concept. Add truncation and proximity where the platform allows. Document every string and the date it ran.

Run And Log The Searches

Export each set as RIS or XML, then merge and deduplicate. Give every record a distinct ID. Save a one-page search log in the appendix with the database platform, date, limits, and total hits for each run.

Screen Titles And Abstracts

Two reviewers screen independently with a short training round first. Use clear buttons: include, exclude, maybe. Meet to resolve conflicts and update rules if patterns emerge. If record numbers are huge, sample the maybes with a rule you set in advance.

Screen Full Texts

Pull PDFs, apply the same rules, and record one specific reason for exclude per item: wrong population, wrong concept, not a study, or not in scope. When a PDF is missing, try interlibrary loan, author contact, or alternate hosts before marking it unavailable. Keep a list of all excluded full texts and reasons in an appendix.

Chart The Data

Build a form that matches the questions. Typical fields include citation details, country, setting, design, population traits, concept terms, comparators if any, outcomes, time frame, and notes on measurement. Pilot on a small set, compare entries, and refine the form. Allow “not reported” as an option to avoid guesswork. Capture direct text where exact phrasing matters.

Synthesize And Map

Summarize counts and ranges in clear tables. Group studies by design, setting, population, or concept strands. Describe patterns: rising output by year, gaps in regions, common outcomes, or measures that vary. Use simple graphics with clear captions. Avoid effect estimates; the goal is breadth, not pooled results.

Invite Stakeholder Input

Plan one check-in early and one near the end. Invite patient partners, clinicians, or policy leads who will use the map. Share a one-page brief with the purpose and draft outputs. Ask what would make the results more useful. Record feedback in a separate section and explain how it shaped the displays or next steps.

Report With PRISMA-ScR

Write methods with enough detail for repeatability. Use the PRISMA-ScR checklist and a flow diagram that shows counts at each stage: found, deduplicated, screened, full-text assessed, included, and excluded with reasons. Place full search strings, the charting form, and the list of excluded full texts in appendices. State any language or date limits and why they were set.

Methods That Meet PRISMA-ScR And JBI

Two anchors keep scoping reviews on track. PRISMA-ScR sets what to report so readers can follow every step. The JBI manual scoping reviews chapter sets how to plan and run each stage, including PCC, protocol structure, study selection, charting, and presentation of results. Use both. Report completely and plan with the JBI steps so methods match intent.

Search Strategy And Sources

Pick sources that mirror where clinicians and researchers publish on the topic. Pair biomedical databases with any that match allied fields. For grey literature, scan major organization sites, trial registries, theses, conference books, and guideline portals. Build one master string per concept and adapt it to each platform. Keep a versioned copy of each string. Before launch, ask a librarian to peer-review the main strings. Rerun searches just before submission to catch the newest records.

Data Charting Fields That Help Clinicians

A good form captures features clinicians ask about when they scan results. Keep items lean and aligned with the questions. Use standardized labels when possible to make tables consistent across studies.

Item Why It Matters Example Entry
Population and sample size Shows who was studied and at what scale. Adults with heart failure, n=312
Setting and country Shows where care happens and context. Outpatient clinics, Japan
Design and period Shows how evidence was gathered and when. Cross-sectional survey, 2018–2023
Concept keywords Ties the study to the map. Telehealth, remote monitoring
Outcomes and measures Shows what was captured and how. Hospitalizations per 1000 patient-years
Main findings as stated Preserves the author’s own words. “Remote monitoring cut alerts by half.”
Notes and caveats Flags quirks readers should know. Outcome measured only in a subset

Scoping Review In Medical Research: Common Pitfalls

Vague Questions

Vague aims bloat searches and dilute relevance. Write one clear primary question and a small set of secondary questions that steer subgroup tables.

Too Narrow Or Too Broad Scope

A scope that is too narrow yields thin maps; too broad yields unmanageable sets. Run a few scoping searches during protocol drafting to size the field and adjust the PCC and date span.

Unclear Inclusion Rules

Ambiguous rules trigger long disputes during screening. Rewrite rules as if/then lines and pretest on a sample. Add examples of include and exclude cases to the protocol.

One-Screener Decisions

Single-reviewer screening invites missed records. Use two reviewers for both stages, with a quick calibration before each. Keep a tie-breaker plan in writing.

Skipping The Protocol

Without a protocol, mid-stream changes can distort selection and charting. Post a protocol on OSF, cite its link in the paper, and keep a change log to show what shifted and why.

Treating Scoping Like A Systematic Review

Scoping reviews map evidence; they don’t pool effect sizes. Risk-of-bias appraisal isn’t required unless your purpose needs it. If you include an appraisal, explain the tool and how its results will be used.

Missing Grey Literature

Conference abstracts, preprints, theses, and agency reports often contain early signals and rare populations. Plan a pragmatic sweep of high-yield sources and record the paths you used.

Poor Reporting Without PRISMA-ScR

Missing flow numbers or search strings makes a review hard to trust. Use the checklist to avoid gaps and place full methods in appendices.

Ambiguous Data Charting

If two reviewers chart the same paper and disagree, the form likely needs clearer labels. Add definitions and examples inside the form and recode early entries if revisions change meanings.

Writing Up For Clinicians And Researchers

Keep the title, abstract, and section headings plain and descriptive. State the purpose, scope, and main patterns early. Use tables for counts and ranges, short paragraphs for themes, and figures where a picture saves space.

Handy Tools And Files

Simple tools are enough. Use a reference manager for deduplication. Use a spreadsheet for screening logs, full-text decisions, and charting.

Ethics, Quality, And Transparency

Most scoping reviews draw on published sources and don’t require ethics board review. When the project includes interviews or surveys for input, seek local advice. State funding and any ties. State limits candidly: databases not searched, languages excluded, or gaps in reporting you observed. Share the charting form and de-identified data in a public repository when you can.

Fast Starter Checklist

  • A clear purpose anchored by PCC.
  • A protocol with a link, a date, and a change log.
  • A librarian-reviewed search across core databases and grey sources.
  • Two-reviewer selection with logs and a flow diagram.
  • A piloted charting form and transparent synthesis.
  • Reporting against PRISMA-ScR with appendices that show your work.

Short Glossary

PCC: A lens for setting scope using Population, Concept, and Context.
Charting: Structured data extraction suited to scoping reviews.
Grey literature: Non-indexed or non-traditional sources such as theses, conference books, or agency reports.
Stakeholder input: A planned touchpoint with end users that can improve relevance.

For reporting, see the PRISMA-ScR checklist. For methods, read the JBI Manual scoping reviews chapter.