A well-run scoping review usually needs 6–12 months from protocol to submission, depending on scope, team size, and search complexity.
This guide sets clear expectations for time, steps, and levers that shorten or extend the calendar. You’ll see a phase-by-phase plan, bottlenecks, and sample schedules for solo, small, and larger teams.
What A Scoping Review Involves From Start To Finish
Most teams follow the Arksey–O’Malley stages with updates by Levac. Reporting uses PRISMA-ScR, and many groups adopt JBI methods for protocol design and stakeholder input. That structure shapes the calendar more than any single tool or database.
Typical Phases And Time Ranges
Here’s a broad view of phases. The table condenses the moving parts into time ranges that fit common projects.
| Stage | Typical Time | What You Do |
|---|---|---|
| Scope & Protocol | 3–6 weeks | Refine PCC, draft aims, set eligibility, plan sources, register protocol. |
| Search Strategy | 3–6 weeks | Build concepts with a librarian, peer review searches, pilot, finalize strings. |
| Searching & Harvest | 2–6 weeks | Run databases and grey sources, export, de-duplicate, set up screening. |
| Title/Abstract Screening | 2–8 weeks | Dual screen with calibration, resolve conflicts, track reasons. |
| Full-Text Screening | 3–10 weeks | Retrieve PDFs, dual assess, log exclusions, contact authors if needed. |
| Data Charting | 4–10 weeks | Design charting form, pilot, extract variables, check inter-rater alignment. |
| Synthesis & Maps | 3–8 weeks | Group evidence, build tables and figures, note gaps and clusters. |
| Write-Up | 4–8 weeks | Draft per PRISMA-ScR, refine visuals, edit, prepare submission package. |
Anchoring to published guidance helps. The PRISMA-ScR checklist sets reporting items, and JBI’s manual offers methods and tips. Both match what peer reviewers expect.
Why The Calendar Swings Between 6 And 12 Months
Three drivers move the schedule: scope breadth, team bandwidth, and search reach. Stretch any of those and the calendar expands; trim them to deliver faster without cutting quality.
Scope Breadth
A broad question, many populations, or multiple outcomes multiplies screening and charting time. Narrowing the question with a tight PCC statement keeps the record count in check and reduces round-trips during calibration.
Team Bandwidth
Dual screening and charting need two trained reviewers. A third team member eases conflict resolution and fills absences. Training time pays back through fewer errors, faster consensus, and cleaner audit trails.
Search Reach
Adding extra databases and deep grey searches increases yield. That helps mapping but inflates de-duplication, screening, and PDF retrieval. A librarian-led approach keeps strings precise so you harvest signal, not noise.
Evidence From Guidance And Handbooks
Several respected sources point to month-scale timelines. A 2021 overview of methods notes that projects can take many months end-to-end. Multiple university guides place the range near 6–12 months, with some teams needing a full year when staffing is lean. By comparison, full systematic reviews often run 12–18 months or more, which helps set expectations for mapping work that stops short of meta-analysis.
Two links worth bookmarking: the JBI chapter on scoping reviews and the PRISMA-ScR paper. Both outline the steps and reporting items that shape real-world schedules.
Practical Schedules For Different Team Sizes
Use these sample plans to size your project. They assume a well-defined topic, access to a health sciences librarian, and standard tools for screening and charting.
Solo Researcher With Help From A Librarian
Expect a longer calendar, since each step lands on one person. Block time: two short windows weekly for screening and one longer window for charting. Leave slack for protocol edits and search peer review.
Two Reviewers Plus A Librarian
This setup fits many graduate projects. Parallel screening halves the longest step. One person can lead charting while the other cleans data, checks extraction, and drafts figures.
Four-Person Team
With two pairs, you can split records, finish calibration faster, and move to charting sooner. One member manages logs and the flow diagram. Another owns searches and updates.
Sample Calendar Blocks
These windows reflect typical projects and work as a baseline for Gantt planning.
| Scenario | Team Size | Expected Duration |
|---|---|---|
| Narrow Clinical Topic, 4–6 databases | 2–3 | 6–8 months |
| Cross-sector Topic, broad grey search | 3–4 | 9–12 months |
| Rapid Funding Timeline, tight scope | 3–4 | 3–5 months |
| Solo Graduate Project, part-time | 1 + librarian | 9–14 months |
Ways To Shorten The Timeline Without Cutting Quality
Lock The Question Early
Draft a crisp PCC statement and test it against sample records. Write clear inclusion and exclusion lines, then run a small pilot to confirm that two people apply the rules the same way.
Co-Design Searches With A Librarian
Use concept blocks, controlled vocabulary, and text words. Ask for a PRESS-style peer review of the strings, then keep a dated log so updates are painless. Save final strings in the protocol and the repository for later audits also.
Automate The Repetitive Bits
De-duplication, PDF retrieval, and reference checking can eat days. A modern screening tool with audit trails speeds calibration and helps with PRISMA-ScR flow counts.
Standardize Charting
Design the form before screening finishes. Pilot on 10–20 papers, fix fields that cause confusion, and set clear rules for how to code study type, population, and outcomes.
Set A Cadence
Weekly stand-ups keep work moving. Use short agendas: blockers, numbers screened, open conflicts, and next targets. Short meetings save hours of back-and-forth by email.
Common Bottlenecks And Fixes
PDF Retrieval Lags
Automate requests through library tools, set an email template for author contact, and track pending items. Move forward with charting on what you have while the queue clears.
Scope Creep During Screening
Freeze the protocol after the pilot. If a new concept truly belongs, log an amendment with the rationale and date so the audit trail stays clean.
Slow Consensus
Define tiebreak rules up front. Use short comment fields in the screening tool to capture reasons. Meet in quick bursts to resolve thorny conflicts instead of endless message threads.
Too Many Irrelevant Hits
Revisit the strings with your librarian. Add proximity operators, trim over-broad text words, and adjust filters that block known-relevant studies.
When A Mapping Review Takes Longer
Some topics swell the record count: multi-disciplinary areas, fast-moving fields with new preprints, and broad concepts without tight vocabulary. Multi-language inclusion and stakeholder consultations extend charting and write-up time. Plan for these from the outset.
Quality Signals Reviewers Look For
Editors and peer reviewers scan for a few markers: a registered or public protocol, a transparent search log, dual processes for screening and charting, and flow counts that match the final tables. Aligning to PRISMA-ScR and JBI helps hit those marks and speeds peer review.
Quick Planning Checklist
- Write a tight PCC and freeze it after a pilot.
- Book a meeting with a librarian before you draft strings.
- Pick one screening tool and set fields before import.
- Run dual screening with calibration and a tiebreak plan.
- Design the charting form early and pilot on live records.
- Schedule weekly stand-ups and a mid-project health check.
- Draft figures while charting wraps to keep momentum.
- Use the PRISMA-ScR checklist to structure the manuscript.
Bottom Line On Timelines
A careful mapping project often lands near 6–12 months. Small groups with tight questions can deliver in under half a year with sharp planning. Broader topics can run a year or a bit more, while full systematic reviews often need 12–18 months.