Most systematic reviews search at least 2–3 bibliographic databases plus trial registries and grey literature to reduce missed studies.
What Counts As A Database In A Systematic Review
A database is any indexed source you can query in a structured way to find eligible studies. Classic picks include MEDLINE via PubMed or Ovid, Embase, and CENTRAL. Others serve fields such as nursing, behavioral science, education, engineering, or economics. Citation indexes, trial registries, and thesis repositories also qualify. The goal is reach, not brand loyalty; choose the mix that maps to your topic and study designs.
How Many Databases For A Systematic Review: Realistic Ranges
There is no magic number that fits every project. In health topics, three core sources span a wide slice of trials and observational work: MEDLINE, Embase, and CENTRAL. Many teams add CINAHL or PsycINFO when outcomes or populations point that way. Outside health, two to three well-chosen databases plus a citation index often match the brief. Trial registries and grey literature sources sit alongside, not instead of, those core databases.
Common Databases And Why They Help
Area | Database | Why It Helps |
---|---|---|
Biomedicine | MEDLINE/PubMed | Broad coverage of biomedical journals and MeSH terms for precise search logic. |
Biomedicine | Embase | Strong pharmacology and device coverage; Emtree terms catch non-MEDLINE records. |
Trials | CENTRAL | Consolidates randomized trials from multiple sources, including handsearched records. |
Nursing | CINAHL | Journals on nursing, allied health, and patient care not always indexed in MEDLINE. |
Behavior | PsycINFO | Behavioral and mental health studies, tests, and measures. |
Education | ERIC | School-based and higher-ed research, programs, and policy documents. |
Multi-discipline | Web of Science | Citation tracking and broad journal reach for forward/backward searches. |
Multi-discipline | Scopus | Large abstract and citation database across science, social science, and arts. |
Engineering | IEEE Xplore | Standards, conference papers, and journals on hardware, software, and systems. |
Social science | EconLit/SocINDEX | Macroeconomics, labor, and social policy records. |
Global health | Global Index Medicus | Regional journals and trials from LMICs that large indexes can miss. |
Agriculture | AGRICOLA | Food systems, nutrition, and rural health links. |
Why More Than One Source Matters
No single index covers everything. Journals differ in what they send to each vendor. Subject headings vary across platforms. Some datasets include conference records or regulatory filings, while others do not. Mixing sources reduces blind spots and offsets indexing delays. When the review hinges on harms, devices, or mental health outcomes, adding a focused database pays off through extra eligible records.
Pick Databases That Fit Your Question
Start with the population, intervention or exposure, comparator, and outcomes you plan to include. Add constraints such as setting, design, or language only when they serve the review. Then map those elements to sources with the best yield.
Clinical Interventions
Use MEDLINE, Embase, and CENTRAL as the base. Add CINAHL when the care process leans on nursing processes or allied health delivery. Add PsycINFO when outcomes relate to behavior or mental health. If devices or diagnostics sit center stage, scan IEEE Xplore or an engineering index for design papers that link to trials.
Public Health And Policy
Pair MEDLINE with a citation index such as Scopus or Web of Science. Add Global Index Medicus to expand reach into regional journals. When the topic crosses sectors, include ERIC or an economics source to capture program evaluations and cost outcomes.
Nursing And Allied Health
Use CINAHL with MEDLINE, then judge whether Embase adds value based on topic. For rehab and physiotherapy, PEDro can add targeted trial records. Keep CENTRAL in play for randomized designs.
Behavioral Science And Education
Use PsycINFO with ERIC and a citation index. Add MEDLINE when outcomes touch health services or when trials run in clinical settings.
Tech And Engineering
Use IEEE Xplore with Scopus or Web of Science. If the review links to clinical outcomes, add MEDLINE to bridge the gap between prototypes and applied trials.
Social Science And Economics
Use EconLit or another social index with Scopus or Web of Science. Add MEDLINE only when health endpoints are central.
Global And Regional Sources
Include Global Index Medicus for health topics with LMIC settings. For Latin America, LILACS expands yield beyond English-language journals. Regional coverage helps when local policy or practice shapes the evidence base.
Trial Registries And Grey Literature Count Too
Registries reduce publication bias and reveal ongoing or terminated studies. Search ClinicalTrials.gov and the WHO ICTRP, then match records to publications. For theses and reports, use sources such as ProQuest Dissertations, OpenGrey, and agency portals. Conference proceedings can surface early data on harms and subgroup effects. Treat these as separate streams alongside your databases and record how you searched each one.
Plan A Search That Scales With Time And Budget
Pick a set you can search well and document cleanly. Two thorough database runs beat five rushed ones. Pilot your logic in one platform, then translate it. Use both subject headings and text words. Capture synonyms, variants, and common misspellings. Log limits and date ranges. Deduplicate across sources before screening so the team spends time on new records, not repeats.
When Three Or More Databases Make Sense
Use three or more when the topic spans disciplines, when devices or medicines are central, or when previous scoping shows low overlap between sources. Reviews that inform guidance or regulatory work often take this route.
When Two Databases Are Enough
Two can be enough when the field is narrow and well indexed, when your question targets one design only, or when earlier mapping shows strong overlap. Pair that with a registry search and one citation index to keep recall strong.
Practical Setups By Topic
The mixes below show common, repeatable setups you can tailor. Each line lists a base of two to three databases plus add-ons that raise recall for that topic.
Topic | Core Databases (2–3) | Add-Ons |
---|---|---|
Drug or device trials | MEDLINE, Embase, CENTRAL | ClinicalTrials.gov, EU CTR, FDA/EMA documents |
Hospital nursing | MEDLINE, CINAHL | CENTRAL, PsycINFO |
Mental health | MEDLINE, PsycINFO | CENTRAL, Web of Science |
School programs | ERIC, Scopus | PsycINFO, EconLit |
Informatics | MEDLINE, IEEE Xplore | Scopus, CENTRAL |
Global health | MEDLINE, Global Index Medicus | Scopus, LILACS |
Public policy | Scopus, Web of Science | EconLit, Regional portals |
Reporting And Peer Review Keep The Search Honest
Document each source, date, platform, and the full strings you ran. Add a flow diagram that shows records from each stream. The PRISMA 2020 checklist sets the reporting bar and helps readers track what you searched and when. If you have access to a librarian or information specialist, ask for a PRESS check on your strategies before you lock them down.
Method Notes You Can Copy
Keep this short within your methods section. Review teams and peer reviewers look for clarity, not slogans.
Example Wording
We searched MEDLINE (Ovid), Embase (Ovid), and CENTRAL from inception to present with no language limits. Search strategies combined controlled vocabulary and text words for the population, intervention, and outcomes. We also searched ClinicalTrials.gov and WHO ICTRP, scanned reference lists, and used citation tracking in Scopus. Full strategies and dates appear in Supplement 1.
Quick Start Template (Copy And Adapt)
- Write a clear question and note designs you will include.
- Pick two to three databases that best match that question.
- Add one citation index for forward/backward searching.
- Add two trial registries and one grey source that fit the topic.
- Build and pilot the search in one platform; translate to the others.
- Export results with full fields; deduplicate; record counts by source.
- Update the search close to submission and report the new date.
Where To Read The Full Rules
For health intervention reviews, the Cochrane Handbook Chapter 4 explains why teams search multiple databases and how to plan strings that travel across platforms. Pair that with the PRISMA link above so your methods section stays clear and traceable.
How To Judge When Coverage Is Adequate
Run a quick yield check before you stop. Add one extra database and record how many eligible studies appear after deduplication. If the gain is tiny and adds no new study, you likely reached saturation. Scan a sample of included papers and note where each was indexed; if the same sources recur, your base set is sound. Also match included trials to registry records to confirm your searches touched those pipelines.
Common Pitfalls That Cost You Studies
Single-platform searching, narrow text words, and missing subject headings cut recall. Hard limits on language or age bands can hide data that sit in tables or supplements. Skipping registries or grey sources leaves non-published trials off the radar. Poor translation across vendors breaks logic and loses hits. Weak deduplication inflates counts, wastes screening time, and can hide true duplicates with minor field differences.
Role Of An Information Specialist
Librarians and trained searchers tune strings, translate them across platforms, and spot indexing quirks. A short chat at the start can prevent days of rework later. Ask for a PRESS check when your draft strings are ready, and save versioned copies so changes stay traceable for readers.
Deduplication And Record Keeping
Export fields from each source, including record IDs, DOIs, and accession numbers. Use matching on several keys, then hand-check near matches. Keep a log that lists raw counts by source, the deduplicated total, and the number moved to screening. That log feeds your flow diagram and helps anyone follow the chain from search to inclusion.
Bottom Line On Database Count
Pick sources that match your question and search them. In many reviews the sweet spot is 2–3 databases plus trial registries, one citation index, and one grey source. Expand when topic breadth, device content, or earlier scoping shows low overlap. Shrink only when you can show coverage is tight and the extra time adds little to recall. Clear reporting and a PRESS check will help your review land on solid ground.