How To Do A Scientific Literature Review | Step By Step

Start by setting a sharp question, map the search, screen with set rules, extract and appraise, then write with transparent reporting.

What A Scientific Literature Review Does

A scientific literature review maps what is known, shows what is missing, and explains how evidence lines up on a topic. It is not a data dump. It is a guided tour that shows readers your trail: which sources you searched, what you kept out and why, how you read the studies, and what the body of evidence says.

Pick a review type that fits your aim and time. A rapid scan for a policy brief is not the same as an in-depth thesis review. Name the type up front and state the aim in one sharp line. Readers should see at a glance what the review covers and what it does not cover.

Stage Core Output Helpful Aids
Question Clear scope and outcomes PICO or PEO templates
Protocol Inclusion and exclusion rules Shared doc with version log
Search Full strings per database Saved queries and dates
Screen PRISMA flow counts Two readers and tie break plan
Extract Clean data sheet Fields defined before start
Appraise Risk of bias notes Tool matched to study design
Synthesise Text, tables, or meta-analysis Rules for grouping and models
Report Full methods and limits Checklists and flow diagram

Steps To Do A Scientific Literature Review That Stands Up

1) Frame A Focused Question

State the problem in a single sentence. Use simple templates like PICO for interventions or PEO for qualitative work. Define the Population, what is being done or studied, any Comparators if that applies, and the main Outcomes you will track. Set time frames and settings when they matter. Write out exact inclusion and exclusion rules so later choices stay consistent.

2) Draft A Protocol Before You Search

Write your plan while you are fresh. List databases, the date range, languages, study types, and how many people will screen and extract. Add how you will rate study quality and handle ties. A protocol reduces mid-stream drift and speeds up screening, since tricky calls were solved in advance.

3) Build And Save Search Strings

Choose a mix of controlled terms and free text. Use Boolean logic, phrase marks, truncation, and field tags. Test a few known key papers: if your search finds them, you are on track. Save each search with the date and export all results with full fields. For biomedicine, PubMed offers MeSH terms, filters, and export tools that make this smooth.

4) Record The Flow With A Standard

Track every record from import to final set. Note removals at the title and abstract stage, at full text, and reasons for exclusion. A simple count at each step supports a clear flow diagram. The PRISMA 2020 checklist and flow templates keep the trail auditable and clear.

5) Screen Titles And Abstracts In Pairs

Two readers reduce missed hits. Pilot ten to twenty records to tune your rules before bulk screening. Resolve ties with a third reader or a short meeting with notes. Keep a live list of edge cases so your team makes the same call the next time the pattern appears.

6) Retrieve And Screen Full Texts

Pull PDFs through your library or open sources and log any that you cannot obtain. Apply the same rules you wrote earlier, and write down the main reason for each exclusion. That reason list feeds your flow diagram and helps readers trust the set you kept.

7) Extract Data With A Locked Sheet

Design the sheet first, then pilot it. Include study design, sample, setting, exposure or intervention details, outcomes, effect measures, follow-up time, and notes on funding or conflicts. Use data types that match each field to reduce entry errors. Pair up on a subset to check agreement, then proceed when agreement looks steady.

8) Appraise Study Quality With The Right Tool

Pick a risk of bias tool that fits the study design. Randomised trials, cohort studies, case control work, and qualitative studies need different lenses. The Cochrane Handbook explains common tools, domains, and judgement rules that keep ratings consistent.

9) Synthesis: Narrative Or Meta-Analysis

Plan how you will group studies: by design, setting, exposure level, outcome measure, time point, or risk of bias. If studies line up on design and measures, a meta-analysis can give pooled estimates; if not, a clear narrative with structured tables can show patterns. State any model choices and why the choice made sense for the data you had.

10) Write With Full Methods And Calm Claims

Readers skim first. Start with a tight abstract that names the scope, data sources, key counts, and the main message. In the paper, keep methods as a recipe others can repeat. In results, lead with study counts and a flow figure, then key tables. In the end matter, state limits, shortfalls in the field, and sharp next steps.

How To Write A Scientific Literature Review Section That Reads Clean

Break long blocks. Use short sentences with strong verbs. Keep one idea per paragraph. Use headings that mirror reader tasks: Question, Search, Screening, Data, Appraisal, Findings, Limits. Include a list of changes made after peer notes so the path is visible.

Make the first screen give value: the title matches the query, the first paragraph sets the answer, and the layout avoids giant banners or heavy widgets that push content down. Add alt text to images, and keep captions short and clear.

Search Strategy With PubMed: Mini Walkthrough

Build The Concept Set

List core concepts and their tags. For each concept, write synonyms and related terms. Turn each set into an OR block. Link blocks with AND. Add NOT terms only when noise is high and the risk of loss is low.

Use MeSH And Free Text

Find the best MeSH for each concept and explode when needed. Combine with title and abstract text for newer terms not yet indexed. Test both paths and compare yields.

Apply Filters With Care

Use study type and date filters when they match your plan. Avoid language limits that drop useful work. Save each version of the search and note the filter set so later updates can run the same way.

Export And Deduplicate

Export full records with abstracts and IDs. Deduplicate in your reference manager and keep a log of counts before and after. Store raw exports so a reader can rerun your steps.

Picking Databases Beyond PubMed

Match sources to the field. Scopus and Web of Science cast a wide net across many areas. PsycINFO suits mental health topics. ERIC covers education. IEEE Xplore helps with engineering. CINAHL suits nursing. Embase adds rich indexing for drugs and devices. For regional work, add local databases and society journals so the set does not tilt toward a single region.

For grey literature, search conference books, thesis portals, and trial registries. Preprints can flag new signals in fast-moving topics, but label them clearly and treat claims with care during synthesis.

Data Extraction Fields: A Practical Sheet

Build a tidy sheet that cuts rework. Keep one row per study arm when needed and label arms in a clear way. Use short, reusable field names. Below is a lean set that covers most needs; add more only when the question calls for it.

  • Citation: first author, year, title, journal, DOI.
  • Design: trial, cohort, case control, cross-sectional, mixed methods.
  • Setting: country, site type, care level, or lab type.
  • Sample: size, age range, sex split, key inclusion traits.
  • Exposure Or Intervention: dose, tool, timing, comparator.
  • Outcomes: exact measures, units, time points.
  • Results: effect size with CI or p-value, and any model notes.
  • Follow-Up: length, attrition, reasons for loss.
  • Funding And Conflicts: source, role, and any declared ties.
  • Risk Of Bias: per domain notes and an overall call.

Synthesis Choices In Plain Language

Think about heterogeneity early. When designs, measures, and settings match well, pooled estimates make sense. When they do not, group studies into clean bins and write a tight narrative that walks the reader through the bins in a steady order. Use plots and compact tables to show where results agree and where they do not.

If you pool, report the model and the reason for it. Show effect size, CI, and a measure of between-study spread. Test influence by removing outliers to see if the core signal holds. When measures differ, convert to a common yardstick and show that step in a short note.

Visual Aids That Carry Weight

A flow diagram earns trust by showing the full path from records to studies. Tables that list key study traits help readers see patterns without scrolling back and forth. Forest plots make pooled results easy to scan. Keep figure titles blunt and captions short so they work on small screens.

Ethics, Credit, And Clear Sharing

Respect rights when sharing PDFs and data. Share only what your license allows. Give credit to librarians and statisticians who shaped the search or the models. Add data and code in a public repo when the journal allows it. A short readme that lists files, versions, and how to run code saves time for the next team that builds on your work.

Update Plans And Living Reviews

Write when you will check for new studies and what will trigger an update. For fast-moving areas, short, regular updates keep guidance fresh. When you update, show what changed: the new search date, any new records, and how the message shifted, if at all.

Reporting Checklist You Can Copy

Many journals expect clear reporting from title to limits. The items below echo what major guides ask for and help readers find key details fast.

Item What To Show Where
Title Name the review type and main topic Title page
Abstract Scope, sources, dates, counts, main message Abstract
Rationale Why the topic matters and the gap Intro
Eligibility Inclusion and exclusion rules Methods
Information Sources Databases, dates, and any hand search Methods
Search Strategy Full strings for at least one database Appendix
Selection Process How many readers and deal with ties Methods
Data Items All fields extracted and any rules Methods
Study Quality Risk of bias tool and judgements Methods
Effect Measures Metrics used in synthesis Methods
Synthesis Methods How you grouped and any models Methods
Flow Diagram Counts at each step with reasons Results
Study Features Table of designs, samples, settings Results
Results Main findings by outcome Results
Certainty Strength of the body of evidence Results
Limits Gaps in methods and in the field Discussion
Registration Link if you registered a protocol End matter

Templates And Reproducibility

Use a shared folder with a clear file tree: protocol, search strings, raw exports, deduped sets, screening notes, data sheets, and code. Name files with dates and short tokens. A reader should be able to rebuild the path without asking you for extra files.

Create small, reusable snippets: a risk of bias note block, a standard footnote for funding, and a stock paragraph on data access. Reuse speeds writing and yields a uniform style across papers from the same lab.

Time-Saving Tools And Tactics

A citation manager such as Zotero or EndNote can tag, dedupe, and format references. Screening tools like Rayyan can speed blind screening and tie breaks. Spreadsheet filters and simple scripts can flag ranges and missing cells. Small gains add up on large sets.

When teams split tasks, set weekly targets and use short syncs to clear roadblocks. Keep tasks small and visible so new members can help fast.

Quality Signals Reviewers Notice

State who did what. Name who built the search, who screened, who extracted, and who checked risk of bias. Show training or pilot steps if you used them. Share forms and data when possible so others can reuse them.

Be careful with claims. Map where evidence is strong, where it is thin, and where it conflicts. If you ran a meta-analysis, check model fit and show both fixed and random effects when the field expects that view.

Who, How, And Why: Make It Obvious

Who: Add a short bio line and a link to an about page if the venue allows it. Point to prior work on the same topic. How: Link to checklists and methods and include a flow figure. Why: Tie the findings to the use case that sparked the review and state who stands to use the results.

Final Checks Before You Hit Submit

Rerun the search on your cut-off date to catch last-minute records. Spell-check all author names and journal titles in your tables. Ask a colleague to replicate your flow counts from the raw exports. Test all links. Run the PRISMA 2020 items one by one. Keep a copy of every file you sent and note the exact date. When the paper goes live, post the data sheet and strings so that others can build on your work.

Handy Links: Search and export with PubMed; align reporting with the PRISMA 2020 checklist; pick appraisal and synthesis methods from the Cochrane Handbook.