How To Do The Review Of Related Literature In Health Sciences | Step By Step

Map a clear question, register a protocol, search widely, screen with criteria, extract and appraise, synthesize, and report with PRISMA.

Start With A Sharp Health Question

Good reviews start with a tight, testable question. Use a structure that keeps the topic narrow and measurable. Define the population, the exposure or intervention, the comparator, and the outcomes. Name the setting and time window. List exact inclusion and exclusion rules. Predefine primary outcomes and any secondary outcomes. State the study designs you will accept, such as randomized trials, cohort studies, or qualitative syntheses tied to patient experience. Write the question in one sentence and keep it visible on your planning page.

Decide the review type early. Options include scoping, systematic, and rapid formats. Pick the type that matches scope, resources, and deadlines. A scoping review maps themes and gaps. A systematic review asks a focused question and tries to bring a full set of eligible studies together, with or without meta-analysis. Rapid work trims steps but keeps core methods transparent.

Plan The Review Workflow

Sketch a protocol that anyone on your team can follow. List the question, objectives, eligibility rules, search plan, screening steps, data items, outcomes, risk-of-bias approach, and synthesis plan. Assign roles for searching, screening, extraction, appraisal, and writing. Decide how you will resolve disagreements. Write simple calibration steps so reviewers practice on the same set of records before full screening starts.

Register the protocol to make the plan public and time-stamped. A widely used registry for health topics is PROSPERO. Public registration reduces duplicated effort and flags changes from the original plan if they become necessary.

Build A Reproducible Search Strategy

Write searches that mix controlled vocabulary and free text. In PubMed, map terms to Medical Subject Headings and use field tags for precision. The MeSH training modules walk through the basics and advanced moves. Combine synonyms with OR, connect concept blocks with AND, and test known sentinel papers to see if they appear. Keep full search strings in a versioned note so anyone can rerun them later.

Search more than one source. Databases cover different journals and study types. Add trial registers and preprint servers when the topic warrants speed. Scan reference lists of key papers and use forward citation tracking. Capture grey literature such as theses and reports if policy or program questions are in scope.

Search Plan Blueprint
Source Purpose Notes For This Topic
PubMed / MEDLINE Core biomedical indexing with MeSH Seed terms from the question; test automatic term mapping
Embase Expanded drug and device coverage Add Emtree terms; watch for duplicates across databases
CINAHL Nursing and allied health Useful for implementation and practice themes
Cochrane Library Trials and prior reviews Screen reviews for reference mining; check trial registries
Trial registers Unpublished and ongoing studies Include national and international registers as needed

Document every run: database, platform, dates, searcher, full string, limits, and the number of records retrieved. Save results to a reference manager and remove exact duplicates before screening. Export a de-duplicated library to your screening tool so counts match across steps.

Doing The Review Of Related Literature In Health Sciences: Screening And Selection

Run a pilot screen on a small sample to check that rules are clear. Two reviewers should screen titles and abstracts in parallel with a third person on call for ties. Move records to full-text screening when both reviewers say “include,” or when one says “include” and the other says “unsure.” Track every reason for exclusion at the full-text stage and keep those reasons consistent.

Use a flow diagram to show counts at each step. The PRISMA 2020 checklist sets a layout that readers understand. Report the number of records identified, screened, assessed in full, and included, along with exclusions by category.

Extract Data The Smart Way

Design a form that captures study identifiers, setting, design, sample, intervention or exposure details, comparators, outcomes, effect metrics, follow-up, funding, and declarations. Define each field to avoid free-text sprawl. Pilot the form on three to five studies and adjust only where the pilot shows real friction. Use dual extraction on key fields and keep an audit trail of edits.

Plan how to handle multiple reports from the same study. Decide which time points and outcome measures take priority. Note unit-of-analysis issues, cluster designs, and crossover designs. Mark any data you had to calculate from reported numbers so readers can check the math.

Judge Study Quality

Pair your review type with the right appraisal tool. For randomized trials, the Cochrane RoB 2 tool is a solid choice. For non-randomized studies of interventions, ROBINS-I is widely used. For diagnostic accuracy studies, look to tools such as QUADAS-2. Use domain-level judgments and avoid rolling up to a single letter grade that hides detail.

Common Bias Domains And What To Note
Domain Typical Signals What To Record
Randomization Unclear sequence or concealment Method used; any deviations from planned allocation
Blinding Open-label procedures Who was blinded; outcomes prone to detection bias
Missing data High or imbalanced loss to follow-up Attrition numbers; handling method; reasons for loss
Selective reporting Outcomes in methods not reported in results Protocol or register match; switched outcomes
Confounding Baseline imbalance in observational work Adjustment approach; residual concerns

Record reasons behind each judgment. Decide in advance how risk-of-bias levels will influence synthesis. You may run sensitivity checks that drop high-risk studies or present stratified estimates by risk level. Keep these rules in the protocol and carry them through to the final report.

Synthesize And Interpret

Put studies on one sheet first. Summarize designs, samples, interventions, and outcomes side by side so patterns emerge. If data line up, pool effect sizes with a random-effects model and test heterogeneity. Report the model, the effect measure, confidence intervals, and an assessment of inconsistency. If pooling is not sensible, use a structured narrative that groups studies by design, dose, setting, or outcome family.

Check small-study effects with visual tools and tests when you have enough studies. If results hinge on one or two trials, say so and show alternative runs with those trials removed. State where indirectness, imprecision, or risk of bias could sway the message. Bring patient-relevant outcomes forward in your summary rather than burying them.

Conducting The Review Of Related Literature In Health Sciences: Reporting And Sharing

Write the report so a reader can retrace every step. Link the methods to the protocol and flag any changes with a short reason. Follow the headings in the PRISMA checklist to keep items in the usual order. Cite the full search strings in an appendix and provide cleaned data and code where possible.

Use clear tables and figures. Place the flow diagram near the top so readers grasp the path from records to studies. State which outcomes drive the main messages and which ones are exploratory. If you ran a meta-analysis, share the dataset used for each forest plot and the calculator or software used for effect sizes.

When your topic touches clinical action or policy, show how the evidence links to real-world choices. Call out research gaps that matter for patient care. Keep the abstract tight and structured so busy readers can scan goals, methods, results, and limits fast.

Work With Tools And Teams

Pick tools that match team size and budget. A reference manager keeps imports tidy. Screening apps reduce manual tracking and make dual review easier. Spreadsheets work for simple extraction, while review platforms bring forms, workflows, and exports under one roof. Agree on file names, folder layout, and version control from day one.

Hold short stand-ups during active phases. Check counts after each step so the flow diagram and library stay in sync. Document decisions in a running log. When the draft is ready, give a fresh pair of eyes a day to spot gaps in methods, missing numbers, or fuzzy language.

Ethics, Equity, And Transparency

Treat studies and participants with care. Avoid language that stigmatizes groups. Plan subgroup analyses only when there is a clear, defensible reason. If you exclude non-English records, say why and note any likely effect on the findings. Disclose funding and any constraints on the review process.

Be open about limits. Short timelines, narrow access to databases, or a single reviewer on a step can change outcomes. State these limits in the methods and the final section so readers can judge the weight of the evidence for themselves.

Quick Reference Checklist

  • Question written, scope defined, and eligibility rules listed
  • Protocol drafted, roles assigned, and registry entry submitted
  • Search strings built, sources chosen, and runs documented
  • Dual screening with reasons for exclusion recorded
  • Extraction form piloted, dual fields set, and audit trail active
  • Risk-of-bias tool matched to study design and applied
  • Synthesis plan executed with sensitivity checks
  • Report written to PRISMA with data and code shared

Helpful Standards And Guides

For methods detail and worked examples, the Cochrane Handbook remains a trusted reference. For reporting, PRISMA 2020 offers checklists and flow diagrams that readers know well. For search skill building in PubMed, the MeSH course from the National Library of Medicine is a handy starting point.