How To Do A Medical Review Article | Step By Step

Pick a question, pre-register a protocol, search widely, screen in pairs, extract carefully, appraise quality, synthesize, and report with PRISMA.

Writing a medical review article isn’t a mystery. With a clear plan, a tidy workflow, and a small, well-coordinated team, you can produce a paper that experts trust and newcomers can follow. This guide keeps the process practical: what to do, when to do it, and how to avoid common missteps that slow a project or sink a submission.

You’ll see two tracks referenced throughout: systematic reviews, which use structured methods and can include meta-analysis, and narrative reviews, which summarise a topic without pooling results. Both can be done well. The steps below show where they differ and where they align.

What Counts As A Medical Review

Before you start, choose the review type that fits your question and timeline. Picking the right design up front makes every later step smoother, from search to write-up.

Review Type Purpose When It Fits
Systematic Answers a tight, pre-specified question with a transparent, reproducible method; may include meta-analysis. When patient care, policy, or guidelines need dependable evidence.
Scoping Maps concepts, methods, and gaps without judging effects. When a field is broad or fragmented and you need a map first.
Narrative Builds a readable overview, placing studies in context. When you need a balanced primer or a viewpoint backed by citations.
Rapid Streamlined systematic approach with targeted limits. When time is short and a full review isn’t feasible.
Umbrella Summarises findings from existing systematic reviews. When many reviews already exist and you must compare them.

Steps For Doing A Medical Review Article

Pick A Clear Question

Frame a focused question that readers can answer from your results. For treatments, the PICO format helps: Population, Intervention, Comparator, Outcomes. For diagnosis or prognosis, adjust the elements but keep the scope tight. Avoid stacking multiple unrelated outcomes into one question; split them or prioritise up front.

Write A Protocol

Document aims, eligibility, outcomes, time frame, and planned analysis. For systematic reviews, register the protocol in PROSPERO to show your intent and reduce duplication. A simple protocol also benefits a narrative review; it keeps the team aligned and your choices traceable.

Build A Search Strategy

Combine controlled vocabulary and free-text terms for each concept. Use Boolean operators, truncation, and proximity as supported by the database. At minimum search MEDLINE or PubMed; for clinical questions add Embase, and for nursing or allied health add CINAHL. Search trial registries and preprint servers if recency matters. Save and export every search string so others can reproduce it.

Example Boolean Block

(random* OR trial* OR placebo) AND (drug X OR generic X) AND (asthma OR bronch*)

Run The Searches

Export results with full citation data and abstracts. Pull from each source on the same day so counts align. Keep a versioned log: the date, the database, the exact query, and the number of records retrieved.

Manage Records And Deduplicate

Import all files into a single library with DOIs and PMIDs preserved. Use a reference manager or a screening platform to remove duplicates. Spot-check a random set to ensure the matching rules didn’t over-merge distinct papers.

Screen Titles And Abstracts

Two reviewers should screen in parallel using the same inclusion rules. Resolve disagreements by a quick chat or a third reviewer. Calibrate on a pilot sample until decisions are consistent, then proceed to full screening. Document reasons for exclusion at the full-text stage.

Assess Eligibility On Full Text

Retrieve PDFs through your library or direct author contact. Apply the same criteria you used for the abstract stage, with clarifications noted in the log. Keep a record of studies that look eligible but lack data so you can mention them in the write-up.

Extract Data

Design a form before you start. Capture study setting, design, population details, interventions or exposures, comparators, outcomes, follow-up, effect estimates, and any notes needed to interpret numbers. Pilot on three to five studies and refine confusing fields before the full pass.

Appraise Study Quality

Use a tool matched to study design: randomised trials often use RoB 2, observational treatment studies often use ROBINS-I, diagnostic studies often use QUADAS-2. Work in pairs and agree rules before rating. Summarise judgements in a table so readers can see patterns across domains.

Plan Your Synthesis

When studies are similar, a meta-analysis can pool effects. Record the metric you’ll use, the model, and your rules for combining outcomes. For heterogeneity, report study-level features and run sensitivity checks that test influential choices. If pooling isn’t sensible, craft a structured narrative that walks through outcomes and subgroups in a steady, consistent order.

Create Figures And Flow

Prepare a PRISMA-style flow diagram to show how records moved from search to inclusion. Tables should be skimmable: one with study characteristics and one with main results. If you meta-analyse, include forest plots with the numeric data used to compute each point and interval.

Writing A Medical Review Article The Right Way

Title And Abstract

Keep the title precise and informative. If you performed a systematic review or meta-analysis, say so. Write a structured abstract that mirrors the main sections and includes the core numbers a reader needs. The PRISMA 2020 abstract items are a reliable checklist.

Introduction

Set the scene in a few short paragraphs: what’s known, what’s unclear, and the exact aim of the review. Cite the most relevant primary papers and any prior reviews, and state how your work adds new value.

Methods That Others Can Repeat

Describe the protocol, registration, eligibility rules, databases searched, full search strings, deduplication method, screening process, data fields, and risk-of-bias tool. Mention any software or packages used. If you changed the protocol, explain the change and why it was made before seeing the results.

Results People Can Scan

Open with the flow diagram totals, then the number of included studies and participants. Move next to study characteristics, then outcomes in a steady order. Use consistent labels across text, tables, and figures so readers don’t hunt for matching terms.

Synthesis And Meta-analysis

State the model and the effect measure in one sentence before each figure. Explain how you handled multi-arm studies, zero-event cells, cluster trials, or crossover designs. Report any subgroup or sensitivity checks sparingly and only where they add clarity.

Limitations And Certainty

Be frank about gaps: small samples, short follow-up, inconsistent measures, or risk-of-bias concerns. If you grade the certainty of evidence, explain the approach in a short note so readers can interpret confidence ratings correctly.

Data, Code, And Materials

Share the extraction sheet, analytic code, and figure data in a public repository when journal policy allows. This speeds peer review and lets later teams re-use your work.

Reporting Checklists

Use reporting guidance that matches your review. The Cochrane Handbook offers step-by-step detail for clinical effectiveness questions, and PRISMA provides itemised reporting prompts that keep methods transparent.

Quality, Ethics, And Registration

Transparency earns trust. Register a protocol when the design qualifies, disclose funding and any competing interests, and keep all analytic decisions visible. If you invite a specialist librarian or statistician to join the team, credit their role and list their contributions clearly.

Quality also relies on calibration. Train screeners on a sample set, pilot data extraction on a handful of studies, and rehearse the risk-of-bias tool with examples. These small drills prevent drift and save hours later.

Study Design Common Bias Tool Notes
Randomised trial RoB 2 Check randomisation, deviations, missing data, measurement, and reporting.
Non-randomised intervention ROBINS-I Pay attention to confounding, selection, and intervention timing.
Diagnostic accuracy QUADAS-2 Check patient selection, index test, reference standard, and flow.
Systematic review AMSTAR 2 Use when you assess prior reviews in an umbrella review.

Common Pitfalls And Quick Fixes

Vague Questions

Fix by tightening outcomes or narrowing the population. If the field lacks shared outcome measures, pick a small set that matter to patients and clinicians and stick to them.

Shallow Searches

Run at least one librarian-reviewed strategy, include both controlled terms and keywords, and store each full string. Search grey literature where missing trials are likely.

Inconsistent Screening

Write short inclusion rules and test them on a pilot batch. Measure agreement and refine the wording until both reviewers make the same calls on the pilot set.

Data That Don’t Match

Decide unit-of-analysis rules before extraction. Document conversions, imputed values, and any calculation that changes what appears in the paper. Keep both the raw value and the derived value in your sheet.

Figures That Confuse

Label axes and effect measures in plain language. Use the same order of outcomes across text, tables, and plots. Avoid decorative chart junk that distracts from the message.

Timeline And Team Roles

Even a small team can move quickly when roles are clear. One person leads the protocol and shepherds deadlines. A search specialist crafts and runs strategies. Two reviewers screen and extract in parallel. A statistician advises on pooling rules and checks code. A content expert strengthens interpretation and keeps the story grounded.

Create a short schedule with milestones: protocol draft, search complete, screening complete, extraction complete, bias appraisal complete, synthesis complete, manuscript submitted. Tie each milestone to a date and a person.

Submission And Peer Review

Pick a target journal early, read its scope notes, and match the article type and length. Follow the house reference style and submit required checklists. Many journals ask for a completed PRISMA checklist and the flow diagram. Some request data and code at submission; host them on a stable repository and link in a data availability note.

During peer review, respond with a point-by-point letter. If a change would break your protocol, explain why and propose an alternative phrasing or an added sensitivity check. Keep the tone factual and friendly; the goal is a cleaner, more useful paper.

Final Pre-Submission Checks

  • Title names the design and the topic.
  • Abstract contains the core numbers and the main message.
  • Methods show full search strings and screening rules.
  • Results match tables and figures line for line.
  • All data used to compute pooled numbers are available.
  • PRISMA items and journal forms are complete.

Helpful Resources You Can Trust

The PRISMA website hosts checklists, abstract items, and ready-to-edit flow diagrams. The Cochrane Handbook offers step-by-step methods for clinical effectiveness questions, from planning and search to bias appraisal and meta-analysis. If your plan qualifies, the PROSPERO registry lets readers see your intent and compare your final paper to the original plan.

Keep a small library of templates nearby: a protocol shell, a screening log, a data extraction sheet, a bias summary table, a flow diagram, and a response-to-reviewers letter. Reusing tested layouts speeds each project and keeps formatting consistent across the team.

Ready To Start

Pick a tight question, sketch a brief protocol, and recruit a partner. Build the search, log each step, and keep files tidy. Small, steady moves win. Follow these checks and your review will come together for readers and editors.

Practical Tips That Save Time

Set Up A Repeatable Folder System

Create a top-level folder for protocol, search strings, screening, extraction, analysis, figures, and manuscript. Inside each, add a readme that lists what lives there and who owns it. A tidy structure keeps files findable months later.

Name Files So Sorting Works

Use ISO dates and clear tags: 2025-09-16_pubmed_search.txt, 2025-09-18_screening_log.xlsx, 2025-09-20_forest_primary.png. Consistent names make version control painless and help collaborators avoid overlap.

Create A Living Glossary

Define outcomes, time points, and subgroup labels once, in a shared document. Paste the exact field names you’ll use in the extraction sheet. When the team speaks the same language, errors drop fast.

Keep Decisions In One Log

Record any rule you add or clarify during screening, extraction, or analysis. Note who decided, when, and why. The log becomes a goldmine when you draft the Methods and it keeps responses to reviewers grounded.

Pre-Write Tables

Sketch the study characteristics table and the main results table before extraction. Seeing the shape of the output sharpens what you collect and trims busywork. When the last extraction finishes, your tables fill themselves.