Assess methods and ethics, give specific evidence-based comments, state conflicts, and advise clear actions with a respectful, concise tone.
Peer review in medicine keeps clinical science honest and useful. A good review reads like a clear map: brief summary, fair critique, and practical advice. This guide shows how to judge a paper, write balanced comments, and back every point with evidence from the manuscript. The goal is simple: help editors decide and help authors improve without guesswork.
Core Principles For Medical Peer Review
Start with duties that never change. Keep manuscripts confidential. Declare any ties that could bias your view. Accept only papers you can judge well. Stay objective, stick to the science, and meet the deadline. The COPE ethical guidelines for peer reviewers set out these basics in plain terms.
Non-Negotiables
- Confidentiality: never share data or ideas from a manuscript.
- Conflicts: disclose funding, collaborations, rivalries, or IP interests.
- Competence: accept the task only if the methods and topic fit your skill.
- Independence: judge what is in the paper, not who wrote it or where they work.
- Civility: critique claims, not people; keep language neutral and precise.
- Timeliness: respond fast to an invitation and agree a due date you can keep.
What To Check First
What To Check | Why It Matters | How To Judge |
---|---|---|
Title and Abstract | Sets claims and scope | Match with outcomes and design stated in the paper |
Question | Defines clinical value | Is the PICO or aim clear and answerable? |
Design | Links claims to methods | Trial, cohort, case-control, review, or lab model fits the question |
Ethics | Protects patients and data | IRB approval, consent, and trial registration where needed |
Methods | Enables replication | Eligibility, interventions, outcomes, and stats described with enough detail |
Bias Control | Limits error | Randomisation, allocation concealment, blinding, or confounder control |
Results | Show the evidence | Proper tables, confidence intervals, effect sizes, no data fishing |
Interpretation | Connects data to claims | Claims stay inside the data, limits are clear |
Giving A Peer Review In Medicine That Editors Trust
Editors look for reviews that are focused, traceable, and fair. Use section labels, cite page and line ranges, and separate major from minor points. When you request extra work, say why it changes the inference, not just the style.
Scan The Manuscript And Declare Conflicts
Read the abstract, figures, and conclusion, then skim methods. If you see a link that could sway your view, alert the editor at once. If the link is minor, the editor may keep you on while recording the disclosure.
Check Study Design Against Reporting Standards
Match the study type with the right checklist. Trials pair with CONSORT. Cohort, case-control, and cross-sectional work pair with STROBE. Systematic reviews pair with PRISMA. The EQUATOR Network hosts these tools and many more.
Evaluate Methods And Statistics
Look for a clear primary outcome and a matching analysis plan. Check sample size logic, inclusion and exclusion rules, and handling of missing data. Ask whether the model fits the measure: time-to-event, count, or continuous. Inspect any subgroup work for pre-specification and interaction testing.
Appraise Results And Figures
Prefer effect sizes with intervals over bare P values. Confirm that denominators stay stable across tables. Check figure axes, units, and legends. Spot duplicate images or selective panels.
Judge Interpretation And Claims
Do authors stretch beyond their design? Associations do not equal causation. Shorten claims that lean on post hoc tests. Point out where clinical impact is unknown or needs external data.
Ethics And Transparency Checks
Trials should show registration numbers and a dated protocol. Human work needs IRB approval and consent; animal work needs oversight and care standards. Ask for data sharing or a public repository link when journal policy requires it. The ICMJE recommendations outline disclosures, trial registration, and data reporting norms across medical journals.
Write The Review With A Clear Structure
Editors and authors value a stable format. Lead with a two-line summary of what the study claims and whether the data back it. Then give numbered major points, followed by minor edits. Close with a private note to the editor only if needed.
Suggested Outline
- Summary: one short paragraph on aim, design, and top finding.
- Major Comments: two to five items that change validity or clarity.
- Minor Comments: style, typos, small clarifications.
- Confidential To Editor: conflicts, duplication worries, or policy flags.
- Recommendation: accept, minor revision, major revision, or reject.
How To Do Medical Peer Review With A Clear Structure
Use a repeatable path from first read to final decision. Habits save time.
The Five-Block Review Template
- Block 1 — Context: State the clinical question and why the study matters to care or policy.
- Block 2 — Methods Fit: Say if design, setting, and analysis align with the question.
- Block 3 — Results Quality: Note signal size, precision, and any missingness or protocol drift.
- Block 4 — Claims And Limits: Mark any overreach and suggest tighter wording.
- Block 5 — Actionable Edits: List specific fixes and cite table or figure numbers.
Phrase Bank For A Neutral Tone
Helpful lines keep feedback firm without heat. Try lines such as:
- “The analysis plan doesn’t match the primary outcome; please align the model with the endpoint.”
- “Randomisation is stated but concealment isn’t clear; add the process and materials.”
- “Please report absolute risks with intervals, not just P values.”
- “The subgroup work wasn’t pre-specified; move to exploratory and temper the claim.”
- “Registration details are missing; provide the registry, ID, and date.”
Handling Common Study Types In Clinical Peer Review
Different designs call for different checks. Trials hinge on randomisation, concealment, and predefined outcomes. Observational work hinges on selection, measurement, and confounding. Systematic reviews hinge on protocol fidelity, search depth, and bias tools.
Randomised Trials
Ask whether groups were created with a proper sequence and kept hidden until assignment. Ask how deviations from intended treatment were handled. Scan harms, adherence, and missing outcome data. CONSORT updates from EQUATOR give checklists and flow diagrams that make these points easy to verify.
Observational Studies
Check that exposure and outcome definitions match standard codes or validated tools. Look for time-varying bias, immortal time, and reverse causation. See whether the authors used directed acyclic graphs or a clear confounder plan. STROBE items help ensure complete reporting.
Systematic Reviews And Meta-Analyses
A solid review starts with a registry entry and a protocol. Search strings, databases, and dates need to be reproducible. Risk-of-bias tools must match designs. PRISMA 2020 from EQUATOR lists the reporting items and a flow diagram that editors expect.
Before you look at rows and fixes, pause and scan whether the paper answers one clear question, sticks to a prespecified plan, and reports what would matter to a clinician or patient. If that frame is missing, ask for tighter aims, trimmed outcomes, and clearer language around uncertainty.
Study Types And Quick Flags
Study Type | Typical Red Flags | Quick Fix |
---|---|---|
Randomised Trial | Unclear allocation or missing harms | Add concealment detail; expand harms table |
Cohort Study | Time-related bias or loss to follow-up | Use time-varying models; report retention |
Case-Control | Control selection or recall bias | Clarify source population and blinding of assessors |
Cross-Sectional | Overstated causality | Reword claims as associations; add limits |
Systematic Review | Weak search or missing protocol | Register, add full strings, and share the protocol |
Ethics, Confidentiality, And Integrity
Do not contact authors. Keep all files secure and delete drafts after the decision. If you detect plagiarism, duplicate images, or data reuse, flag it privately to the editor with evidence. If you need help on policy, check the ICMJE recommendations or the COPE flowcharts supplied by your journal.
Speed, Professional Habits, And Reviewer Credit
Reply to an invitation within two days. If the topic is close but not exact, accept with a note on any limits and offer to review the parts you know well. Set two sessions: one fast pass, one slow pass with notes. Keep a checklist and a library of phrases you reuse. Many journals share certificates or credit for timely, solid reviews.
Checklist Before You Submit
- Conflict statement completed and up to date.
- Study type matched to a reporting checklist via EQUATOR.
- Primary outcome, effect sizes, and intervals checked.
- Randomisation, concealment, and blinding assessed when relevant.
- Confounder plan judged for observational work.
- Risk-of-bias or quality tools applied for reviews and meta-analyses.
- Data sharing, registration, and ethics approvals verified.
- Major comments numbered and linked to exact sections or lines.
- Minor edits grouped cleanly; tone stays neutral and concise.
- Clear recommendation supplied to the editor.
Common Statistical Pitfalls In Medical Manuscripts
Numbers drive clinical claims, so a few focused checks pay off. Check whether the primary measure matches the type of data. Binary outcomes need risk ratios or risk differences with intervals. Small samples with zero cells may need exact methods or a clear correction, not default software output. Continuous outcomes need clear units and a plan for non-normal data. Time-to-event outcomes need survival methods with censoring handled cleanly.
Binary Outcomes And Effect Measures
Ask for both absolute and relative effects. Odds ratios can mislead when events are common, so risk ratios or risk differences help readers judge size. If a table shows a zero cell, check the method used and ask for a note on continuity corrections.
Multiple Comparisons And Selective Analyses
Limit unplanned tests. Label new looks as exploratory and stick to the main outcome and a small set of secondary endpoints.
Survival Analysis Traps
Kaplan–Meier curves need counts at risk and a clear time origin. Check proportional hazards or use a strategy that relaxes it. Competing risks need methods built for them; simple censoring may mislead.
Model Reporting
Models should list all terms, coding choices, and checks. Ask for full tables with coefficients and intervals. For machine learning, request splits, tuning, calibration, and a code link when policy allows.
Practical Writing Tips For Reviewers
Write short, active lines. Tie each note to a page and line, then offer a clear, clean rewrite.
How To Frame Major Comments
Use a three-step formula. State the issue in one line. Give evidence from the manuscript or a standard rule. Offer a concrete fix. Here is a pattern that works: “Primary outcome not defined in methods; please name it and align the sample size and analysis plan with that outcome.”
Tone With Non-Native Authors
Many teams write in a second language. Mark unclear lines without sarcasm. Suggest a short model sentence, not a lecture. Flag jargon and write what you think the line means, then invite a correction.
Working With Editors
Use the confidential box to note anything that could change an editorial call. Examples include overlap with a preprint from the same group, a trial that stopped early, or a conflict you disclosed. If you think the paper fits another journal tier, say so and explain. If you and a co-reviewer disagree, keep your stance clear and evidence-based; the editor will weigh the points, not the tone.
Revisions And Rebuttals
On a revised paper, judge whether authors met the core requests. Track each major point and tick it off or say what still blocks acceptance. If a new analysis appears, scan for fresh risks and ask whether it matches the scope of the study.
When To Say No
Decline a review if timing is tight, the topic sits far outside your lane, or a conflict cannot be managed. Reply fast, suggest two alternate reviewers, and add a line on their skill. Saying no early helps the editor keep the process moving.
Quick Reference Card For Busy Days
Print a one-page card with your steps: accept or decline within two days, run the right checklist, verify methods against outcomes, write numbered comments, check tone, and send a clear recommendation. Small, steady habits build trust and make every medical review sharper and faster.