Check scope and ethics, review methods and results, judge novelty and clarity, then send structured comments with a clear accept-revise-reject call.
Every journal sets its own rules, yet the core stays stable across fields. Respect confidentiality, declare conflicts, and follow the venue’s instructions. For common expectations, see the ICMJE recommendations and the COPE reviewer guidelines. For reporting checks by study type, keep the EQUATOR library open while you read.
Peer Review A Medical Manuscript: Step-by-step
Confirm Fit And Read The Files
Start by scanning the title, abstract, and cover letter. Check scope, audience, and study type. Open the main file, figures, data notes, and any supplements. If the journal runs blinding, keep that wall intact. If you spot a clear conflict, recuse fast instead of reading further.
Check Competing Interests And Timing
Conflicts can be financial, personal, academic, or territorial. If you advised the project, mentored the authors, or share a grant, say so. If time is tight, decline fast so the editor can reassign. If you accept, set a calendar block to finish inside the deadline.
Pre-review Checklist
| Task | What To Verify | Quick Tip |
|---|---|---|
| Scope | Fits the journal’s audience and aims | Match title and abstract to the venue |
| Blinding | Author and institution details hidden if required | Avoid web searches that break masking |
| Conflicts | Any ties that could bias your view | Declare or decline without delay |
| Ethics | IRB approval, consent, trial registration as needed | Note missing IDs or unclear consent |
| Data | Availability statement and sharing plan | Flag closed data that blocks checks |
| Study type | Trial, cohort, case control, diagnostic, review, lab | Pick the right reporting checklist |
| Methods access | Enough detail for replication | List items that prevent repeat steps |
| Stats | Design, power, models, assumptions | Ask for a stats review when in doubt |
| Outcomes | Primary and secondary defined up front | Watch for switching or post-hoc focus |
| Results | Match methods and prespecified outcomes | Map each result to a method line |
| Figures | Clear scales, labels, legends, units | Request raw figure files if blurry |
| Citations | Current, balanced, free of self-citation bias | Suggest neutral, high-quality sources |
Build Your Review Setup
Skim, Then Read Deeply
First pass: learn the question, design, and main claim. Second pass: line-by-line notes on clarity, logic, and method detail. Keep a short log of page and line numbers. Quote short phrases when needed so authors can find the spot fast.
Track Notes With A Template
Use three buckets: summary, major points, minor points. The summary shows you read the paper; major points change claims or methods; minor points fix clarity, style, or small errors. Keep the voice calm and specific. If English needs help, mark that once and give a few examples.
Red Flags To Pause On
- No ethics approval where needed, or consent gaps
- Trial without registration or with mismatched registry fields
- Outcome switching, p-hacking hints, or selective plots
- Overstated claims that go beyond the data
- Image duplication or splicing concerns
- Data that cannot be shared when sharing is required
Medical Manuscript Peer Review Process: What Editors Expect
Title, Abstract, And Keywords
Do the title and abstract match the study and the results? Can a busy reader learn the design and the main outcome in a few lines? Point out buzzwords or claims that promise more than the data deliver. Suggest clear keywords that match search intent and indexing terms.
Introduction
Authors should set the gap and the aim in plain terms. Ask for a one-line research question. If the context is bloated, suggest trims and a sharper aim. If key sources are missing, add a few, leaning on high-quality reviews or guidelines instead of self-promotion.
Methods
Look for a clear design, prespecified outcomes, sample process, and planned analysis. For trials, call for a CONSORT flow and registry ID; for systematic reviews, a PRISMA flow; for observational studies, a STROBE-style structure. If blinding, randomization, or matching are vague, ask for exact steps. If the model list is long, ask which one drove the main claim and why.
Statistics
Check assumptions, missing data handling, and multiple comparisons. Ask for effect sizes with intervals, not only p-values. If the sample is small, watch for wide intervals and unstable models. If the analysis is beyond your skill, state that and request a statistical reviewer.
Results
Confirm that each result maps to a method. Tables should carry labels, units, and clear footnotes. Avoid walls of numbers; suggest compact tables or figure panels. Make sure subgroup work was prespecified or clearly labeled as exploratory.
Figures And Data
Good figures carry readable axes, uniform scales, and legible text. Ask for raw image files when resolution hides detail. If the journal asks for data sharing, check that the statement gives a link, accession, or contact route that works.
Discussion
Authors should link back to the aim, state what the data show, and list limits. Ask for a clear take-home line that does not oversell. If the sample is narrow, if the setting is single-center, or if follow-up is short, ask for a plain note on that.
References
Look for balance across regions and schools of thought. Spot strings of self-citations without clear need. Suggest a small set of stronger sources where it helps a claim or a method step.
Write Clear, Actionable Comments
Open With A Short Summary
Start with three lines: question, design, and main finding. Keep it neutral. This snapshot helps the editor check fit and helps authors see you read the work with care.
Lay Out Major Points
Group high-level issues under short sub-bullets. Tie each point to a page or figure. Offer a path forward wherever you can. If a claim lacks support, suggest the exact analysis, dataset, or wording change that would fix it. If a fix needs new data that the team cannot collect, say so plainly.
List Minor Points
Use short bullets for clarity edits, figure tweaks, unit fixes, and small errors. Keep the tone friendly. Avoid edit-by-edit rewrites; give patterns and a few samples instead.
Tone That Builds Trust
- Speak to the work, not the people
- Use “please” and “could you” when asking for changes
- Quote the line you want changed
- Offer sample text for tricky rephrasing
- Thank the authors for fixes between rounds
Decision Language And Examples
Your core job is a clear call with reasons. Match the call to the size of the fixes. When the paper is sound but needs polish, lean to minor revision. When claims need fresh analysis or major rewrites, lean to major revision. When basic design cannot support the claim, or ethics or data barriers cannot be fixed, advise rejection with a brief note on a better home.
| Editor Decision | When It Fits | Sample Wording |
|---|---|---|
| Accept | Strong methods, clear data, tiny edits only | “The work is sound and ready; only small style edits remain.” |
| Minor revision | Good study with limited clarity issues | “Please refine the abstract, add effect sizes, and fix figure labels.” |
| Major revision | Core claims need added analysis or tighter methods text | “Please rerun the model with prespecified covariates and update the conclusions.” |
| Reject | Design cannot answer the question, or ethics/data issues block trust | “The claim cannot be supported with the current design; a new study would be needed.” |
Ethics, Confidentiality, And Credit
Do not share the manuscript or use its ideas in your own work. If you need help on a narrow point, ask the editor before inviting a co-reviewer. Keep all files off public drives. Follow the COPE guidance on conflicts, privacy, and fair conduct. Many journals also align with the ICMJE recommendations for roles, transparency, and corrections after publication.
Bias can sneak in through names, affiliations, methods you prefer, or regions you know best. Read with care for neutral tone in your own notes. Avoid “not novel” claims that rest only on personal taste. If the study answers a clear, useful question with solid work, say so even if you would have used a different path.
Some venues now ask reviewers to state any use of language tools or code assistants. If you use such tools to check grammar or to format references, disclose in the private note to the editor, follow the journal policy, and keep all manuscript text off public systems unless the journal allows it.
Fast Triage By Study Type
Randomized Trials
Look for a clear question, trial registration, a CONSORT flow diagram, prespecified outcomes, and a link to the protocol. Check randomization, allocation concealment, and any deviations. Ask for effect sizes with intervals and for adverse events reporting. If the registry lists a different primary outcome, ask for an explanation and a fix.
Systematic Reviews And Meta-analyses
Ask for a PRISMA flow, a registered protocol, and full search strings. Check risk-of-bias tools and how the team handled small-study effects. If the meta-analysis shows high heterogeneity, ask for reasons and for sensitivity work that tests the main claim.
Observational Studies
Seek a STROBE-style structure, a clear cohort or case definition, prespecified exposures and outcomes, and a directed acyclic graph or a short note on confounding. Check missing data handling and model diagnostics. If causal claims appear without strong design, ask for tempered wording.
Diagnostics And Prediction
For diagnostic accuracy, ask for STARD items, clear thresholds, and clinical setting detail. For prediction models, look for TRIPOD items, internal validation, calibration plots, and external validation where possible.
How To Phrase Your Report
Editors like short, skimmable sections with labels. Use this frame: summary, strengths, limits, requests, decision. Keep the main file for comments to authors and a short private note for the editor. In the private note, flag any ethics or conduct issues and state your confidence in the review.
Strengths You Can Call Out
- Clear question tied to a real gap
- Prespecified analysis and shared code
- Useful dataset with enough follow-up
- Balanced discussion with plain limits
- Readable figures and tables
Limits You Can Flag
- Outcome not prespecified or measured inconsistently
- Underpowered analysis for the main claim
- Wide intervals that do not support a firm take-home
- Key confounders missing from the model
- Conclusions that stretch beyond the data
Time Management And Second Rounds
Set a steady pace: day one for first pass and notes, day two for methods and figures, day three for the write-up. If a second round arrives, read the response letter first. Check that each point was handled. If a fix is partial, say what still blocks a clear claim and how to finish the job.
Common Reviewer Pitfalls And Fixes
Long lists of edits can hide the message. Lead with the one or two points that change the claim. Overwriting a paper in your own style slows the authors and muddies voice; point to examples and let the team rewrite. Personal tone creeps in fast; keep remarks tied to lines, figures, and results. Delay hurts authors; if you cannot finish on time, tell the editor early so a backup can step in. Private notes should be brief, factual, and saved for ethics flags or fit with the journal. If the study belongs in a different venue, suggest a general type, not a rival by name.
Sample Sentences You Can Reuse
These short lines help you stay neutral and clear:
- “Thank you for the chance to review this work. My expertise covers the design and the analysis used here.”
- “The paper asks a clear question and the dataset suits that aim.”
- “Please add a registry ID and align outcomes with the registered plan.”
- “The effect size is small with wide intervals; the claim should be hedged.”
- “I recommend major revision given the changes listed under Major Points.”
- “I do not see a path to support the claim with the current design; rejection is advised.”
Quick Reference Links
Keep these open while you work: the ICMJE recommendations for roles and process, the COPE reviewer guidance for conduct and privacy, and the EQUATOR reporting guidelines for study-type checklists.