Peer review keeps medical science honest, clear, and useful. A tight process helps editors, helps authors, and protects patients who rely on published work. This guide gives you a clean, repeatable way to review any clinical, public health, or basic science manuscript without getting lost in the weeds. You will see what to check first, how to test the strength of methods, what good reporting looks like, and how to write comments that land. You can follow it as a first timer or as a seasoned reviewer ready to sharpen your approach.
Doing A Peer Review Of A Medical Research Paper: Core Steps
Start with a conflict check and a gut check. If you know the authors, have a stake in the results, or feel too close to the topic, decline fast so the editor can reassign. If the paper sits outside your skill set, say so. If you can review, accept and set a firm time box. Many editors look for a one to two week turn. If you will need longer, send a short note first.
Next, scan the abstract, title, and submission letter. Is the research question clear? Does the design fit the question? Do outcomes match the stated aims? If the match looks weak or the paper seems off scope for the journal, flag that early in your confidential note to the editor. Then read the full text twice. First pass for the big picture and fit. Second pass line by line with notes.
Rapid Triage Checklist
| Item | What To Look For |
|---|---|
| Fit | Scope aligns with journal aims; question matters to readers; novelty or clear information gain. |
| Ethics | IRB or ethics approval stated; consent described; trial registration if interventional. |
| Design | Design fits the question (trial, cohort, case-control, cross-sectional, diagnostic, qualitative). |
| Methods | Pre-specified outcomes; inclusion and exclusion; sample size plan; bias controls. |
| Stats | Appropriate tests; model assumptions; handling of missing data; sensitivity checks. |
| Reporting | Checklist match: CONSORT for trials, PRISMA for reviews, STROBE for observational work. |
| Transparency | Data availability, code access, protocol or preregistration, competing interests. |
Use this list to decide whether the paper needs major work or already tracks well. Then build your report in the structure below.
Set Up Your Review Workflow
Work with a simple template so you never miss the basics. Keep a plain text file or a note in your reference manager. Add the manuscript ID, title, and a one line summary of the claim. Then draft a short list of strengths before listing concerns. That balance helps the editor and shows the authors you read with care.
First Pass: Triage And Fit
Read without pausing to check citations or numbers. Ask three things. One: is the question clear and linked to patient or system outcomes? Two: does the design match the question? Three: does the paper tell you anything new or more precise than prior work? If any answer lands as a no, note it for the editor. If all three look fine, proceed.
Second Pass: Line-By-Line Assessment
Now annotate the PDF or printout. Mark unclear sentences. Circle outcomes and definitions. Underline statistical methods and note whether they match the data type. Write down every claim that needs a check. Do not rewrite the paper; point to issues, give a short fix when you can, and keep the tone steady.
Study Design And Methods That Hold Up
Good methods beat flashy results. Match the design to the question and look for pre-specification. Trials should name a primary outcome and a time point. Observational studies should define exposure, outcomes, and confounders in advance and explain missingness. Diagnostic studies need a reference standard. Qualitative work needs a sampling approach and a description of the analytic approach.
Eligibility, Outcomes, And Bias
Check inclusion and exclusion. Are these consistent with the setting and the question? Look for outcome definitions that clinicians or public health workers can apply. Review blinding, allocation concealment, and steps to limit selection, performance, and detection bias. If the paper uses matching, check balance and whether the matching variables sit outside the causal path. If the paper uses weighting, look for diagnostics.
Sample Size And Power
Trials and many cohort studies should include a sample size plan tied to the primary outcome and a clinically meaningful effect. If the paper has no plan, note that. If the plan exists, check the inputs and whether the final sample met the target. Under-powered studies can still add value if precision is reported honestly and claims stay modest.
Analysis Choices And Assumptions
Ask whether the analysis fits the data and design. Binary outcomes often use logistic regression; time-to-event data often use Cox models; clustered data call for mixed models or GEE. Check model assumptions and diagnostics. Look for handling of missing data that goes beyond case deletion. Multiple imputation or a clear sensitivity analysis builds trust.
Ethics, Reporting, And Integrity
Medical research touches people. Review the ethics section with the same care as the statistics. Confirm IRB approval or an ethics waiver. Check consent, assent for minors, and data privacy steps. Interventional trials should provide a registration number. If these elements are missing, mark them as major issues.
Match the manuscript to a reporting checklist. Trials should align with CONSORT, systematic reviews with PRISMA, and observational studies with STROBE. The EQUATOR reporting guidelines hub lists the right checklist for most designs, including SPIRIT for protocols and STARD for diagnostic studies. A paper can be strong and still miss main items; your comments can set a clear path to fix them.
Guard fairness and confidentiality. Do not share the manuscript or use its ideas. Avoid loaded language. If you suspect plagiarism, image manipulation, or data fabrication, tell the editor in the confidential section and cite clear signals. For general standards, the COPE peer reviewer guidelines are your north star.
Journals also look for clarity on authorship, contributorship, funding, data sharing, and competing interests. If statements look vague, ask for detail. The ICMJE Recommendations set shared rules many journals follow, including guidance on trial registration and data transparency.
Statistics And Data Transparency
Reproducible work wins trust. Authors should describe software, versions, and main packages. Figures and tables should match the text. Flow diagrams should show screening and attrition. Confidence intervals should accompany p values. If the team used Bayesian methods, priors and sensitivity checks belong in the text or supplement. If the team built prediction models, look for internal validation and calibration plots.
Effect Sizes And Precision
Push beyond significance labels. Ask for effect sizes that clinicians can use, such as risk differences, numbers needed to treat, or median time saved. Ask for absolute risks next to relative measures. Wide intervals tell you the estimate is unstable; that is fine as long as claims match the level of precision.
Multiplicity And Subgroups
Many manuscripts test many outcomes or subgroups. Ask whether the plan pre-specified them. If not, request a toned-down claim and clear language that the analysis is exploratory. If subgroups matter for care, ask for interaction tests and a caution on over-interpretation.
Data Access And Sharing
Check the data availability statement. If privacy blocks sharing, that can be fine, yet the statement should say how qualified researchers could request access. Code archives on trusted platforms and links to cleaned, de-identified datasets help other groups reproduce results. When these exist, say so in your strengths list.
Write Constructive, Actionable Feedback
Editors want a clear bottom line and a path for revision. Lead with a two to three sentence summary of the paper in your own words. Then list strengths. After that, present major issues and minor issues. Keep your tone fair and specific. Avoid guesswork about the authors’ motives. Aim for fixes the team can carry out within the journal’s word and figure limits.
Constructive Language That Works
| Issue | Helpful Phrasing |
|---|---|
| Unclear question | “Please state the primary question and outcome in the last line of the introduction.” |
| Methods mismatch | “The design does not match the aim. A cohort or case-control approach may fit better, or the aim can be reframed to match the current design.” |
| Under-powered study | “Add a sample size section and frame claims around precision rather than significance.” |
| Selective reporting | “List all outcomes in a table and mark which were pre-specified. Move post-hoc analyses to a separate section with careful language.” |
| Over-stated claim | “Tone down language and align claims with the effect size and the confidence interval.” |
| Missing ethics info | “Add IRB approval number, consent process, and, for trials, registration details.” |
Short, neutral phrasing keeps authors on task and makes the editor’s decision easier.
How To Peer Review A Medical Research Paper With Confidence
Structure your report with the editor in mind. Begin with a one paragraph summary. Then list three to six strengths. Next, present major issues in priority order, each with a fix if one exists. After that, minor issues. Close with a brief recommendation on fit and interest for the journal’s readers. If you have concerns you do not want to share with the authors, add a confidential note to the editor.
Select A Decision
Many journals use four options: accept, minor revision, major revision, or reject. Accept is rare on a first round. Minor revision fits when the core is sound and fixes are small. Major revision fits when the study has value yet needs extra analyses, clearer reporting, or a reframed claim. Reject fits when the design cannot answer the question, ethics are unclear, or the work sits outside scope. When you choose a path, give a one sentence reason.
Calibrate Your Tone
Be direct without sharp edges. Use plain language. Avoid sarcasm. Thank the authors for the effort. When you ask for more work, explain why it matters. When you cannot see a fix, say so and move on. Your goal is a fair record for the editor and a useful to-do list for the team.
Be Specific About Evidence
Point to the line, table, or figure tied to each comment. If a statement lacks a citation, suggest one. If a measure is not standard, ask for a definition or a reference. If a result vanishes after a sensitivity check, ask the team to show that. When the paper uses a reporting checklist, ask the authors to upload the filled checklist with page numbers.
Common Pitfalls To Avoid
Do not ask the authors to write your preferred paper. If the study was not built to answer a different question, suggest a tighter claim, not a redesign. Skip line edits unless clarity blocks interpretation. Flag grammar only when it hides meaning. Avoid personal remarks and speculation about motives. Keep all manuscript content private.
Nitpicking Without Priority
Long lists of tiny edits bury the main message. Group minor edits and keep them short. Spend your energy on design, methods, and claims. Editors read for signal, not length.
Bias And Conflicts
Declare any links to the topic, the methods, or the authors. If you coauthored with an author in the recent past or share a grant, recuse. If you sense bias as you read, pause and reset. The job calls for fairness and care.
A Quick Template You Can Adapt
Summary: One short paragraph on the research question, design, setting, participants, main outcome, and main result.
Strengths: Three to six bullets on design quality, data quality, precise outcomes, clear reporting, or open materials.
Major Issues: Numbered points with evidence and a suggested fix when possible.
Minor Issues: Numbered points for clarity, figures, tables, and small edits.
Confidential To Editor: Scope fit, novelty, fairness, any ethics concerns, and your decision with a one line reason.
Next Steps After You Submit
Good reviews build better papers and stronger evidence. A clear process saves time for you and the journal. With practice, you will read faster, spot issues earlier, and write sharper comments. That helps readers and, down the line, patients who depend on sound research.