PRISMA turns a messy pile of search hits into a simple story. The flow diagram shows where records came from, what was screened, what was excluded, and what made the final cut. If you plan and log your work as you go, drawing the diagram near the end takes minutes. This guide walks you through clear steps for a literature review that needs a PRISMA diagram, with notes on wording, counts, and layout that align with PRISMA 2020. Today.
What a PRISMA diagram shows
A PRISMA diagram is a flowchart that reports the numbers behind study selection. It starts with identification of records from databases and other sources, shows deduplication, tracks screening at title and abstract level, then documents full text review with reasons for exclusion. The final boxes report studies included in the review and, if used, in any meta-analysis. You can download the official PRISMA 2020 flow diagram from the PRISMA website.
| Stage | What To Record | Where You Get The Number |
|---|---|---|
| Records identified: databases | All hits from each database search before deduplication | Database export or screenshot |
| Records identified: registers | All hits from trial or study registers | Register export |
| Records removed before screening: duplicates | Difference between total identified and the set after deduplication | Reference manager or deduplication log |
| Records screened | De-duplicated records assessed at title and abstract stage | Screening tool export |
| Records excluded | Excluded at title and abstract stage | Screening log |
| Reports sought for retrieval | Full text reports you tried to obtain | Request log |
| Reports not retrieved | Reports you could not obtain | Request log |
| Reports assessed for eligibility | Full text reports read and judged against criteria | Full text screening log |
| Reports excluded with reasons | Full text reports excluded, grouped by one main reason | Screening log with codes |
| Studies included in review | Distinct studies that met criteria | Study inclusion log |
| Studies included in meta-analysis | Subset that entered a quantitative synthesis | Meta-analysis dataset |
PRISMA flow diagram for a literature review: steps that work
You will fill the diagram while you manage records. Save searchable notes and exportable counts so you can show exactly how each number was produced.
Set up your protocol
Write down your question, eligibility criteria, sources you plan to search, and screening rules before you begin. If your field expects registration, add the record link to your report and keep the wording consistent across the diagram and text.
Run and log your searches
Search multiple databases that fit your topic and add trial registers or preprint servers when relevant. Save each full strategy, the platform, the date, and the number of records retrieved. Export search results in a single format if you can. For methods on planning searches and selecting studies, see the Cochrane Handbook chapter on searching and selecting studies.
Remove duplicates
Combine all records into one library and run a deduplication step. Record the total before and the total after. The difference becomes “duplicates removed.” Keep the method brief in your methods section so readers can repeat it.
Title and abstract screening
Screen titles and abstracts against your criteria. If you screen solo, keep a clear audit trail. Record the number sent to screening and the number excluded at this point. Use a short rule set for typical reasons to keep decisions consistent.
Retrieve and screen full texts
Collect full texts for all records that pass the first screen, and for any record that is unclear. Log how many reports you attempted to retrieve, how many you could not obtain, and how many you assessed. When you exclude a report at this stage, pick one primary reason from a short coded list so the diagram can display grouped counts.
Include studies and build the diagram
Once full text decisions are complete, count studies and, if applicable, count reports as well. Record how many studies enter the review and how many were pooled in a meta-analysis. Now transfer the counts to the boxes. If you used registers or both databases and registers, select the matching PRISMA 2020 template.
Design tips that match PRISMA 2020
Keep the label text from the template. Adjust box notes only when the template tells you to add detail, such as a list of exclusion reasons. Use the new wording for “records identified,” “records screened,” “reports sought,” and “reports assessed,” which aligns counts with how searches and screening run today. If you update a prior review, use the update template so new and old sources are clear. The PRISMA 2020 statement sets the current items, wording, and figure styles.
Labels and totals
Place the total for each box inside the box. If a box calls for a list of reasons, place the list under the box with one line per reason and a count for each line. The grand total of reasons should equal the full texts excluded count.
Reasons for exclusion
Use the smallest set of reasons that still helps a reader. Common lines are wrong population, wrong intervention or exposure, wrong outcome, wrong study design, duplicate publications, and not available in full text. Pick one reason per excluded report to avoid double counting.
Automation and registers
If you used screening by automation or a classifier, the PRISMA template includes a place to report numbers sent through automation and numbers excluded. If you searched registers, use the version that splits records identified by databases and by registers. The flows then merge just before screening.
Doing a PRISMA diagram for literature reviews: worked example
Say your topic is wearable sleep trackers for adults with insomnia. You search four databases and one trial register. You export all records, remove duplicates, screen titles and abstracts, and then assess full texts. Below is a simple set of counts that shows how the pieces fit.
Example dataset and counts
Databases return 1,840 records. The register returns 65 records. After deduplication you keep 1,520 de-duplicated records. You screen these at title and abstract level and exclude 1,320. You seek 200 full texts, but you cannot obtain 8. You assess 192 full text reports and exclude 160 with reasons. You include 32 studies in the review, of which 18 enter a meta-analysis.
Map the counts to the boxes
In the top boxes you record 1,840 from databases and 65 from registers. In the next box you record 385 duplicates removed. In the screening box you record 1,520 records screened and 1,320 excluded. In the full text boxes you record 200 reports sought, 8 not retrieved, 192 assessed for eligibility, and 160 excluded with reasons. In the last boxes you record 32 studies included and 18 in the meta-analysis. Then you add a short list of reasons under the full text exclusion box with counts that add to 160.
Common pitfalls and fixes
Small slips create mismatched totals and confuse readers. The table below lists frequent problems and simple fixes that keep your numbers tidy and your diagram easy to read.
| Problem | How It Shows Up | Fix |
|---|---|---|
| Totals don’t add up | Box counts or reason totals conflict | Walk through the flow top to bottom and recalc from logs |
| Duplicates not reported | No number for deduplication | Save before and after counts and show the difference |
| Mixed records and reports | Counts swap between records and reports | Use records for screening and reports for full texts |
| Long reasons list | Too many granular reasons | Pick one primary reason per report and group short labels |
| Unclear sources | Databases or registers not named | Name each source in methods and mirror labels in the figure |
| Hidden date info | Search dates missing | Add dates in the methods and, if space permits, near the figure |
| Automation not explained | Classifier used but not reported | Add a one line note and show the numbers in the matching boxes |
| Meta-analysis count missing | Figure omits pooled studies | Add the final box for studies in the meta-analysis |
| Update diagram used wrong | Old and new searches mixed | Use the PRISMA update template and split old vs new at the top |
Make it reader-friendly
Use a clear font and adequate spacing so each number is easy to spot. Keep box order left to right and top to bottom. Add alt text when you insert the figure so screen reader users get the same story. Near the figure, echo all main totals so readers do not need to zoom in to read the image.
Tools to draw the diagram
You can fill the PRISMA Word template, build a clean diagram in a slide deck, use a vector editor, or pick an online generator that exports vector formats. When a tool lets you lock styles, set consistent box widths, font sizes, and arrow weights so the figure looks polished. The PRISMA website also links to templates in Word that many journals expect. Many journals request the Word template format.
What reviewers look for
Each number should be reproducible from your logs. Report dates for each search source. Keep the same inclusion and exclusion rules in your methods and screening forms. When counts change during revision, update both the figure and the text. Link to the checklist and the flow diagram template you followed so readers can verify the version.
Count studies and reports the right way
PRISMA 2020 draws a line between records, reports, and studies. Records are the entries you screen at title and abstract stage. Reports are the full texts you read. A study can have more than one report. That often happens with protocol papers, conference abstracts, and companion articles. When you reach the final boxes, you count studies, not reports. To avoid double counting, pick one report as the anchor for each study and link related reports to that anchor in your spreadsheet or reference manager.
During screening, treat records that describe the same report as duplicates. During full text review, treat multiple reports of the same study as a bundle that leads to one include or one exclude. If two reports describe different outcomes or time points from the same trial, they still map to one study in the “included” box. If one report meets criteria and another report from the same study does not, keep the study and note the extra report in your data charting file.
Updates and living reviews
When you update a review, searches span at least two time frames. The update template has separate boxes at the top for new sources and the prior flow. Keep all counts from the original review intact, then add new records and reports under the update path. If you maintain a living review with regular search runs, keep a dated log and archive each diagram by cycle, then provide an aggregate figure that shows the cumulative totals and the current included set.
Data management that makes counting easy
A tidy workflow makes the diagram simple to fill. Name exports with a pattern that includes the source and the date. Keep one spreadsheet for counts with a row per source and columns for strategy link, platform, date, and hit count. Record deduplication settings and the before and after totals. Use screening software with exportable decisions, or if you screen by hand, save a sheet with record IDs and keep/drop calls. When you reach full texts, store PDFs by status and keep a sheet with one column for status codes.
Quick template you can reuse
1) Write a short protocol with question, criteria, and sources. 2) Save each search exactly as run and record counts on the day. 3) Deduplicate once and record the method. 4) Screen titles and abstracts with a rule set. 5) Retrieve full texts and code one reason for each exclusion. 6) Count included studies and those in any meta-analysis. 7) Fill the PRISMA 2020 boxes and add the reasons list. 8) Export the diagram as vector art and add alt text.
