How To Do A Qualitative Systematic Review In Health Research | Plan, Search, Synthesize

Define a focused question, register a protocol, search widely, screen in pairs, appraise, synthesize transparently, and rate confidence with CERQual.

You can run a qualitative systematic review that readers trust and decision makers can use.
This guide gives you a clean, repeatable process that matches mainstream standards while staying practical for real projects.

The flow aligns with PRISMA 2020 reporting, the JBI approach to qualitative evidence, and GRADE-CERQual confidence ratings.
You will move from a sharp question to actionable findings without losing transparency.

Planning checklist and outputs

Step What you produce Tips and tools
Question and scope Clear focus, context, and participants Frame with a question stem that fits qualitative aims (e.g., experiences, barriers, views)
Protocol Public record of methods Register on OSF or PROSPERO; align with the PRISMA 2020 items for methods
Search Reproducible strategies Use subject headings and text words; capture grey literature; log dates and platforms
Screening PRISMA flow and reasons for exclusion Two reviewers working independently; pilot rules on a small set first
Appraisal Study-level judgments Use CASP or the JBI checklist; record quotes or page lines that justify calls
Extraction and coding Structured data set and codebook Code the findings sections line by line; revise the codebook as decisions converge
Synthesis Analytic themes or translated concepts Pick a method that fits the question: thematic synthesis, meta-ethnography, framework, or meta-aggregation
Confidence rating CERQual profiles and a summary table Judge methodological limits, coherence, adequacy, and relevance for each finding
Report Full manuscript and appendices Follow PRISMA 2020; use ENTREQ or eMERGe when suitable

What a qualitative systematic review is

A qualitative systematic review in health research brings together findings from primary qualitative studies to explain experiences, needs, barriers, enablers, or the workings of interventions and services.
The unit of analysis is the reported finding, not effect sizes. The result is an integrated set of concepts or themes that can inform policy, practice, guideline panels, and future trials.

Steps for a qualitative systematic review in health research

Work through these stages in order. Adjust depth to fit your topic, timelines, and team capacity.

1) Frame the question and scope

State the focus up front. For qualitative aims, write a question that targets experiences, views, needs, barriers, or implementation issues.
Define the population, setting, and any exposure or service of interest. Clarify what counts as qualitative evidence, such as interviews, focus groups, ethnography, open-ended survey data, or mixed-methods components with analyzable text.

Decide on inclusion and exclusion rules that match the aim. Typical rules cover study design, language, date limits, and health context. Keep rules simple and testable.

2) Write and register a protocol

Draft a protocol that fixes your question, sources, screening plan, appraisal tool, synthesis method, and a plan for assessing confidence in findings.
A public record reduces bias and helps others reuse your plan. You can use the PRISMA 2020 checklist to shape methods and an open registry to lodge the protocol when your topic fits local rules.

Name the team, including an information specialist for the search.

3) Build transparent searches

Databases and sources

Search widely. Databases often used for health topics include MEDLINE or PubMed, Embase, CINAHL, and PsycINFO.
Use both controlled vocabulary and text words. Combine synonyms with OR and join concept blocks with AND. Add study-method filters for qualitative research only when sensitivity remains high.

Documentation

Reach beyond journals by scanning theses, conference abstracts, and reports from agencies or NGOs. Track backward and forward citations from your included studies. Peer review the strategy with a librarian and write down platform, coverage dates, and the strings you ran so others can repeat your work.

Export all records to a single library, remove duplicates, and archive the raw exports for your appendices.

4) Screen titles, abstracts, and full texts

Run a pilot round on fifty records to sharpen the rules. Then screen titles and abstracts in pairs, blind to each other’s decisions.
Retrieve full texts for possible includes and repeat paired screening. Record one primary reason for exclusion per study at the full-text stage.
Create a PRISMA flow diagram that shows numbers at each stage along with counts for duplicates and unresolved items.

5) Appraise methodological quality

Use a structured checklist such as CASP qualitative or the JBI qualitative tool. Calibrate with two reviewers, compare judgments, and log notes that justify calls.
Focus on transparency of methods, fit between question and method, sampling approach, data collection, analysis, and reflexivity. Do not exclude solely on quality unless the study offers no usable findings.
Instead, carry low-confidence studies forward and reflect their limits in the confidence step.

6) Extract data and build a codebook

Extract the descriptive fields you need for context: setting, participants, method, and data source. Then pull the authors’ findings and any supporting quotes.
Code text line by line to capture meaning and action. Start with open codes, then merge into a shared codebook. Keep a decision log so the audit trail stays intact.

Use simple software that your team can share. Spreadsheets work for small reviews. For larger sets, a qualitative analysis tool can speed coding and retrieval.
Whichever route you take, version the files and back them up.

7) Choose a synthesis method that fits

Pick a method that matches your aim and the nature of the evidence. Thematic synthesis suits broad practice questions and mixed samples of designs.
Meta-ethnography suits concept building across rich qualitative studies. Framework synthesis works when a prior framework exists, such as a program theory or WHO model.
Meta-aggregation, used in the JBI tradition, groups findings into categories and synthesizes them into statements ready for use in guidance.

Write out the exact steps for your chosen method before you begin. For thematic synthesis, move from codes to descriptive themes and then to analytic themes that offer fresh insights.
For meta-ethnography, translate concepts across studies and produce a line-of-argument that explains the whole picture. For framework approaches, map coded data to the framework and show where new themes extend it.

8) Rate confidence in each finding

After you have a stable set of findings, grade your confidence using GRADE-CERQual. For each finding, judge four components: methodological limitations of the contributing studies, coherence across those studies, adequacy of data, and relevance to your question.
Write a brief profile that explains the judgments and assign a confidence level such as high, moderate, low, or very low. Link the profile to the finding in your main table so readers can see both the message and its strength.

Explain how confidence influenced any practice messages.

9) Report with clarity and reproduce your search

Structure your write-up so readers can see what you did and what you found at a glance. Report the full search strings, databases, platforms, dates, and any limits.
Add the PRISMA flow and the study list with reasons for exclusion at full text. Use ENTREQ to make sure the narrative includes sampling, data handling, synthesis steps, and outputs.
If you used meta-ethnography, follow the eMERGe reporting items for the analytic phases and outputs.

Present a main table of findings that pairs a short finding statement with a short explanation and its confidence level. Keep appendices for long lists and raw material so the main text stays readable.

10) Plan for updates and sharing

State when an update is likely and how you will track new studies. Share your protocol, search files, screening forms, extraction tables, and codebook in a public repository.
This saves time for future teams and builds trust in your work.

Synthesis methods at a glance

Method Best fit Two moves that matter
Thematic synthesis Practice questions with mixed designs Code line by line; build descriptive and then analytic themes
Meta-ethnography Concept building across rich studies Reciprocal and refutational translation; craft a line-of-argument
Framework synthesis Topics with an existing model Map to the framework; expand it where data demand
Meta-aggregation Decision-ready statements Group findings into categories; synthesize categories into clear statements

Doing a qualitative systematic review in health research: frequent pitfalls

Many reviews stall or lose clarity because of recurring traps. Use the checks below to stay on track.

Keep the question tight

Over-broad aims pull in findings that do not speak to one another. Trim scope until your studies share a common thread in population, setting, or phenomenon.
State what will not be covered so readers know where the line sits.

Balance sensitivity and precision in searches

Thin searches miss studies. Bloated searches slow screening. Pilot strategies in one database and inspect a sample of hits. Add terms that retrieve missed landmark studies. Remove obvious noise terms that flood results without new includes.

Write decisions down

Teams often recall rules differently. Use a living log for screening and coding decisions with short examples. When a new edge case appears, add it to the log and alert the team.

Separate study quality from confidence in findings

A study can have limits yet still contribute insight. Keep appraisal at the study level and reflect those limits later in CERQual. Mixing the two can hide useful patterns.

Match the method to the aim

Do not force one synthesis method onto every question. If you need ready-to-use statements, meta-aggregation helps. If you need a fresh concept or program theory, meta-ethnography or framework routes may fit better.

Show the chain of evidence

A clear audit trail builds trust. Link each finding to sample quotes or authors’ statements and list which studies support it. Keep a short note on dissenting data and how you handled it.

Templates and deliverables you can reuse

Set up a small library of reusable files so each new review starts faster. Use the JBI Manual to model your protocol headings and critical appraisal forms.
For reporting, lean on the PRISMA 2020 resources for checklists and flow diagrams.
When you grade confidence, keep a simple profile template based on GRADE-CERQual so each finding has a short rationale and a clear level.

Store versions of the search strings, deduplication logs, and screening forms in the same folder as the protocol. Add a readme that lists file names and the order to read them.
Use the same naming convention across projects so teammates can jump in without a tour.

Ethics and reflexivity in review teams

Qualitative synthesis draws on interpretation. Name how the team’s backgrounds may shape choices during coding and theme building.
Keep short reflexive notes at milestones such as after the pilot coding round and after each synthesis cycle.
Report any conflicts of interest and explain data handling for unpublished materials such as theses or reports.

Plan fair credit. List contributors and their roles. Share authorship rules early so expectations stay clear.

Quality signals editors and reviewers look for

Clear scope and protocol

Editors want to see a sharp aim and a public protocol. A link to a timestamped registration helps.

Searches that others can repeat

Full strings, platforms, and dates belong in an appendix. A short description sits in the main text with a pointer to the full details.

Two-stage screening with a flow diagram

Paired screening and a clean PRISMA flow show care in study selection. The flow should tally across all stages without gaps.

Fit-for-purpose synthesis

The chosen method should match the aim and the data. A few sentences that state why the method fits will help readers trust the output.

Confidence linked to findings

Confidence ratings should sit next to findings, not as an afterthought. Readers should see what you found and how much weight to give it on one page.

Bring it all together

This path lets you conduct a qualitative systematic review that reads cleanly and stands up to scrutiny.
With a stable protocol, a transparent search and screening trail, a synthesis method that fits the aim, and confidence ratings that travel with each finding, your work will be ready for guideline panels, service leads, and peers who need credible messages from qualitative evidence. Share a plain-language summary for readers.