No, literature reviews synthesize published studies; they don’t gather new data, so they aren’t empirical research.
A lot of students and early-career researchers ask the same thing: are literature reviews empirical? The short answer above sets the baseline, but the story has nuance. Some review types apply strict methods, some apply statistics to study results, and some map a field without running tests or surveys. This guide draws the lines in plain language, shows how editors classify each review type, and gives you a quick way to label your own assignment or manuscript.
Are Literature Reviews Empirical? Where The Line Sits
Empirical work gathers original observations or measurements and analyzes those data to answer a question. A literature review does something else: it finds, reads, and compares prior studies, then writes a narrative from that body of work. In that sense, the review is secondary research. The units under the microscope are papers, not people or specimens. That’s why journals and style guides place most reviews in the non-empirical camp. You may see tight protocols and even meta-analysis inside a review, but the input still comes from other authors’ datasets. The output is a synthesis, not a new dataset. Sources below clarify this stance and supply naming and reporting rules used by editors and peer reviewers.
Fast Classifications You Can Trust
Use the table below to label common review types by whether they are empirical, what they build, and what readers should expect. This snapshot appears early so you can match your task and move on with confidence.
| Review/Study Type | Empirical? | What It Produces |
|---|---|---|
| Narrative Literature Review | No | Synthesis of themes, debates, and gaps drawn from prior studies |
| Systematic Review | No | Protocol-driven synthesis that follows pre-set search and screening steps |
| Meta-Analysis | No | Statistical pooling of effect sizes reported in prior studies |
| Scoping Review | No | Map of concepts, methods, and evidence across a topic |
| Umbrella Review | No | Summary of multiple reviews on a broader question |
| Rapid Review | No | Streamlined systematic approach with time-saving shortcuts |
| Bibliometric Review | Borderline | Counts and networks of publications and citations drawn from databases |
| Primary Empirical Study | Yes | New measurements from experiments, surveys, interviews, or observations |
Is A Literature Review An Empirical Study? Common Misreads
This is where confusion starts. A well-run review can look “data-heavy”: you see flow diagrams, risk-of-bias tables, and forest plots. Those figures analyze published results, not fresh measurements. Editors treat that as secondary analysis. Reporting guides like PRISMA 2020 were built for reviews of studies and tell you how to report search strings, screening steps, and synthesis methods. PRISMA does not turn a review into a trial or a survey; it just standardizes how you report a non-empirical synthesis. You can confirm that in the PRISMA overview and in the BMJ explainer for the 2020 update.
What Counts As “Empirical” In The First Place?
Empirical work collects raw observations and draws findings from that raw stream. Think sensors, lab assays, field notes, interview transcripts, or log files. A review draws from published results. It may extract numbers (sample sizes, means, odds ratios) and run statistics across studies, but those numbers were already measured by somebody else. The activity is rigorous, yet it remains synthesis. If your course or journal asks, “Is your piece empirical?” the safe answer for a review is no—unless you also ran a new study and the review is a background section inside that study.
Why Some People Call Reviews “Empirical”
Two edge cases create the label mix-up:
- Bibliometric reviews. These pull records from databases and count patterns. The counts are derived from metadata, not lab or field measures. Some authors call that “empirical” because numbers are involved. Most editors still file it under non-empirical secondary analysis.
- Meta-analysis. This runs models on effect sizes and produces pooled estimates. The math is original, but the inputs are published. The work is analytical, yet the data source is still prior research.
Core Traits: Review Vs Empirical Study
Here is a simple comparison that you can apply to any manuscript or assignment prompt.
Purpose
Review: Summarizes what is known, compares methods and outcomes, and explains where evidence converges or conflicts.
Empirical study: Tests a question with new observations and reports those results.
Inputs
Review: Published articles, reports, datasets, and their stated findings.
Empirical study: Raw measurements gathered by the author team.
Outputs
Review: A reasoned narrative, structured tables, and (in meta-analysis) pooled estimates with plots.
Empirical study: New data, statistical tests on that data, and a full methods section for replication.
Quality Signals Editors Check
- For reviews: Clear question, transparent search, inclusion/exclusion rules, critical appraisal, and a synthesis that matches the evidence. See the BMJ PRISMA 2020 paper for the reporting checklist and diagrams.
- For empirical studies: Sampling or recruitment plan, instruments or protocols, analysis plan, and data availability as allowed by ethics and policy.
Using The Main Keyword Naturally Across Your Text
If you are writing for readers who search “are literature reviews empirical?”, you can address that phrase directly in your intro, in one subheading, and once again when you wrap the guidance and give next steps. Keep the phrase intact where it reads smoothly, and lean on related terms elsewhere (systematic review, meta-analysis, scoping review, empirical study). That approach feels natural to readers and matches how academic librarians teach search behavior.
When A Review Feels “Data-Driven” But Isn’t Empirical
Some review formats look like data papers. A systematic review lays out a protocol, runs searches across databases, screens records, assesses bias, and then synthesizes evidence across included studies. The workflow is methodical. Still, the pipeline never touches participants or bench samples. That’s why the Cochrane Handbook defines a systematic review as a way to collate empirical evidence from prior research using explicit methods that aim to reduce bias. The phrase “collate evidence” marks it as synthesis, not primary data collection.
Meta-Analysis: New Statistics, Old Data
Meta-analysis pools effect sizes from included studies to raise precision. You will see forest plots, heterogeneity stats, and subgroup checks. These steps crunch numbers that were already published. The modeling is new; the measures are not. So journals treat meta-analysis as a review tool rather than a primary design.
Common Assignments And How To Label Them
Use this second table to label course or thesis components without guesswork.
| Assignment/Section | Empirical? | What To Include |
|---|---|---|
| Standalone Literature Review | No | Clear question, search scope, inclusion rules, critical synthesis |
| Systematic Review Paper | No | Protocol, databases, screening flow, bias appraisal, structured synthesis |
| Meta-Analysis Paper | No | Effect size extraction, model choice, heterogeneity, sensitivity checks |
| Thesis Chapter: Literature Review | No | Field map, gaps tied to your study, sources table, narrative linkages |
| Original Research Study | Yes | Participants/samples, instruments, procedures, analysis plan, results |
| Mixed Methods Thesis (Study + Review) | Partly | Separate chapters: review (non-empirical) and your study (empirical) |
| Bibliometric Mapping | Borderline | Database source, search keys, counting rules, network visuals |
How To Write A Strong Non-Empirical Review
Editors and instructors care about two things: a fair search and a fair synthesis. The following steps align with top guides and keep your review clear and checkable.
1) Set A Sharp Review Question
State the population or phenomenon, the concept or outcome, and the context. Keep the scope tight so the search and screening work stays manageable and you can synthesize with clarity.
2) Draft A Protocol You Can Follow
Even for a course paper, a one-page plan helps. Name the databases, date ranges, and languages. Add broad inclusion and exclusion rules. If the paper is high-stakes, post the plan on a lab wiki or OSF page before you start to keep yourself honest.
3) Run A Transparent Search
List your exact search strings and the date you ran them. Capture the count of records returned from each database. Keep a copy of the strings in an appendix or repository so another reader can repeat the search later.
4) Screen With A Reproducible Flow
Record totals at each step: found, deduplicated, screened by title/abstract, screened full text, and included. A flow diagram keeps it clean. PRISMA offers a ready-made chart style that readers recognize instantly.
5) Appraise Study Quality
Choose a bias or quality tool that fits your topic (randomized trials, observational studies, qualitative reports, or mixed designs). Apply the same tool across all included studies and report the results in a compact table or heat map.
6) Synthesize Without Overreach
Group studies by design, population, or outcome and write the takeaways for each group. If results clash, lay out the plausible reasons: measurement differences, small samples, or context shifts. Keep claims tied to the evidence you actually have.
7) Cite Sources Readers Trust
Two links worth knowing: the Purdue OWL page on literature reviews for writing guidance, and PRISMA 2020 for reporting a systematic review and any linked meta-analysis. These are widely used across disciplines and give your paper a format readers can follow.
Edge Cases And How To Explain Them
Some projects sit near the boundary and can puzzle graders or reviewers. Here’s how to label them plainly in your cover letter or methods section.
Scoping Reviews
Use a scoping format when the field is messy or emerging and you need a map of concepts and approaches. This is still non-empirical. You are charting what exists, not running tests.
Mixed Projects
A thesis or grant may pair a review with a new study. Label each part on its own merits: the review chapter is non-empirical; the study chapter is empirical. Keep the methods separate so readers can see what came from where.
Service Evaluations And Audits
These collect new data inside one site or service. That makes them empirical even if the write-up looks like a report rather than a journal article. If you also include a review section, call that portion non-empirical to avoid confusion.
Where Editors Draw The Method Line
Editors use common yardsticks. The BMJ PRISMA 2020 article lists items a review should report: databases, search dates, selection process, risk-of-bias methods, and synthesis steps. The Cochrane Handbook chapter on starting a review describes systematic reviews as ways to gather and collate prior empirical evidence using methods that reduce bias. Both sources reinforce the same rule: a review studies studies; an empirical paper studies the world directly.
FAQ-Style Clarifications Without The FAQ Section
“My Instructor Says I Need An Empirical Paper. Can I Submit A Review?”
No. An empirical assignment asks you to gather data and report the results. A literature review fills the background or stands alone as synthesis. If you want to include a review inside an empirical paper, keep it as a separate section and state that your data come later.
“Can A Review Include Numbers And Still Be Non-Empirical?”
Yes. Data extraction and meta-analysis apply math to study results that already exist. The analysis is new; the measurements are not. That keeps the work in the review category.
“Where Do Librarians Fit In?”
They help craft search strategies, pick databases, and set up screening tools. Adding a librarian as a coauthor or acknowledgments line can raise search quality and transparency.
Bottom Line For Writers And Students
When you see the question “are literature reviews empirical?” answer no in your abstract and cover letter, then show your method for a clean synthesis. If your project includes both a review and a new study, label each part plainly. Use Purdue OWL for writing help and PRISMA for reporting a systematic approach. With those anchors and the checklists above, your reader can tell exactly what kind of paper they’re reading—and you can defend that label with ease.
