Back to Skills

Making Sense of Research Literature Reviews and Meta-Analysis

One of the best ways for a reporter to get up to date quickly, though, is to read a study of studies, which come in two forms: a literature review and a meta-analysis.

Back to Skills

When journalists want to learn what’s known about a certain subject, they look for research. Scholars are continually conducting studies on education topics ranging from kindergarten readiness and teacher pay to public university funding and Ivy League admissions.

One of the best ways for a reporter to get up to date quickly, though, is to read a study of studies, which come in two forms: a literature review and a meta-analysis.

A literature review is what it sounds like – a review of all the academic literature that exists on a specific issue or research question. If your school district or state is considering a new policy or approach, there’s no better way to educate yourself on what’s already been learned. Your news coverage also benefits from literature reviews: Rather than hunting down studies on your own and then worrying whether you found the right ones, you can, instead, share the results of a literature review that already has done that legwork for you.

Literature reviews examine both quantitative research, which is based on numerical data, and qualitative research, based on observations and other information that isn’t in numerical form. When scholars conduct a literature review, they summarize and synthesize multiple research studies and their findings, highlighting gaps in knowledge and the studies that are the strongest or most pertinent.

In addition, literature reviews often point out and explain disagreements between studies – why the results of one study seem to contradict the results of another.

For instance, a literature review might explain that the results of Study A and Study B differ because the two pieces of research focus on different populations or examine slightly different interventions. By relying on literature reviews, journalists also will be able to provide the context audiences need to make sense of the cumulative body of knowledge on a topic.

A meta-analysis also can be helpful to journalists, but for different reasons. To conduct a meta-analysis, scholars focus on quantitative research studies that generally aim to answer a research question – for example, whether there is a link between student suspension rates and academic achievement or whether a certain type of program reduces binge drinking among college students.

After pulling together the quantitative research that exists on the topic, scholars perform a systematic analysis of the numerical data and draw their own conclusions. The findings of a meta-analysis are statistically stronger than those reached in a single study, partly because pooling data from multiple, similar studies creates a larger sample.

The results of a meta-analysis are summarized as a single number or set of numbers that represent an average outcome for all the studies included in the review. A meta-analysis might tell us, for example, how many children, on average, are bullied in middle school, or the average number of points SAT scores rise after students complete a specific type of tutoring program.

It’s important to note that a meta-analysis is vulnerable to misinterpretation because its results can be deceptively simple: Just as you can’t learn everything about students from viewing their credit ratings or graduation rates, you can miss out on important nuances when you attempt to synthesize an entire body of research with a single number or set of numbers generated by a meta-analysis.

For journalists, literature reviews and meta-analyses are important tools for investigating public policy issues and fact-checking claims made by elected leaders, campus administrators and others. But to use them, reporters first need to know how to find them. And, as with any source of information, reporters also should be aware of the potential flaws and biases of these research overviews.

Finding Research

The best place to find literature reviews and meta-analyses are in peer-reviewed academic journals such as the Review of Educational Research, Social Problems, and PNAS (short for Proceedings of the National Academy of Sciences of the United States of America). While publication in a journal does not guarantee quality, the peer-review process is designed for quality control. Typically, papers appearing in top-tier journals have survived detailed critiques by scholars with expertise in the field. Thus, academic journals are an important source of reliable, evidence-based knowledge.

An easy way to find journal articles is by using Google Scholar, a free search engine that indexes published and unpublished research. Another option is to go directly to journal websites. Although  many academic journals keep their research behind paywalls, some provide journalists with free subscriptions or special access codes. Other ways to get around journal paywalls are outlined in a tip sheet that Journalist’s Resource, a project of Harvard’s Shorenstein Center on Media, Politics and Public Policy, created specifically for reporters.

Another thing to keep in mind: Literature reviews and meta-analyses do not exist on every education topic. If you have trouble finding one, reach out to an education professor or research organization such as the American Educational Research Association for guidance.

Sources of Bias

Because literature reviews and meta-analyses are based on an examination of multiple studies, the strength of their findings relies heavily on three factors:

  1. the quality of each included study,
  2. ​the completeness of researchers’ search for scholarship on the topic of interest, and
  3. ​researchers’ decisions about which studies to include and leave out.

In fact, many of the choices researchers make during each step of designing and carrying out a meta-analysis can create biases that might influence their results.

Knowing these things can help journalists gauge the quality of a literature review or meta-analysis and ask better questions about them. This comes in handy for reporters wanting to take a critical lens to their coverage of these two forms of research, especially those claiming to have made a groundbreaking discovery.

That said, vetting a review or meta-analysis can be time-consuming. Remember that journalists are not expected to be experts in research methods. When in doubt, contact education researchers for guidance and insights. Also, be sure to interview authors about their studies’ strengths, weaknesses, limitations and real-world implications.

Study Quality, Appropriateness

If scholars perform a meta-analysis using biased data or data from studies that are too dissimilar, the findings might be misleading – or outright incorrect. One of the biggest potential flaws of meta-analyses is the pooling of data from studies that should not be combined. For example, even if two individual studies focus on school meals, the authors might be looking at different populations, using different definitions and collecting data differently.

Perhaps the authors of the first study consider a school meal to be a hot lunch prepared by a public school cafeteria in Oklahoma, while the research team for the second study defines a school meal as any food an adult or child eats at college preparatory schools throughout Europe. What if the first study relies on data collected from school records over a decade and the second relies on data extracted from a brief online survey of students? Researchers performing a meta-analysis would need to make a judgment call about the appropriateness of merging information from these two studies, conducted in different parts of the world.

Search Completeness

Researchers should explain how hard they worked to find all the research that exists on the topic they examines. Small differences in search strategies can lead to substantial differences in search results. If, for instance, search terms are too vague or specific, scholars might miss some compelling studies. Likewise, results may vary according to the databases, websites and search engines used.

Decisions About What to Include

Scholars are not supposed to cherry-pick the research they include in literature reviews and meta-analyses. But decisions researchers make about which kinds of scholarship make the cut can influence conclusions.

Should they include unpublished research, such as working papers and papers presented at academic conferences? Does it make sense to exclude studies written in foreign languages? What about doctoral dissertations? Should researchers only include studies that have been published in journals, which tend to favor research with positive findings? Some scholars argue that meta-analyses that rely solely on published research offer misleading findings.

Other Factors to Consider

As journalists consider how the process of conducting literature reviews and meta-analyses affects results, they also should look for indicators of quality among the individual research studies examined. For example:

  • Sample sizes: Bigger samples tend to provide more accurate results than smaller ones.
  • ​Study duration: Data collected over several years generally offer a more complete picture than data gathered over a few weeks.
  • ​Study age: In some cases, an older study might not be reliable anymore. If a study appears to be too old, ask yourself if there is a reason to expect that conditions have changed substantially since its publication or release.
  • ​Researcher credentials: A scholar’s education, work experience and publication history often reflect their level of expertise.

Denise-Marie Ordway is a veteran education reporter and the managing editor of Journalist’s Resource, a project of Harvard’s Shorenstein Center on Media, Politics and Public Policy aimed at bridging the gap between journalism and academia. She would like to thank James S. Kim, a professor at the Harvard Graduate School of Education, for reviewing and providing feedback on this article.

This reporter guide was made possible by a grant from the W.T. Grant Foundation.

x
Latest
Podcast
badge-arrow
Podcast
Donate