Back to Skills

The Do’s and Don’ts of Covering Education Research

When it comes to education research, the biggest mistake journalists make is avoiding it

Back to Skills

In her talk at EWA’s recent annual conference in Washington, D.C., Holly Yettick admitted that’s what she did when she was a reporter: Dismiss research as too difficult to cover or something for national publications.

Today, as the director of the Education Week Research Center, Yettick doesn’t want reporters to make the same error, and miss out on studies that can help them break news, add context to their stories, and hold public officials accountable.

Without research, for example, reporters in Denver wouldn’t have been able to show that a school closure, which the district touted as beneficial to students, actually hurt them. A study by local academics, Yettick said, showed the disruptions caused by the closure hurt students’ academic performance.

But not all research is good research, and Yettick offered tips on how to find studies that deserve attention:

Look for peer-reviewed research, one form of quality control. “Peer reviewed” means at least three people evaluate the study before it is published without knowing who wrote it. The readers usually have expertise in the field covered by the study. The process isn’t perfect – Yettick compared it to a seat belt because it reduces rather than eliminates the chances that journalists end up writing about shoddy research.

The best place to find peer-reviewed research, Yettick said, is in academic journals. Some of her recommendations: AERA Open (a new publication), Education Evaluation and Policy Analysis, and the AERA journal. She also recommends going to academic conferences, or at least flipping through their programs. One of her favorites, which she called “the best kept secret in academia,” is the annual conference of the Association for Education Finance and Policy.

For non-peer reviewed studies, do some vetting yourself. One way is to ask an academic to share his or her thoughts on a study’s design, and many are happy to do so. Professors can also help identify the best research on a topic. Yettick called this the “tour guide” approach to research, which has benefits but also a few tourist traps.

She warned against Eduttantes, who provide a pithy quote on any subject. They should be avoided, Yettick said, because “nobody is an expert on everything.” Then there are the Nihilists who say all research is bad. “If you hear that all the time, you have to start getting a little skeptical,” she said. One sign you have a good tour guide: They often say “I don’t know the answer to that, but I’ll give you the number to someone who does.”

Four common mistakes in covering research that Yettick identified, in addition to ignoring it, are:

Mistake 1: Ignoring research reviews. Yettick says reviews offer an efficient, effective way to get up to speed on a topic. “It’s better to know what we know cumulatively,” she said, “because any one study can be flawed.”

There are two types of reviews: meta-analyses that provide a quantitative measure of the evidence, and qualitative reviews that describe what the studies say. (Tip: Don’t assume you have to pay for reviews — or other studies — that are behind a paywall. Ask the publisher or the author, and they often will share the study for free. AERA will give journalists free access to studies in its publications, too.)

Mistake 2: Focusing on the size of the study’s sample (such as the number of students included) without looking at how the sample was selected. A huge sample isn’t valuable if it’s not chosen well, Yettick said.

One famous example, she said, is a 1936 poll of 2.4 million Americans conducted by Literary Digest. Despite the large number of respondents, the poll incorrectly predicted that Alfred Landon would beat Franklin D. Roosevelt in the presidential race. The reason? The 2.4 million people were skewed toward the financially comfortable — in selecting people, the researchers looked in places that had a high percentage of people with money, such as (at the time) those with phones. A random sample of just 1,000 people would have given more accurate results, Yettick said.

Mistake 3: Ignoring what’s called “effect size,” which helps researchers gauge whether the outcomes were meaningful. (For more on understanding this important component, take a look at EWA’s helpful explainer.) Reporters should ask teachers/principals/other school staff: Given the expense of an effort, and the time needed to put it in place, is it worth doing?

Mistake 4: Overgeneralizing conclusions. If a study involves college students, for example, it’s probably not applicable to 8-year-olds.

If journalists avoid all these mistakes — especially the first, ignoring research reviews — they can accomplish what Yettick sees as one of the goals of journalism.

“Part of your job,” she said, “is to help people avoid reinventing the wheel and repeating the past.”

x
Latest
Podcast
badge-arrow
Podcast
Donate