Families, educators and journalists alike are trying to make sense of COVID-19’s lasting impact on students. Understanding the academic and social effects will be crucial as public school districts and lawmakers revamp their recovery plans.
A new database can help reporters find and bolster these essential stories.
The Education Recovery Scorecard, released in October 2022, was spearheaded by Tom Kane and Sean Reardon at Harvard and Stanford universities, respectively. They and their teams examined state and national assessment data to measure the impact of the pandemic on individual public school districts.
By providing this data, researchers are encouraging journalists to compare achievement gaps of districts within states and across state lines, further showing how remote schooling, federal spending and other factors are affecting young learners at the local level.
Kane, professor of education and economics at Harvard, is challenging reporters to not only ask districts for a list of things they’re doing, but get them to explain why their efforts are going to be enough.
“Show me on paper,” Kane gave as an example. “What are the effect sizes you’re expecting from each of those things? And do they add up to the magnitude of our losses?”
“Many districts,” he said, “… would find they’re way short.”
This explainer will provide a better sense of what the Education Recovery Scorecard is, detail how reporters can use it for stories and list tips they should keep in mind.
Understanding the Education Recovery Scorecard
The scorecard includes metrics from nearly 4,000 public school districts in 29 states and Washington, D.C. It uses data from 2019 and 2022 on math and reading scores for students third to eighth grade. This includes recent results from The Nation’s Report Card, or National Assessment of Educational Progress (NAEP).
Users can compare similar districts and analyze disparities based on student demographics.
Looking at districts in urban settings, for example? Reporters can now easily compare Portland Public Schools in Oregon to Minneapolis Public Schools in Minnesota or Charleston County School District in South Carolina. Of note, the scorecard does not provide data for individual schools or students.
Interactive maps and corresponding data offer insight into instructional spending per student and district allocations of the Elementary and Secondary School Emergency Relief Fund, known as ESSER.
Unlike other data sets, the scorecard works no matter how each state defines “proficiency,” and it puts the results in more easily understandable terms.
Instead of looking at the percentage of students meeting specific benchmarks, as state and federal reports often do, this data set shows the “mean,” or average, results. Researchers from the project said this better incorporates the best and worst student scores.
Lastly, the scorecard quantifies the results in “years of schooling missed,” making it easier to grasp the impact and make more apples-to-apples comparisons. Reporters will see this as “grade equivalents” on the scorecard. Researchers said one grade equivalent loss in achievement is roughly the same amount of learning that would typically occur in a single school year.
Demonstrating Years of Schooling Missed
Researchers found learning loss varied dramatically among districts in the same state, and losses were greater in higher-poverty districts.
The average U.S. public school student from third to eighth grade lost the equivalent of a half year of learning in math and a quarter of a year in reading between spring 2019 and spring 2022, according to their findings. Students in most states fell behind, but some dropped behind by decades.
“[For] five states, their mean score in eighth grade math was below what it was in 1990,” Kane said, referring to Oregon, North Dakota, Maine, Iowa and Montana. “And that’s a shame; because between 1990 and 2019, every state had increased, some by a large amount.”
Districts that stayed remote longer suffered greater losses, according to the project’s researchers, but they added that school closures weren’t the sole factor. The scorecard also demonstrates the widening disparity gaps among students of color and students from less affluent communities.
Kane said researchers present the findings as estimates because they have to make several statistical assumptions to convert percentages into a single estimate of mean achievement and put them on a comparable scale. Their findings aren’t based on individual test scores, but since typically more than 95% of students take the state tests, the data is still very complete.
Low Academic Achievement and Student Outcomes
Researchers for years have studied test scores to understand long-term impacts on students’ futures, including income, graduation, college enrollment, teen motherhood and arrest rates, according to Kane.
“Yeah, these are just test scores,” he said. “But they’ve also been … a leading social indicator for what was going to be happening to the kids born in those states.”
Districts will need to do extensive tutoring, doubled-up classes, summer school and extended school years to make up a fraction of the learning they’ve lost during COVID, Kane said.
But districts made their recovery plans before this data was released and before educators knew the magnitude of the problem. Kane said he hasn’t seen any plans yet that will actually match the need.
Kane contends that school districts need to focus more dollars on academic recovery than they’re doing. The federal government only required districts to spend 20% of their stimulus allocations on academic recovery.
In many cases, districts need to spend more, Kane said at EWA’s 75th National Seminar in July 2022.
Reporting on Disrupted Learning and Student Recovery
Eder Campuzano, statewide K-12 education reporter for the Star Tribune in Minneapolis, wrote about the scorecard the first week it came out. He said it allowed him to compare major urban districts in a way past data couldn’t.
“If you’re constantly checking in with principals throughout the school year, you know what’s coming in terms of student proficiency,” Campuzano said. However, now there’s “quantifiable proof.”
Campuzano sees the scorecard as a measuring stick for what schools have done since they were first given federal-relief money for COVID academic recovery.
In a lot of cases, their plans didn’t work. He said this is a chance to go back to districts and find how they plan to do things differently, such as getting Black and Latino students on track.
Campuzano noted the importance of reporters interviewing people, such as students and school staff, who are directly affected by these decisions.
“The idea [is] that we speak truth to power,” he said. “[In] doing so, we should always be talking to people who have the least amount of it.”
Ready to embark on your coverage? Here are some things to remember:
Make sense of the numbers.
Don’t just identify key data points from your local districts. Put the data in context. Look at what has changed, why that matters and who is disproportionately impacted.
Focus on poorer districts and the students most affected by these achievement gaps.
Find your own data too.
Data is a snapshot. Do additional research outside the scorecard to ensure your coverage is as thorough and accurate as possible.
Kane said the scorecard research team pulled additional data for reporter Christopher Huffaker’s local coverage in The Boston Globe, adding they’d work with any journalist who asks.
Remember to research other factors that may have contributed to missed learning as well, such as COVID death rates and internet access, and check how your districts are approaching “full recovery,” not just academic recovery.
Look for future data and stories.
By the end of February, Kane said researchers plan to release results for an additional 12 states and a research brief investigating the role of other factors — such as the share of parents working from home, broadband access and COVID deaths and hospitalization — on achievement loss.
Check back with the scorecard and your districts for follow-up coverage.
Talk to people in the classroom.
Don’t forget to include educator and student voices.
“The most important question that we as education reporters can ask any of our sources, especially [those] in the classroom, is, ‘Do you feel like you’re being set up for success?’” Campuzano said.
Build trust. Don’t make assessment coverage a “one and done.”
If reporters wait until test scores come around to establish a connection with a school district, Campuzano explained, educators aren’t likely to trust them.
“[People] in the buildings that we cover, right now, especially more than in most other times, are under incredible pressure to deliver on something that’s a little bit intangible,” he said.
Reporters can look for profiles and shorter stories to cover year-round to show their commitment.
“Covering education … is like ongoing source/relationship development to get people to trust you,” he said. “Because at the end of the day, what we’re here for is to make sure that people are doing right by kids.”