Back to Skills

What to Do When Your State Test Melts Down

Back to Skills

It’s hard to avoid writing about tests and test scores as an education reporter. Too often, though, the story gets done in a rush — with scores about to be released or already out in the world.

Marianne Perie, the director of the Center for Assessment and Accountability at the University of Kansas, urged reporters to take a step back at the Education Writers Association conference in May.

Perie, who called herself the “expert on everything that can go wrong” with online testing, suggested key questions for reporters to ask if they find themselves trying to explain a testing meltdown, like Tennessee reporters had to this year. (The issues included login failures, screens going blank, and difficulty submitting completed exams.)

But first, Perie offered general tips for understanding the state and national assessments that education journalists write about.

A key takeaway? It’s really hard to write about a test if you don’t understand how it was created, how students take it, and what it’s trying to measure. So find someone who knows your assessment well and quiz them on background until you really get it.

Here are a few other tips she offered:

Dig into “item maps.” These are tools to explain student performance in a concrete way by pegging sample questions on an exam to achievement levels. Take the item map for fourth-grade math on the National Assessment of Educational Progress (NAEP). One question asks test-takers to identify the sets of shapes that make up all the faces of a pyramid. The map explains that this question was answered correctly by most students earning a 250 or higher, just clearing the proficiency bar. In addition, it indicates the proportion of test-takers who got the right answer.

Ask questions about any released questions. Every time a test question that’s already been used on a test is made public, it’s expensive, Perie said. So ask where sample or example questions came from, and check if they have accompanying data about how students did. No data means the questions were likely rejected before making it onto a test — which may not make them good examples to share with readers.

“Sometimes the public items are the worst items,” Perie said.

Don’t forget that “cut scores” can be political. Panels of teachers are often convened to decide how questions correspond to a state’s proficiency bar. (Reporters should ask to be in the room when teachers are getting trained, Perie said.) And while those teachers recommend cut scores, policymakers make the final call. That means there’s potential for interested parties to try to monkey with the cut scores, and reporters need to watch closely.

Those tips are true for any standardized test. But the move toward computer-based testing has introduced a whole new set of challenges for reporters, like explaining whether the computer itself is affecting students’ performance.

Perie noted that early research on computer-based testing focused on how students performed on computers compared to how they did using paper and pencil. Now, research is focusing more on how students using different devices compare to one another — and experts are finding that small screens (like those on iPad minis, for example) can hurt student performance.

Perie suggested asking state officials, testmakers, and districts what analyses they have conducted to show that students do not face a disadvantage when using a particular electronic device to take an exam. In addition, she said reporters should ask if students have experience on the the device being used for testing.

Another challenge is explaining testing interruptions.

Questions for Test Vendors

A common reason for an assessment problem is a distributed denial-of-service (or DDoS) attack. Perie, who dealt with DDoS attacks on tests she was overseeing out of the University of Kansas, said reporters shouldn’t call those hacks, which implies someone unauthorized had access to internal data. In a DDoS attack, computer servers are deliberately overwhelmed with log-in attempts until they shut down.

At that point, reporters need to ask the testing vendors: How will you stop a similar attack in the future? Why didn’t you plan to stop this one ahead of time? Vendors should have answers, Perie said, since testing companies can contract with companies to protect their servers.

Some questions will be harder to answer. Knowing how many students were affected by an interruption, for example, can be tricky both because some states allow schools to test any time within a given window, and because once the servers have gone down, log-in attempts won’t be recorded.

“How many people knocked on your door when you weren’t home? You don’t know,” Perie said. “It’s basically the same thing.”

But testing companies should be able to explain certain aspects of the interruption, like how many students were kicked off tests they had started, how long the system was down, and why an attack happened.

“This year, I know reporters are frustrated because they’re not getting good answers,” Perie said.

x
Latest
Podcast
badge-arrow
Podcast
Donate