Interviews conducted by EWA Public Editor Emily Richmond
In the six years since reporter Heather Vogell joined the Atlanta Journal- Constitution, she has written about suspicious test scores, Georgia’s death penalty and a variety of other public-policy issues. But it’s the newspaper’s coverage of a massive cheating scandal in the Atlanta Public Schools that’s grabbing headlines now. The 15-year newsroom veteran spoke with EWA about the challenges of tackling such a complex – and important – story.
Heather Vogell
1. Were you surprised by the breadth and scope of the cheating scandal?
The level of organization behind the cheating in some schools, and the amount that was directed by principals and tacitly permitted by area superintendents, was surprising to a lot of people.
What made this [Atlanta cheating] situation unique was the thoroughness of the state investigation. It documented specific individuals. We suddenly had characters in a drama that were responsible for this, with evidence as to what they did, how they did it and why they did it.
Our paper began raising concerns about the validity of the test scores several years ago–the district originally denied there was a problem. As time went on and the evidence mounted, we decided to look into it more ourselves. We found the district took a long time to provide requested records. I had to argue that certain documents were not exempt, as the district tried to claim. We found out later that they were actually hiding records from us. At the time, I thought it might just be a matter of bad recordkeeping. I was surprised it was so intentional.
2. Where did you start with the data mining?
Working with John Perry, our paper’s data analyst, our first step was to review results for students statewide who failed the original high-stakes test, underwent remediation over the summer and then took a retest. We knew in the past the failure rate on the retest was typically around 50 percent. But here were schools where every student passed the second time around, and with very high scores.
I learned a lot from working with John about how to look at test scores critically. I also took the EWA’s statistics boot camp a few years ago and that was very helpful as a precursor to this project. I would recommend it highly to anyone who has to deal with numbers.
3. What has the public’s response been to the stories?
After the first story ran, people started calling me and telling me things that were happening in their schools. The problem was they feared for their jobs and couldn’t go on the record. It was frustrating–I had no way to get at the story. That’s when I did an open records request for the internal investigations the district had been conducting, and we went back and examined all of the testing data for 2009. From there, the state got involved and eventually conducted an external investigation.
Most people have thanked us for writing about this, including school employees. There are a lot of people out there who suspected what was going on, didn’t want to participate and were miserable in their work environments.
4. This was obviously a story with significant potential for controversy and many moving parts. What steps did you take beyond the normal fact-checking?
When you do statistical analysis, you’re often out on a limb. We found experts to review our data, which gave us a lot more confidence in our conclusions. Investigative reporters don’t always share the inner workings of our stories before they’re published. In this instance, we did share our findings with both the district and the state to give them plenty of time to respond and to challenge our reporting. We were completely transparent, which I think gave us more credibility.
5. What suggestions can you share with education writers and editors who might be considering following your lead?
We treated this as a rolling investigation; we were producing stories regularly. This became easier when my colleague, investigative reporter Alan Judd, jumped in.
When you write about an issue frequently, you build momentum and that gets you more sources. The overarching story was shaped by the responses to our coverage. That made our reporting more nuanced than it might have been had we stepped back, dropped a big story and then disappeared for a few months.
The big picture is that years after these test scores became monumentally important for public schools, nationally there is still no rigorous, consistent oversight of their validity. The oversight varies widely from state to state. I’m not suggesting there are going to be major problems in every district. But people are talking about using those student test scores to make more and more decisions, including in teacher evaluations. Reporters need to look at how their states are policing that data. This is a big story everywhere.