Back to Skills

U.S. 8th Graders’ Scores Stagnate on National Civics, History, Geography Tests

Back to Skills

American eighth graders continue to demonstrate lackluster knowledge and skills when asked basic questions about U.S. history, geography, and civics, with between 18 and 27 percent of students scoring proficient or higher, new data show.

The results from the National Assessment of Educational Progress, known informally as The Nation’s Report Card, is based on a representative sampling of more than 29,000 U.S. eighth graders tested last year across the three subjects. (In history, for instance, it tested more than 11,200 students.)

Since 1998, scores have been inching upward in several topic areas, particularly for minority and low-income students. But overall there were no meaningful gains in student performance since the combined history, geography and civics test was last given in 2010:

Source: National Assessment of Educational Progress.
Source: National Assessment of Educational Progress.

While scores overall have stagnated, student groups with historically low performance continue to inch up, and at a faster rate than their white peers with particularly rapid improvement by Hispanic kids.

Take a look at the NAEP sample tests to get an idea of what students are being asked to do – it’s a lot more than just multiple-choice questions. They must also interpret graphs and data, and demonstrate the kind of critical-thinking skills that the Common Core State Standards are intended to promote. Here’s one example shared with reporters during a press call Tuesday:

Source: National Assessment of Educational Progress.
Source: National Assessment of Educational Progress.

I do want to offer a caveat: NAEP is just one indicator of student knowledge and skills, and it’s not designed to evaluate the merits of a particular educational program or intervention. Correlation is not causation. Stephen Sawchuk of Education Week uses the term “MisNAEPery” to refer to the many examples of education advocates and policymakers using data from the assessments to whet their own particular axes.

I recommend Jessica Brown’s thoughtful overview of the full report’s findings over at Education Week. For more on what NAEP does – and doesn’t – measure, take a look at a fact sheet on the civics assessment created by the Center for Information and Research on Civic Learning & Engagement (CIRCLE) based at Tufts University in Boston.

That said, some experts note that one powerful use of NAEP is to offer an independent barometer to compare with results on state tests, especially when it comes to reading and mathematics. In other words, if state test results show 90 percent of students are proficient in a given subject but NAEP says it’s 25 percent, that may be a red flag that the state’s bar for “proficient” is not set very high. (The NAEP exam for history, civics, and geography does not report out state-by-state results, however.)

In any case, there are some interesting long-term trends across multiple NAEP subject areas. Peggy Carr, the acting commissioner of the National Center for Education Statistics (which oversees NAEP) noted during Tuesday’s press call that U.S. students have continually made gains on the reading assessment, particularly among those in the lowest-performing quartile. She theorized that those gains in literacy might have helped them when it came time to take the latest history and civics test.

If students are more confident readers, the history and civics questions might be less daunting “and they might be able to access these materials better than they were in past years,” Carr said. (Education historian Diane Ravitch made a similar argument when the high school NAEP history results – also lackluster – were released in 2011, saying its gains by lower achievers potentially reflected improved literacy rather than a deeper grasp of the content.)

And if you think that simply adding more class hours would solve the problem, consider this: “There’s no association between how much time is spent on a subject in a given year and how well students perform,” Carr said.

Given that NAEP scores on history, civics, and geography have been sluggish since 1998, today’s report is no surprise. A 2011 story by NPR contended that American students have always been weak in the subject, providing a laundry list of banner headlines lamenting poor history scores all the way back to 1955. And despite that, the United States continues to lead the world in key areas, Ravitch said in the NPR interview.

“We have to temper our alarm,” Ravitch told NPR. “And realize we’re not a very historically minded country.”

To be sure, ensuring an engaged and informed citizenry was one of the original arguments made by the Founding Fathers who advocated for free public schools. There’s been a groundswell in recent years to make civics education a national priority, with several states increasing the curriculum requirements for K-12 students (the Wall Street Journal put together a handy chart). Robert Pondiscio, now with the conservative-leaning Thomas B. Fordham Institute, argued in 2013 that passing the U.S. citizenship test should be a requirement for high school graduation. In January, Arizona became the first state to take that step.

It’s worth remembering that the NAEP is what’s known as a low-stakes assessment. No teachers’ or principals’ jobs are on the line if kids don’t do well, and the students themselves often realize the outcomes have no bearing on their own academic trajectories.

Officials at the National Assessment Governing Board, which oversees NAEP, have said they’re taking a look at the impact of student motivation – or lack thereof – on student test scores. I reported on one study, by a Boston College researcher, which found that offering high school students relatively low-level incentives to “try” harder on a reading test similar to NAEP produced meaningful gains. But raising the stakes for NAEP could hurt its usefulness as a long-term barometer of student knowledge.

As Jack Buckley, the former NCES commissioner (and now with the College Board) told me in 2014: “How much pressure is going to be put on the data, and for what purpose? You need just enough pressure not to distort the outcome – that’s the ideal measure.”

x
Latest
Podcast
badge-arrow
Podcast
Donate