With the results of a global exam showing flat scores for American 15-year-olds in reading, math, and science, education journalists were busy this week parsing the data, providing context, and explaining why comparisons among countries’ results can be a tricky business.
The U.S. saw its international rankings climb in all three subjects tested because scores slipped in some other countries on the Program for International Student Assessment (PISA) exam, the results of which were published Tuesday.
PISA was launched in 2000 by the Organization for Economic Cooperation and Development (OECD) to provide a comparable measure of education systems around the globe. Students in about 80 countries and jurisdictions — largely industrialized — are tested on critical-thinking and problem-solving skills, as well as proficiency in core subjects. The PISA results are often used by education advocates to point to U.S. students’ relative readiness to compete in the global workforce.
U.S. 15-year-olds earned average reading scores of 505, math scores of 478, and a science score of 502, on the PISA scale of zero to 1,000. The U.S. ranked eighth in reading and 11th in science — above the OECD average. But the U.S. was 30th in the world in math, below the OECD countries’ average. The test was administered in 2018.
China — based on scores for students in four provinces — topped the rankings in all three subjects, with scores of 555 in reading, 591 in math, and 590 in science. Education researcher Tom Loveless used a Twitter thread to point to concerns about China’s PISA performance, including the practice of using high-stakes entrance exams to determine which students advance into the academic high school track and are ultimately among the testing pool. (For some fascinating insights into China’s public education system, listen to my EWA Radio conversation with author Lenora Chu, who enrolled her Chinese-American son in a local elementary school when she relocated to Shanghai.)
In addition to China, the top-scoring countries included Singapore, Estonia, Canada, and Finland.
Among the key findings for the U.S.: The gap between America’s top achievers and struggling students continues to widen, said Peggy Carr of the National Center for Education Statistics, which coordinates U.S. participation on PISA. She characterized the results as a troubling red flag.
While PISA tests different kinds of knowledge and skills than the National Assessment for Educational Progress, sometimes referred to as the nation’s report card, the gaps among the top and bottom quartiles are similar, Carr said. (Read more about NAEP and the disappointing results on the most recent tests of fourth and eighth graders here.)
U.S. Secretary of Education Betsy DeVos used the new PISA results as an opportunity to call — once again — for an increase in school choice options for students.
“The gap between the highest and lowest performing students is widening, despite $1 trillion in federal spending over 40 years designated specifically to help close it,” she said.
Others questioned whether the PISA scores point to a need to overhaul the nation’s teacher preparation programs or approaches to curriculum and instruction, as Dana Goldstein explored for The New York Times.
The National Center on Education and the Economy, a Washington, D.C.-based think tank that has long sought to distill lessons from top-performing countries, called for a “reset” of America’s education system. The group focused on what it sees as the “building blocks” common to higher-achieving countries, including access to high-quality early childhood education, a highly trained and supported teacher workforce, and support systems for students who need them.
Here are a few takeaways and things to keep in mind when considering the latest results:
The U.S. isn’t Finland. We can look to higher-achieving countries for insights into how they approach things like building a high-quality teacher workforce, but adopting one country’s instructional approach (Singapore Math, anyone?) or policy preferences may not move the needle very far, experts say. Reporter Matt Barnum of Chalkbeat offered a cautionary note when it comes to extrapolating from PISA scores: “[A]cademics have warned that international test scores can’t — or at least shouldn’t — be used to make policy prescriptions. The reason is that it’s extremely difficult to say what policies explain better or worse scores between different countries. And even if we knew which policies worked in one country, it’s not clear they could be exported elsewhere.”
Who cares about PISA? The Washington Post’s Moriah Balingit and Andrew Van Dam focused on student motivation as a factor in how hard students tried on their PISA tests and how that may have factored into a country’s scores:
“The exam is designed to accurately gauge the abilities of students from country to country because it is low-stakes, meaning more affluent students do not have an incentive to pay for special test preparation. But those administering the exams to teenagers have encountered serious motivation issues. Economists have found mounting evidence that the gap in scores between countries reflects a gap in effort as much as it does a gap in achievement. By both measures, the United States lags behind.”
Although only a handful of studies have examined motivation as a factor in PISA performance, student knowledge and effort both influence test scores and untangling them isn’t easy, Loveless told EWA. And how big a role motivation might play in student outcomes likely varies among countries, he said. He’s heard from educators in other countries that their students try hard on tests as a matter of national pride regardless of whether they receive an individual score. By comparison, “You can tell the average American 15-year-old that their score is going to add up and represent the United States, and I don’t think that has the same effect as in other parts of the world,” Loveless said. “I think in the U.S. that would be on the low side of important.”
While the test provides a snapshot of academic performance versus a definitive statement, PISA does offer some interesting insights into how students apply what they learn. As reporter Tawnell Hobbs pointed out for The Wall Street Journal, only 13.5 percent of American students could distinguish between fact and opinion in reading selections on the test. (Here’s a sample question.) That’s better than the OECD average of about 10 percent.
“In the world of … fake news, it’s very important that students can actually navigate ambiguity and resolve conflicting dialogue,” Andreas Schleicher, OECD’s director for education and skills, told The Journal.
Knowing and doing are two different things. PISA doesn’t just measure students’ comprehension of core subjects — it tests how well they’re able to apply those skills to real-world situations. When it came to reading, U.S. students on average scored at level two (out of six). That’s comparable to “below basic” on NAEP, Carr of the NCES told reporters. Sarah D. Sparks of Education Week explained what that “level two” score really means:
“[T]he average U.S. 15-year-old could understand the main idea and draw basic inferences in a moderately long text, but would struggle to understand and compare texts that included multiple features or competing ideas, as is required in the next proficiency level. U.S. students could more readily reflect on texts given to them than locate information or understand and infer the meaning of what they read—results that mirror those in the latest Nation’s Report Card for reading, released last month.”
Sparks’ piece is worth a close read, including for the interesting findings on strong performance by immigrant students in the U.S., who largely outscored their native-born American classmates.
Unlike in some prior rounds of PISA, no individual U.S. states this time opted to have separate samples of students so as to produce their own, comparable results. In 2012, Connecticut, Florida and Massachusetts did so. In 2015, the states of Massachusetts and North Carolina participated, as did Puerto Rico. Massachusetts scored above the average of participating countries in 2015 in both reading and math, and on par in science.
For another perspective on how American students differ from their peers abroad, take a look at a project supported by an EWA Reporting Fellowship: The Cleveland Plain Dealer’s Patrick O’Donnell and Olivera Perkins compared career education and apprenticeships both in Ohio and abroad. At a 2015 EWA seminar, we looked at global comparisons in education — what can be learned as well as the inherent limitations. We also looked at some of the applications of international test results. (You can watch a video replay of that discussion here.)
The U.S. is hardly alone in its PISA-heavy headlines, which could be found around the globe in Brazil, Australia, and beyond. Singapore bemoaned slipping to second place in the world rankings, while Estonia celebrated leading the European nations in all three subjects. Ireland was thrilled to be among the world’s top readers. At the same time, longtime international education darling Finland was found to have one of the largest gender gaps among the industrialized nations participating in the exam — girls are reading at about two grade levels ahead of boys — raising questions (and spurring some Twitter snark).