Back to Skills

Are 4th Graders Ready For Online Writing Tests?

Back to Skills

Are fourth graders computer-savvy enough to have their writing skills measured in an online assessment? A new federal study suggests that they are, although it’s not clear whether old-fashioned paper and pencil exams might still yield useful results.

The National Assessment of Educational Progress, sometimes referred to as “The Nation’s Report Card,” is administered every two years to a representative sampling of students in grades 4, 8, and 12. Because each state uses its own mix of assessments, NAEP (along with the SAT and ACT college entrance exams) is one of the few ways of making comparisons nationally on student performance. NAEP expects to be a fully online assessment by 2017.

In preparation for that shift, the National Center on Education Statistics tested an early version of an online writing assessment on 60 fourth graders to gauge their comfort level with the platform and design. After identifying common issues that students were struggling with, the pilot online writing assessment was eventually redesigned. Among the changes: swapping out a drop-down menu for icons, and giving students more frequent – and shorter – prompts. That assessment was then administered to 13,000 fourth graders.

Students were asked to respond to prompts that required one of three modes of writing:

  • Persuade: Write a letter to your principal, giving reasons and examples why a particular school mascot should be chosen.
  • Explain: Write in a way that will help the reader understand what lunchtime is like during the school day.
  • Convey: While you were asleep, you were somehow transported to a sidewalk underneath the Eiffel Tower. Write what happens when you wake up there.

Because it was not a representative sampling, NCES has cautioned against drawing conclusions about the online assessment, or the students’ performance.

With that caveat, here are a few of the findings from the pilot study:

  • On a scale of 1 to 6, just over 60 percent of students scored a 3 or higher on the assessment. NAEP officials say that means “the majority of students wrote enough to be assessed, included ideas that were mostly on topic and used simple organizational strategies in most of their writing.”
  • Students scored higher when they had more time – 30 minutes vs. 20 minutes – to construct their responses.
  • Nearly all students – 92 percent – said they had previously taken some form of an online assessment.

While this may only have been a pilot study, it’s clear that many of the students who took part struggled to respond to these prompts. When students were given 30 minutes to work on the assignment, 26 percent were rated a 2 (marginal) while 14 percent were at 1 (little or no skill). At the other end of the spectrum, just 10 percent of students were rated a 5 for “competent,” while 4 percent earned the highest possible score of 6 for “effective.”

The pilot assessment was scored by human beings, not computers, said Cornelia Orr, executive director of the National Assessment Governing Board, which oversees NAEP. The switch to an online assessment gives NAGB the opportunity to develop a more engaging testing platform that holds students’ attention, Orr said during a webinar to discuss the findings.

“When students are interested in what they’re writing about, they’re better able to sustain their level of effort, and they perform better,” Orr said.

One downside to the NCES pilot study: It doesn’t compare student answers with similar questions answered in a traditional written exam setting. From Education Week’s overview:

Scott Marion, the associate director of the Dover, N.H.-based National Center for the Improvement of Educational Assessment, said he’d like to see a study in which students are each given two writing prompts, responding to one on paper and one on computer. The one written on paper would then be scribed onto the computer for true comparability (since, he also noted that, in general, typed papers tend to receive higher scores than handwritten ones).

There is another concern: How well classroom instruction will line up with the new expectations of NAEP’s online writing assessment. Certainly a central goal of the Common Core is deeper knowledge, where students are able to draw conclusions and craft analysis, rather than simply memorize rote fact.

But teaching students to write is a complex animal all its own, said Peg Tyre, author of “The Writing Revolution,” a 2012 story for The Atlantic, that detailed the debate in education circles over methodology and expectations. The foundation of strong writing is knowing how to “work with language and build strong sentences into paragraphs,” said Tyre, who is the director of strategy at the Edwin Gould Foundation. For those skills to become rote, writing has to be a regular part of the student’s experience. Tyre said she’s not seeing that happening in most of the public school classroom she visits.

“Teachers aren’t taught how to teach writing,” Tyre told EWA. “While it’s not rocket science, it does require a dedicated approach and a dedicated amount of time.”

To be sure, NAEP isn’t alone in abandoning paper exams. Two testing consortia that have designed assessments aligned to the new Common Core State Standards  – Smarter Balanced and the Partnership for the Assessment of Readiness for College and Career  – use online assessments, which are being rolled out through 2015. (In some places, it’s been a bumpy start.) The GED high school equivalency exam has also gone digital.

Jacqueline King, director of higher education policy for Smarter Balanced, said the fourth-grade assessment includes components to measure students’ writing abilities. It will be left up to individual states to decide how to score the exams, within some parameters established by the consortium to ensure “accuracy and comparability of student scores,” King said.

The field test of the assessment was not machine-scored, King told EWA. However, the consortium is investigating whether, some elements of the student responses, such as their spelling and syntax, could be graded using a software program. The longer answers – where students are asked to write informative, narrative, or opinion pieces – take 90 minutes to complete, which includes time for students to read materials and craft their responses, King said. (That’s three times as long as students are given for the NAEP pilot study assessment.)

While there are digital programs designed to measure student writing, “We have determined that those programs are not yet able to adequately score all the elements of student writing that we are asking our human scorers to evaluate,” King said. “We will continue to watch that technology as it evolves.”

Additional EWA resources on NAEP, the Common Core, testing and assessments: 

For more on the Common Core tests, check out the “Angles on Assessment” session from EWA’s 67th National Seminar, held at Vanderbilt University in Nashville: We have both video of the six featured experts, as well as a write-up by guest blogger Sarah Darville of Chalkbeat NYC.

EWA’s topics pages on the Common Core, as well as standards and testing, are also useful. And EWA’s Mikhail Zinshyten has a rundown of the most recent NAEP results (reading and math for 12th graders).

Webinars:

A Look Inside the New GED

Global View: Questions to Ask about PISA 2012

Blog posts:

Atlanta Cheating Scandal: New Yorker Magazine Gets Personal

Government Report Suggests Racial Achievement Gap Narrowing

Common Core: Should States Slow Down on Implementing New Assessments?

Nation’s Report Card: When Test Scores Don’t Count, Do Students Really Try?

Nation’s Report Card: Urban Districts Making Long-Range Gains

x
Latest
Podcast
badge-arrow
Podcast
Donate