Back in 2006, there was an apparently minor glitch in administering the Programme for International Student Assessment, a system of student tests given by the Organisation for Economic Cooperation and Development. Some U.S. reading-test booklets were misprinted and included confusing directions.
The result: The OECD didn’t even report PISA reading scores for the United States. Officials reasoned the scores wouldn’t be valid and they shouldn’t put out bad results.
Leslie Rutkowski, an assistant professor at the Indiana University School of Education and an expert on statistical modeling, brought up the PISA experience in connection with the problems Indiana experienced with its 2013 ISTEP-Plus exam. Testing was plagued by disruptions traced to the computer servers of CTB/McGraw-Hill, Indiana’s test contractor.
“I feel like that is such a telling anecdote,” Rutkowski said. “PISA is low-stakes: The test doesn’t have consequences for students or for schools. And for something as small as a printing error to invalidate the results … that was something that really resonated with me.”
ISTEP, by contrast, is a high-stakes test – all the more reason there needs to be full confidence in its results. “We’re making decisions about whether a child graduates, whether a teacher keeps her job, whether a school stays open” on the basis of test scores, Rutkowski said. “If the data is questionable in any way, I don’t think we can use it.”
Indiana Superintendent of Public Instruction Glenda Ritz appears to agree. The state Department of Education contracted with the New Hampshire-based National Center for the Improvement of Educational Assessment to determine if ISTEP results are valid. Continue reading