Indiana’s ILEARN scores have been made public, and the freakout is underway. I guess we should be grateful. A decade ago, business leaders and newspaper editorial writers might have pointed to the scores as evidence that schools were broken. Now the consensus seems to be that the test is broken.
Here’s another possibility. Maybe the problem isn’t with the test. Maybe the problem is what we do with it. Maybe it’s the high stakes, not the testing, that we should reject.
Results for the new ILEARN assessment were released today during a meeting of the State Board of Education. As expected, the rate at which students were found to be proficient was considerably lower than the passing rate on ISTEP, Indiana’s previous test.
Here we go again. Indiana has a new standardized test, the results sound bad, and educators are calling on the state to hold off on imposing consequences on schools or teachers using new test scores.
Today, Gov. Eric Holcomb joined the call for a “pause” in accountability based on the tests. House and Senate leaders concurred, which means it’s almost certain to happen. Results from the new assessment, called ILEARN, are scheduled to be made public at the Sept. 4 State Board of Education meeting.
Most Indiana schools earn A-to-F grades on a formula that gives equal weight to performance and growth on standardized tests. But schools in their first three years of operation – most of which are new charter schools and Indianapolis or Gary “innovation network” schools – can have their grades calculated on growth only, with no consideration of performance. Those schools have an advantage.
As Dylan Peers McCoy of Chalkbeat Indiana pointed out, it means you can’t use the grades to compare schools in a district like IPS. “Of the 11 out of 70 Indianapolis Public Schools campuses that received A marks from the state,” she wrote, “eight were graded based on growth alone.”
So why not grade all schools on growth only, not performance? It seems like that would make a lot of sense. In any given year, schools may not have a lot of control over where their students start out in their math and reading performance. What matters is, do schools help students grow?
It’s a lousy week to be an education reporter in Indiana. ISTEP-Plus test results were released Wednesday by the State Board of Education, so editors are assigning – and readers are expecting – the usual stories. Which schools did best? Which did worst? Which improved, and which didn’t?
Reporters who spend their work lives visiting schools and talking to educators and experts know this is the epitome of a non-news story. They know that years of experience and research tell us that affluent schools will have higher test scores than schools serving mostly poor students. But the stories have to be written.
It’s no surprise that low-poverty schools in the suburbs have the highest passing rates in the Indianapolis metropolitan area. They do every year. And it’s disturbing but not really shocking that barely 5 percent of Indianapolis Public Schools 10th-graders passed their tests. Three of their high schools were about to close; the tests had no consequences for the schools or their students.
That’s not to say test scores or meaningless, or that they should be ignored altogether.
It’s been said that Indiana’s ISTEP testing program is a train wreck. It’s also something like a car crash that you pass on the highway. You know you shouldn’t stare, but you can’t avert your eyes.
Scores from the spring 2017 tests were released Wednesday, and newspapers and digital news sites have already posted stories about how local and state schools fared. Admit it – we’re going to read them, even though we know in advance which schools will do well and which schools won’t.
I’ll leave the scorekeeping and analysis to others, but here are a few observations: Continue reading
People of color have a different view of their community schools than do white people. That’s an important take-away from the 2015 PDK/Gallup Poll, released Sunday.
For example, asked to rate the schools in their own community, 51 percent of poll respondents gave local schools an A or B. But only 23 percent of African-American parents and 31 percent of Hispanic students gave their local schools an A or B.
Maybe that’s to be expected: Blacks and Hispanics are more likely than whites to live in economically struggling communities with under-resourced schools. But for years, the PDK/Gallup Poll has highlighted the fact that a majority of parents think local schools deserve an A or B – the message being that most parents are satisfied with local public schools. It turns out that’s only partly true.
And African-Americans differ from whites on other topics and issues: They are:
- More likely to think test scores are an important measure of school effectiveness.
- Less sympathetic to the “opt-out” movement and less likely to exempt their own children from testing.
- More supportive of having schools teach the Common Core State Standards.
The PDK/Gallup Poll tends to produce similar headlines every year: Americans rate their local schools highly, they favor charter schools and choice but are skeptical of testing and accountability schemes, etc. But this year’s poll added a web-based component that let the pollsters break down some results by race and ethnicity and political party loyalty. That gives a better picture of the public’s attitudes.
Regarding Indiana’s selection of British-owned testing giant Pearson to develop and run the ISTEP+ exam: The timing wasn’t the best, was it?
Last week the Indiana Department of Administration chose Pearson as the contractor for ISTEP+ math and English tests. The two-year contract is worth $38.1 million.
Two days later, a New Jersey blogger reported that Pearson was monitoring social media use by students taking tests it created for the PARCC consortium of states.
A test-security contractor said a girl in New Jersey had posted confidential test information on Twitter. Pearson apparently tracked down who she was and told the state education department, which informed local school officials.
The local superintendent vented about the overreach in what she thought was a private email. But it found its way to the inbox of blogger Bob Braun, who broke the story of Pearson snooping on students.
Pearson insisted the monitoring was necessary for test validity, but a lot of people weren’t buying it. Continue reading
What if Indiana hadn’t dumped Common Core and fled the PARCC consortium? Would we still be having this brouhaha over how long our students are sitting for standardized tests? Yeah, probably.
Many of us were taken aback when we learned last week that the time it takes to complete the ISTEP+ exam has more than doubled since last year. But longer tests seem to go hand-in-hand with the more rigorous “college and career ready” standards that Indiana and other states are adopting.
Anne Hyslop, who follows testing and accountability issues as a senior policy analyst with Bellwether Education Partners, believes tests are getting longer because they include performance tasks and writing sections that attempt to better reflect whether students are learning the standards.
“In other words, if you want a high-quality test, you need high-quality items, and those may take longer to complete than a multiple choice question,” she said.
Back when Indiana had adopted Common Core and its teachers were preparing to implement the standards, it was part of PARCC, a consortium of states developing Common Core-aligned tests. And the PARCC exams that will be given this spring aren’t much shorter than the new Indiana ISTEP+.
When the word came out that ISTEP+ was more than doubling in length, some parents and teachers were outraged. A pediatrician told the State Board of Education last week that forcing young children to sit for such lengthy tests amounted to child abuse. Continue reading
Back in 2006, there was an apparently minor glitch in administering the Programme for International Student Assessment, a system of student tests given by the Organisation for Economic Cooperation and Development. Some U.S. reading-test booklets were misprinted and included confusing directions.
The result: The OECD didn’t even report PISA reading scores for the United States. Officials reasoned the scores wouldn’t be valid and they shouldn’t put out bad results.
Leslie Rutkowski, an assistant professor at the Indiana University School of Education and an expert on statistical modeling, brought up the PISA experience in connection with the problems Indiana experienced with its 2013 ISTEP-Plus exam. Testing was plagued by disruptions traced to the computer servers of CTB/McGraw-Hill, Indiana’s test contractor.
“I feel like that is such a telling anecdote,” Rutkowski said. “PISA is low-stakes: The test doesn’t have consequences for students or for schools. And for something as small as a printing error to invalidate the results … that was something that really resonated with me.”
ISTEP, by contrast, is a high-stakes test – all the more reason there needs to be full confidence in its results. “We’re making decisions about whether a child graduates, whether a teacher keeps her job, whether a school stays open” on the basis of test scores, Rutkowski said. “If the data is questionable in any way, I don’t think we can use it.”
Indiana Superintendent of Public Instruction Glenda Ritz appears to agree. The state Department of Education contracted with the New Hampshire-based National Center for the Improvement of Educational Assessment to determine if ISTEP results are valid. Continue reading