The stakes are the problem

Indiana’s ILEARN scores have been made public, and the freakout is underway. I guess we should be grateful. A decade ago, business leaders and newspaper editorial writers might have pointed to the scores as evidence that schools were broken. Now the consensus seems to be that the test is broken.

Here’s another possibility. Maybe the problem isn’t with the test. Maybe the problem is what we do with it. Maybe it’s the high stakes, not the testing, that we should reject.

Results for the new ILEARN assessment were released today during a meeting of the State Board of Education. As expected, the rate at which students were found to be proficient was considerably lower than the passing rate on ISTEP, Indiana’s previous test.

Continue reading

Advertisements

ILEARN results: déjà vu all over again

Here we go again. Indiana has a new standardized test, the results sound bad, and educators are calling on the state to hold off on imposing consequences on schools or teachers using new test scores.

Today, Gov. Eric Holcomb joined the call for a “pause” in accountability based on the tests. House and Senate leaders concurred, which means it’s almost certain to happen. Results from the new assessment, called ILEARN, are scheduled to be made public at the Sept. 4 State Board of Education meeting.

Continue reading

Why not grade all schools on growth only?

Most Indiana schools earn A-to-F grades on a formula that gives equal weight to performance and growth on standardized tests. But schools in their first three years of operation – most of which are new charter schools and Indianapolis or Gary “innovation network” schools – can have their grades calculated on growth only, with no consideration of performance. Those schools have an advantage.

As Dylan Peers McCoy of Chalkbeat Indiana pointed out, it means you can’t use the grades to compare schools in a district like IPS. “Of the 11 out of 70 Indianapolis Public Schools campuses that received A marks from the state,” she wrote, “eight were graded based on growth alone.”

So why not grade all schools on growth only, not performance? It seems like that would make a lot of sense. In any given year, schools may not have a lot of control over where their students start out in their math and reading performance. What matters is, do schools help students grow?

Continue reading

ISTEP results are a non-story

It’s a lousy week to be an education reporter in Indiana. ISTEP-Plus test results were released Wednesday by the State Board of Education, so editors are assigning – and readers are expecting – the usual stories. Which schools did best? Which did worst? Which improved, and which didn’t?

Reporters who spend their work lives visiting schools and talking to educators and experts know this is the epitome of a non-news story. They know that years of experience and research tell us that affluent schools will have higher test scores than schools serving mostly poor students. But the stories have to be written.

It’s no surprise that low-poverty schools in the suburbs have the highest passing rates in the Indianapolis metropolitan area. They do every year. And it’s disturbing but not really shocking that barely 5 percent of Indianapolis Public Schools 10th-graders passed their tests. Three of their high schools were about to close; the tests had no consequences for the schools or their students.

That’s not to say test scores or meaningless, or that they should be ignored altogether.

Continue reading

Don’t look! It’s ISTEP time

It’s been said that Indiana’s ISTEP testing program is a train wreck. It’s also something like a car crash that you pass on the highway. You know you shouldn’t stare, but you can’t avert your eyes.

Scores from the spring 2017 tests were released Wednesday, and newspapers and digital news sites have already posted stories about how local and state schools fared. Admit it – we’re going to read them, even though we know in advance which schools will do well and which schools won’t.

I’ll leave the scorekeeping and analysis to others, but here are a few observations: Continue reading

PDK/Gallup Poll: Views differ by race

People of color have a different view of their community schools than do white people. That’s an important take-away from the 2015 PDK/Gallup Poll, released Sunday.

For example, asked to rate the schools in their own community, 51 percent of poll respondents gave local schools an A or B. But only 23 percent of African-American parents and 31 percent of Hispanic students gave their local schools an A or B.

Maybe that’s to be expected: Blacks and Hispanics are more likely than whites to live in economically struggling communities with under-resourced schools. But for years, the PDK/Gallup Poll has highlighted the fact that a majority of parents think local schools deserve an A or B – the message being that most parents are satisfied with local public schools. It turns out that’s only partly true.

And African-Americans differ from whites on other topics and issues: They are:

  • More likely to think test scores are an important measure of school effectiveness.
  • Less sympathetic to the “opt-out” movement and less likely to exempt their own children from testing.
  • More supportive of having schools teach the Common Core State Standards.

The PDK/Gallup Poll tends to produce similar headlines every year: Americans rate their local schools highly, they favor charter schools and choice but are skeptical of testing and accountability schemes, etc. But this year’s poll added a web-based component that let the pollsters break down some results by race and ethnicity and political party loyalty. That gives a better picture of the public’s attitudes.

Continue reading

Indiana steps into testing privacy mess

Regarding Indiana’s selection of British-owned testing giant Pearson to develop and run the ISTEP+ exam: The timing wasn’t the best, was it?

Last week the Indiana Department of Administration chose Pearson as the contractor for ISTEP+ math and English tests. The two-year contract is worth $38.1 million.

Two days later, a New Jersey blogger reported that Pearson was monitoring social media use by students taking tests it created for the PARCC consortium of states.

A test-security contractor said a girl in New Jersey had posted confidential test information on Twitter. Pearson apparently tracked down who she was and told the state education department, which informed local school officials.

The local superintendent vented about the overreach in what she thought was a private email. But it found its way to the inbox of blogger Bob Braun, who broke the story of Pearson snooping on students.

Pearson insisted the monitoring was necessary for test validity, but a lot of people weren’t buying it. Continue reading