Examining data for Indiana’s ‘disaggregated groups’

Here’s a question that arguably deserves more attention from education researchers and policy types: Why are some schools better than others at getting students from low-income families to pass tests?

We hear a lot about high-poverty schools that produce better test scores than you’d expect. We pay a lot of attention to no-excuses charter schools and public schools that focus relentlessly on data. But poor kids are scattered throughout all kinds of schools and school districts, urban, rural and suburban. And judging by test scores, some districts do a better job of helping them learn than others.

The Indiana Department of Education recently posted district-by-district and school-by-school passing rates on the ISTEP+ exam for “disaggregated groups” of students: minorities, students who qualify for free or reduced-price lunches, English language learners and special-needs students.

The data are a carry-over from the No Child Left Behind Act, which required schools to hit targets for the percentage of students in each group who passed standardized tests.

The results vary from school to school – a lot. Looking at students who qualify for free and reduced-price lunches, for example, the proportion who passed both the math and English ISTEP+ exams in 2014 ranged from 85.9 percent to 45.2 percent. The state average was 62.3 percent.

Some of the districts with the lowest passing rates for free-and-reduced lunch students are high-poverty urban districts. But some aren’t. Some of the districts with the highest passing rates are low-poverty schools with relatively few poor students. But some aren’t. It’s a mix, with no obvious pattern. Continue reading

Indy’s Catholic-to-charter school experiment comes to an end

Any other week, the announcement that the Padua Academy and Andrew Academy charter schools in Indianapolis were giving up their charters would have been big education news. Last week, not so much. The story got buried under reports of alleged ISTEP+ cheating at Flanner House Elementary charter school.

It was certainly a big deal when Padua Academy and Andrew Academy opened as charter schools, however. Formerly Catholic schools, they converted to publicly funded charter schools in 2010, a time when Catholic schools were struggling financially.

Rather than close the schools, the Archdiocese of Indianapolis created an independent board, ADI Charter Schools Inc., which got a charter from the Indianapolis mayor’s office to operate the schools – in the same buildings and with many of the same students, but without religious education. “These two schools are the first in the nation to be chartered by an archdiocese through the establishment of an independent board,” the ADI Charter website says in a history of the schools.

They did well academically for a time but have struggled recently. In spring 2014, only 39.7 percent of Padua students and 31.7 percent of Andrew students passed both the math and English ISTEP+ exams. Continue reading

Thoughts on the Flanner House cheating allegations

If cheating at Indianapolis Flanner House Elementary School was as bad as reports suggest, the question you have to ask is: Why? Why would a teacher, or teachers, bend the rules to boost their students ISTEP+ scores when they were likely to get caught?

Were they under that much pressure to raise test scores? Were they worried the school might be shut down? Did they think their students were at an unfair disadvantage in a rigged testing game?

We don’t know for sure what happened at Flanner House. Reports by the Indiana Department of Education and Indianapolis Mayor Greg Ballard’s charter-school office suggest there was cheating in 2013 and 2014. But officials at the school pushed back against the allegations.

“I don’t believe there was massive cheating for Grades 3 to 6 here,” school board president Patricia Roe told parents last week, according to the Indianapolis Star.

Flanner House is a charter school where over 90 percent of students qualify for free school lunches and nearly all are African-American. It shocked school-watchers in spring 2013 by recording some of the highest ISTEP+ passing rates in the state, after a fairly mediocre performance in previous years.

That apparently sparked an investigation, and the state education department reported last week that students’ test sheets included an unusually high number of wrong-to-right answer changes, suggesting someone was guiding them. Some test booklets, the state said, had answers in more than one handwriting, including sections that appeared to be written by an adult. Continue reading

Should we accept school segregation by social class?

The Bloomington Herald-Times had an excellent series of stories recently about Fairview Elementary School, a local school that struggles with a high poverty rate and low test scores. The series should raise some hard questions.

For example, why are public elementary schools so thoroughly segregated by class and income in this small college town? Can we do anything about it? Should we try?

To answer the third question with a question: Shouldn’t we at least talk about it? As Richard Kahlenberg of the Century Foundation suggests in a 2012 article, America isn’t doing students any favors with its system of schools that are divided by social class.

The research is clear. Low-income students in middle-class schools are surrounded by: (1) peers who, on average, are more academically engaged and less likely to act out than those in high-poverty schools (2) a community of parents who are able to be more actively involved in school affairs and know how to hold school officials accountable; and (3) stronger teachers who have higher expectations for students.

Why are local schools divided by class? The usual answer is that we all want neighborhood schools, and there are rich and poor neighborhoods, so schools reflect this reality. But that’s only part of the story.

Sure, affordable housing tends to get clustered in areas where property is cheap. In Bloomington, there are areas where housing is expensive and areas where it is less so.

But our elementary schools for the most part aren’t what we think of as neighborhood schools, where kids walk to and from school and the buildings serve as centers for neighborhood activities. Continue reading

Records request leads to wait

A year ago this week, I filed a public-records request with the Indiana Department of Education. I’m still waiting to see if I’ll get what I asked for.

Kelly Bauder, a state DOE staff attorney, admitted this week that the department has been running behind on responding to a trove of records requests. Two employees who were working on the task left the department, she said. A new legal assistant has been hired and is learning the ropes.

“We’re hoping to get caught back up in the next couple of weeks,” she told me.

My request was for copies of departmental emails from 2012 concerning changes in the state’s school grading system. The objective is to tie up a loose end to a story.

Last summer, Associated Press Reporter Tom LoBianco disclosed DOE emails showing how former Superintendent of Public Instruction Tony Bennett and his staff scrambled to tweak the system so Christel House Academy, an Indy charter school run by a Bennett political supporter, would get an A instead of a C.

Those emails showed the department decided not to count the performance of Christel House’s 9th and 10th-graders for accountability purposes. That boosted its grade from a C to a B. How did it get to an A? Thanks to Cynthia Roach, director of assessment for Indianapolis Public Schools, we learned the other change: getting rid of a “ceiling” on points awarded elementary schools for math or English test scores.

But it was never clear when, why and by whom that decision was made. Continue reading

Math for America president: ‘Value-added’ flawed, but advocates won’t let go

John Ewing wrote a classic 2011 journal article, titled “Mathematical Intimidation,” that lamented the growing use of value-added models to evaluate schools and teachers. Three years later, he sees little evidence that education policy makers understand or care about the flaws in the approach.

Yes, he said, critics of value-added have grown more vocal. But lots of people with power and influence are still wedded to the idea that we can use test scores to identify bad teachers – and either weed them out of the profession or make them improve.

“People just can’t let it go,” Ewing told me this week. “Policy makers bought in, in a big way, and they can’t let go of it.”

John Ewing (Math for America)

John Ewing (Math for America)

Ewing, a mathematician, is president of Math for America, a New York-based organization that promotes mathematics education. He previously spent 14 years as executive director of the American Mathematical Society. Before that, he was an Indiana University math professor for two decades.

“Mathematical Intimidation” was directed at his fellow mathematicians, urging them to stand against policies that make bad use of their discipline. But it’s a concise, easy read. You don’t need to be a mathematician, or even know a lot of math, to follow its clear and persuasive argument.

Ewing wrote that that proponents of value-added use the supposed objectivity of the models – they’re based on mathematics, after all – to close off discussion of what the goals of education should be. But the models rest on a shaky foundation: The idea that standardized tests in math and English provide a valid and complete measure of what schools and teachers should accomplish. Continue reading

Does growth model measure up to Indiana law?

There’s a lot to be said for the Indiana Growth Model, the statistical method that Indiana uses to calculate year-to-year student growth on math and English test scores. It’s far from perfect, but it’s a much better measure of how schools are doing than looking at how many students pass the tests.

It may be a challenge, though, to use the model while complying with a state law that says Indiana must measure students’ growth in relation to their proficiency on state standards.

That’s especially true now that Indiana has adopted new “college and career ready” standards and will be giving a new version of the ISTEP+ exams, aligned with the new standards, in the spring of 2015. Is it possible to measure growth in proficiency when you give a test for the first time?

The law in question is House Enrolled Act 1427, adopted in 2013. It says accountability measures “must be based on measurement of individual student academic performance and growth to proficiency” and “may not be based on a measurement of student performance or growth compared with peers.”

But the State Board of Education voted this month to use the growth model in 2015. Indiana’s Center for Education and Career Innovation and the state Department of Education recommended the approval.

CECI and DOE staff cited an analysis by testing expert Damian Betebenner, who helped design the Indiana Growth Model and advises the state. He suggests using a statistical adjustment called “equi-percentile concordance” to correlate the 2014 test with the new, 2015 test. That, he says, will make it possible to keep using the growth model to measure students’ test-score gains.

But his report to the board also says that, with the move to new ISTEP+ exams, it won’t be possible to evaluate students’ gains or losses on a single test from one year to the next. “Without gains/losses,” he writes, “growth must be calculated using norm-based metrics that compare like students as they progress from the ISTEP+ to the Career and College Ready Assessment.” (Italics added). Continue reading