School-choice advocates argue that children will get a better education if they can leave public schools for charter or private schools, especially in urban areas. The Indiana Growth Model tells a different story.
It suggests public schools, overall, are performing better than charter schools or the private schools — most of them religious schools — that are getting state vouchers.
The growth model is a statistical tool that measures students’ test-score gains compared to those of students with similar academic histories. It may not be perfect, and critics argue that it shouldn’t be over-used. But it’s unquestionably a better measure of school effectiveness than standardized test scores or school grades, which have been shown to correlate closely to student demographics.
You can download 2012-13 growth scores for all the schools in the state from the Indiana Department of Education website. Sort and rank them, and what do they show?
- For Indiana’s 1,400-plus public schools, the median score – the value at which half the scores are higher and half are lower – was at the 51st percentile in math and the 50th in English. That’s about what you’d expect: Most Indiana schools are public schools, so naturally the median score will be in the middle.
- For private schools reporting growth scores, median scores were at the 46th percentile in English and only the 40th percentile in math.
- For charter schools, median scores were at the 46thpercentile in English and only at the 36th percentile in math.
To be sure, there are charter schools and private schools with strong growth scores. In fact the scores are all over the map for charter schools, private schools and public schools. And among public schools, there’s no clear pattern to which schools do well and which don’t. Schools with high growth include rural, suburban and urban schools; some are in affluent communities and others serve poor neighborhoods.
But the overall trend is clear: Schools that are part of public school districts do better.
In math, I count only 12 charter schools with above-average growth scores, compared to more than 50 with below-average scores. Charter schools with subpar growth scores include highly regarded Indianapolis charters like KIPP, Andrew J. Brown Academy and Carpe Diem. Christel House Academy, which former state Superintendent Tony Bennett helped to get an A in 2012, had growth scores of 25 in math and 23 in English. (Its scores were better last year).
Private schools also did worse than public schools, as a group, when it came to growth on English tests. And in math, fewer than 80 private schools had better-than-average growth scores, and more than 200 had below-average scores.
Even among Catholic schools, it appears that, in math, more than twice as many had below-average growth scores as had above-average scores.
Public opinion has put a halo on Catholic schools ever since the Coleman Reports from 30-plus years ago concluded they were more effective than public schools. But that idea has been challenged by recent research, including a study by Todd Elder at Michigan State that found Catholic schools are worse than public schools at improving student performance.
It’s certainly true that improving test scores isn’t the only thing we want our schools to do, so the Indiana Growth Model shouldn’t serve as the be-all and end-all of school evaluation. It’s reasonable to wonder if a relentless focus on raising math and English scores can cause schools to neglect other important aspects of learning.
But if nothing else, the growth-model results should undercut the argument that we need charter schools and vouchers so children can “escape” failing public schools. And they should raise questions about the parts of Indiana Gov. Mike Pence’s legislative agenda that seem designed to tilt the playing field in favor of charter schools.
Quite a few public schools – including a number in the maligned Indianapolis Public Schools district — are doing great, as measured by the student growth. And quite a few charter and private schools are not.
Notes: Readers are welcome to download the linked spreadsheets and check the calculations. I may have missed or misidentified some charter schools, but probably not enough to affect the overall results. I counted Indiana’s four “turnaround” schools as charter schools because they seem to operate independently as charters do. Some people might quibble with that approach, but – even though their growth scores were quite low – excluding them wouldn’t change the findings.
The communications director of the Indiana Public Charter Schools Association and the director of the Indiana Non-Public Education Association did not respond to emails asking if they could explain the growth-model results.
These results would be front page news…if the results were the other way around.
Why not send this to mr. Tully of the indianapolis stAr. He almost always toots charter over public
Keep up the good investigating, Steve. Public school parents & students in Indiana are well served by your work. The reformers won’t like it because their movement is about making money not educating kids.
Thanks, Beth. It means a lot to me that you appreciate this.
Pingback: Indiana: Public Schools Show Stronger Performance than Charter Schools or Voucher Schools | Diane Ravitch's blog
Pingback: First Delaware; Now Indiana: Charter Schools Scores Lag Far Behind Public School’s Scores | kavips
Pingback: USA: escolas privatizadas têm menor desempenho | AVALIAÇÃO EDUCACIONAL – Blog do Freitas
You’ll have to help me understand… Your talking about “growth”, or improvement on scores over time, right? So let me ask you… How would the growth scores look for a school that has shown 98/99/100% pass rates and 80%+ pass-plus rates for a number of years in a row? Would the “growth scores” be low?
Thanks for the question. The growth model measures changes in individual students’ test scores, and it shouldn’t reflect whether a school has a high pass rate or pass-plus rate. I don’t think there’s any reason to expect low growth at a school where most kids pass the tests.
Some people seem to think that students with high scores are at a disadvantage for growth. As I understand the model, that’s not the case. Each student’s growth percentile score is based on comparing his or her improvement with that of other students with the same test-score history. A student who starts with a high score is expected to grow as much as other students who start with the same high score. A student who starts with a low score is expected to grow as much as other students who start with the same low score.
I believe you are suggesting this type of conclusion: “Students who consistently achieve 95+% on tests and have been strongly supported for years have just as much room to improve as someone getting, perhaps, 80% and is under-supported by the institution, the State and at home”.
Yes, all have room to grow. However, the 82% student may, with different teaching styles, additional assistance and increased resources, be able to fully reach his potential (that is, the highest level to which he can actually perform) and improve his scores by quite a bit. A student who has already been actualized and is already performing to the best of his abilities may have little to no room to grow.
Now, I ask you… Should we compare those two environments and situations? Is it fair to EITHER school to draw improper comparisons? If we score on grades alone, one school looks bad. If we compare on growth alone, the other looks bad.
But the student who consistently achieves 95+ percent is being compared to other students who consistently achieve 95+ percent. There’s no expectation, in the growth model, that she improve her test scores by as much as the student who starts with a lower score. If you look at the numbers, there are a lot of schools with high achievement and high growth and a lot of schools with low achievement and low growth.
Yet you are still suggesting that growth would be expected, no? If two private schools had flat growth and two public schools had differing, large growth, you are saying they would not be and should not be compared between the two types of school? No, of course you aren’t. In fact, the whole premise of the article is comparing the growth of the two types of schools. So if high-performing schools with actualized students has flat growth, using the logic presented in these articles and citations, they would not be as strong a school as one with high growth. (Or at least people would be allowed to reach that conclusion)
Am I misunderstanding?
I think you’re conflating students with schools. I don’t see any reason that high-achieving students, or schools, would be disadvantaged by a focus on growth. I personally think schools should support growth for all students, whether high-achieving, low-achieving or in between. I’m not a fan of any over-simplistic method of evaluating schools. But hey, people evaluate/compare/rate schools all the time. If we’re going to do it, it’s much better to look at growth than to look at achievement (average test scores, passing rates), which tells us little about what the school is doing.
Reblogged this on Middletown Voice.
Steve: I’m a former IN PS teacher, strong critic of vouchers and charters. See my “The Great School Voucher Fraud” at arlinc.org. Only 1400 schools kin IN? Really? Please explain. — Edd Doerr (email@example.com)
I think the number is low because there are no growth scores for high schools — no year-to-year testing in the same subjects.
High scoring students demonstrate high growth too, often gaining more than 9 months of achievement in 9 months time. It’s reasonable to assume that brighter students are capable of as much or more growth than less capable students. But many at-risk and poor students are bright as well and demonstrate more than year’s growth in a year’s time.
As noted in comments above, one has to be VERY careful how growth is defined. If it means more kids passed a 3rd grade test this year than last, that measure is not very helpful. It compares different groups of students and not how much the same students have progressed.
As a parent, I wanted to know if my child was on pace to achieve the expected amount of growth for her age, grade, and subject, regardless of how other students progressed. I could determine her growth by how much difficulty she had with her homework, by checking her graded papers, and by her grades. It was somewhat interesting to know how my child compared to others but not compelling. Parents don’t need a state-wide data bank to measure their child’s progress, and state, national, and international data can often mislead. There’s no good substitute for parental monitoring of student progress, and it’s not difficult to do.
Thanks, Nancy — very helpful comments.
Pingback: The Fight for Indiana’s Schools | I Hear Them All