[Ed. Note: The following–by DCPS parent and education researcher Betsy Wolf–concisely unpacks a recent study by Stanford University’s Center for Research on Education Outcomes (CREDO), which has been funded by pro-charter and ed reform interests. This study compares the performance of charter school students to those in traditional public schools. It comes at an interesting moment, as test scores nationwide have decreased while covid has affected student groups differently and DC pursues boundary and master facilities plans based in part on school quality, which itself is largely based on test scores. In other words: buckle up for a bumpy 2024 around measures of student progress and attendant propaganda.]
By Betsy Wolf
I read the recent CREDO charter school study so you don’t have to. This is a QED [quasi-experimental design] study where charter students are matched to students attending “feeder” neighborhood schools based on grade, gender, free lunch (which varies by state), race, sped [special education], ELL [English language learners], and “similar” grade 3 scores.
The charter sample is restricted to students who attended only charters during the study period. The traditional public school (TPS) sample is restricted to students who attended only TPS during the study. So this eliminates students who moved in between charter and TPS schools.
The study found matches for 81% of charter students, meaning the study did not examine outcomes for about 19% of charter students.
Relative to the “feeder” TPS, charter schools in the study served slightly fewer students in poverty (2 percentage points difference, PPD), English learners (2 PPD), students with disabilities (SWD; 2 PPD), White (8 PPD), and more Black students (9 PPD).
First, the report tries to rule out the “charter school student advantage”. They look at prior achievement by looking at charter students who were previously in TPS and compare their achievement with those of students who remain in the TPS. But this doesn’t examine prior achievement for all charter students (just those who were previously in TPS; we don’t know what percentage this is). They find slightly lower achievement for the charter in-movers versus TPS students, which might be because mobility can lower achievement.
The report indicates that this analysis rules out systematic differences in other unobserved factors (like motivation). But the problem with “omitted variable bias” is that there might be systematic differences based on variables you don’t observe in your data–like motivation.
Given that charter students are opt-in versus opt-out, there are likely systematic differences that are unobserved in the data. That’s why this is a QED study. To be clear, this study still appears to be a reasonably well-done QED study, but it can’t make causal claims.
Second, the study then looks to see if charter students are outperforming TPS students in grades 3-12 in 31 states (including DC) in years 2014-19.
Overall results for charter students: +0.028 standard deviations (SDs) in reading, +0.011 SDs in math, which are very small effects. Results are also presented in number of days learned, which makes it seem like the results are much larger than they actually are.
Results for Black and Hispanic students in poverty were higher: +0.05 to +0.06. Results for all students in poverty were +0.029 to +0.039. Results for students with disabilities were negative: -0.024 to -0.022.
Results for DC were 0.00 in reading and +0.056 in math. Results also show gains for Black students and losses for white and Hispanic students in charters in DC relative to those in TPS.
And, there was a lot of variation in charter effects. Effects of charters were larger in charters with management organizations than in stand-alone charters. Online charter schools had large negative effects. Effects of existing charters are more positive than effects of new charters.
Effects of charters were positive for reading in 18/31 states, which is 56% of states (but not DC), and positive for math in 12/31 states, which is 39% of states (including DC).
Another way to think about this is that charters did not do better than TPS in about half of states in reading and about 60% of states in math.
But this just tells us whether effects were positive, not whether there was a meaningful improvement.
The study also aggregates student achievement up to the schools and finds charter schools (for matched students) performed better than their feeders (but unclear how much) about 36% of the time; performed the same as their feeders 39-47% of the time; and did worse 17-25% of the time.
So what to make of all this?
On average nationwide, we see very small positive effects for charters, but there is a lot of variation across charters and TPS. And, the study can’t rule out all concerns about unobserved systematic differences in charter versus TPS populations.
The report focuses on student achievement, but there are other outcomes we should care about. Like, what are the effects of opening charters on the whole system? On the students who are either “left behind” or pushed out? On the quality and diversity of the educator workforce?
This was my interpretation of the study results, and it took some time to dig through. There is always the chance that I didn’t fully understand what the authors were trying to convey.
The report also gives a long appendix of praise/shame for individual charter organizations. But again, I would take these with a grain of salt. In DC, the “feeder” schools may not be that comparable, which means the match rates might be quite low for a particular LEA.
And this is what one of the authors says on a Wall Street Journal podcast, “So what this is showing is charters really are outperforming their public school peers and in some states very much so and especially among certain types of students, including Black and Hispanic students and those in poverty.”
https://www.wsj.com/podcasts/opinion-potomac-watch/can-data-change-the-debate-on-charter-schools/13123885-7ef9-4124-8eed-634031efa5cb?eType=EmailBlastContent&eId=6ddfceb5-24fe-4925-adf0-254e5203d685
LikeLike
Yes. That is exactly why Betsy Wolf’s outline of the study is so valuable: It outlines what the study actually shows versus propaganda about the study.
LikeLike