Normally, when somebody hears about an evaluation of an education program, they reasonably assume the evaluation will tell them whether the program is working or not. When reading an evaluation report, policymakers, parents, and educators hope the evaluation will tell them if the program is helping the participating students. These seem like obvious, uncontroversial points.
On Monday, June 4, researchers from N.C. State released “an evaluation of the North Carolina Opportunity Scholarship Program,” North Carolina’s largest private school voucher program. The authors enthusiastically publicized and distributed the report, making sure to provide advance copies to media organizations and pro-voucher advocacy groups. The report has been highlighted by all of the state’s major media outlets, including being the first story greeting visitors to EdNC.org all of last week.
But there’s a problem: the report fails to tell us whether the Opportunity Scholarship program is working. The researchers’ efforts tell us nothing about whether accepting an Opportunity Scholarship will help or harm a student’s education.
The report’s primary flaw is that it has no external validity. That is, the students tested as part of this study are different from the average Opportunity Scholarship student. As a result, there’s no reason to think that the untested Opportunity Scholarship students would similarly outperform their public school counterparts. As the Charlotte Observer‘s Ann Doss-Helms noted, just over half of the voucher schools that participated in the study were Catholic, while only 10 percent of all schools receiving Opportunity Scholarship vouchers are Catholic. Additionally, the report only looked at students who were recruited and volunteered to take a test. These students are different from the average voucher student.
Because of these differences, you can’t use the report to make claims about the average voucher student or the impact of the voucher program overall. The effects highlighted by the researchers only apply to the 89 Opportunity Scholarship students (in the researcher’s preferred comparison) who volunteered to be tested, representing just 1.6 percent of the 5,624 Opportunity Scholarship students in the 16-17 school year. The report tells us nothing about the other 98.4 percent of Opportunity Scholarship students
Unfortunately, one would have to carefully read the report to reach these conclusions. The press release fails to adequately warn readers of the paper’s limitations. One would have to dig into the ninth paragraph of the Charlotte Observer’s story on the report to find a clear description of the report’s shortcomings:
“N.C. State researcher Anna Egalite says the study she and her colleagues conducted provides valuable insights but doesn’t mean the average scholarship recipient is outperforming peers who stayed in public schools.”
Instead of highlighting the report’s fatal limitation (i.e., it doesn’t tell us whether the Opportunity Scholarship program is working or not), the authors’ press release highlights the “large positive impacts associated with voucher usage in North Carolina.” Predictably, voucher advocates have seized upon this conclusion and exaggerated the report’s findings, spreading a false narrative of the program’s effectiveness. Notably, several of the organizations making these misleading claims share funding with the report’s primary funders: the John William Pope Foundation and the Walton Family Foundation.
In addition to the report telling us nothing about the performance of the average Opportunity Scholarship voucher student, there’s also good reason to doubt that the small number of tested voucher students are actually performing as well as the report indicates. The report attempts to compare test results of voucher students against similar non-voucher students. But despite the sophisticated statistical methods of the researchers, the groups still differ in ways that could affect the findings:
- The voucher students were recruited by a disreputable voucher advocacy organization: The participating Opportunity Scholarship students were recruited by Parents for Educational Freedom in North Carolina (PEFNC), a pro-voucher advocacy organization. The report’s authors make no effort to determine the extent to which PEFNC cherry-picked not just specific schools – but also in picking specific students from those schools. It’s worth noting that PEFNC has a track record of dishonest advocacy. When this author was on the staff of the North Carolina General Assembly, PEFNC tried to pass off falsified data on private school tuition costs in an effort to make the Opportunity Scholarship program appear less costly to taxpayers. Additionally, the organization continues to maintain a website on the “unfairness” of charter school funding, even though their claims have been debunked.
- The non-voucher students were likely from lower-income families than the voucher students: To be eligible for an Opportunity Scholarship voucher, students’ families must have an income that is within 246 percent of the federal poverty level. Yet in selecting the control group, the N.C. State researchers only selected public school students from families of up to 185 percent of the federal poverty level. The researchers claim they control for this by comparing students’ prior year test results, but that assumes that income differences had no impact on student performance in the ensuing school year.
- The non-voucher students were likely in higher-need schools than the public schools the voucher students would have been attending in the absence of the program: Opportunity Scholarship students can come from any school in the state. Yet in selecting the control group, the N.C. State researchers selected public school students from the highest-poverty schools in four districts. The researchers claim they control for this by comparing students’ prior year test results, but that assumes that the effects of attending a high-poverty school had no impact on student performance in the ensuing school year.
- The voucher students and non-voucher students faced different motivations: What were these students told before they entered the exam room? It’s easy to imagine that the voucher students were told to do well, or they might have to change schools. Whereas it’s difficult to imagine what would motivate the public school students to do their best on a meaningless test.
In addition to the differences in student characteristics and motivation, the report explains that the test the students took is not aligned to North Carolina’s Standard Course of Study. If the test aligns more closely with the private schools’ curricula, that would explain some amount of the differences in test results. As a result, it’s not clear that the observed test score differences can be entirely attributed to participation in the Opportunity Scholarship program.
Regardless, the report’s biggest weakness remains that the results – even if accurately measured – tell us nothing about the program as a whole. Because the report only examines the test results of a small, non-representative sample of students who volunteered to participate, these results don’t tell us whether the average scholarship recipient is outperforming peers who stayed in public schools. The report (though not the press release) makes this flaw clear. To the extent the report tells us anything about student performance of voucher students, it only tells us about the 1.6 percent of voucher students recruited to take part in the study.
In short, this report is not an evaluation in the common understanding of the word. Despite the report’s publicity, it does nothing to tell us whether the Opportunity Scholarship is helping or hurting its students. And the roll-out, coordinated with right-wing advocacy groups, has done more to misinform, rather than inform, the public.
The report’s authors have emphasized that one of their central goals in writing the report was to highlight how the Opportunity Scholarship’s existing policies prevent researchers from conducting an evaluation that would allow policymakers to draw conclusions about the program as a whole. If this were the case, the authors should have written a report that more directly and clearly highlights these shortcomings. They could have provided policymakers with options for strengthening the program’s existing accountability structure – currently the worst in the country – so that a meaningful evaluation could be conducted. For example, they could have recommended requiring Opportunity Scholarship students to take the state’s End-of-Grade assessments. They could have encouraged the General Assembly to limit Opportunity Scholarship enrollment, so that the performance of voucher recipients could be compared against similar public school students.
Unfortunately, few readers walked away from the report with these takeaways, and General Assembly lawmakers are about to conclude yet another legislative session without implementing meaningful evaluation and accountability measures on our voucher programs. Despite the N.C. State report, unfettered expansion of vouchers continues, and policymakers, educators, and parents still don’t know whether the program is working or not.