BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Did Closing School Buildings Cause Test Scores To Drop? Looking For Evidence.

Following

When the National Assessment of Education Progress (NAEP) results dropped, they were followed by the usual scramble ensued, as folks worked to use the data to bolster whatever policy ideas those folks already supported.

One particular narrative emerged in three parts.

First, the test score drop represented a “terrible loss.” The New York Times fretted that “The Pandemic Erased Two Decades of Progress in Reading and Math.

Next, some critics laid blame for that erasure on the pandemic closure of school buildings. And for public education critics, that could in turn be blamed on teachers and their unions.

In response to an AP tweet about low NAEP scores, Tom Bevan, president and co-founder of Real Clear Politics tweeted, “Randi Weingarten and all of the members of teachers' unions who fought tooth and nail to keep kids locked out of school are responsible.” Walter Blanks, Jr., the press secretary for the DeVos-founded American Federation for Children, blamed teachers and declared, “Devastating proof of how school closures harmed children should inspire parents to end this interest group’s education stranglehold.” A PR firm has been e-mailing journalists that the leaders of Moms for Liberty are available to “discuss how the unions are responsible for the newly announced 20-year-low in student learning loss.”

There is no question that the pandemic interrupted the usual progress of education, but separating out the actual factors and their impact is the challenge (particularly with so many people angling to “interpret” results in a manner that supports business initiatives). The NAEP people are themselves crystal clear that suggesting a causal relationship between test scores and any single factor is a mistake.

Still, the theory that test scores plummeted because school buildings were closed seems open to checking with some quick sifting of data. Can we find a pattern in test results?

NAEP’s own breakdown of data shows the drops were not consistent for all groups. In reading, rural and city students showed no significant drop in scores, but suburban and town students did. When that is broken down to show students in the top and bottom groups, the top and bottom 25th percentiles of town test takers showed drops that NAEP called not significant, in both reading and math.

Broken down by region of the country, we find no significant change in reading scores in the West. Once again, if we look at the top and bottom quarter, we find even more areas where the reading scores did not decline significantly, including top scores in the South, both groups in the Midwest and the bottom quartile in the Northeast.

There’s no clear correlation here between areas that closed school buildings for more or less time. There’s no clear pattern in the subgroups for NAEP results.

State by state NAEP results are not yet available. But we can use a comparison of results from a state’s own annual testing, looking at the 2021 results vs. 2019 scores (in 2020, mandated annual testing was suspended).

The school opening tracker at the burbio website gives us a state by state index of what percentage of school buildings were open, virtual or hybrid. In January of 2021, the site lists just seven states as 80-100% open, with ten states 0-20%. By comparing state test scores from 2021 to 2019 (the last pre-pandemic test round), we should see if any patterns emerge.

The short answer is that they do not.

Arkansas was one of the first states to reopen school buildings. It administers the ACT Aspires test to students in grades 3 through 10. Without diving too deeply into the data, we find that in 2019, 72% 3rd graders met readiness benchmarks and in 2021 that percentage had fallen to 61%. In 2019, 60% of 10th graders met the benchmark and in 2021 55% met it. When cohort of 10th graders was tested as 8th graders in 2019, 74% met the benchmark.

Florida has taken great pride in its efforts to keep schools open. Looking at its 3rd grade results, we find that from 2019 to 2021, the percentage of students achieving “Level 3 or above” in English language testing went from 58% to 54% (the 2022 numbers dropped to 53%). Scores for 3rd through 10th graders showed a similar drop in English language scores, while math numbers for 3rd through 10th graders went down 10%.

California was one of the later states to re-open its school buildings. They test students in 3rd through 8th grade, plus 11th grade. In 2019, 51% of those students met or exceeded the standard for English Language, and 40% met or exceeded standards for math. In 2021, those numbers were 49% and 34% respectively.

Washington state was another state that had a very low level of open buildings, and they did experience a large drop in the percentage of students meeting standards over the 2019-to-2021 span. English dropped from 61% to 52%, while math dropped from 50% to 33%. Yet some of the best-of-the-worst results came from the greater Seattle area, which was Ground Zero for the strictest pandemic responses.

If we looked at currently available data, we’d be hard pressed to find a difference between states related to how long their buildings were closed.

The lack of any other strong, clear pattern holds true for the other states. Anyone can do similar research simply by sifting through state testing results. Stephanie Tait, an author and speaker who put together a similar four-state comparison on Twitter, told me she put it together in about an hour.

The only visible pattern across various pandemic-era scores is that students with less access to wealth and resources were hardest hit, which is not so much news as simply an extension of what we already knew. Some students get the short end of the educational stick; when the stick gets even shorter, they suffer the most.

It’s possible that when NAEP releases state scores, a pattern may emerge, but given the patchwork of different pandemic responses and conditions within states, it seems unlikely that we will ever be able to link the drop in test scores to a single cause.

It may be as simple as along period in which students neither practiced nor prepped for a big standardized test. It may be as complicated as different traumas and stresses for every individual student. What is certain is that anyone pointing to a single cause and labeling it the one reason for a drop in test scores is wrong.

Follow me on Twitter