What worked (and didn’t) this year: 10 lessons from education research to take into 2019

It’s hard to keep up with education research. So with the end of the year approaching, we’re here to help.

We’ve synthesized what we learned from research in 2018, focusing on which policies seemed to work and which didn’t. We’re using “what worked” as a shorthand for policies that improved test scores or affected metrics like suspensions, attendance, and high school graduation rates.

(A few important caveats apply: Sometimes, policies affect some measures but not others. And just because a policy works one place doesn’t mean it will succeed elsewhere.)

With that out of the way, here are 10 takeaways from a year of education research.

What worked: Addressing the effects of child poverty

One way to help students in poverty do better in school has nothing to do with schools themselves. That’s the conclusion of a bevy of studies we wrote about this year: improving the conditions of poor children, by just making their families less poor, translates to better outcomes.

Cash benefits, the earned income tax credit, food stamps, and health insurance programs are among the anti-poverty programs connected to increases in student learning or the number of years students persist in school. The benefits were similar to those seen from effective school improvement efforts.

We also looked more closely at some specific programs. For instance, the timing of food stamps affects student learning, with students scoring better on exams a few weeks after the benefits are provided. A program to address lead poisoning through extra health services led to dramatic reductions in suspensions, absences, and crime rates in schools. Children’s health insurance programs caused increases to how long kids stay in school. Universal free lunch programs can reduce suspensions and improve kids’ health.

What worked: Giving students familiar peers and teachers

Research on schools has found that familiarity doesn’t breed contempt — it seems to spur more learning.

One paper found that students scored slightly higher on state tests when they had the same teacher in back-to-back years. Two other studies found that elementary school students scored worse when they had different teachers for core subjects, rather than just one teacher, as is more common. And another paper found that students learned less when their teacher left mid-year, perhaps because of the disruption caused by having a new teacher.

The benefits seem to extend from other students, too. Elementary school students are less likely to be absent when they have the same classmates in consecutive years, recent research has shown.

What worked: Assigning top teachers as mentors to student-teachers

Here’s a common-sense way to improve teacher preparation: ensure prospective teachers are paired with mentor teachers who are themselves effective.

A trio of recent studies were among the first to document that teachers are more effective when they first taught under the supervision of high-quality teacher. Notably, the studies found that experience level of the mentor teacher was less important. The benefits were fairly modest, but they’re encouraging for policymakers who have long struggled to find ways to improve teacher prep.

What worked: Giving struggling students extra learning time

Here’s another common-sense result that research bore out this year: struggling students benefit from extra time in school.

In particular, two recent studies in Massachusetts cities found that students benefited from a “spring break academy,” where some kids were given the chance to get intensive test prep with small classes over spring vacation. In one of the studies, students were not only more likely to be proficient on state exams, they were also less likely to be suspended over the rest of the school year. The results are in line with research on intensive small-group tutoring during the school day for students who are behind.

An important caveat, through: the spring break academies were only offered to certain students, and avoided including students with behavioral or attendance problems. That raises concerns about whether the approach works for some in part because it leaves other kids behind.

What worked: Performance pay for teachers

A handful of high-profile studies several years ago suggested that teachers simply didn’t respond to the promise of higher pay based on performance. But one major study suggested that merit pay may have merit after all.

The random-assignment study, released by the federal government, compared schools that gave teachers raises based in part on their evaluation scores to those that gave raises to all teachers. Students at the schools with performance-based pay saw slightly higher test scores as a result, and teachers were less likely to leave the school.

What worked: Air conditioning

Teachers, students, or really anyone who has been in a hot room knows the temperature makes it hard to pay attention. A study from earlier this year documented that high temperatures cause high school students to perform worse on the PSAT. The schools and classrooms that appeared unaffected? Those with air conditioning.

What (kind of) didn’t work: School vouchers

In last year’s review, we noted that the latest research suggested that private school vouchers hurt student test scores, but those effects might not last for students who use the voucher for multiple years. This year, though, a couple of studies suggested that the lower test scores caused by attending a private school with a voucher do persist.

A study in Washington, D.C. found that lower test scores in math continued for two years after a student was in the voucher program. (That study also found that the vouchers did improve parents’ perceptions of school safety.) And a revised study in Indiana showed that, contrary to an earlier version, students lost ground in math after four years in a private school.

But things become a bit more complicated when looking at other research on vouchers. A study in Milwaukee showed that voucher recipients were more likely to attend college; in D.C., there was no effect on college attendance.

In Louisiana, studies showed huge initial drops in test scores, which bounced back in some cases, for elementary and middle school students. For high school students, vouchers had no effect or even a modest positive impact on college attendance.

It that sounds like a complicated verdict, that’s because it is. If you’re curious for more, check out our updated overview of voucher research.

What didn’t work: Holding students back a grade

One approach meant to help struggling students catch up is leaving them further behind, according to research on grade retention in New York City and Louisiana. Specifically, students held back a grade in middle school were much more likely to drop out of high school as a result. Evidence on retention in elementary school is more mixed, but also doesn’t point to clear benefits of the policy.

One complicating factor here is that some states also offer extra summer school for students at risk of being held back, and that may be beneficial to students.

What didn’t work: Tougher teacher evaluations

Ramped-up teacher evaluation systems came without meaningful benefits, but produced a major unintended consequence, according to two studies this year. A study of the Bill and Melinda Gates Foundation’s grants to districts and charter schools to put in place new evaluation systems showed no clear gains for students. (Gates is a funder of Chalkbeat.) A separate study found that the national push for tougher evaluations — and weaker tenure protections — deterred a substantial number of prospective teachers, perhaps due to fears of weaker job security.

Keep in mind that other evaluations of district-specific evaluation changes have pointed to much more encouraging results, suggesting that the jury is still out on evaluation reforms. Another study from this year described how the new teacher evaluations have changed the jobs of principals, both positively and negatively.

What didn’t work: Cutting school spending

Results on “the nation’s report card” — that is, the federal NAEP tests — arrived this year, and scores were basically stagnant. Some pundits saw a “lost decade” of educational progress.

One potential culprit, backed by some research, is the spending cuts schools faced in the wake of the Great Recession. One study found that states that had steeper cuts made less progress on NAEP as a result. That’s in line with other research we looked at this year on whether schools work better for students when they have more money to spend. The short answer is yes.

“By and large, the question of whether money matters is essentially settled,” concluded one major review of the research.

Want even more of what works? Here’s last year’s version of this retrospective.