The GiveWell Blog

Philanthropy Action points to more evidence on education interventions

Board member Tim Ogden writes,

Mathematica Policy Research has conducted a multi-year randomized controlled trial of sixteen educational software programs (covering both reading and math) aimed at elementary and middle school students. The products selected were generally those that had at least some evidence of positive impact … the educational software didn’t make much difference.

The second-year study included 3280 students in 77 schools across 23 districts (page xvi – details on sample sizes on pages 4 and 9) in first, fourth and sixth grade (page 70), and randomly assigned classrooms (page 65) to incorporate or not incorporate one of ten software programs (see page 70). Effects on test scores (details of tests on page xviii) had not been statistically significant for any grade in year 1 (page xviii-xx); second-year effects were not significantly different for first- and fourth-graders, and were mixed (better in one case; worse in another) for sixth-graders (page xx).

The results are consistent with the fairly substantial set of evidence that developed-world education is an extremely difficult area to get significant results in. (Including research discussed in recent blog posts here and here, as well as more examples of failed programs discussed on

Note that the second-year study was released a couple of months ago, though we learned of it via Mr. Ogden’s recent blog post. Also note that we haven’t thoroughly examined it, as it does not point to a new promising approach, but rather adds more evidence to a theme we’ve noted many times.

Mr. Ogden also discusses research on education in the developing world, about which we’ll have more to say later.