The GiveWell Blog

Truth please

This post is more than 17 years old

truth pleaseThe most common type of evidence non-profits have offered us in defense of their programs are independently produced evaluations. In my experience, though, these evaluations sometimes read more like propoganda than like objective assessments of a program’s impact. I came across one glaring example of this phenomenon when reviewing the Vocational Foundation’s application for a Clear Fund grant in Cause 5.

This report by Public/Private Ventures is generally quite positive about VFI, but the thing that caught my eye was the following statement on page 8: “Of the 87 percent of enrollees who complete skills training, 78 percent are placed in jobs. The placement rate is well above the New York City average of 39.6 percent for youth employment programs.*” VFI is twice as good as average – that sounds great.

But is it? The problem is that pesky asterisk. Following it to the bottom of the page, and reading the tiny type, you’ll find that in fact, the average placement rate for long-term employment and training programs, those that are comparable to VFI’s, is 64%. It’s not clear whether this is the percentage of graduates who are placed – in which case VFI is noticeably higher, at 78%, though still nowhere near twice as good – or whether it’s the percentage of enrollees who are placed, in which case VFI rings in at 68% (87%*78%), almost exactly the same as average. Either way, it looks like P/PV is highlighting a number that they know exaggerates the picture, and burying the more accurate story in a footnote.

The rest of the paper goes on to describe VFI’s method – the student selection process, curriculum, and individualized “case-management” – in attempt to understand why VFI’s method is successful, all with the assumption that the model is successful. And, it makes no attempt to give evidence for particular practices – the assumption underlying the entire paper seems to be that VFI is successful, and therefore everything it does must be “what works.”

Of course, we’ve seen some great studies too, those that work hard to assess whether a program is impactful and don’t fall into the trap of just extolling its virtues.

So, if you’re evaluating, don’t sell. Just give us the facts. And, if you’re reading a report, remember to read with a critical eye.