The GiveWell Blog

Invest in Kids

As part of our research into United States causes, we’ve been looking at Invest in Kids, an organization focused on implementing evidence-based programs in Colorado, and we recently had the chance to speak with Lisa Merlino, Invest in Kids’ Executive Director (edited transcript of our conversation (DOC)).

While our research is still in progress, we want to highlight some of things we really like about Invest in Kids:

  • Founding story. Invest in Kids was started in the late-90s by a group of mostly lawyers in Colorado who wanted to start an organization to help children in need. They considered their options and spoke with experts to identify programs with strong track records. Ultimately, they were convinced by the Nurse-Family Partnership’s strong evidence of effectiveness and decided to start an organization focused on implementing the evidence-based program. At that time, David Olds, NFP’s founder, was conducting the 3rd randomized-controlled trial of NFP’s model, and the NFP National Service Office (the NFP charity that GiveWell recommends) did not yet exist.
  • Ongoing program selection. After implementing NFP, Invest in Kids began looking for other evidence-based programs to implement. In 2003, they settled on the Incredible Years, another program that has been subject to rigorous evaluation. More recently, they participated in a clinical trial of the Good Behavior Game. According to Ms. Merlino, “This research trial was completed and although changes in child behavior trended in a positive direction, the preliminary data shows outcomes were not statistically significant for the children who received the intervention. Therefore, Invest in Kids has decided not to replicate the program at this time. However, anecdotally we heard powerful stories of improvement in teachers and children so we remain hopeful about the positive outcomes that may be seen from this intervention. We continue to await additional results from this and other trials around the country.”
  • Monitoring and evaluation. Ms. Merlino told us that they have ongoing monitoring of the programs they implement to assess whether the outcomes their programs achieve are in line with their expectations based on the research. Note that IIK has sent us these reports, but we haven’t yet had a chance to review them.

While our analysis of Invest in Kids is ongoing, we’re excited about them. Their general approach of looking to scale up what works should, in our view, serve as a model for other non-profit organizations. We’re looking forward to learning more about them over the next few months.

The Money for Good study

The Money for Good study’s headline finding is that “few donors do research before they give, and those that do look to the nonprofit itself to provide simple information about efficiency and effectiveness.”

That conclusion syncs up with our own experience talking to donors, but we aren’t discouraged by the results. That’s because where the Money for Good study answered the question “how do most donors behave?” we’re interested in answering a different question: is there a market for giving based on evidence of impact and how big is that market?

Hope Consulting shared their raw survey data with us, and we’ve done a rough estimate of the size of the potential “GiveWell market” by extrapolating the percentages in the survey to the size of the overall giving market. We estimate:

  • $4.1 billion from donors who report having done research to compare and evaluate multiple organizations (as opposed to researching a single organization or researching how much to give).
  • $3.8 billion if we further narrow the above set by looking at what factors are important to them, and eliminate any donors that rank what we consider “factors irrelevant to impact” (e.g., “ability to get involved with the organization” or “public recognition of my donation”) higher than what we consider “factors relevant to impact” (e.g., “organizational effectiveness”)
  • $554 million from donors who both did research to compare organizations (i.e., fit in the first group above) and reported that “amount of good organization is accomplish” was the most important piece of information sought in their research.

We still don’t have a great sense of the potential market size for GiveWell-style research, but it certainly hasn’t been established that the market is small.

Ultimately, we think it’s important to take the study’s conclusions with a grain of salt. If you polled all TV watchers on what they want, you’d conclude that only a very small percentage want something like The Wire, yet that show wasn’t exactly a failure. In fact, for most successful businesses I can think of, it’s still the case that most people aren’t customers of them.

Our goal isn’t to create a product that the majority of people like; it’s to create a product that some minority market loves. From what we’re seeing now, it’s still possible that the minority of donors interested in impact-focused research is quite large.

Slow spending

The Chronicle of Philanthropy and NPR note that charities don’t seem to have spent large percentages of the funds raised for Haiti to date. Here we (a) lay out the numbers, using the Chronicle of Philanthropy’s helpful public survey data; (b) discuss what it means for donors that most of the money raised seems to be reserved for long-term as opposed to immediate relief.

The numbers

The Chronicle of Philanthropy’s survey data gives a total of over $1.6 billion raised, and seems to include nearly all of the “big name” charities working in disaster relief. We have collected the data, for all charities that provided comparable “raised” and “spent” figures (i.e., either both worldwide or both non-worldwide), into this Excel file.

The chart below summarizes this data by sorting the charities in order of how much they’ve raised. Each bar represents a listed charity; the total length of the bar corresponds to funds raised, while the blue part corresponds to funds spent.

Notes:

  • 38 of the 48 charities have spent under 75% of the money they’ve raised; 29 of the 48 have spent less than half the money they’ve raised; and 22 of the 48 have spent less than a third.
  • The Mennonite Central Committee reports spending far more than it has raised, but its numbers are confusing (it is the only listed charity whose “worldwide” money raised figure is lower than the other figure it provides) and there is no summary of how it’s spent the funds.
  • The Entertainment Industry Foundation reports raising and spending exactly the same amount ($66 million). There is no summary of how it’s spent the funds.
  • Only two other charities, Population Services International and Fonkoze, report spending over 90% of what they’ve raised. Both cases involve relatively small amounts (around $2 million for Fonkoze and $211,000 for Population Services International). Update: this Fonkoze figure is for Fonkoze USA, not for Fonkoze as a whole.

Overall, about 38% of the ~$1.6 billion raised has been spent. In fact, the amount spent – around $627 million – is not much greater than the amount ($560 million) that was raised in the first 9 days after the earthquake hit.

Why this matters: “speed relief” vs. longer-term relief and recovery

We don’t believe that spending money slowly indicates irresponsibility. Shortly after the earthquake hit, we expressed doubts about whether there was “room for more funding”, and the Chronicle’s coverage implies that at this point there largely isn’t.

However, we do feel that it’s important for donors to note how much of their donations are likely paying for longer-term, as opposed to immediate, relief, because this has implications for what one should look for in a disaster relief charity in the future.

Immediately after the earthquake hit, many (including us) were stressing the importance of a charity’s existing capacity on the ground and its ability to respond quickly and efficiently. When we think about longer-term relief, though, we wish to focus less on capacity/speed and more on the things we usually focus on:

  • Is the organization clear about where the money is going?
  • Does the organization formally assess whether and to what extent its work is succeeding? (For disaster relief, in particular, we’d hope to see evidence that the organization is actively getting and acting on feedback from beneficiaries.)
  • Does the organization focus on activities that help as many people as possible, as much as possible, for as little money as possible?

In assessing disaster relief organizations, we plan on focusing on what they’ve accomplished over the longer term, because that’s what donations in these situations are most likely to be paying for.

Alliance for Effective Social Investing survey

We are members of the Alliance for Effective Social Investing, and the Alliance is currently determining whether/how to accept new members. We are passing along the following message on the Alliance’s behalf. If you’re potentially interested in membership, please read the message below.

The Alliance for Effective Social Investment is collecting feedback from current members and other stakeholders in effective social investment to better define the Alliance’s strategic priorities and membership policy. Please take a few minutes to share your thoughts and help shape the way ahead for the Alliance for Effective Social Investment.

All you need to do is follow this link (www.zoomerang.com/Survey/WEB22AWS3WLCWH) that will take you right to the survey. Please take 10-15 minutes to fill in the survey until Friday, July 23. The findings will be discussed at the upcoming Alliance meeting end of July and acted on by current members.

Thank you for your feedback! We appreciate your interest and support. Feel free to invite others to participate in the survey as well and contact us should you have any questions or suggestions.

Best regards,

The Alliance for Effective Social Investment

Unitus and room for more funding

It seems like no one is sure why Unitus is closing its doors. That said – what can we learn from this situation if, as stated, Unitus is closing down because it has accomplished its mission and no longer needs to exist?

“We have always thought of Unitus as a project, and that when we completed the project, we would have the integrity to say we were done,” says Joseph Grenny, one of the seven founders, and the chair of Unitus’s board. (From the Chronicle of Philanthropy’s report on Unitus)

From a donor perspective, what this quote is describing is the issue of room for more funding, which we’ve discussed at length (see our page and blog post series on this issue). No matter how successful a program is, there are limits to how much it can be productively expanded.

In December, we argued that room for more funding is a key question few others are asking charities. The Unitus case – if they did in fact shut down because they “completed the project” – lends support to our argument.

We can’t find evidence that Unitus itself gave an indication that its use for more funds was limited. According to the Puget Sound Business Journal, Unitus was “interviewing potential candidates for its vacant top fundraising post as recently as a month ago.”

Donors shouldn’t rely on a charity to tell them when it’s running out of room for more funding. We believe that the issue of “room for more funding” is one of the most under-recognized issues in the field of charity evaluation, especially for individual donors.

Against Promise Neighborhoods

We are in favor of scaling up proven programs, but against the Promise Neighborhoods initiative.

As far as we know, the only evidence that the Harlem Children’s Zone (or any similar approach) has been effective is the relatively recent study showing impressive effects on test scores at its charter schools. We discussed this study in a four-part series and concluded that:

  • The effect demonstrated was extremely impressive and unusual.
  • There are serious questions about how “real” the effect is (to what extent did it come from narrow “teaching to the test?”), how likely it is to be sustained as opposed to temporary, and how significant it is in terms of likely effects on actual life outcomes.
  • These questions aside, there are also major questions about just what aspect(s) of Harlem Children’s Zone are crucial and whether they can be replicated at all, let alone at a reasonable cost.

Given this situation, we don’t feel it’s time to attempt a replication in 20 communities at once, at a cost that seems likely to stretch into the billions if and when these replications are fully carried out.

We’re not just concerned about mis-spending money. We’re concerned about overreacting to evidence, overpromising results, and thus damaging the credibility of future proposals along these lines. We’re concerned that the funds will be allocated, the Promise Neighborhoods will be rolled out, and 10 years from now we’ll check back and see no narrowing of the achievement gap.

We hope that someday, there will be a truly replicable program with an extremely strong case that it can put a significant dent in the achievement gap. If and when that day comes, a failed Promise Neighborhoods scale-up – and any other oversold programs – will come back to haunt us.

We feel it is appropriate to pursue some replication of, and experimentation with, the Harlem Children’s Zone model. We feel a rollout of this magnitude would be a mistake.