The GiveWell Blog

December 2014 update on GiveWell’s funding needs

This post provides an update on GiveWell’s operating budget and funding needs. It is aimed at close followers of GiveWell, particularly those who have a high degree of trust in and alignment with us and are primarily seeking to make the highest-impact gift according to our (admittedly biased) opinion.

Our opinion is that for such people (as opposed to the bulk of our donors, who we feel place more emphasis on neutral recommendations, evidence bases, etc.), direct, predictable support of GiveWell represents the highest-impact giving opportunity.

Below, we provide more details on our current funding situation. For more background on our philosophy on fundraising, see our October 2013 post.

What are our current projected revenues and expenses?

We currently project 2015 revenues of $2.16 million and expenses of $2.33 million. We held $1.73 million in reserves at the end of October (the last month for which we closed our books before updating our budget forecast). Because we receive a large portion of our annual funding in December (approximately 40%), this tends to be the time of year when our reserves are lowest.

Our budget file (.xlsx) provides additional detail.

Our projected revenues include donations we expect to recur in the future (because donors have either explicitly told us that they would give again or because they have given consistently enough in the past that we expect their donations to recur) as well as some expectation that (a) some portion of lower probability donations recur and (b) organic growth in unrestricted revenues continues.

Our projected expenses include our best guesses about the number of staff we plan to add (erring on the conservative side — we’d prefer to project an additional hire we don’t make than lack the funds to hire someone outstanding) and the salaries we anticipate paying them.

How has our fundraising and budget situation changed over the past year?

Our staff grew substantially since the end of 2013. We made 7 additional hires in 2014 to bring our total staff size to 18. We also work with 7 conversation notes writers who produce high-quality summaries of conversations we have with experts.

We anticipate staff continuing to grow in 2015. We made several offers to summer research analysts, two of whom (so far) have accepted our offer of employment, and we project some additional hiring. In 2014, we also increased salaries for all staff who had been with us at least a year commensurate with their additional experience and contributions and GiveWell’s increased influence.

Some of this new staff has supported the Open Philanthropy Project, but most have primarily worked on our traditional work focused on top charities. We were able to produce in-depth reviews of significantly more charities than we had in the recent past while much of our senior staff time has gone to continuing our progress on the Open Philanthropy Project; this was made possible by the new staff we have brought on over the past 18 months.

Currently, we estimate that approximately 50% of our unrestricted funding supports the Open Philanthropy Project and 50% supports our traditional, top charities work. In addition, Good Ventures made a $675,000 grant in November 2013 to support research-related expenses for the Open Philanthropy Project.

At what point would we consider our funding gap closed?

At the point where we hit our excess assets policy, we would regrant any funds given to GiveWell to our recommended charities.

Using a conservative revenue projection (which we believe is appropriate when considering granting out funds), we project 12-month-forward expenses as of November 2015 (i.e., expenses we would incur from November 2015 to October 2016) of $1.76 million more than what we project holding in reserves. Therefore, we would require $1.76 million in additional funding before we would begin to grant out funds.

Under a different, less conservative revenue projection, we project 12-month-forward expenses as of November 2015 that are approximately $1.16 million higher than the reserves we project holding. Were we to receive $1.16 million more than we currently project, we would likely no longer encourage additional donors to give to us as strongly as we do today (e.g., via blog posts like this).

What will we do if we raise more or less funding than we anticipate?

If we raise more funding than we anticipate, we would likely reduce the staff time we put into fundraising. This is currently quite low but accounts for, on average, approximately 5 hours per month each from Elie, Holden, and Natalie, time that would otherwise be devoted to research. We currently plan to maintain this level of time commitment to fundraising and are optimistic that posts like this enable us to raise the funding we need without devoting more time to fundraising.

If we raise less funding than we anticipate, Elie and Holden would spend more time on fundraising. If this step didn’t succeed in raising the funding we need, we would consider the following options (likely in this order): (a) slowing or halting planned staff expansion, (b) requesting additional funding from Good Ventures, and (c) laying off staff. Note that we believe these scenarios are extremely unlikely given our current situation, but we require continued, growing support to ensure that we avoid them.

Are GiveWell’s projected operating expenses reasonable or excessive in light of its impact?

We anticipate 2014 money moved to top charities of approximately $25 million and project expenses in 2015 of $2.27 million. We previously wrote that we believe expenses that are 15% of money moved are well within the range of normal.

Good Ventures also directed an additional $8.4 million to funding opportunities identified by the Open Philanthropy Project. In 2014, we project spending approximately $300,000 of the Good Ventures’ research grant mentioned above.

What is our recommendation?

For donors who have a high degree of trust in and alignment with GiveWell, we recommend unrestricted gifts to GiveWell.

For donors who want to support our work because they value it but are otherwise primarily interested in supporting charities based on neutral recommendations, strong evidence, etc., we recommend giving 10% of their donation to GiveWell.

You can do this by sending us a check and filling out our check donation form (details on our donate page) with the allocation for your donation. If you’d like to give 90% of your gift to GiveWell for regranting to top charities and 10% to GiveWell unrestricted you can do so via this page: select the “grants to recommended charities (90%) and unrestricted (10%) option.

Request for input

We’re planning to redesign our website in early 2015. We last worked on our website in 2009, and it’s time to refresh it.

Please let us know if you have any suggestions you’d like us to consider. In particular:

  • Is there functionality you wish our website had?
  • Is there information you wish were easier to find?
  • Is there a message you think we should be more actively communicating?

If you have thoughts, please share them either via blog comment or by emailing us at info@givewell.org.

Update on the Ebola outbreak

Over the past couple of months, we’ve put some time into trying to understand the Ebola containment effort and whether it represents a strong giving opportunity. Our process has included conversations and correspondence with about 15 people including major private donors to the containment effort and representatives of the UN, CDC, WHO, and Doctors without Borders. We have also kept informed on the outbreak by reading updates from the UN, CDC, WHO, and University of Pittsburgh Medical Center. (More on our process)

Yesterday, we published a writeup summarizing the status of the outbreak and control effort, the picture of funding provided and needed, and our view on the cost-effectiveness of donations.

The key points, in our view, are:

  • It has been very difficult – we think unnecessarily so – to get a picture of the funding needs and the likely impact of additional donations. It’s inevitable that a situation like this one will be difficult to understand and follow as it’s unfolding, but we’ve struggled to find even fairly basic information. We haven’t been able to find any consolidated estimate of how much total funding is needed for core activities and how much of that funding has come in so far. The closest we’ve found has been an appeal specific to UN partner agencies and major NGOs; this appeal doesn’t appear to account for the (significant) activities undertaken and funds provided directly by donor governments. (Note that Center for Global Development scholars have expressed similar sentiments about the lack of reliable information on donor contributions, in a post titled How Much Is Actually Being Spent on Ebola?) Furthermore, the description of planned activities for the UN partner agencies and major NGOs is very broad and high-level. We’ve found very little information on how additional funds would be spent. More
  • Substantial funding has come forward, particularly from governments. As of November 28, over $2 billion in funding for immediate relief efforts had been tracked, and more recently the US Congress approved over $5 billion in funding “to contain and end the Ebola outbreak at its source in Africa, enhance domestic preparedness, speed the procurement and testing of vaccines and therapeutics, and accelerate global capability to prevent the spread of future infectious diseases.” (It appears that about half of this will be spent domestically.) We believe that this outbreak may have been less compelling for individual donors than disasters such as the 2010 Haiti earthquake and 2011 Japan earthquake/tsunami; on the other hand, it has attracted significant funding from governments, perhaps in part because of fears that the outbreak might spread beyond Africa. More
  • There has been significant progress toward stopping the outbreak, though there are still areas with intense transmission. It’s hard to have high confidence in data about the containment effort, but broadly speaking, it looks as though the effort has ramped up significantly; that it has largely (though not completely) met its goals regarding safe burials and case isolation; and that the overall number of cases in Liberia (where the outbreak has affected the most people) has declined substantially and come in very far below projections made at the beginning of the outbreak. The situation appears to be worst in Sierra Leone, where there are still major Ebola “hotspots,” and it remains possible that the course of the outbreak could change rapidly. More
  • On the margin, we don’t expect additional donations from our audience to be critical to stopping the outbreak. We see the current funding gap as hindering how far into the future the Ebola response team can plan (e.g. contracts for some employees lasting 2 months instead of 6) or leading to the use of lower-quality equipment (e.g. ambulances with a dial-up modem instead of a VSat) rather than as, for example, preventing work in some areas altogether. More funding may be needed in the future, but we are optimistic that government donors will come forward with the bulk of such funding. More
  • We aren’t recommending additional donations to the containment effort in place of donations to our top charities, but we do feel the containment effort has been an outstanding use of funding. We estimate that the several billion dollar effort, taken as a whole, could easily have saved lives as cost-effectively as our top charities – something that we don’t believe is usually the case in a disaster. We doubt that additional donations from our audience – beyond what has already been committed and will likely be committed in the future from others – would have comparable cost-effectiveness. We see the commitments governments have provided as outstanding uses of funds and hope they deliver on their commitments and close any remaining funding gap. More
  • We remain interested in opportunities to strengthen disease surveillance over a longer time frame, to prevent the spread of this outbreak and help contain future outbreaks at earlier stages. We previously argued: “The best opportunities to prevent or contain the epidemic were probably before it was widely recognized as a crisis (and perhaps before Ebola had broken out at all – more funding for preventive surveillance could have made a big difference).” In our conversations around this outbreak, we have heard that there may be opportunities to rebuild health systems with stronger general surveillance capacity, and we are interested in this as part of our work on biosecurity (a likely priority of our Open Philanthropy Project work on global catastrophic risks).

As is usually the case in a vivid and widely publicized disaster, a large amount of funding has come forward, and it’s been hard to understand the developing situation and the role of additional donations – two factors that generally make us unlikely to recommend giving. In this case, the size of the threat and the potential difference to be made by a containment effort were unusually large, and we believe that the funding that went to this effort has generally been money unusually well spent.

Should we expect an ongoing study to meet its “goal?”

One of our newly “standout” charities, Development Media International (DMI), is in the midst of a randomized controlled trial. So far, all we have from the trial is information about self-reported behavior change, and we’ve tried to use that information to estimate how many lives the program will likely save (for purposes of our cost-effectiveness analysis). We estimate that the measured behavior changes should equate to about a 3.5% reduction in child mortality. However, DMI is hoping for a 19% reduction, and by our estimate, if it falls short of 10-14%, it will likely fail to find a statistically significant impact. What should we put more credence in – GiveWell’s projection based on available data about behavior change, or DMI’s projection?

Ordinarily, I’d be happy to consider the GiveWell estimate a best guess. I’m used to charities’ estimates turning out to be optimistic, and DMI’s estimate is based on a general model rather than on the actual data we have about its impact on behavior.

However, I find myself very uncomfortable predicting a figure of 3.5% when the people carrying out a study – and paying the considerable expenses associated with it – are expecting 10-20%. I’m uncomfortable with this discrepancy for two reasons:

  • It’s a little hard to imagine that an organization would go to this level of expense – and reputational risk – if they weren’t fairly confident of achieving strong results. Most predictions and projections charities put out are, in a sense, “cheap talk,” by which I mean it costs a charity little to make strong claims. However, in this case DMI is conducting a study costing millions of dollars*, and by being public about the study, they face a significant public relations risk if the results are disappointing (as our projection implies they will be).
  • I also struggle to think of examples of studies like this one – large, expensive, publicized studies focused on developing-world health or economic empowerment – that have turned out to be “disappointing” from the perspective of people carrying out (and/or paying for) the study. Though I do know of a fair number of studies showing “no impact” for an intervention, I believe they’ve generally been academic studies looking at very common/popular interventions (e.g. improved cookstoves, microlending). These “no impact” results were noteworthy in themselves, and didn’t necessarily reflect poorly on the people conducting or paying for the studies. I have a much harder time thinking of cases in which a major developing-world study found results that I’d consider disappointing or embarrassing for those carrying out or funding the study. The only one that comes to mind is the DEVTA trial on vitamin A and deworming.

I haven’t taken the time to systematically examine the intuition that “developing-world studies rarely find results that are disappointing/embarrassing for those carrying out the study.” It’s possible that the intuition is false; it’s also possible that it’s an artifact of the sort of publication bias that won’t affect DMI’s study, since the DMI study’s existence and hypothesis are already public. Finally, it seems worth noting that I don’t have the same intuition about clinical trials: indeed, failed clinical trials are frequent (especially in the relatively expensive Phase II).

With that said, if my intuition is correct, there are a couple of distinct possible explanations:

  1. Perhaps, in developing-world settings, it is often possible to have a good sense for whether an intervention will work before deciding to run a formal study on it. Accordingly, perhaps expensive studies rarely occur unless people have a fairly good sense for what they’re going to find.
  2. Perhaps publication-bias-type issues remain important in developing-world randomized studies. In other fields, I’ve seen worrying suggestive evidence that researchers “find what they want to find” even in the presence of seemingly strong safeguards against publication bias. (Example.) Even with a study’s hypothesis publicly declared, we believe there will still be some flexibility in terms of precisely how the researchers define outcomes and conduct their analysis. This idea is something that continues to worry me when it comes to relying too heavily on randomized studies; I am not convinced that the ecosystem and anti-publication-bias measures around these studies are enough to make them truly reliable indicators of a program’s impact.

Even with #2 noted as a concern, the bottom line is that I see a strong probability that DMI’s results will be closer to what it is projecting than to what we are projecting, and conditional on this, I see a relatively strong probability that this result will reflect legitimate impact as opposed to publication bias. Overall, I’d estimate a 50% chance that DMI’s measured impact on mortality falls in the range of 10-20%; if I imagine a 50% chance of a 15% measured impact and a 50% chance of a 3.5% measured impact (the latter is what we are currently projecting), that comes out to about a 9% expected measured impact, or ~2.5x what we’re currently projecting.

In either case, I’ll want our cost-effectiveness estimate to include a “replicability adjustment” assigning only a 30-50% probability that the result would hold up upon further scrutiny and replication (this adjustment would account for my reservations about randomized studies in general, noted under #2 above). Our current cost-effectiveness estimate assigns a 50% probability. Overall, then, it could be argued that DMI’s estimated cost-effectiveness with the information we have today should – based on my expectations – be 1.5-2.5x what our review projects. That implies a “cost per life saved” of ~$2000-$3300, or about 1-1.7x as strong as what we estimate for AMF. It is important to note that this estimate would be introducing parameters with a particular sort of speculativeness and uncertainty, relative to most of the parameters in our cost-effectiveness calculations, so it’s highly debatable how this “cost per life saved” figure should be interpreted alongside our other published estimates.

DMI has far less of a track record than our top charities this year. In my view, slightly better estimated cost-effectiveness – using extremely speculative reasoning (so much so that we decided not to include it in our official cost-effectiveness estimate for DMI) – is not enough to make up for that. Furthermore, we should know fairly soon (hopefully by late 2015) what the study’s actual results are; given that situation, I think it makes sense to wait rather than give now based on speculation about what the study will find. But I do have mixed feelings on the matter. People who are particularly intent on cost-effectiveness estimates, and agree with my basic reasoning about what we should expect from prominent randomized studies, should consider supporting DMI this year.

*The link provided discusses DMI’s overall expenses. Its main activity over the time period discussed at the link has been carrying out this study.

We’re happy to talk to you!

If you’re currently trying to figure out where you’ll give this year, and you think it might be helpful to talk to us about your decision, feel free to contact us at info@givewell.org.

We work hard to put all the relevant information about our recommendations on our website, but we know it can sometimes be hard to fully digest, so don’t hesitate to reach out to us and talk. Talking to donors about their plans for allocating their giving is something we do regularly and we’d be happy to do more of it.

We’re not sure how big a response we’ll have to this and we have limited staff (especially over the next couple of weeks), so if you email us please let us know how much you’re thinking about donating. If necessary, we’ll prioritize larger givers, though we hope to be able to speak with everyone who gets in touch.

Donor coordination and the “giver’s dilemma” – part II

We recently wrote about three questions we faced that relate to donor coordination. This post is a continuation of that topic and may only appeal to donors who are particularly interested in this issue.

Over the past two weeks, we’ve been discussing the question: how should we allocate funds that donors give us for regranting to our top charities?

We plan to allocate these funds according to the recommended allocation explained in our previous post up to the point where one of our top charities reaches the maximum target from individuals we have set (explained here).

The question we face is: what should we do if we move more money than we expect, and one or more of our recommended charities reaches our maximum target from individuals before the end of giving season? In that case, how should we allocate funds given to us for regranting?

We see three options:

  1. Give according to our recommended allocation even after charities hit their max targets. This options maximizes some donors’ agency. For example, if Alice (hypothetical donor) gave $10,000 to the Against Malaria Foundation (AMF) in early December, we would not take Alice’s gift into account when deciding how to allocate funds given to us for regranting. Were AMF to meet its maximum target from individuals, we would still allocate funds there and Alice’s gift would cause AMF to receive $10,000 more than it would have had she not donated. However, were we to keep giving to charities beyond the maximum we believe they can effectively use, we would be allocating funds suboptimally. Many of the donors who give to us for regranting do so because they want us to use our judgment about where additional funds will do the most good. If we mechanically follow our predetermined allocation, we would not follow these donors’ wishes in allocating their funds. We also believe it is likely that many donors who give to a specific charity would not want us to allocate funds beyond the maximum targets we have set for organizations. To continue the example, our guess is that Alice would often say, “I would rather donate to one of GiveWell’s other top charities once AMF has closed its funding gap.”
  2. Give where we think it’s needed most. In the event that one or more of our top charities passes its maximum target, we would reallocate funds to the charities with remaining funding gaps. For example, if AMF were to receive $5 million before the end of giving season, we might choose to reallocate the funds given to us for regranting to the Schistosomiasis Control Initiative (SCI), the Deworm the World Initiative (DtWI) and GiveDirectly up to their maximum targets from individuals. The benefits here are clear: we would direct funds to charities that we believed to have more pressing gaps. The potential problem is that our doing so would arguably take agency away from other donors. For example, if Barbara gave $10,000 to AMF in early December, and AMF reaches its $5 million maximum target from individuals in late December, we might choose to allocate funds we hold to charities other than AMF. This means that Barbara’s gift did not effectively cause AMF to have $10,000 more; instead, it effectively caused the other charities to which we allocated funds to have $10,000 more. If Barbara wanted to support AMF and only AMF, our decision to reallocate the pool of funds over which we have discretion removed her ability to cause the charity of her choice to receive additional funding.
  3. Compromise with donors who request it. In our previous post on this topic, we wrote about the compromise we reached with one donor who was planning to give to SCI. He believed SCI’s funding gap was larger than we did, and planned to give $1 million to SCI, so we compromised by meeting in the middle: we increased our target by half the size of his donation ($500,000). In this option, we would reach a similar arrangement with donors who have given or plan to give to our top charities, disagree with us about the charities’ maximum targets from individuals, and therefore want the charities to receive more funding even if it would cause them to go past the maximum targets we have set. We would do the same thing with these donors as we did with the SCI donor by adjusting our maximum targets by half the size of their donations.

Our tentative plan is to take option #3. Specifically:

  • As a general rule, we are not planning to allocate funds to charities such that they would receive more than our maximum targets.
  • Any donors who disagree with us about our targets should email us at info@givewell.org to let us know. If they tell us the size of their donation (and we can verify it), we will increase our targets by 50% of the size of their donation for purposes of allocating funds earmarked for regranting. This post is our announcement of this fact. We aren’t planning to announce this elsewhere.

We believe that this plan will result in relatively few donors emailing us and GiveWell largely reallocating funds given to us for regranting according to our best judgment if/when one or more of our top charities hits its maximum target from individuals. We have a few reasons for believing this is the right course of action:

  • I corresponded with five donors who are long-time supporters of GiveWell top charities, none of whom give to GiveWell for reallocation. They encouraged us to follow our intended path and were not particularly concerned about the problem of removing donor agency (generally speaking — there was some disagreement). In particular, one noted that, over time, all donations are eventually fungible. That is, if donors give more to a charity than it can effectively use in year 1, it is more likely that we recommend it receive less money in year 2. (This is what happened to our recommendation of AMF from 2011 when we first recommended them to 2013 when we said that AMF had limited room for more funding.)
  • Our impression from conversations with many people is that very few donors are concerned with issues of donor agency. We think that most people who land on our website and make a donation would rather we use our judgment to allocate funds optimally than allocate funds suboptimally to maximize the agency of their decision to support a specific charity.

Generally, we see a real distinction between (a) setting targets for charities based on incentives, how strong we think the charities are, etc. and (b) setting targets for charities based on how much we think they can productively absorb. When donors give differently from our recommended allocation, by default we interpret this as a disagreement with us on (a) but not necessarily on (b). As such, we are hesitant to “offset” donations on the basis of (a), but we are much less hesitant to do so on the basis of (b). We set “room for more funding” targets taking into account funds available from other sources, and this means that in some long-run sense we’re always creating the possibility of “offsetting” others’ donations.

It’s possible that we’ll change our mind based on reactions to this post.

Additional questions and answers about our plans:
How will we allocate funds in the event that a charity hits its maximum target?

We’re not yet sure. It’s possible that we will allocate surplus funds to GiveDirectly (which has the highest maximum target from individuals and is unlikely to reach it), but it’s also possible that we ultimately decide to allocate at least some funds to AMF (whose maximum target is more of a “soft” and approximate cap) or to standout charities. We have found that we need to go through a period of intense debate and reflection before we decide how to allocate funds in a given situation; that’s not something we’ve done yet for the hypothetical situation outlined here.

When do we plan to allocate funds given to us for regranting?

We are currently focused on funds that come in during giving season. Our accountant will close the books on December in mid-January, and we will grant funds out in mid-February.

How will we allocate funds that have already been donated for reallocation?

Some donors gave for regranting before we clarified the language on our donate page, and may have intended that their gift follow the allocation stated on our donate page. We plan to email these donors to tell them our plans and ask them whether they’d prefer that we allocate their funds (i.e., funds we have already received) according to our target allocation rather than according to the plan laid out in this post.

Are we concerned that this policy will incentivize donors to try and avoid this plan?

It is possible that this policy will cause some donors to (a) wait to give until after we’ve allocated the bulk of the funds we hold or (b) don’t report their gifts to us.

We think this will happen to some extent but will likely be relatively minimal. Our targets are based not only on funds given by GiveWell-influenced donors but on all funding an organization receives, so eventually we will learn about and incorporate their gifts into our targets.