This is the second post (of six) we’re planning to make focused on our self-evaluation and future plans.
This post reviews and evaluates last year’s progress on our traditional work of finding and recommending evidence-based, thoroughly vetted charities that serve the global poor. It has two parts. First, we look back at the plans we laid out in early 2014 and compare our progress against them, providing details on some of the most significant accomplishments and shortcomings of the year. Then, we reflect on the quality of our traditional work and critically evaluate some of our major strategic decisions. In our next post in this series, we will cover our plans for GiveWell’s traditional work in 2015.
Overall, we feel that 2014 was an excellent year for GiveWell’s traditional work.
At the beginning of 2014, we laid out our most ambitious research goals yet, including publishing updates on all recommended charities, reviewing several new charities, and completing new intervention reports. We expected to be able to complete a higher volume of work than ever before while also reducing senior staff time (note 1) devoted to GiveWell’s traditional work by continuing to hire, train, and develop non-senior staff. We feel that we broadly met those goals while maintaining the overall quality of our research.
The impact of GiveWell’s traditional work continues to steadily increase, as we moved about $28 million to recommended charities in 2014 (more details forthcoming in the final post of this series, which focuses on our metrics).
We believe that our 2014 recommended charities list was high quality. A notable development was that we included four new “standout” organizations on our recommended charities list. We believe that some of these organizations may become top charities in the future.
A few areas in which we fell short in 2014 were:
- Publishing new intervention reports. Completing intervention reports was more difficult than we had expected at the beginning of 2014; we hope to improve on our process for doing these reports in 2015.
- We finalized the details of our top charity recommendations in late November 2014, after we had already made a recommendation to Good Ventures about how to allocate its giving among the recommended charities. If we had completed our analysis earlier, we may have recommended a different allocation to Good Ventures. We see the harm here as minimal since we ultimately adjusted our public targets to account for grants from Good Ventures; however, in the future we should try to avoid a recurrence of this issue, perhaps by announcing our recommendations to Good Ventures and the public at the same time.
- We did not follow the ideal process for ensuring that our cost-effectiveness analyses were robust, accurate, and easily understandable, which led us to finalize these analyses very late in the year.
- Continue to build capacity for conducting “top charities”-related research work. Reduce senior staff time devoted to this work by training other staff to take over senior staff’s responsibilities
- Publish updates on previously recommended charities
- Conduct reviews for several new potential recommended charities
- Maintain our “open-door policy” for allowing charities to apply for a recommendation
- Publish four intervention reports that were near completion (maternal and neonatal tetanus elimination, salt iodization, Vitamin A supplementation, and polio)
- Publish 5-10 new intervention reports on nutrition programs, behavior change programs, and other programs
- Fund experimental work that may lead to more recommended charities in the future (e.g., providing early funding to promising charities such as New Incentives or funding replications for promising interventions)
- Conduct other miscellaneous research (e.g., produce a cost-effectiveness estimate for Dispensers for Safe Water (DSW), review the midline of Development Media International (DMI)‘s randomized controlled trial (RCT), consider evaluating a mega-charity, etc.)
We feel that we broadly achieved these goals in 2014. In summary, we:
- Substantially increased our capacity and completed more research than in previous years (more details below).
- Accomplished our goal of publishing updates on four previously recommended charities (Against Malaria Foundation (AMF), Deworm the World Initiative (DtWI), GiveDirectly, and Schistosomiasis Control Initiative (SCI)).
- Completed four new charity reviews (DMI, Iodine Global Network (IGN), Global Alliance for Improved Nutrition’s Universal Salt Iodization program (GAIN), and Living Goods).
- Maintained our “open-door policy” for allowing charities to apply for a recommendation. Living Goods reached out to us and took advantage of this policy, and it ultimately became a “standout” organization.
- Published two intervention reports (salt iodization and vitamin A supplementation), refreshed our existing intervention reports (most notably by reviewing a new deworming study) and completed other intervention report-style work, analyzing the Living Goods and DMI randomized controlled trials.
- Took initial steps on some experimental work that may lead to new recommended charities. We 1) recommended a grant to New Incentives, a new conditional cash transfer charity, and 2) provided support to IDinsight and Evidence Action.
- Did some, but not all, of the other miscellaneous research projects that we planned to do at the beginning of 2014. For example, we carefully reviewed DMI’s midline study and made progress on (but did not publish) a cost-effectiveness analysis of DSW, but we did not publish reviews of other behavior change or nutrition organizations because we ultimately prioritized other organizations.
- Conducted some other research that we did not expect to, such as our investigation into whether funding the Ebola response was a promising giving opportunity.
One area in which we fell short of our expectations was publishing new intervention reports, largely because completing these reports was more difficult than we had anticipated.
More details on some of our major achievements and shortcomings are below.
GiveWell’s traditional work produced more total research ‘output’ in 2014 than in previous years while also using less senior staff capacity than in previous years. A rough measure of total research output is the number of charity updates, charity reviews, intervention reports, and other major research work that we did during the year. In 2014, we completed four charity updates, four new charity reviews, two intervention reports, and some work on “seeding” new top charities. For comparison, in 2013, we completed three charity updates (two of which (AMF and SCI) required a significant amount of senior staff time because they were substantial updates), one new charity review (Deworm the World Initiative), and one intervention report (water quality). (More details on the work we did in 2013 are in our 2013 self-evaluation.) We consider the increase in research output in 2014 to be a major achievement.
Non-senior staff continued to be trained to take on additional responsibilities, and our staff continued to steadily expand. Examples of greater responsibilities shared by non-senior staff and reductions in senior staff time spent on GiveWell’s traditional work include:
- All four new charity reviews (DMI, IGN, GAIN, and Living Goods) were led by non-senior staff. Our first new charity review led by non-senior staff was the DtWI review in 2013. We would not have had the capacity to do four new charity reviews in a year if not for our expanded non-senior staff capacity.
- Holden and Alexander spent very little time on traditional work. In particular, Holden substantially reduced the amount of time that he spent writing blog posts for GiveWell’s traditional work. Elie continued to spend most of his time on traditional work but also passed off some of his responsibilities to other staff.
- Non-senior staff took on increased management responsibility. For example, Natalie Crispin managed other staff on our updated review of SCI, our new review of Living Goods, and other work. Other staff supported with management of new Research Analysts, Summer Research Analysts, and Conversation Notes Writers.
- All three site visits to recommended charities were conducted without senior staff.
- All intervention report work was primarily conducted by Jake Marcus.
We see reducing senior staff time spent on GiveWell’s traditional work as a major success because a) making the organization less dependent on a few individuals improves the sustainability of the organization and b) we have historically primarily been constrained by senior staff capacity, so freeing up senior staff capacity should enable us to make progress on goals such as the Open Philanthropy Project.
We have also substantially improved our capacity by hiring and training Conversation Notes Writers. GiveWell has published about 150 conversation notes per year for the last two years (see them on our conversations page). In 2013 and early 2014, Research Analysts spent a substantial amount of their time writing conversation notes. In 2014, we hired Conversation Notes Writers to handle this responsibility. We now have 8 Conversation Notes Writers, and Research Analysts generally spend very little of their time on conversation notes.
Finally, although increased capacity has already allowed us to accomplish more than we had previously, we believe that many of the largest benefits will come in the future. Staff has consistently contributed significantly more as their tenure at GiveWell has grown. We currently have only five staff members who have been at GiveWell for more than two years.
New charity reviews
We feel that this was a major success of our research work for the year. We believe that adding these “standout” charities to our list of recommended charities was valuable because (roughly in order of importance):
- These organizations seem to be very promising giving opportunities; some of them may become top charities in the future.
- If our money moved continues to grow, it will be important to have as much “room for more money moved” as possible. Even if current standout charities never become as strong (in isolation) as our current top charities, they may become the best options available when room for more funding is taken into account.
- The “standout” charities represent the organizations that we felt, on preliminary review, had the best chance of being significantly stronger giving opportunities than our current top charities. This time around, further review concluded that they were not as strong, but we feel it is important to continue engaging in these sorts of investigations and evaluating the best possible challenges to our current list.
- The “standout” designation and associated changes to our review process improves the incentives for potentially promising charities to apply for a GiveWell recommendation, which makes us more likely to be able to find the best giving opportunities. In particular, in 2014 we provided participation grants to promising charities that allowed us to review them publicly, directed some funding to the “standout” organizations by adding them to our list of recommended charities, and conferred some status on these organizations by giving them a GiveWell recommendation. These factors improve the cost-benefit analysis for a charity considering applying for a GiveWell recommendation, which we hope leads to more promising charities applying over time. Consistent with this, we saw increased interest from charities in engaging in our process in 2014 and expect this to continue as our money moved and influence grows.
- Adding more charities to our recommended list provides donors with more options. If donors have different values from us or different fundamental beliefs about which types of organizations are likely to be most effective, then we could be providing a valuable service by doing research on a wider set of donation options.
We published fewer intervention reports than we had hoped to at the beginning of 2014. We completed intervention reports for salt iodization and vitamin A supplementation, but we have not yet published the other two reports that we had said were near completion at the beginning of 2014 (polio and maternal and neonatal tetanus elimination) and did not publish any new reports, though we said last year that we had hoped to publish 5-10 new reports. That said, our goal of publishing 5-10 new intervention reports was arbitrary and, upon further reflection, unrealistic given the amount of time that it has typically taken us to complete intervention reports in the past.
We did not accomplish as much as we expected on this front primarily because completing these reports was much more difficult and time-consuming than we had anticipated. As of the beginning of the year, we had only completed 3 intervention reports that match our current standards of thoroughness, and senior staff had led the completion of each such report. This year, we tried to complete intervention reports with far less involvement from senior staff, and this proved challenging. There are an essentially unlimited number of questions we could ask about a given intervention, and making the right decisions about which to focus on (and at what level of thoroughness) is key; with less involvement from senior staff, it was more difficult to ensure that time spent investigating and writing up questions was allocated to the right questions at the right level of detail. In particular, we had cases in which an intervention report appeared close to completion, but late-stage reviews and peer feedback added many more questions.
Improving our process for doing intervention reports is one of our primary goals for 2015 (more on our goals in a forthcoming blog post). Additionally, the main staff member who worked on intervention reports (Jake Marcus) also worked on other evidence reviews, such as reviewing a new, promising study on deworming, an early, unpublished draft of the Living Goods study, and the midline of the DMI study. He also spent some of his time investigating donating to the Ebola response as a giving opportunity.
Late completion of top charity recommendations
We finalized the details of our top charity recommendations later in the year than would have been ideal. In late November, we were still clarifying facts and debating some key issues related to our recommendations, such as SCI’s room for more funding and estimated cost-effectiveness and AMF’s room for more funding.
This is problematic because we made our recommendation to Good Ventures about how to allocate its giving in mid-November. We had agreed with Good Ventures that it should aim to announce its giving plans at the same time that we released our recommendations to the public in order to avoid potential fungibility concerns. To meet this deadline, we sought to finalize our recommendation to Good Ventures a couple of weeks before our public recommendations were released.
If we had fully completed our analysis before making a recommendation to Good Ventures, we likely would have recommended relatively more to AMF and relatively less to GiveDirectly. (For more details on how Good Ventures allocated its giving and our recommended allocation to donors, see our 2014 recommendations announcement post.)
In the end, we adjusted the public targets we announced based on the grants Good Ventures had committed to, so we don’t see a major issue here. However, in the future we should try to avoid a recurrence of this issue.
In the past, we have tried more than once to finalize our recommendations well in advance of giving season. At this point, we’re not sure that goal is realistic: we want our giving-season recommendations to take advantage of the most recent possible information and ideas, and it’s unlikely that we’d be comfortable with finalizing our recommendations before the date that we have to do so. An alternative way to avoid the issue described above might be to announce our recommendations to Good Ventures and the public at the same time.
Issues with cost-effectiveness analysis
We did not follow the ideal process for reviewing and internally critiquing our cost-effectiveness analyses, which led us to finalize them later in the year than would have been ideal. In particular:
- There was little senior-level review of the details of some of our key cost-effectiveness analyses (e.g., the cost-effectiveness analyses for SCI and DMI) until late in our research process.
- We did not ensure that multiple staff members understood the most important parameters and assumptions in all cost-effectiveness analyses until late in the research process. For example, the proportion of deworming pills that were given to children as part of SCI‘s campaigns was a relatively important parameter in our cost-effectiveness analysis for SCI, but we did not have as much confidence in our understanding of this parameter as we could have at the end of the year.
- The cost-effectiveness analyses were often complicated and somewhat opaque, which made it difficult for staff members to use the analyses as an input to their thinking about what GiveWell’s recommendation should be.
After putting in additional work on the cost-effectiveness analyses late in the research process, we ultimately felt that they were acceptable, but we plan to improve these analyses in the future (more details in the next post in this series).
Quality of recommended charities list
The quality of our top charities list (measured roughly in terms of expected impact) improved in 2014 relative to 2013 because AMF had room for more funding, a new study increased our estimate of the impact of deworming programs, and GiveDirectly had a stronger track record after another year of successfully distributing unconditional cash transfers at scale.
Additionally, we added four “standout” organizations to our recommended charities list, which we felt improved the quality of our recommendations for the reasons mentioned above.
We feel that we maintained the high quality of our research in 2014. Though evaluating the quality of our research is difficult and involves many subjective judgments, we feel we have maintained our research quality because:
- Our major research reports (charity reviews, intervention reports, etc.) lay out all reasoning explicitly and back up all evidence-backed claims with footnotes that show what evidence is being used to support their claims. These standards force all researchers to produce reports that can be easily vetted by other staff and the public. All reports receive many levels of critical review before they are published. For example, each charity review and intervention report is reviewed by at least one staff member who did not write the report and by the staff member’s manager. For intervention reports, we generally solicit feedback on the quality of the reports from experts in the appropriate fields (see, e.g., our water quality report).
- We feel that we have a very strong understanding of our recommended charities’ activities. In general, we feel that the quality of our “What does [the charity] do?” and “Does it work?” sections of charity reviews are as high as or higher than they have ever been. For example, our understanding of (top charity) SCI’s activities is much stronger now than it had been in the past due to greater capacity for deepening our investigation.
However, we believe that there is still room to improve the quality of our research. In particular, we think that the “What do you get for your dollar?” (cost-effectiveness) sections of our charity reviews could be substantially improved and that the “Room for more funds?” sections could be improved. More details on this in the next blog post in this series.
Does our impact justify the size of our staff?
In 2014, we moved about $28 million to our recommended charities. Excluding Good Ventures’ giving, we moved approximately $12.7 million to our recommended charities. (More details on our 2014 money moved will be in our forthcoming 2014 metrics blog post.) We currently project total GiveWell/Open Philanthropy Project expenses of about $2.3 million for 2015 (more). We estimate that about half of those expenses are attributable to GiveWell’s traditional work. We previously wrote that we believe that expenses that are 15% of money moved are well within the range of normal, so we feel comfortable with the relative size of our operating expenses at this point.
As noted above, we have substantially increased our capacity for GiveWell’s traditional work after many years of struggling to do this. However, we feel that it is worth critically evaluating how much value is being added by our additional capacity and how much further we should expand our staff, if at all.
An important factor in our thinking about the ideal size of GiveWell staff is that we now see more potential than we had previously for some staff to transition to working for the Open Philanthropy Project.
To analyze the costs and benefits of different staff sizes, we can imagine three scenarios for future GiveWell staff:
- Expansion: increasing the size of GiveWell’s staff would allow us to: review as many or more new charities each year in the future, eventually enable us to allocate more staff to the Open Philanthropy Project, potentially improve our work of “seeding” potential future top charities, and potentially improve our future outreach efforts.
- Status quo: if we kept the size of GiveWell staff the same as it is now, we would likely dedicate most staff to maintaining our current level of research. Under this scenario, we would likely halt the transition of staff to the Open Philanthropy Project, not do substantial work to improve future outreach efforts, and do relatively little to seed potential future top charities.
- Contraction: in this scenario, we would reduce the size of GiveWell staff to the minimum amount of staff needed to maintain our recommendations. A smaller staff would likely be able to publish updates on our past top charities while conducting about one new charity review per year. Under this scenario, we would be relatively unlikely to find promising new giving opportunities, so we would be making a bet that we had already largely found the best giving opportunities.
The main arguments we see in favor of expansion are:
- If our money moved continues to grow, we will likely need more “room for more money moved.” To increase “room for more money moved” and ensure that we are recommending high-quality giving opportunities, we will likely need to do research on new charities and do more work to seed potential future top charities.
- The Open Philanthropy Project is early in its process of finding promising new giving opportunities and is severely capacity-constrained. Increasing the size of GiveWell’s staff will likely lead to more capacity for the Open Philanthropy Project.
- GiveWell would need more staff in order to do more work on seeding potential future top charities and to do more outreach while maintaining its current level of research. These activities could be highly valuable.
- Hiring operates on a long time scale; there are long lags between a) advertising a position, b) hiring and c) the new staff member reaching their full potential. Highly experienced hires are very versatile and valuable; the benefits of making such hires are robust across many potential future paths for GiveWell and the Open Philanthropy Project.
- The worst case scenario for overexpansion is that some amount of money is used inefficiently on staff and that GiveWell must contract later, while the worst case scenario for underexpansion is that GiveWell and the Open Philanthropy Project are unable to capitalize on a vastly larger future opportunity for impact.
The main arguments we see in favor of maintaining the status quo or contracting are:
- GiveWell’s “impact per dollar” would likely be higher in the short term in the status quo or contraction scenarios because we could maintain our current top charities list while spending less on our operations. GiveWell has not found many new top charities in the recent past, so we may not be sacrificing much impact by contracting. However, the legitimacy of GiveWell’s top charities list may degrade over time if the set of plausible candidates for top charities grows relative to the set of charities we have considered.
- To some extent, there are diminishing returns to additional hiring because a growing staff requires more overhead- and human resources-related work.
Ultimately, we feel that the arguments in favor of expansion are significantly stronger than those for maintaining the status quo or contracting. However, we are still unsure of how much larger GiveWell’s staff should become in the longer term. The ideal future size depends on many factors, such as whether our research process has been identifying new top charities, the size of the “pipeline” of potential new top charities and priority programs (which we plan to discuss in the next post in this series), how many existing GiveWell staff ultimately work for the Open Philanthropy Project, and the size and success of our outreach operation. We plan to continue revisiting this question periodically.
Allocation of resources to research vs. outreach
As with previous years, we did not set a goal to do more outreach in 2014; we maintained our outreach at similar levels to what we had done in the past. Our approach to outreach has been to prioritize the highest return-on-investment activities while not making outreach a major priority. That said, the resources that we devote to outreach are not insignificant. For example, Co-Executive Director Elie Hassenfeld spent more than 10% of his time on outreach in 2014. More details on how we think about prioritizing outreach are available in this blog post.
Note 1: In this post, senior staff refers to Elie, Holden, and Alexander. Many staff took on additional responsibilities throughout 2014, so this refers to senior staff as of January 2014, not as of today.
Note 2: These were not necessarily the charities that we had expected to review at the beginning of 2014. At that time, we believed that we might complete reviews for ICCIDD (now named IGN), Centre for Neglected Tropical Diseases (CNTD), Nothing But Nets, UNICEF Maternal and Neonatal Tetanus Elimination Initiative (MNT), Measles and Rubella Initiative, and Menafrivac. Of those charities, we completed a review for IGN and made substantial progress on forthcoming reviews for CNTD and UNICEF MNT. Nothing But Nets declined to participate in our process. We ultimately prioritized different charity reviews because we learned new information–for example, Living Goods contacted us to share early results from its RCT and DMI found promising midline results from its RCT.