The GiveWell Blog

Evaluating GiveWell by finding the best charity

One of the challenges of GiveWell is evaluating the quality of our own research. There are no accepted standards of what charity recommendations should be or how they should be assessed.

Over the last year, we’ve been developing a method for self-evaluation based on structured writeups from people external to GiveWell (i.e., people who are not donors, Board or staff). Some of these people are early-stage volunteers with GiveWell; asking them to review us in this way gives us both valuable feedback and a window into how they think. Other reviewers have been recruited by us because their opinions seem particularly relevant due to their backgrounds (for example, Laura Freschi of Aid Watch). The goal here isn’t to produce a quantitative “score” for our research, but rather to provide some in-depth and credible outside perspectives that our audience (and we) can react to as we choose.

In most cases, the reviewer is asked to answer a very specific set of questions about a specific subset of our content: our overview of a particular issue such as developing-world education or HIV/AIDS, our heuristics for identifying promising charities, the fairness of how we’ve applied these heuristics, or our writeups on particular charities. Put these four areas together and you have our entire process for recommending charities.

However, we’ve also been experimenting with a broader approach that asks the reviewer to evaluate the whole of GiveWell’s value at once: finding the best charity to donate to within a set period of time, with no requirement that they use GiveWell to do so. We provide the reviewer with a list of all online resources we know of, including but not limited to GiveWell, and encourage the reviewer to do whatever it takes to get the best answer (including searching independently for information and calling charities directly).

We’ve run this “Finding the Best Charity” project a couple of times with volunteers, but the most recent posting is particularly interesting for a couple of reasons:

  • The reviewer was Tobias Pfutze, an Assistant Professor of Economics who co-authored a paper rating official aid agencies with William Easterly.
  • We took the project out of the hypothetical and into the real by offering a $2,500 donation to the charity of Prof. Pfutze’s choice, in exchange for the time (about 10 hours) he spent on the review. This donation was funded by Dario Amodei. Thus, this project literally represented Prof. Pfutze’s best attempt at allocating $2,500 to accomplish good.
  • Prof. Pfutze chose a charity, Living Goods, that GiveWell does not recommend – though he did find this charity through GiveWell’s website.

Prof. Pfutze’s submission is available here. It includes a paraphrased transcript of a follow up conversation we had with him, exploring our areas of agreement and disagreement.

Overall we found this project very interesting and valuable. On one hand, Prof. Pfutze was clear that he “found Givewell to be by far the most helpful website,” that he agreed with the bulk of our criteria and conclusions, and that he found our recommended charities to have strong cases behind them. On the other, he came to a substantially different conclusion from us.

It appears that he placed a substantially higher weight on the “upside” of the donation – what the project would accomplish if it went as well as possible – and in particular, on the project’s model for achieving sustainability. He found Living Goods promising because it both (a) is aiming for a highly sustainable, lasting impact; (b) gains some credibility from its ongoing evaluation in partnership with Poverty Action Lab.

We remain in disagreement with this conclusion, but think his position is defensible, well-argued and thought-provoking.

We encourage interested readers to check out his full submission including the follow-up dialogue with GiveWell. We’ve found this “Finding the Best Charity” assignment to be valuable and interesting, and we are hoping to use this same format to get more feedback in the future.

GiveWell’s plan for 2011: Specifics of research

This is the fifth post (of five) focused on our self-evaluation and future plans.

A previous post outlined our top-level priorities for 2011. The most important priority is finding more top charities. This post lists our potential tactics for finding top charities; we are particularly interested in feedback on this topic.

These tactics are listed in order from “closest to our existing methodology; most likely to succeed” to “furthest departure from our existing methodology; most likely to take a lot of time before we can identify outstanding organizations.”

Tactic 1: deep investigations of charities with distinction.

We have a list of charities that have some form of distinction. This includes

We haven’t found sufficient information on these charities’ websites to recommend them, and based on past experience, this makes us pessimistic. However, we have begun the process of prioritizing how promising the different charities are; we will be contacting the most promising ones, interviewing staff, and thinking about the minimum information we would need to confidently recommend them. More so than in the past, we can now point to significant impact of our recommendations on donations, so we expect better access to these charities than in the past.

Tactic 2: investigating “low burden of proof” sub-causes in international aid

In the past, we’ve looked for direct evidence of charities’ impact on improving lives. As we’ve gotten more context and experience with international aid, however, a couple of causes have stood out to us as particularly recommended/promising, to the point where we may be able to be confident in a charity without the sort of impact assessment we’ve sought in the past.

  • Orphans and vulnerable children: some charities provide homes, shelters, and other basic services for children who otherwise might be homeless, sleeping on the street, or even taken in by those who exploit them. By speaking with the right people, we may gain an understanding of where and when there are needs for these sorts of organizations to expand, resulting in more children having safe homes/shelters who would not otherwise.
  • Water: if we found a charity that was demonstrably improving access to clean water, in a way that (a) benefited communities with very poor previous access to water (b) lasted over time (we are very concerned about wells being put in and falling into disrepair), we might recommend such an organization without direct evidence of improved health outcomes.

Tactic 3: investigating other promising causes

As mentioned previously, we’re experimenting with a method for quickly getting a high-level picture of a charitable cause and how likely it seems that we could find top charities in this area. By investigating particularly promising causes, such as disease research funding and catastrophic risk mitigation (including but not limited to global warming), we might be able to find more outstanding opportunities for donors.

We will certainly be pursuing this tactic, but feel it is less likely to generate top charities in 2011 than the tactics above.

Tactic 4: project funding.

We have always aimed to find great organizations and recommend unrestricted donations to them, rather than funding particular projects. This is partly because we think traditional donation restricting is unreliable; partly because we think project-based funding adds harmful complications (particularly the fact that the donor’s and charity’s goals aren’t fully aligned); and partly because, in the past, we have had so little sense of how much money (if any) our top-rated charities could expect to raise.

But if we can’t find more charities that focus – at the overall organizational level – on proven, cost-effective, scalable programs, we will open the doors to large organizations offering promising projects, and potentially recommend that donors give to these organizations with specific designations (“Use this donation for project X”).

If we go down this path, it will become essential to have concrete expectations for what will be implemented – and what will be measured and reported – at different levels of funding. (Projects also ought to be based, to the maximum extent possible, on programs that have worked in the past.) The fact that we now have a track record of moving money to top-rated charities makes this option more feasible than it was before.

We’d like to avoid project-based funding, and even if we do implement it, we’ll be keeping an eye out for organizations that we can recommend for unrestricted funding. The latter will always take precedence.

We think this tactic is promising in the long run, but unlikely to generate “gold medal” opportunities in the short run because of the difficulties we’ve had (and expect to have) communicating with grantwriters.

GiveWell’s plan for 2011: Top-level priorities

This is the fourth post (of five) we’re planning to make focused on our self-evaluation and future plans.

In previous posts, we discussed the progress we’ve made, where we stand, and how we can improve in core areas. This post focuses on the latter, and lays out our top-level strategic choices for the next year.

Broadly, we see the key aspects of GiveWell – the areas in which we can improve – as

  • Research vetting: subjecting our existing research to strong, critical scrutiny from people with substantial relevant experience and credentials.
  • Research maintenance and systemization: keeping our research up-to-date and high-quality, while training junior staff to maintain it.
  • Research expansion: actively seeking more charities to recommend.
  • Marketing: increasing our “customers” reached and money moved.
  • Fundraising/operating: maintaining the organization.

(These are broadly similar to last year’s areas for improvement.)

Our top priorities for this year are:

  1. Research expansion. As discussed previously, we have an urgent need to find more top charities so that we can productively move more money. It would be a major problem for GiveWell if we essentially had more demand for our research (i.e., donors interested in following our recommendations) than supply (i.e., charities able to absorb this funding effectively).We aim to find at least $3 million in room for more money moved, i.e., gold-medal charities that can collectively absorb at least $3 million very effectively. Finding more top charities could be very challenging and require large allocations of time from both junior and senior staff.
  2. Fundraising – as discussed in the overview of our financial situation, we have expanded our staff while seeing a couple of large sources of revenue fall off. Raising more money in 2011 will be necessary in order to maintain our operations at optimal size.We aim to raise our annual revenue from around $200,000 to around $400,000 (if we do not make it all the way to $400,000 we will cut some staff going into 2012, as outlined in our overview of our financial situation). I (Holden Karnofsky) will be taking primary responsibility for this task, and expect it to take a fair amount of time.

Other key priorities are:

  1. Research maintenance and systemization – all our research will need to be updated this year (as it is every year). We expect this work to be done primarily by junior staff.
  2. Research vetting – we believe there is substantial room for improvement in our testimonials and external reviews of our research. We expect to make improvements with relatively little investment of our time, since our process for getting testimonials and reviews is already in place.
  3. Marketing – we expect to pursue most of the ideas we listed previously for expanding our reach, with moderate time cost.
  4. High-level research of new causes – we’re experimenting with a method for getting a high-level picture of a charitable cause, without getting to the most time-consuming step of evaluating specific charities. The idea is to take a given cause – for example, global warming mitigation – and get a basic sense of what information is available, what cost-effectiveness estimates say about what can be accomplished for how much, what the key questions are for organizations, and how likely it seems that we could find top charities in this area. We hope to do high-level research on a few particularly promising causes, laying the groundwork to find more top charities in the future.

Self-evaluation: GiveWell as a project

This is the third post (of five) we’re planning to make focused on our self-evaluation and future plans.

This post answers a set of critical questions for GiveWell stakeholders. The questions are taken from our 2010 list of questions. For each question, we discuss

  • Where we stood as of our previous self-evaluation and plan a year ago.
  • Progress since then.
  • Where we stand (compared to where we eventually hope to be).
  • What we can do to improve from here.

Is GiveWell’s research process “robust,” i.e., can it be continued and maintained without relying on the co-Founders?

Where we stood as of Feb 2010

We had one full-time employee other than the co-Founders (Natalie Stone). We felt that she was capable of maintaining our international aid report with little oversight, while covering other causes would require heavy involvement from co-Founders.

Progress since Feb 2010

  • Natalie Stone has now been with GiveWell for about 18 months. She has not only worked on maintaining and expanding the international aid report, but also did the bulk of the work for a new cause – microfinance – and has also taken on miscellaneous duties such as responding to emails sent to info@givewell.org
  • We have also made another full-time hire, Simon Knutsson, who started as a volunteer, became a 30-40-hour-per-week contractor in March of 2010 and joined as a full-time employee in January 2011. He has done the bulk of the work for the U.S. Equality of Opportunity report.
  • We have systemized the process of keeping our research up to date. This process can now be done partly by junior staff and partly by volunteers, with very little participation needed from Co-Founders.
  • Because we now have useful work for volunteers to do, we’re able to use volunteer work as a way to evaluate potential hires, leading to two more new hires:
    • We hired one volunteer (Stephanie Wykstra) part-time; she has been working about 20 hours per week for us and plans on working full-time for us over the summer. She did the bulk of the work for our disaster relief report.
    • Another former volunteer (Alexander Berger) plans on coming to work full-time for us after he graduates college.

All in all, by this fall we expect to have 3.5 employees apart from the co-Founders. All will focus primarily on maintaining and expanding our existing research (and in particular, finding and evaluating particularly promising charities), a top priority for us, but all will experiment with and be available for other work as well, including work on researching new causes.

Where we stand

We have a promising set of junior staff and lots of valuable work for them to do, keeping our research up to date and thorough. These staff have promise as people who might eventually be able to run the organization entirely, but we don’t feel they’re at that point yet.

What we can do to improve

We plan on continuing to expand the scope of the work we assign to junior staff, as well as continuing to work with volunteers who may eventually become staff. This will take a good deal of time in terms of training/management overhead.

Does GiveWell present its research in a way that is likely to be persuasive and impactful (i.e., is GiveWell succeeding at “packaging” its research)?

Where we stood as of Feb 2010

We felt that our research was presented with sufficient clarity and good organization, though without emotional persuasiveness. We sought to add more external evidence of credibility and to better integrate the blog and website.

Progress since Feb 2010

As discussed in the previous post, we have added some external evidence of credibility to our website, and established a basic process for adding more. We have not made other progress on this metric; we do not consider improving the emotional persuasiveness of our work to be a priority.

Where we stand

We’re currently satisfied with the presentation of our content and don’t plan on emphasizing this goal in the near future.

What we can do to improve

At some point, more attention to the presentation of our research may be useful for broadening our audience. Right now, though, our impact appears to be growing rapidly just from our niche audience, and continuing to serve this audience well presents major challenges that are higher priorities.

Does GiveWell reach a lot of potential customers (i.e., is GiveWell succeeding at “marketing” its research)?

Where we stood as of Feb 2010; progress since Feb 2010; where we stand

As detailed previously,

  • We tracked over $1.5 million in donations to top charities in 2010, compared to just over $1 million in 2009.
  • Our website traffic nearly doubled from 2009 to 2010, and donations through the website nearly tripled. Our overall increase in money moved appears to be driven mostly by (a) a gain of $200,000 in six-figure donations; (b) new donors, largely acquired via search engine traffic and the outreach of Peter Singer.
  • Our growth in online donations to recommended charities was significantly faster than that of the more established online donor resources (Charity Navigator and GuideStar); our total online donations remain lower than these resources’, but are now in the same ballpark.

We have also formed a content partnership with GuideStar.

The amount of money we’re influencing is now quite significant to our top charities. As discussed previously, we now have significant concerns about “room for more money moved.”

Ultimately, we have a loose goal of reaching the point where our “money moved” is equal to, at a minimum, 9x our operating expenses. We see our operating expenses as being between $400,000-$500,000 in a steady state (details forthcoming) so this would require seeing another 150-200% growth in our money moved.

What we can do to improve

We have many ideas to continue to improve our reach. At this point we feel the most promising are:

  • Have more discussions with, and surveys of, those who use our research to decide where to give, in order to understand our audience, deepen our relationships, and ultimately find more such donors. A professional fundraiser could help significantly with this.
  • Hold fundraising events in order to make it easier for our existing supporters to bring in new supporters. We held a small event in December of 2010 focusing on VillageReach; the cost of the event was about $3,000 and it resulted in a total of $17,800 in donations to VillageReach and $7,500 in donations to GiveWell.
  • Public relations / press targeting projects aiming for earned media.
  • More active pursuit of people we consider highly influential among our target audience (the better we understand our audience, the better our ability to seek out such people).
  • Website improvements, including
    • Improving our search engine optimization. (Most of our gains in traffic in 2010 came via search.)
    • Minimizing the processing fees on donations.
    • Better encouraging donors to share their actions and recommendations with friends (using email and social media).
    • Offering donors an easy way to get reminded to donate in December. (In 2010 over 25% of the money donated through our website was on the very last day of the year.)
    • Improve our ability to track website users and understand how they found us.

Is GiveWell a healthy organization with an active Board, staff in appropriate roles, appropriate policies and procedures, etc.?

Where we stood as of Feb 2010

We were happy with our Board and other aspects of our organization.

Progress since Feb 2010

We remain happy with our Board and policies and procedures. However, we now have a significant need to raise more operating funds. In a nutshell,

  • Our improved ability to recruit and employ junior staff has resulted in increased expenses, both present and future.
  • We have lost two major sources of revenue (a major donor changed careers and the Hewlett Foundation lowered the size of its support).

Details will be forthcoming in a financial analysis document.

Where we stand

We have a significant need for more operating support and intend to make this a major priority for 2011.

What we can do to improve

We aim to raise operating support by

  • Applying for support from major institutional funders.
  • Having discussions with people who have a substantial history of being close to our project, including Board members and major “customer donors” (i.e., donors who have used our research to decide where to give), about the possibility of their supporting GiveWell directly.

We plan to continue to avoid soliciting funds from the public at large. We wish to avoid “competing with” our recommended charities for funding, and feel our credibility would be hindered if we were asking for money ourselves.

What is GiveWell’s overall impact, particularly in terms of donations influenced? Does it justify the expense of running GiveWell?

Where we stood as of Feb 2010

In 2009, we tracked just over $1 million in donations to our top charities, though over 30% of this $1 million came from pledges and restricted donations that had been made in 2008. We had a reasonably strong presence in our sector as members of the Alliance for Effective Social Investing and GuideStar Exchange Advisory Board.

Progress since Feb 2010

As discussed above, our website traffic and money moved have grown significantly and we now provide significant funding to our top-rated organizations.

We remain members of the Alliance for Effective Social Investing and GuideStar Exchange Advisory Board; we have also joined two other collaborative groups in our sector (Markets for Good and Market Development Working Group of the Social Impact Exchange) and begun a content partnership with GuideStar.

Where we stand

At this point we can point to significant impact for our top charities; the impact they, in turn, have with our additional funds remains to be seen, but we are well-positioned to evaluate it over time.

That said, our “money moved” is still not at the point we would eventually like to see, and we still do not know enough about our impact because we have little information on where donors would have given without our research. We are currently investigating the latter.

What we can do to improve

  • Survey donors on where they would have given if not for our research, to get a better sense of our overall impact. (Currently in progress.)
  • Pursue the previously outlined strategies for increasing both our “money moved” and “room for more money moved” (both are necessary).
  • Pursue long-term sustainability of the organization – in particular, continue developing junior staff and raise the operating funding needed to maintain these staff.

Self-evaluation: GiveWell as a donor resource

This is the second post (of five) we’re planning to make focused on our self-evaluation and future plans.

This post answers a set of critical questions about the state of GiveWell as a donor resource. The questions are taken from our 2010 list of questions, with one change: “Has GiveWell covered enough areas to be useful?” is replaced by “How much funding can GiveWell’s top-rated charities effectively absorb?” for the reasons outlined at our previous post.

For each question, we discuss

  • Where we stood as of our previous self-evaluation and plan a year ago.
  • Progress since then.
  • Where we stand (compared to where we eventually hope to be).
  • What we can do to improve from here.

Does GiveWell provide quality research that highlights truly outstanding charities in the areas it has covered?

Where we stood as of Feb 2010

  • Internally, we were satisfied with the quality of our research as compared to other options for donors.
  • We planned to conduct one or more site visits to the developing world, in order to inform our work with some direct observation.
  • We felt a need for more substantial external checks on our research, and planned to subject it to strong, critical scrutiny from people with substantial relevant experience and credentials.

Progress since Feb 2010

  • We have made substantial progress on seeing charities’ work up close.
  • We have made substantial progress on subjecting our research to strong, critical scrutiny from people with substantial relevant experience and credentials.
    • We have formalized the process for keeping our research up to date, crucial for soliciting formal feedback. (Last year, we had to update our year-old report before any work on external feedback was feasible.)
    • We have formalized the process for providing feedback on the different aspects of our research, and have gotten at least one submission for each category of assignment – see our page on external reviews.
  • We have also improved the quality of our research in other ways.
    • We have systemized the process of keeping our research up to date, and updated our international aid report in mid-2010. This process can now be done partly by junior staff and partly by volunteers, with very little participation needed from Co-Founders.
    • We have intensified our focus on room for more funding and gotten particularly concrete answers on this question from two of our top charities: VillageReach and Nurse-Family Partnership.

Where we stand

We feel that our current research is high-quality and up-to-date and that the first set of external reviews reflects this. We are not fully satisfied with the number and credibility of these external reviews and hope to secure more of them.

What we can do to improve

Is it practical for donors to evaluate and use GiveWell’s research in the areas it has covered?

Where we stood as of Feb 2010

We were satisfied with the organization of our website, but felt that donors’ options for assessing our credibility were insufficient. We wrote, “We feel that we should have a single, easy-to-find roundup of available information on the credibility of our research.”

Progress since Feb 2010

Where we stand

While we have created the basic process and template for both reviews and testimonials, and have some basic evidence of our credibility now easily available, we feel that the content of both pages could be much stronger.

What we can do to improve

Over the next year we intend to secure the most impressive testimonials we can and to significantly increase the number of external reviews by people with clearly relevant credentials. We predict a substantial improvement in both of these areas compared to the content that is there now.

How much funding can GiveWell’s top-rated charities effectively absorb?

Where we stood as of Feb 2010

Progress since Feb 2010

Where we stand

Our gold medal charities currently have relatively little room for more funding. VillageReach had a $4 million projected need over six years as of December; its need now is probably under $3 million (we will soon be discussing with them and posting another update). Nurse-Family Partnership has a long-term but not short-term need for more funding.

We do not have very good information on our silver medal charities’ room for more funding, though this group involves some large charities (Stop Tuberculosis Partnership, KIPP) that can likely absorb tens of millions of dollars of funding.

(Read about our definitions of “gold medal” and “silver medal” charities.)

What we can do to improve

  • Push for clearer information on room for more funding from our silver medal charities.
  • Find more charities that we are comfortable giving the gold medal distinction. This is a top priority for the coming year.

Stats on GiveWell’s money moved and web traffic

This post provides summary metrics we look at to gauge GiveWell’s influence and growth. Our full annual review and plan will follow over the next few days.

Summary

  • We tracked over $1.5 million in donations to top charities in 2010, compared to just over $1 million in 2009.
  • Our website traffic nearly doubled from 2009 to 2010, and donations through the website nearly tripled. Our overall increase in money moved appears to be driven mostly by (a) a gain of $200,000 in six-figure donations; (b) new donors, largely acquired via search engine traffic and the outreach of Peter Singer.
  • Our growth in online donations to recommended charities was significantly faster than that of the more established online donor resources (Charity Navigator and GuideStar); our total online donations remain lower than these resources’, but are now in the same ballpark.

Total money moved

One of our primary metrics is “money moved”: donations to our top charities that we can confidently identify as being made on the strength of our recommendation. We are generally very conservative in identifying “money moved”; full details of how we track it are at the end of this post.

The tables below show GiveWell’s 2010 money moved by (a) organization and (b) size of donation. They show a total of just over $1.5 million, the vast majority of which went to our top-rated charity, VillageReach.

Note: these figures do not match up exactly to the above total money moved figures because this table does not include donations made directly to our recommended charities that the charities informed us about and for which we do not know the individual donation sizes.

Donations through the website

While the aggregate money moved figure (which includes all money donated to charities due to GiveWell’s recommendation) is ultimately the more meaningful measure of GiveWell’s impact, we believe that donations by website visitors whom GiveWell staff don’t know personally is a more meaningful measure of GiveWell’s progress as these donations represent (a) use of the GiveWell tool as we ultimately envision it (i.e., retail donors coming to the site and using it to give) and (b) are less susceptible to large, one-off circumstances that lead to large variations in the aggregate money moved figure.

The following charts show monthly donations through the website, for 2008-2010. (We include January 2011 data.) We’ve attempted to strip out donors we have had personal contact with (we aren’t able to see the source of all donations, so some could be included, although we were able to see the source of – and thus strip out – the bulk of donations for which this applies).

The following two charts show data just for December, the month in which we see significantly more donation activity.

Comparison to Charity Navigator and GuideStar

As shown above, donations through our website were substantially higher in 2010 than in 2009. We used the public Network for Good tickers for Charity Navigator and GuideStar to compare our growth (and level) to theirs, to see how much of our growth can be attributed to GiveWell’s improvement in particular vs. more interest in online charity evaluators / online giving in general. (We have confirmed that we are looking at the right data with Charity Navigator; we are still waiting for confirmation from GuideStar, but the numbers we got from this ticker match almost exactly with the independent numbers they sent us for 2010.)

It’s also worth noting the levels of (not just changes in) these figures. Though Charity Navigator had about 30x as many donations as GiveWell in 2010, and GuideStar about 9x as many, the average donation for GiveWell was over 4x higher (~$450 for GiveWell; closer to $100 for each of the other two). Net result: the total online donations for GiveWell were about 13% of those for Charity Navigator and 42% of those for GuideStar.

There are many possible interpretations of these numbers. One could argue that GuideStar’s and Charity Navigator’s numbers understate the actual impact, more so than they do for GiveWell, because people are more likely to use those sites as only one step in the process and end up giving via the charity’s own website. One could also argue the reverse: available information implies that more people are using GiveWell’s research to make large donations, and people often prefer to give these donations using donor-advised funds and checks rather than online.

One thing we do feel is the case, however, is that a donation given through GiveWell likely represents a much bigger impact – in terms of changing a donor’s actions from what they would have done otherwise – than a donation given through GuideStar or Charity Navigator. GiveWell recommends only 10 charities, while Charity Navigator lists over 1600 charities with its highest four-star rating and GuideStar allows donors to give to just about any charity they’d like. GiveWell is designed to make a strong recommendation of where to give; the others are designed to help people make one final check on the charity they had already planned on supporting. So while donations through the website remain lower for GiveWell than for Charity Navigator and GuideStar, they are now in the same ballpark, and there is an argument that the overall impact of GiveWell is at least as high.

What changed between 2009 and 2010?

  • Six-figure gifts: two individual donors, both of whom have been following GiveWell for a long time, each made six figure gifts in 2010. The two donations totaled around $460,000. These donors had both refrained from large gifts to recommended charities in the past; in 2009, their giving to our top charities totaled $10,000. On the other hand, we did not repeat the 2009 economic empowerment grant (funded by an anonymous donor). So we gained a net of about $200,000 on six-figure gifts.
  • New donors: $387,585 came from donors who hadn’t given before, while we lost $118,793 via donors who had given in 2009 and did not repeat in 2010. So the net from donor turnover was about +$270,000. This figure is about the same size as, though conceptually different from, the increase we saw in donations through the website.

We break down the 2009-2010 changes in more detail at the bottom of this post.

Web traffic

Our web traffic roughly doubled in 2010 vs. 2009. The following shows web traffic by source.

Two notes:

  • The largest driver of growth in web traffic in 2010 was increased organic (i.e., non-AdWords) search traffic.
  • The charts include data from January 2011 and the recent dips are a function of normal seasonality — i.e., we have more traffic during December’s giving season.

Money moved vs operating expenses over time

The following chart shows GiveWell’s total money moved each year (2007-2010) relative to our operating expenses. A major question for GiveWell is whether the value of our research is worth the cost of producing it. Money moved has continued to grow significantly relative to operating expenses.

Note that the above chart reports lower 2009 and higher 2008 money moved figures than our 2010-vs-2009 comparison above and our previous report. This is due to the way we reported data in 2008 and 2009. We spent a significant portion of 2008 soliciting funds for organizations we intended to recommend in the future. We ultimately made these recommendations in mid-2009. The funds were actually donated in 2009 but were committed in 2008. We believe that counting the funds when they were committed provides the most accurate picture of changes in GiveWell’s influence relative to operating expenses over time (though counting the funds when they were given makes it easier for us to track what changed between 2009 and 2010).

What we count as “money moved”:

  • Donations made to top charities directly through our website. Though these donations go directly to top charities, we are able to track them and verify that they went through our website. (Example: VillageReach donate page)
  • Donations that our recommended charities report back to us as being attributable to GiveWell (we have a high standard for this – we count only cases where (a) the donor explicitly stated that their donation was on the strength of GiveWell’s recommendation or (b) the donor gave to Nurse-Family Partnership and stated that they heard about it from a Nicholas Kristof column; Mr. Kristof has informed us that he included NFP in the column on our recommendation).
  • Donations that donors report to us (informally or using our donation report form) as donations that they made on the strength of our recommendation. We cross-reference our data with recommended charities’ data, when necessary, to eliminate double-counting.
  • Donations made directly to GiveWell and earmarked for re-granting. We count donations made and restricted in year X, and then granted in year Y, as “money moved” for year X, not year Y.

More details on what changed between 2009 and 2010:

  • Two individual donors, both of whom have been following GiveWell for a long time, each made six figure gifts in 2010. The two donations totaled around $460,000. These donors had both refrained from large gifts to recommended charities in the past; in 2009, their giving to our top charities totaled $10,000. On the other hand, we did not repeat the 2009 economic empowerment grant (funded by an anonymous donor). So we gained a net of about $200,000 on six-figure gifts.
  • $387,585 came from donors who hadn’t given before, while we lost $118,793 via donors who had given in 2009 and did not repeat in 2010. So the net from donor turnover was about +$270,000. Some additional context on these numbers:
    • 13 new large donors, giving between $8-30k each, accounted for $163,449 of the $387,585 in gains. The rest came from nearly 500 smaller donors ($6k and under).
    • Of the $387,585 in new donations, we can (using a combination of web analytics and optional “Where did you hear about us?” surveys) attribute $56,672 to organic search (i.e., people searching for things like “best charities” – not for GiveWell itself – and not including Google AdWords); $30,903 to the outreach of Peter Singer; $19,691 to “word of mouth”; and $18,517 to Google AdWords.
    • We do not know the source of the other ~$195k in new donations. We can guess at what the distribution looks like using our survey data. Of the 2010 users who responded to surveys about where they heard about us, 13% found us via search, 23% found us through media, 25% found us through word of mouth or links from other sites, and 38% found us through the outreach of Peter Singer.
    • Of the ~$118k in lost donations, $73k can be attributed to donors who simply gave early in 2011 instead of late in 2010. The remaining ~$45k comes from donations of $5,000 and under, and the fact that we lost this many small donors is a source of some concern.
  • Donors who gave in both 2009 and 2010 gave less in 2010: $411,787 vs. $354,900 for a net loss of about $57,000. Three very large donors lowered their donations by a total of over $100k, for reasons that we believe to be related to financial circumstances, while another donor increased his by $30k; the rest of the changes netted out to a slight gain.
  • Donations to GiveWell, earmarked for regranting, fell by a net of $38,326. About $46,000 of this loss represented donors who switched over to giving to our top charities, or to GiveWell unrestricted, instead of giving to GiveWell restricted. There was another ~$29,000 in losses, $9,000 of which came from one donor and the rest of which came largely from people who had donated immediately after our 2007 launch and not returned, offset by ~$36,000 in gains, ~$34,000 of which came from 3 donors.
  • We had $238,988 in donations that we weren’t able to attribute to specific donors (donations that went through Network for Good plus donations reported to us by charities as coming on the strength of GiveWell’s recommendation), up from $56,200 in 2010, for a net gain of about $180,000 in this “mystery” category.