In May, we emailed all the charities we have analyzed in-depth asking them to fill out a brief survey about their experiences with our process.
We asked:
- How much time did you spend on our process?
- How clear were you at the outset about what our process would entail?
- Do you think, looking back, that our process was reasonable?
Note that in our process, we only contact and publish in-depth reviews for organizations we find promising, and this survey only went to those that we found promising enough to take that step. The survey was anonymous (unless the responder chose to disclose their identity.) We emailed 39 organizations; 12 responded. This was the first time we’ve conducted a survey like this. We intend to conduct these more regularly in the future and hope that leads to more responses.
We will use the results of this (and future) surveys to help us improve our process. The data will also help us to give the organizations we contact better estimates of how much time our process will require.
Unsurprisingly, organizations that receive high ratings in our process responded more positively to our survey. Below, we separate results from highly rated organizations (i.e., Gold- or Silver-rated charities) from others. (In the analysis below, we’ve assumed that organizations that submitted a survey but did not disclose their identities were not highly rated.)
Finally, we would guess that the time spent below slightly overstates the average time required for an organization we would contact in the future. The group of non-highly-rated organizations that responded was a select group (~7 / 30) that presumably were particularly motivated to respond, presumably because they spent more time than usual on our process.
Results
Of the 12, all but 1 gave us permission to share their responses in aggregated form, so all averages exclude this organization.
Table 1. Time spent on our process (hours)
Rating | Average | Median | Min | Max |
---|---|---|---|---|
Gold/Silver | 32 | 12 | 10 | 100 |
Not Gold/Silver | 43 | 32 | 1 | 120 |
All | 38 | 13 | 1 | 120 |
Table 2. Average time spent, by part of GiveWell’s process
Rating | On the phone with GiveWell | Creating new materials for GiveWell | Gathering/Sending materials for/to GiveWell | Reviewing materials (created by GiveWell) for accuracy |
---|---|---|---|---|
Gold/Silver | 29% | 36% | 14% | 18% |
Not Gold/Silver | 23% | 39% | 28% | 9% |
All | 26% | 37% | 21% | 13% |
Table 3. Average time spent, by role at the organization
Rating | Executive Director | Development staff | Program staff | Other |
---|---|---|---|---|
Gold/Silver | 51% | 32% | 16% | 1% |
Not Gold/Silver | 14% | 33% | 38% | 14% |
All | 31% | 33% | 28% | 8% |
Table 4. Number of responses to “Overall, did you feel the time spent was: reasonable, somewhat reasonable, or unreasonable?”
Rating | Reasonable | Somewhat reasonable | Unreasonable |
---|---|---|---|
Gold/Silver | 4 | 1 | 0 |
Not Gold/Silver | 1 | 3 | 2 |
Table 5. Number of responses to “How well did you understand the criteria / steps in the process / time required to assess your organization?” (Data is in the form of “very well + reasonably well | not particularly well + not well at all”.)
Understanding our process | Criteria | Process | Time required |
---|---|---|---|
Gold/Silver | 4 | 1 | 4 | 1 | 4 | 1 |
Not Gold/Silver | 3 | 3 | 3 | 3 | 1 | 5 |
Feedback from charities
One organization submitted feedback which was critical about our process.
- “Process is very flawed. Information is provided, then goal posts are moved. Givewell not in a position to analyze or understand information that is provided to it. Dealt with different people internally at Givewell, and have very little faith in process from Givewell’s end. Appeared new people did not know of earlier info provided – seemed to be ticking or crossing boxes, rather then engaging with the material. End result is that Givewell’s assesment process seems to be more of an assesment of Givewell’s ability to assess, than a real snapshot of organisations it is dealing with. Ie internal limitations of Givewell are the core factor in deciding ratings – I don’t think this is reflected in the way Givewell positions itself.”
We are not surprised to hear that an organization perceives the “goal posts moving” in our process. We tailor our analysis to the specific activities of the organization, which often means learning about what types of information the organization does/does not have available as our process is underway. And, as we find more outstanding organizations, the standard an organization must meet to qualify for our highest ratings does move. (We’ve written about this before here.)
The above organization submitted this feedback anonymously. If the representative who submitted this is reading our blog, and you are willing to speak further, please contact us at info@givewell.org. I’d appreciate the opportunity to learn more about the issue that it “appeared new people did not know of earlier info provided – seemed to be ticking or crossing boxes, rather then engaging with the material.”
A note on the organizations we contacted: We chose not to email the organizations we only had contact with during our first grant-focused process in 2007 because (a) the grant process we used then is very different from the process we use now and (b) it was so long ago that we thought it unlikely they’d recall relevant information.