# The GiveWell Blog

Our goal with hosting quarterly open threads is to give blog readers an opportunity to publicly raise comments or questions about GiveWell or related topics (in the comments section below). As always, you’re also welcome to email us at info@givewell.org or to request a call with GiveWell staff if you have feedback or questions you’d prefer to discuss privately. We’ll try to respond promptly to questions or comments.

You can view previous open threads here.

• Ethan Kennerly on June 17, 2022 at 8:51 pm said:

Since 2020, I have learned from the meticulous cost-effectiveness spreadsheets for each top-charity.

In 2022, an estimate of risk would inform my allocation of donations. For example, I would learn from estimating the amount of risk and uncertainty for AMF to save a life. AMF Cost to Save a Life Apr 2022 ranges from about $4000 to$9000.

1. Is that an average estimate?
2. Do you know of a pessimistic estimate?
3. Do you know of a technique to derive a pessimistic estimate?

At the moment, a naïve notion might be to copy the spreadsheet and refer to the column with the highest estimate \$9050 in Chad. Then to naively recalculate each risky or uncertain row with an extra margin. A naïve example of an extra margin might be 5% worse. Meaning it would multiply the cell value by 1.05 or divide by 1.05, depending on whichever operation raises the cost to save a life. Obviously this fiddling is unscientific and below the high standard GiveWell has for auditing an intervention. Yet, with my ignorance and limited free time, I would still learn from some way of estimating risk and uncertainty. Of course, GiveWell has already estimated some risk and uncertainty into the estimate, yet at first glance, it is unclear what kind of range of expected values would be plausible in a pessimistic scenario.

• Miranda Kaplan on June 28, 2022 at 2:53 pm said:

Hi, Ethan,

The cost per life saved estimates we provide for our top charities in the spreadsheet you link to are our best guesses for each program in each location. They’re not averages, although many of the inputs that go into our cost-effectiveness analyses are averages (e.g., in the AMF tab you link to, we use 10 as the average number of years between net distribution and when long-term benefits start to accrue). We don’t provide a pessimistic estimate of cost-effectiveness for our top charities; we just get as close as possible, given a number of uncertain inputs, to what we think the actual cost per life saved is. At times, we do conduct sensitivity analyses for parameters that are particularly uncertain, or that will have a big impact on our bottom line, to figure out the most extreme value we could use while still getting a result that meets our cost-effectiveness bar.

Coming up with a pessimistic estimate for each location in our analysis for, e.g., AMF, wouldn’t be a very straightforward process. It would require figuring out which individual parameters in the analysis need to be adjusted upward or downward, and by how much to change them. Starting with only the highest cost per life saved, for Chad, and adjusting from there would probably result in a significant overestimate for other locations. There are also some hard-coded inputs in our public cost-effectiveness analysis that would prevent it from updating correctly if you made a copy and adjusted key parameters.

We have at times included pessimistic or optimistic scenarios in our cost-effectiveness analyses for other grants to other programs. For example, in this analysis of Evidence Action’s program to support maternal syphilis screening and treatment in Liberia, we’ve included “pessimistic,” “optimistic,” and “best guess” figures (those highlighted in blue) for some of the values in the “Assumptions” tab. The parameters here aren’t the same as those used in our top-charity models; this is just to illustrate how we came up with a pessimistic scenario in another instance.