In previous posts, I have:
- Laid out the view that in general, further economic development and general human empowerment are likely to be substantially net positive, and are likely to lead to improvement on many dimensions in unexpected ways.
- Listed possible global catastrophic risks that provide a potential counterpoint to this view, while also noting “global upside possibilities” in which progress could lead to a future that is far brighter than the present.
This post attempts to lay out my reasons for thinking that speeding the pace of global development and empowerment should be thought of as increasing humanity’s odds of an extremely bright future, relative to its odds of a future that is worse than the present. Note that
- I focus here on slightly to moderately speeding or slowing the pace of global development and empowerment relative to what it is today; this takes for granted that we can expect to see substantial development and empowerment in our future, and simply asks whether it is desirable that this development/empowerment happen more quickly or more slowly.
- I focus on the odds of an extremely bright future relative to the odds of a future that is worse than the present. This means that I’m not only considering the contribution of empowerment and development to catastrophic risk; I’m also considering their contribution to “global upside possibilities.”
1. Some catastrophic risks seem clearly reduced, and not exacerbated, by technological/economic progress. These include “non-anthropogenic” risks, such as asteroids, supervolcanoes, and non-engineered pandemics. Development may give us better tools for anticipating and responding to these risks, and is unlikely to make them worse. In addition, risks like #4 and #5 from the previous post on this topic – which involve risks of slowing growth due to shortage of a particular resource, or a slowdown in innovation – seem clearly mitigated by a faster pace of development.
2. Even for the catastrophic risks that seem exacerbated by development, I believe that faster development is likely safer than slower development (or, at worst, the net effect is highly ambiguous). This belief is based on the previously articulated concept of “global upside possibilities” – the belief that sufficient development may make the world not only better, but less at risk for major disruption by global catastrophe. If one accepts this view, it follows that faster overall development would mean less time between (a) the emergence of a given danger and (b) other developments that dramatically reduce risks. For example, faster development may bring the day closer when a highly dangerous synthetic pandemic can be designed, but it will also bring the day closer when we have the technologies and resources to manage such a risk (as well as potentially speeding the improvement of decision-making abilities and mental health worldwide, improving the capabilities of those who would mitigate such a risk and reducing the number of people who would contribute to it). Likewise, faster development may lead to higher carbon emissions, but is also likely to lead to better progress on alternative energy sources, more resources for adaptation mechanisms (much of the impact of climate change depends on these resources), and generally an environment more favorable to investing in climate change prevention.
There are certainly limitations to this reasoning. For one thing, it addresses “general” economic/technological development; the point remains that empowering people and developing technologies that are particularly likely to exacerbate risks can increase net risk, and that for any given risk there are particular kinds of growth that are more and less problematic in terms of that risk. (For example, the ideal scenario for dealing with climate change is one in which we see strong growth but also reduce carbon emissions.)
In addition, if there is a particular risk that has been clearly identified before it is yet technologically possible, and there is a promising plan for averting such a risk, it could be safer to experience slower development while the promising plan is executed. However, I know of no compelling examples of such dynamics today. (And in general, it is likely to be much easier to design a plan for responding to a risk when the risk is real and concrete rather than hypothetical.)
3. I believe that a large proportion of the risk of global catastrophe comes from the category of “risks that remain unarticulated and unimagined.” I don’t believe the list we made previously – or any list that can be constructed with today’s available information – is close to comprehensive: I expect that many of the most threatening risks are simply outside what we are able to anticipate today.
I would guess that some such risks become nearer as economic/technological development progresses, while some do not. But in all cases, I believe that economic/technological development is likely to improve our resources for anticipating, preventing and adapting to global catastrophes, and that for the reasons articulated above, faster development is more likely to reduce the lag between the emergence of risks and responses to them (including “global upside possibilities” that dramatically reduce risks).
4. A key part of my view is the belief that there are few outstanding cases in which it is clear that very particular actions need to be taken to avert particular risks. If there were a more compelling set of cases in which the right course of action were known, I would be more likely to believe that “slowing development until the right course of action can play out reduces risks, and generically speeding development increases them.” But as it is, I don’t see such clear-cut cases. The cases in which the necessary actions are clearest to me are those of asteroids (which I think is a clear-cut case in which development reduces risks) and climate change (which I see as highly ambiguous regarding the question of whether faster development is desirable, as discussed above). Thus, I don’t see a strong case for safety benefits to slower development.
I remain highly open to the possibility that particular risks represent excellent giving opportunities, and that focusing on them may do more good than simply focusing on increasing development and empowerment. But I am not aware of what I consider a strong case for believing that development in general increases the odds of a badly disrupted future relative to an extremely bright one, and I believe there are strong reasons to believe that development improves our prospects on net.
Comments
I think it’s true that effects of technological change now on *known* disasters like global warming and nuclear war are either ambiguous or positive. It’s also true that technology reduces non-anthropogenic risks.
But I think you are too optimistic in saying that faster TC is likely safer. For example, if we assume that people alive today are the prime consideration in determining “safety,” then this seems to imply that TC will likely reduce overall global catastrophic risk during our lifetimes. I’m not at all confident of that. Are you?
Also, while no doubt unintended, it appears that GiveWell’s recommendations are unusually good at *slowing* technological change: you transfer money to economies where no research is taking place, from first world economies performing most research. I don’t know a better legal method of increasing development while slowing TC.
The most important dimension of risk reduction is empowering people at individual, family and societal levels with needed inputs of information, education and practical training to develop SAFETY CULTURE and Resilience. This has to be further strengthened with technological development.Risk reduction is every one’s responsibility. Technological change, develoment and disaster risk reduction are interwoven. Thaese have to be taken care with inclusive planning. -COLONEL NAGAR M VERMA,DIRECTOR GENERAL SARITSA FOUNDATION.
Alexander, I don’t follow your second paragraph, could you elaborate? Note that I haven’t claimed that development will reduce risk during our lifetimes; I’ve distinguished between discussing “development relative to no development” and “faster development relative to slower development, given a relatively narrow range of development speed” and explicitly focused on the latter. It can simultaneously be the case that we would be safer if development halted now, and that faster development is safer than slower development.
Holden, do you have any thoughts about how economic development interacts with resource scarcity? It seems like bringing up the developing world to developed standards would entail both an enormous increase in human welfare, as well as an enormous drain on resources. Are you assuming that more people at a higher level of development would be able to harness more total resources? Or that any increased costs are more than offset by the humanitarian good? Do you have any concerns that the resulting scarcity could put economic or political pressure on an already unstable situation?
It seems like furthering technology in the developed world has been the primary driver behind increased efficiency in the use of existing resources. Look at how cellphone technology enabled the developing world to skip a huge amount of infrastructure, for instance. Pushing ahead technological progress in the developed world could thus have massive spillover effects to the developing world, and make more development possible on a limited budget.
William: We agree that increasing the standard of living in low income countries would likely entail net increases in resource consumption; we believe that the humanitarian benefits significantly outweigh any downsides to that increase. It strikes us as possible but unlikely that additional scarcity due to accelerated economic development would have sufficient magnitude to cause additional instability.
I generally agree that improved technology has played a crucial role in increasing the efficiency of resource use, and that further technology development could likely further efficiency.
My thought is that any spending inside an economy stimulates more spending in that economy, at least if you’re not running at full employment.
http://en.wikipedia.org/wiki/Keynesian_multiplier
Also, it’s mostly the first world that is conducting scientific research, and certainly almost no research is happening inside low-income countries:
http://arxiv.org/ftp/arxiv/papers/0911/0911.1042.pdf
So anybody transferring their money to Africa through GiveDirectly, say, also gives the multiplier effects of spending that money to Africa. At least some of those effects in a first-world economy would probably aid research. In Africa, probably not. Controlling where those effects go is pretty hard, but one of the few ways I can think of is ejecting your money from the national economy, since national economies are fairly self-contained.
Now if your goal was simply to not have any multiplier effects, you could I suppose save your money indefinitely, or burn it. But donating to a poor country has the additional effect of furthering development in that country.
So if you viewed technological change as largely bad but human development as good–which I guess is what primitivists think–then this might be pretty close to ideal.
I’m not sure how to calculate how big these effects are. Maybe a rule of thumb is that ripples are evenly distributed across sectors according to their percentage of GDP. You would have to know how big the multiplier was also. Maybe such a calculation would reassure your donors or curiosity that opportunity costs from lost ripple effects on domestic research were not high.
From my own perspective it would be interesting to know how to encourage some technologies and discourage others. But this is a different goal.
Regarding your last statement on faster and slower development, one would seemingly assume that if there is a narrow risky period that can be merely delayed and not eliminated, time discounting will favor delay. So if there is some one-year period where the world has a 50% chance of blowing up, things being equal I would like that period to happen when I am 41 rather than 40.
Alexander Gabriel:
“This belief is based on the previously articulated concept of “global upside possibilities” – the belief that sufficient development may make the world not only better, but less at risk for major disruption by global catastrophe. If one accepts this view, it follows that faster overall development would mean less time between (a) the emergence of a given danger and (b) other developments that dramatically reduce risks.”
Could you outline how/why you expect the likely risks from development that aren’t climate change (#1 and #7 from your previous list) to be effectively mitigated? Looking at the only previous potential catastrophic risk in this category, nuclear weapons, it doesn’t look like there has been a long-term decrease in risk:
http://s1.ibtimes.com/sites/www.ibtimes.com/files/styles/v2_article_large/public/2013/01/15/doomsday-clock-graph.png
Alex, the Doomsday Clock you point to implies (via Wikipedia) that the danger of nuclear global disaster was the same in 1980 (“President Jimmy Carter pulls the United States from the 1980 Summer Olympic Games in Moscow and considers ways in which the United States could win a nuclear war”) and in 2002 (before climate change was added to the factors affecting the clock). That isn’t plausible to me, and I’m inclined to discount the Clock as an indicator because of it. (Note that our shallow writeup on nuclear security cites a different report coming to the conclusion that we are in fact safer on this front today.) I believe that the risk of global nuclear catastrophe has fallen significantly and that general empowerment/development deserves a good deal of credit for this.
Regarding risks #1 and #7 from the previous list – two “global upside possibilities” that could substantially reduce these risks would be colonization of other planets and major widespread improvements in intelligence and altruism. That’s a reason to believe that faster development is safer (which is distinct from believing that some development is safer than none).
Comments are closed.