Hostname: page-component-8448b6f56d-c47g7 Total loading time: 0 Render date: 2024-04-16T01:26:28.250Z Has data issue: false hasContentIssue false

Astronomical Waste: The Opportunity Cost of Delayed Technological Development

Published online by Cambridge University Press:  26 January 2009

Nick Bostrom
Affiliation:
University of Oxford, nick@nickbostrom.com

Abstract

With very advanced technology, a very large population of people living happy lives could be sustained in the accessible region of the universe. For every year that development of such technologies and colonization of the universe is delayed, there is therefore a corresponding opportunity cost: a potential good, lives worth living, is not being realized. Given some plausible assumptions, this cost is extremely large. However, the lesson for standard utilitarians is not that we ought to maximize the pace of technological development, but rather that we ought to maximize its safety, i.e. the probability that colonization will eventually occur. This goal has such high utility that standard utilitarians ought to focus all their efforts on it. Utilitarians of a ‘person-affecting’ stripe should accept a modified version of this conclusion. Some mixed ethical views, which combine utilitarian considerations with other criteria, will also be committed to a similar bottom line.

Type
Articles
Copyright
Copyright © Cambridge University Press 2003

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 Cirkovic, M., ‘Cosmological Forecast and its Practical Significance’, Journal of Evolution and Technology, xii (2002)Google Scholar, http://www.jetpress.org/volume12/CosmologicalForecast.pdf

2 Drexler, K. E., Nanosystems: Molecular Machinery, Manufacturing, and Computation, New York, John Wiley & Sons, Inc., 1992Google Scholar.

3 Bradbury, R. J., ‘Matrioshka Brains’, Manuscript, 2002Google Scholar, http://www.aeiveos.com/~bradbury/MatrioshkaBrains/MatrioshkaBrains.html

4 Bostrom, N., ‘How Long Before Superintelligence?’, International Journal of Futures Studies, ii (1998)Google Scholar; Kurzweil, R., The Age of Spiritual Machines: When Computers Exceed Human Intelligence, New York, 1999Google Scholar. The lower estimate is in Moravec, H., Robot: Mere Machine to Transcendent Mind, Oxford, 1999Google Scholar.

5 Bostrom, N., ‘Are You Living in a Simulation?’, Philosophical Quarterly, liii (2003)Google Scholar. See also http://www.simulation-argument.com

6 The Virgo Supercluster contains only a small part of the colonizable resources in the universe, but it is sufficiently big to make the point. The bigger the region we consider, the less certain we can be that significant parts of it will not have been colonized by a civilization of non-terrestrial origin by the time we could get there.

7 Utilitarians commonly regard time-discounting as inappropriate in evaluating moral goods (see e.g. Brandt, R. B., Morality, Utilitarianism, and Rights, Cambridge, 1992, pp. 23 fGoogle Scholar.). However, it is not clear that utilitarians can avoid compromising on this principle in view of the possibility that our actions could conceivably have consequences for an infinite number of persons (a possibility that we set aside for the purposes of this paper).

8 Bostrom, N., ‘Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards’, Journal of Evolution and Technology, ix (2002)Google Scholar, http://www.jetpress.org/volume9/risks.html

9 This formulation of the position is not necessarily the best possible one, but it is simple and will serve for the purposes of this paper.

10 Or whatever the population is likely to be at the time when doomsday would occur.

11 See e.g. Vinge, V., ‘The Coming Technological Singularity’, Whole Earth Review, winter issue (1993)Google Scholar.

12 Freitas, R. A. Jr, Nanomedicine, vol. 1, Georgetown, Landes Bioscience, 1999Google Scholar.

13 E.g. Moravec, , Kurzweil, , and Vinge, ; Drexler, E., Engines of Creation, New York, 1986Google Scholar.

14 On the person-affecting view, the relevant factor is not the temporal location of a person per se. In principle, effects on the well-being of a past or a future person could also be an appropriate target for moral concern. In practice, however, the effect on the well-being of past persons is likely to be relatively small or even zero (depending on which conception of well-being one adopts), and the effect (of any action that has a significant effect on the future) on not-yet-existing persons is very likely to be such that competing alternative actions would lead to separate sets of possible persons coming into existence, for example as a result of different sperms fertilizing different eggs, and thus there would typically be no future persons whose level of well-being is affected by our current actions. Our actions might affect which future persons there will be, but on the person-affecting view, such results provide no general moral reasons for action. One exception are fetuses and fertilized eggs, which are not yet persons, but which may become persons in the future, and whose lives our current actions may influence.

15 I'm grateful for comments from John Broome, Milan Cirkovic, Roger Crisp, Robin Hanson, and James Lenman, and for the financial support of a British Academy Postdoctoral Award.