background preloader

Existential risk

Facebook Twitter

Far future and global catastrophic risks

Nick Bostrom. Nick Bostrom (born Niklas Boström, 10 March 1973)[1] is a Swedish philosopher at St.

Nick Bostrom

Cross College, University of Oxford known for his work on existential risk, the anthropic principle, human enhancement ethics, superintelligence risks, the reversal test, and consequentialism. He holds a PhD from the London School of Economics (2000). He is the founding director of both The Future of Humanity Institute[2] and the Oxford Martin Programme on the Impacts of Future Technology as part of the Oxford Martin School at Oxford University.[3] Early life and education[edit] Bostrom was born in 1973 in Helsingborg, Sweden.[5] He pursued postgraduate studies in theoretical physics and philosophy at Stockholm University, and computational neuroscience at King’s College in London. Philosophy[edit] Existential risk[edit] Anthropic reasoning[edit] Bostrom has published numerous articles on anthropic reasoning, as well as the book Anthropic Bias: Observation Selection Effects in Science and Philosophy.

Human extinction. Human extinction is the hypothesized end of the human species.

Human extinction

Various scenarios have been discussed in science, popular culture and religion (see End time). The scope of this article is existential risks. Humans are very widespread on the Earth, and live in communities that (while interconnected) are capable of some level of basic survival in isolation. Therefore, pandemics and deliberate killing aside, to achieve human extinction the entire planet would have to be rendered uninhabitable, with no opportunity provided or possible for humans to establish a foothold beyond Earth. This would typically be during a mass extinction event, a precedent of which exists in the Permian–Triassic extinction event among other examples. Possible scenarios[edit] Severe forms of known or recorded disasters[edit] U.S. officials assess that an engineered pathogen capable of "wiping out all of humanity" if left unchecked is technically feasible and that the technical obstacles are "trivial".

Potential global catastrophic risk focus areas. Updated Sept. 12, 2014 to change “GiveWell Labs” to “Open Philanthropy Project,” in line with our August 2014 announcement.Throughout the post, “we” refers to GiveWell and Good Ventures, who work as partners on the Open Philanthropy Project.

Potential global catastrophic risk focus areas

This post draws substantially on our recent updates on our investigation of policy-oriented philanthropy, including using much of the same language. As part of our work on the Open Philanthropy Project, we’ve been exploring the possibility of getting involved in efforts to ameliorate potential global catastrophic risks (GCRs), by which we mean risks that could be bad enough to change the very long-term trajectory of humanity in a less favorable direction (e.g. ranging from a dramatic slowdown in the improvement of global standards of living to the end of industrial civilization or human extinction). Why global catastrophic risks? Importance. By definition, if a global catastrophe were to occur, the impact would be devastating. Near earth asteroids. Global Catastrophic Risk Institute. Global catastrophic risk.

A global catastrophic risk is a hypothetical future event with the potential to inflict serious damage to human well-being on a global scale.[2] Some such events could destroy or cripple modern civilization.

Global catastrophic risk

Other, even more severe, scenarios threaten permanent human extinction.[3] These are referred to as existential risks. Natural disasters, such as supervolcanoes and asteroids, pose such risks if sufficiently powerful. Human-caused, or anthropogenic, events could also threaten the survival of intelligent life on Earth. Such anthropogenic events could include catastrophic global warming,[4] nuclear war, or bioterrorism. The Future of Humanity Institute believes that human extinction is more likely to result from anthropogenic causes than natural causes.[5][6] Classifications of risk[edit]

Potential Global Catastrophic Risk Focus Areas. As we've written previously, Good Ventures is working with GiveWell to research potential focus areas.

Potential Global Catastrophic Risk Focus Areas

This joint venture, known as the Open Philanthropy Project (previously known as GiveWell Labs), currently is our primary strategy for learning and the main input into our grant decisions. We're cross-posting blog entries by GiveWell staff related to the Project to give readers of the Good Ventures blog a fuller picture of our progress. Throughout the post, “we” refers to GiveWell and Good Ventures, who work as partners on the Open Philanthropy Project. This post draws substantially on our recent updates on our investigation of policy-oriented philanthropy, including using much of the same language.

In our annual plan for 2014, we set a stretch goal of making substantial commitments to causes within global catastrophic risks by the end of this calendar year. The overwhelming importance of the future - Google Slides. The overwhelming importance of the future.