background preloader

We're Underestimating the Risk of Human Extinction

We're Underestimating the Risk of Human Extinction
Unthinkable as it may be, humanity, every last person, could someday be wiped from the face of the Earth. We have learned to worry about asteroids and supervolcanoes, but the more-likely scenario, according to Nick Bostrom, a professor of philosophy at Oxford, is that we humans will destroy ourselves. Bostrom, who directs Oxford's Future of Humanity Institute, has argued over the course of several papers that human extinction risks are poorly understood and, worse still, severely underestimated by society. Some of these existential risks are fairly well known, especially the natural ones. But others are obscure or even exotic. Most worrying to Bostrom is the subset of existential risks that arise from human technology, a subset that he expects to grow in number and potency over the next century. Despite his concerns about the risks posed to humans by technological progress, Bostrom is no luddite. Of course there are also existential risks that are not extinction risks. How so?

Related:  PhilosophyTranshumanismExtinction

Science and Nonduality In this article standup philosopher Tim Freke articulates the nature of ‘paralogical’ thinking, which is the foundation of the philosophy and practices he shares to guide people to a ‘deep awake’ state. The need for paralogical thinking arises from an important insight. Life is profoundly paradoxical. I’ve already mentioned in passing the paradox that on the surface of life we live in a world of separate things, but at the depths all is one. At first such spiritual paradoxes can sound like mystical mumbo jumbo. So I want to ground our discussion of paradox in the empirical discoveries of hardnosed science, before using paralogical thinking to cast new light on the insights of spirituality.

9 Ways Humanity Could Bring About Its Own Destruction It all started with David Chalmer's various thought experiments concerning qualia and something called philosophical zombies: A p-zombie is essentially a creature that in all ways appears to be conscious, but isn't really conscious. Chalmers proposed a thought experiment where the inputs an outputs of a growing number of neurons in a person's skull are gradually routed into a growing collection of artificial neurons. The person is given a switch that allows them a to switch back and forth between their natural neurons and the synthetic ones. The person can then try switching back and forth at any level of replacement. Less than 1 percent, 10 percent, 20, percent, 50, 70, 90, whatever.

#5: Stephen Hawking's Warning: Abandon Earth—Or Face Extinction by Big Think Editors Let's face it: The planet is heating up, Earth's population is expanding at an exponential rate, and the the natural resources vital to our survival are running out faster than we can replace them with sustainable alternatives. Even if the human race manages not to push itself to the brink of nuclear extinction, it is still a foregone conclusion that our aging sun will expand and swallow the Earth in roughly 7.6 billion years. So, according to famed theoretical physicist Stephen Hawking, it's time to free ourselves from Mother Earth. "I believe that the long-term future of the human race must be in space," Hawking tells Big Think.

10 Mind-Blowing Theories That Will Change Your Perception Of The World by Anna LeMind Reality is not as obvious and simple as we like to think. Some of the things that we accept as true at face value are notoriously wrong. Transhumanism: The Most Dangerous Idea? "What ideas, if embraced, would pose the greatest threat to the welfare of humanity?" That question was posed to eight prominent policy intellectuals by the editors of Foreign Policy in its September/October issue (not yet available online). One of the eight savants consulted was Francis Fukuyama, professor of international political economy at Johns Hopkins School of Advanced International Studies, author of Our Posthuman Future: Consequences of the Biotechnology Revolution, and a member of the President's Council on Bioethics. His choice for the world's most dangerous idea? Transhumanism.

Humanity Is Getting Verrrrrrry Close to Extinction Too late, Stop sign. via Flickr. If you were to take a comparative look back at our planet during the 1950s from some sort of cosmic time-traveling orbiter cube, you would probably first notice that millions of pieces of space trash had disappeared from orbit. The moon would appear six and a half feet closer to Earth, and the continents of Europe and North America would be four feet closer together.

The Feminist Theory of Simone de Beauvoir Explained with 8-Bit Video Games (and More) Simone de Beauvoir, existentialist philosopher, feminist theorist, author of The Second Sex, whose birthday we celebrate today. Metroid, an action-adventure video game designed for the Nintendo in 1986. At first glance, they’re not an obvious pairing. But in 8-Bit Philosophy, a web series that explains philosophical concepts by way of vintage video games, things kind of hang together. Bostrom Responds to Fukuyama’s Assertion that Transhumanism is World’s Most Dangerous Idea Nick Bostrom (Sept 10, 2004) “What idea, if embraced, would pose the greatest threat to the welfare of humanity?” This was the question posed by the editors of Foreign Policy in the September/October issue to eight prominent policy intellectuals, among them Francis Fukuyama, professor of international political economy at Johns Hopkins School of Advanced International Studies, and member of the President’s Council on Bioethics.

Possible Ways Humans Could Go Extinct In The Near Future. One day, there will come a time where humans will no longer walk the planet. A morbid thought, I know, but a realistic one. We face extinction at some point and every day that we live we are just racing faster toward that inevitable end. Scientists have theorized about how it will happen and when it will happen and here is a compendium of some of the more likely theories.