Most of us are content to just worry about the future of humanity in our spare time, but there's an entire group of academics at Oxford University in England who make that their professional mission.
Each member of the Future of Humanity Institute has his own focus. Some are concerned with climate change and its impact on humanity; others with the future of human cognition. Department head Nick Bostrom, whose paper Existential Risk Prevention As Global Priority has just been published, has a long history of being worried about our future as a species. Bostrom posits that humanity is the greatest threat to humanity's survival.
Bostrom's paper is concerned with a particular time-scale: Can humanity survive the next century? This rules out some of the more unlikely natural scenarios that could snuff out humans in the more distant future: supervolcanoes, asteroid impacts, gamma-ray bursts and the like. The chances of one of those happening within the very narrow timeframe involved is, according to the paper, extremely small. Further, most other natural disasters, such as a pandemic, are unlikely to kill all humans; we as a species have survived many pandemics and will likely do so in the future.
According to Bostrom, the types of civilization-ending disasters we may unleash upon ourselves include nuclear holocausts, badly programmed superintelligent entities and, my personal favorite, "we are living in a simulation and it gets shut down." (As an aside, how the hell do you prepare for that eventuality?) Additionally, humans face four different categories of existential risk:
Extinction: we off ourselves before we reach technological maturity
Stagnation: we stay mired in our current technological and intellectual backwater
Flawed realization: we advance technologically...in a way that isn't sustainable
Subsequent ruination: we reach sustainable technological maturity and then eff it all up anyway
More pointedly, Bostrom's paper is a renewal of a call-to-arms he issued a decade ago imploring people to wake up to the possibility that we will kill ourselves with technology. These days, he's not so much concerned with the how -- existential death by grey goo vs existential death by sentient robots is still existential death. He's most concerned that there's nobody out there really doing anything about this problem. That's understandable, of course. Existential threats are nebulous concepts, and even the threat of nuclear winter was not enough to terrify certain governments into, you know, not building thermonuclear weapons.
Bostrom, of course, can't help but come off a bit high-handed when he laments that there are far too many papers in the academic literature on "less-important topics," such as "dung beetles" and "snowboarding." This is amusingly wrong-headed. If I've learned anything at all during my stint on this planet, it is that if a topic exists, somebody will make it their life's work to ask questions about it.
Like every SF fan out there, I love a good dystopian narrative, disaster scenario, and potential civilization-ending cataclysm. It makes for interesting drama and the exploration of the ethical implications of certain actions. If Bostrom and his team wish to kick-start a juggernaut of end-times scientific research by hiring smart young research assistants to help solve some of these problems, then I applaud their efforts and look forward to reading their scenarios. Let's hope that next time they can leave the pointless criticism of other legitimate avenues of research out of it.
1. Loss of the Earth’s Magnetic field currently will bring the suns solar radiation upon Earth killing much life.
2. Humanity industrial revolution will continue make global warming across the Earth and deplete necessary life resources.
3. World population will escalate pushing Humanity industrial revolution and using up world resources.
7. Industrial pollution grows as humanity industrial revolution engine and population grown.
4. With depletion of necessary resources wars will develop further in the world.
5. With rising population growth, so goes the rise of the plagues that will affect humanity.
6. Oh, there is always a rogue country with a nuclear bomb that will be tossed about and the country that is effected will respond with conventional or nuclear means 1000 fold or more, example ( The USA response to the twin towers ).
All these things are happening now and threaten the Earth together within 50 years.
The number order above is wrong, oops, and the overall story is probably much much grime than I know or described.
If we on Earth are going to die, die a good and decent person and do not allow the madness around you, to corrupt you.
It's all worldview and eschatology.
Why worry about something that doesn't fit your worldview?
"we are living in a simulation and it gets shut down." (As an aside, how the hell do you prepare for that eventuality?)
we address the programmer through a built in interface without alluding that we intend to take over his/her world and enslave the populace.
um... maybe we could also define some limits for our own interactive games.
ok: comment #2:
If it costs money to study something and there's no backers, any study will be superficial. It's easy to say "yeah we should have had a plan" after everybody's extinct, but without proof that it's actually going to happen, nobody's going to take it seriously (global warming excepted of course).
The other side of that argument is that maybe the government or a university puts up a fund to study doom. There's so many ways to oblivion, and the time scale is usually longer than a lifetime, that the money gets spent on make-work studies just for the income. If the problem gets solved, the money stops.
This implies that a dedicated oversight committee should contract the studies, and report the result to someone who cares (like the FDA or Dept of the Interior). Presumably there's an agency SOMEWHERE that will implement changes to avoid a worst case scenario.
The holes in the arguement are big enough to put a tanker truck through.
1. Technology is moving so fast, that cures pop up and are often replaced by a better therapy before they hit the market.
2. 2011 was the most peaceful year in human history.
3. The internet is crippling regimes ability to control their populations information, and is largely responsible for the arab spring. Likely there will be lasting peace and democracy in the middle east in 20 years.
4.Optimists live 8% longer then pessimists.
5. (afterthought) Westernization drops birth rates, and once funds shift from war to industrialization of third world countries , the population will level off to a lower level.
I would just like to point out that some of the comments have listed the growing population as a chief obstacle to our survival while also listing depopulating events like wars.
We have thousands of years of history to teach us that some of these problems like war, famine, and climate change, take care of themselves by shrinking the population.
It's only a problem if you foolishly wish the future to look like the past. Yes, that'd be swell, but it's not going to happen.
Where all going to die, Eeeeeek!
And the really greatest threat? Humanity itself.