• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed



  • Powered by FeedBlitz

« Poetry, War, and Nanotech | Main | Living in a Simulation »

December 06, 2007

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Tom Craver

Well, they left out the top 3 risks faced by just about everyone in the world:

#1 cardio-vascular disease
#2 cancer
#3 age related disease

I ranked risks based on:
- danger - threat to the world,
- probability - likelihood of the threat being realized in the next ~50 years,
- urgency - how soon and vigorously we'd need to act to prevent it.

#4 Energy price increases. Taken to roughly mean "approaching and going over the oil peak". Worst cases seem to be major wars and economic collapse, which are fairly likely if we do nothing more to prepare. We have many ways to avoid that, but need to act quickly and vigorously.

#5 Iran/Pakistan nukes - Megadeath range mainly due to the war that their use would trigger. Fairly high chance Pakistan's bombs will get used - not just because radical islamics may get them, but because of factionalism among the radicals. The opportunity to do something sensible there is slipping away fast.

#6 Climate chaos - the worst impact would likely be a megadeath famine. Somewhat likely to get that bad. Getting started soon is important due to difficulty, but doing the wrong things fast won't help.

#7 Extremism / radicalism - worst case radical islam unites a large fraction of muslim nations, which might trigger a major war. Fairly small chance, but the opportunity to change policies and prevent it is slipping away fast.

#8, #9, #10 Pandemic, water shortages, environmental degradation.

If I leave out the immediacy factor - i.e. just looking at how dangerous things are likely to get over the next half century, Global Nuclear War (which I added) and AI are high, simply because they're both extinction level risks that aren't highly improbable.

Nanotech came in fairly high under that approach due to the potential for gigadeaths, but still below Energy and Pakistan nukes, due to the good chance that the benefits and defensive capabilities of nanotech will counter-balance both the severity and likelihood of the dangers.

The comments to this entry are closed.