Next week I'll be traveling to Oxford, U.K., to make a presentation titled "Small Machines, Big Choices: The Looming Impacts of Molecular Manufacturing" at a Conference on Global Catastrophic Risks.
Yesterday, I was interviewed via phone by a producer for the Discovery Channel regarding an on-camera appearance in a program they are doing about "Doomsday Scenarios."
What's behind this interest in nano catastrophes?
Perhaps it's the same morbid human curiosity that leads people to rubberneck at traffic accidents, to watch horror movies, and to read true-crime books. But perhaps it's more than that. It may result from a growing recognition that change is coming fast, and that our ability to control the powerful potential of emerging technologies could be fatally compromised.
On the other hand, the idea of science and technology (read human curiosity) racing out of control is as old as the myths of Prometheus, Icarus, and Frankenstein.
Typically with nanotech, the primary worry is tiny voracious replicators, aka "grey goo." What's interesting is how this particular scare has hung on even though it was largely put aside years ago in favor of more imminent and potentially more dangerous risks.
When I initially took the call from the Discovery Channel producer, he said his main intent was to get information on the threat of grey goo, but by the time our discussion concluded he agreed it makes more sense to focus on other issues, especially the weaponization of advanced nanotechnology, and the possibility that molecular manufacturing might be used for planetary engineering projects that have disastrous unintended consequences.
I'll keep you updated as that conversation progresses.
A bit off topic, but perhaps related to fast change and risks: have you been reading the IEEE Spectrum's articles on Singularity?
I was a bit put-off by the snide "rapture of the nerds" attitude - reminded me strongly of the early attitudes toward nanotech.
I figure an AI singularity is more likely to kill me than "rapture" me. I've got a better chance at some amount of life extension from the slow transition of bio-medical scientists away from their historical fear of ridicule for thinking seriously about helping people live longer, and MAYBE from life extension research eventually accelerating once that attitude is well and truly dead.
Another interesting point was raised by Robin Hanson: "Even so extraordinary an innovation as radical nanotechnology would do no more than dramatically lower the cost of capital for manufacturing, which now makes up less than 10 percent of U.S. GDP."
He meant to imply that AI will be far more transformative than nanotech, which ultimately would be true. But he seems to have implied that nanotech might accelerate the economy by only around 10%, which I think is flat out wrong, even if the total amount of production did not increase. Kill off that "10%" primary production, and you'd see most of the rest of the economy collapse like dominoes.
Posted by: Tom Craver | July 06, 2008 at 07:14 PM
The cited paper presupposes high degrees of surveillance everywhere, that never gives false positives, that cannot be spoofed. *shrug* Your pardon, but it strikes me as a fantasy. If it helps you sleep better, I hope I didn't stir up nightmares.
Posted by: John B | July 17, 2008 at 07:15 AM