• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed



  • Powered by FeedBlitz

« Timing is Everything | Main | Nanotube Radio »

July 02, 2008

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Tom Craver

A bit off topic, but perhaps related to fast change and risks: have you been reading the IEEE Spectrum's articles on Singularity?

I was a bit put-off by the snide "rapture of the nerds" attitude - reminded me strongly of the early attitudes toward nanotech.

I figure an AI singularity is more likely to kill me than "rapture" me. I've got a better chance at some amount of life extension from the slow transition of bio-medical scientists away from their historical fear of ridicule for thinking seriously about helping people live longer, and MAYBE from life extension research eventually accelerating once that attitude is well and truly dead.

Another interesting point was raised by Robin Hanson: "Even so extraordinary an innovation as radical nanotechnology would do no more than dramatically lower the cost of capital for manufacturing, which now makes up less than 10 percent of U.S. GDP."

He meant to imply that AI will be far more transformative than nanotech, which ultimately would be true. But he seems to have implied that nanotech might accelerate the economy by only around 10%, which I think is flat out wrong, even if the total amount of production did not increase. Kill off that "10%" primary production, and you'd see most of the rest of the economy collapse like dominoes.

John B

The cited paper presupposes high degrees of surveillance everywhere, that never gives false positives, that cannot be spoofed. *shrug* Your pardon, but it strikes me as a fantasy. If it helps you sleep better, I hope I didn't stir up nightmares.

The comments to this entry are closed.