• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed

  • Powered by FeedBlitz

« Destroy Your Enemy | Main | Industry Interest in Nanotech? »

July 18, 2005


Feed You can follow this conversation by subscribing to the comment feed for this post.

Tom Craver

We could get everything we hope for AND destroy ourselves...

Or get everything we hope for, and using them means the end of humanity as we would prefer it to be.

Or we get everything we hope for, but MUST use them in ways we'd prefer not to (e.g. becoming robots or uploads would be objectionable for most) in order to contain the dangers.


heaven, hell, or houston - ZZ Top song; check it out;

As for the above, humanity can't survive for long except in its agricultural existence even though that will develop into industrialism at least every couple of thousand years now. Humanity makes ever more technology because each piece of technology is an imcomplete answer, so we end up on this exponential(roughly) curve which gets to the point that only a transhuman can take in all the knowledge; a part of humanity either choses to be a scientific society, or we parish; the choice is ours!


"Finally, there are those who have sufficient faith in human cussedness that they think we will be able to control our futures rather than be the pawns of technology — the "prevail" scenario."

So it is implied that the other two scenario's turn us into pawns? That we won't have any control left?

I don't understand this whole prevail scenario. We are obviously giving more and more control to our machines already. We've lost a lot of control already, and everything is going just fine.

Me personally, I have no problem with a superior AI running our world for us. We obviously can't do it ourselves.

I've written a Singularity FAQ for Dummies, and put it on my new blogspot. Besides answering questions that any newcomers might have, it also shows how I feel about the whole thing.



We can choose our future. Not necessarily, we will. But the option is there... in various future environments humans or other sentient beings we give birth to, choose to live in, varying levels of safeguards will be needed to safeguard against dangerous technologies. In more inert environments like a simple farming community the minimum time to construct MM or AGI, or whatever the easiest dangerous technology is, will be much longer than the time required to effect these if you choose to ride the crest of the singularity wave. Your actions will be much more constrained a you choose the latter as a few seconds unsupervised with post-singularity resources could allow you to cause serious damage to the universe. But for individuals to make decisions at what level of reality they wish to exist in, a certain level of public education will need to be mandetory for all individuals. Even children born into Amish communities.

The comments to this entry are closed.