• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed

  • Powered by FeedBlitz

« Internet Fragmentation? | Main | Will Nanotech Find Happiness? »

October 03, 2005


Feed You can follow this conversation by subscribing to the comment feed for this post.

Janessa Ravenwood

Very nicely put! And exceedingly amusing - think I'll read the whole thing.

Brian Atkins

Unfortunately the final sentence shows the poster doesn't quite grok fully the vastness and current incomprehensibility of what a full blown Singularity is. What he describes is a tiny warmup.

Janessa Ravenwood

Well, yes, but I appreciate the general sentiment. Though I see that his numbers are also challenged on the thread.

Brian Wang

15 yrs ago. 1990.
1990 Tim Berners-Lee at CERN propose a 'hypertext' system, which is the first start of the Internet as we know it today. Internet was commonly used for email at universities for two decades. Gopher introduced in 1990. Hypercard out on the Mac since 1987.
1200 bps modems out since 1985.
Bulletin board systems were around and fairly popular (9000 fidonet nodes in 1990)
Microsoft releases Windows 3.0 a completely new version of Microsoft Windows. The Intel 486 has been out for 1 year.
There were 10MB,20Mb,40MB and 80MB hard drives. (that 486 box did cost about $2000-4000)
(1991 CD Rom came out and WWW)

Having lived through it...kind of surprising how little has changed. Still using microsoft word etc... Kind of thought that 1000-10000X boost in speed and capacity would have done more.

More user friendliness, wider spread, more business productivity. Real gains in societal interactions from internet.

More affordable luxury. More people can afford big screen televisions.

Speed by itself often is less useful or dramatic than a revolutionary interface or revolutionary transportation. That is until you pass some quantitative change that enables revolutionary qualitive changes. ie. if we were to nail human to computer voice communication or the mind / machine communication noted in a recent article on this site.

Brian Wang

For the Singularity and strong AI, Kurzweil's approach (similar to Aubrey De Grey's approach for conquering aging) is to use brute force engineering to side-step time consuming science. Get to AI by reverse engineering the whole brain. Mimic all of the neurons. There is still plenty of science involved but the problems are decomposed into smaller bits.

This seems to be a fundamental difference between those putting forth rapid technological advance predictions and some of the moderate predictors. We do not have to figure it all out before we perform an audacious scale project. We can make it work by getting certain steps to function properly and then repeat them.

I think we will not know for sure that the proposed short cuts will fully successfully reverse engineer the brain. The attempts look like a promising direction and one that should deliver useful capability and insightful science.

Incremental science is useful, but taking a more disruptive approach can get more rapid advance in capability. More risk taking and willingness to fail. Entrepreneurial approaches to science and research.

An error from the 60's was when McNamara (Kennedy's secretary of defence stopped the X projects for planes and rockets, because it made for unpredictable advances in capability...it was messing with his arms control talks).

We have Moore's law but we can do better. We just have to try more disruptive plans and approaches. What we have now (tech wise) is not good enough.

Space, Singularity, immortality ...will take forever with approaches that are not disruptive. Someone else with a disruptive plan will get there first.

Tom Craver

In 1985, I joined a company that aimed at a "dream personal computer" - CD-ROM storage, realistic images and animation, full motion video with audio, interactive 3D graphics. It took only about 10 years for all that to become commonplace. So even if the article's numbers were off by about 5 years, the core of the message was right.

However, we appear to be at the top end of the semiconductor S-curve, as they are hitting a thermal wall. We will squeeze out another 5-10 years of performance improvements via multiprocessing and big chip areas and other tricks - but multi-processing still hits thermal limits (or economic chip area limits) eventually, and shrinking the circuits is yielding less and less benefit. A new technology (MNT??) will likely come along to re-accelerate growth, but maybe only after a period of more modest or even flat improvement.

I worry that the motivator for developing the next surge of technology might be War. I keep catching glimpses of a dense black pyroclastic cloud of trouble hanging over our near future - 2010-2030. I really hope that cloud disipates before we get there - I'd love to be proven wrong.

The comments to this entry are closed.