A thread on Slashdot re Kurzweil's The Singularity is Near asks us to...
Rewind your brain 15 years and imagine what you'd think if I told you:- Your computer will be roughly 1,000 faster than what you're using today. You will probably have more than 4,000 times the memory, and a fast hard drive that stores over 100,000 times as much as that floppy you're using. You can buy these supercomputers for less than $500 at Wal-Mart.- That computer will be hooked into a self-directed network that was designed by the Department of Defense and various universities -- along with nearly 400,000,000 other machines. Your connection to this network will be 10,000 times faster than the 300 baud modem you're using. In fact, it will be fast enough to download high-quality sound and video files in better than realtime.
- There will be a good chance that your computer's operating system will have been written by a global team of volunteers, some of them paid by their employers to implement specific parts. Free copies of this system will be available for download over the hyperfast network. You will have free access to the tools required to make your own changes, should you want to.
- You will use this mind-bendingly powerful system to view corporate sponsored, community driven messages boards where people will bitch about having to drive cars that are almost unimaginably luxurious compared to what you have today.
Remember: in some fields, the singularity has already happened.
Nicely put!
Mike Treder
Tags: nanotechnology nanotech nano science technology ethics weblog blog
Very nicely put! And exceedingly amusing - think I'll read the whole thing.
Posted by: Janessa Ravenwood | October 03, 2005 at 09:40 PM
Unfortunately the final sentence shows the poster doesn't quite grok fully the vastness and current incomprehensibility of what a full blown Singularity is. What he describes is a tiny warmup.
Posted by: Brian Atkins | October 03, 2005 at 09:56 PM
Well, yes, but I appreciate the general sentiment. Though I see that his numbers are also challenged on the thread.
Posted by: Janessa Ravenwood | October 03, 2005 at 10:04 PM
15 yrs ago. 1990.
1990 Tim Berners-Lee at CERN propose a 'hypertext' system, which is the first start of the Internet as we know it today. Internet was commonly used for email at universities for two decades. Gopher introduced in 1990. Hypercard out on the Mac since 1987.
1200 bps modems out since 1985.
Bulletin board systems were around and fairly popular (9000 fidonet nodes in 1990)
Microsoft releases Windows 3.0 a completely new version of Microsoft Windows. The Intel 486 has been out for 1 year.
There were 10MB,20Mb,40MB and 80MB hard drives. (that 486 box did cost about $2000-4000)
(1991 CD Rom came out and WWW)
Having lived through it...kind of surprising how little has changed. Still using microsoft word etc... Kind of thought that 1000-10000X boost in speed and capacity would have done more.
More user friendliness, wider spread, more business productivity. Real gains in societal interactions from internet.
More affordable luxury. More people can afford big screen televisions.
Speed by itself often is less useful or dramatic than a revolutionary interface or revolutionary transportation. That is until you pass some quantitative change that enables revolutionary qualitive changes. ie. if we were to nail human to computer voice communication or the mind / machine communication noted in a recent article on this site.
Posted by: Brian Wang | October 04, 2005 at 10:03 AM
For the Singularity and strong AI, Kurzweil's approach (similar to Aubrey De Grey's approach for conquering aging) is to use brute force engineering to side-step time consuming science. Get to AI by reverse engineering the whole brain. Mimic all of the neurons. There is still plenty of science involved but the problems are decomposed into smaller bits.
This seems to be a fundamental difference between those putting forth rapid technological advance predictions and some of the moderate predictors. We do not have to figure it all out before we perform an audacious scale project. We can make it work by getting certain steps to function properly and then repeat them.
I think we will not know for sure that the proposed short cuts will fully successfully reverse engineer the brain. The attempts look like a promising direction and one that should deliver useful capability and insightful science.
Incremental science is useful, but taking a more disruptive approach can get more rapid advance in capability. More risk taking and willingness to fail. Entrepreneurial approaches to science and research.
An error from the 60's was when McNamara (Kennedy's secretary of defence stopped the X projects for planes and rockets, because it made for unpredictable advances in capability...it was messing with his arms control talks).
We have Moore's law but we can do better. We just have to try more disruptive plans and approaches. What we have now (tech wise) is not good enough.
Space, Singularity, immortality ...will take forever with approaches that are not disruptive. Someone else with a disruptive plan will get there first.
Posted by: Brian Wang | October 04, 2005 at 11:53 AM
In 1985, I joined a company that aimed at a "dream personal computer" - CD-ROM storage, realistic images and animation, full motion video with audio, interactive 3D graphics. It took only about 10 years for all that to become commonplace. So even if the article's numbers were off by about 5 years, the core of the message was right.
However, we appear to be at the top end of the semiconductor S-curve, as they are hitting a thermal wall. We will squeeze out another 5-10 years of performance improvements via multiprocessing and big chip areas and other tricks - but multi-processing still hits thermal limits (or economic chip area limits) eventually, and shrinking the circuits is yielding less and less benefit. A new technology (MNT??) will likely come along to re-accelerate growth, but maybe only after a period of more modest or even flat improvement.
I worry that the motivator for developing the next surge of technology might be War. I keep catching glimpses of a dense black pyroclastic cloud of trouble hanging over our near future - 2010-2030. I really hope that cloud disipates before we get there - I'd love to be proven wrong.
Posted by: Tom Craver | October 04, 2005 at 06:53 PM