1:05 PM
We're back from lunch and Max More is the first speaker of the afternoon. He's discussing the challenges that humans have in making good group decisions and avoid serious dangers. Max is now proposing that the "Proactionary Principle" can be useful in overcoming our built-in biases and limitations. (Recently the Extropy Institute, which Max co-founded back in the 1980's, dissolved. Now he is spending his energy promoting the ProP, as he calls it.)
1:25 PM
Christine Peterson of the Foresight Nanotech Institute is the next presenter. She says that to protect humanity's values as we move through a technological singularity, greater intelligence is not the only answer. She lists various methods for managing power, securing property rights, and ensuring "secure operating systems." She is describing "smart contracts" that supposedly could lead to "automated mutual defense."
This was a good talk by Christine with a lot of interesting ideas. I wish she had more time and that there could be some audience interaction with her. Unfortunately, she finished her slot with some slightly distasteful advertising for Foresight (nobody else has done this kind of self-promotion).
1:36 PM
Taking the stage now is John Smart, president of the Accelerating Studies Foundation. He says he has 90 slides.
John says "intelligence is the driver of accelerating change." He is doing an overview of historical descriptions of accelerating change. You can tell that he is a teacher, because he's packing a lot of well-researched information into a short amount of time; so much information, however, that I can't possibly report it all here. But if you've seen him speak before, or if you've read his website, then you'll have a very good idea of what he's saying.
I love John's material, but he's got way too much for this limited time and format. This is, of course, one of the major weaknesses of an event like this one.
Yipes! He's just moved into the second of three parts of his talk, and he's already used 19 of his allotted 20 minutes...obviously, he had to cut it short and wrap up.
2:10 PM
Eliezer Yudkowsky, Research Fellow of the Singularity Institute for Artificial Intelligence (the main group that worked to make this event happen) is speaking now. He's starting his talk with a definition and description of what he calls the "intelligence explosion."
What is intelligence?
- "Book smarts" versus cognition.
- The most powerful force in the known universe.
- Not a complete mystery.
You cannot trust your sense of how ridiculous things sound. -- A great quote from Eliezer. Interesting, funny, and insightful, he's always an entertaining speaker. I can't possibly keep up, so you may want to check out Dan Farber's review.
He finished with a typically beautiful, philosophical, even lyrical vision of the future of the human species.
Tags: nanotechnology nanotech nano science technology ethics weblog blog
Recent Comments