2:50 PM
Bill McKibben, author of Enough, is speaking from a remote location via teleportation, so to speak. His image is projected on a small screen behind the other speakers. It's life-size, which is supposed to make it seem as if he's here, I guess, but the problem is that it's hard for many of us to see the image. He's also not using a slide show, so basically we're just listening to a disembodied voice. To make matters worse, he is reading his talk, which tends to be harder to take in than is extemporaneous speaking.
Bill's remarks are focused on his objection to "the immortalitists." He says that instead of aiming to live forever, we should aim to live. That's a remarkably shallow platitude, in my view, as I see nothing wrong with vigorously pursuing both aims.
Now he is reporting on how humans, as a whole, are less happy today than we were 50 years ago. He claims that the answer of technoprogressives is always "more is better." But I think that's: a) a strawman, and b) confusing cause with correlation.
It's a strawman because virtually all of the futurist thinkers that I know (and I know most of the leading ones) are just as interested in living now as they are in living forever, and they are just as connected with human interests and values as they are to technological goodies.
The confusion between cause and correlation comes from the suggestion that people as a whole are less happy today because of advanced technology. There seems to be no real evidence for this.
3:15 PM
OK, McKibben is done, and now Ray Kurzweil is back at the podium to give his thoughts on and reponses to all the earlier speakers. I'm not going to try to keep up. He's covering a lot of points in a short time. Happily, you can look forward to a Singularity Summit podcast, which is supposed to be coming in a few days, and then a streaming video some time after that.
3:42 PM
The final segment of today's event is a panel discussion including all of the speakers (except Eric Drexler, who is no longer here) and an audience Q&A. The panel is being moderated by Steve Jurvetson and Peter Thiel.
Steve began by asking the panel: Could "uploading" a human mind be expected "to work" if it was attempted before we have a thorough understanding of how the brain works?
Ray's answer was a little hard to understand, but I think he said, first, that uploading is not really necessary for the technological singularity to occur, and second, a premature upload probably wouldn't be successful. But he emphasized that uploading is "a side issue."
Eliezer talked about the slow rate of change provided by evolution versus the rapid rate possible though design and development.
A note here about the problem with a huge panel (10 speakers) at the end of a long day: it tends to be unfocused and not very satisfying. I prefer to allow audience Q&A and/or speaker discussion periodically throughout an event and not saved up until the end.
Peter Thiel asked: By what year will we have strong AI, and what is the probability that it will lead to a better world?
Ray says ~2029, and 90%. Wow, that's high!
Doug Hofstadter says human-level computer intelligence by ~2100, and who knows?
Nick says maybe 2040, and perhaps 50%.
Sebastian says 2101, and good in hindsight [laughter].
Max says it all depends on what we do.
John says around 2060, but it could be much sooner if our burgeoning computer networks -- mainly spurred by Google -- awaken spontaneously.
Audience question: Will strong AI emerge or be designed?
Eliezer says the question is too simple and anyway we don't know enough yet to provide a meaningful answer.
Ray says it's too complex to emerge spontaneously and probably will be designed.
Tyler Emerson asked Bill McKibben: How can we avoid existential risks?
He said, slow down. Say enough is enough. Relinquish some technologies. He added thet he thinks the two most important inventions of the last few centuries are the concept of a protected wilderness, and the concept of nonviolent resistance [applause, including from me].
In response to another audience question, Cory referred to the technological singularity as "the Rapture of the Nerds" (crediting Ken MacLeod), and got loud applause.
Somehow while I wasn't paying attention, the discussion turned to whether we will eventually have a world government. The consensus seemed to be Yes, although as to whether or not that will be a good thing, there was less agreement.
More discussion ensued on various topics until, finally, the event concluded at 5:35 PM. Whew, it was a long day!
Tags: nanotechnology nanotech nano science technology ethics weblog
You Said:
"The confusion between cause and correlation comes from the suggestion that people as a whole are less happy today because of advanced technology. There seems to be no real evidence for this."
You weren't listening very well to what Bill McKibben said. He did not state or imply that advanced technology has caused greater unhappiness. His main point was not about technology per se, but about community vs. hyper-individualism (which underlies US culture especially).
His primary point, as I heard it -- though Ray seemed to completely not get this at all -- is that more than just talking about technological progress, we need to ask and continue to ask the more fundamental question of "to what end?".
Implicit in Ray's vision is the assumption, which he nowhere discusses explicitly, that more technology is inevitably better, and that happiness and human flourishing will be the inevitable consequence.
Bill McKibben asked us to be a little more reflective and to ask whether this is necessarily the case, and how we might explain that so few people now are genuinely happy in this most prosperous of all ages.
Posted by: Joseph Knech | May 14, 2006 at 03:03 AM
Joseph,
I think people like Bill McKibben are WAY off base in their criticisms. As long I go to, say a restaraunt, and see old people and think, "I am likely to become like that", then technological progress is going way too slow, especially in biotechnology. Bill has valid points in the issue of "community vs anarchy". However, what he has to realize is that any concept of community, as with any form of interpersoanl relationships, is based on free association. We associate with the people we like to be with and do not associate with those we don't feel we have anything in common with. The key here is that all fulfilling interpersonal relationships must be VOLANTARY. Bill McKibben seems uncomfortable with this notion.
In any case, having a finite lifespan, with age and decay thrown in, limits our ability to live the lives we want and to see the associations with other that we desire. It is a real impediment to free living.
I say lets cure aging, secure for ourselves the open, limitless futures that we deserve, then we can debate social associations to our hearts' content.
Posted by: Kurt | May 14, 2006 at 10:52 AM
Re: BMcKibben & Happiness. Unhappiness can be a powerful and constructive force. Happiness can be a dangerous narcotic - lulling us into sleep - and sleep can be another form of denial. Would you say that relief of physical pain and suffering is relative? Technological change is a means to evolve toward some mechanical like end state. Can it be anything more than that? Should it be more than that? I believe it can - but only if we make it so. For that to happen we need to be awake!
Posted by: James | May 16, 2006 at 09:31 PM
There are lots of positive emotions: happiness, joy, excitement, contentment; they all seem to be lumped under "happiness." Some of them are lulling, and some of them are energizing.
Yes, unhappiness can be very powerful and get a lot done. Just ask any torturer. Conversely, I don't get up in the morning because of fear of physical pain.
In middle-class 21st century America, I don't expect to feel noteworthy pain on any given day. Likewise, if I expected that I was not going to experience unhappiness tomorrow, that would not reduce my motivation. Maybe some days I'd seek joy; others, fulfillment; and some days I might choose to wallow in contentment.
I don't see any of those positive emotions as addictive. In fact, last I heard, addiction is promoted by a diminished capacity to feel pleasure.
Chris
Posted by: Chris Phoenix, CRN | May 16, 2006 at 10:36 PM
Technologically induced happy brain states are fine with me. They may be psycologically addictive like cannibas, but as long as they don't mimic the physiological properties of say heroin, I think it will be a positive influence to our qualities-of-living. Maybe people who play the videogame-narcotic of the future should have to pass a cursory pysch test and obtain a Driver's License-like document.
Posted by: Phillip Huggan | May 17, 2006 at 08:11 AM
Any details or links for the podcast/videocast? I haven't been able to find anything about it.
Thanks!
Posted by: Andrew | May 17, 2006 at 02:12 PM
Regarding Bill Mckibbon's comment to "slow down", one has to wonder how he could enforce such an idea, noble and inviting as it might be.
Personally, I think that our ultimate "protection" from future distopia's would be for as many of us as possible to participate in the process, thus providing the greatest diversity of ideas and actions.
Posted by: Michael Rudolf | May 23, 2006 at 04:46 PM