Rocky Rawstern, the esteemed editor of Nanotechnology Now, points us to a fine series of columns written and published in late 2004 by Brad Allenby. Titled "Free Will and the Anthropogenic Earth" -- by which Allenby means the "human-generated" earth -- the whole three-part series is worth reading in full.
Here is an extended excerpt with special relevance to CRN's work:
Jalalu’ddin Rumi, the 12th century Persian poet, made the pertinent observation that “there is a disputation [that will continue] till mankind are raised from the dead between the Necessitarians and the partisans of Free Will.” The observant reader will observe that the specified event has not occurred, and, indeed, there remains no closure on whether free will exists or what it is. And a brief column on environmental issues is unlikely to resolve this issue - so why on earth bother?
Very simply, free will matters in large part because in most cultures -- including the dominant Eurocentric globalized culture -- ethical responsibility accompanies decisions made where free will exists, and does not accompany actions that do not arise from free will. Thus, moral responsibility is generally not imposed where actions are taken under duress, or by individuals who are incapable for some reason, such as mental illness, of exercising free will. . .
Microethical systems -- ethics at the level of the individual as a member of a particular culture, or profession -- are not free of disagreement and complexity, but at least it is well tilled ground. Macroethics, however -- the ethics of emergent behavior of technology systems, societies, or intercultural relations that a profoundly multidisciplinary world calls forth -- is an area that to date has yet to receive adequate attention. This requires not just new ethical formulations, but new institutional roles -- if the individual engineer is not ethically responsible for the emergent behavior of the Net, then who is?
These questions become particularly important in light of the rapid evolution of new foundational technologies, particularly those known as NBIC -- nanotechnology, biotechnology, information and communications technology, and cognitive sciences. These technologies almost certainly will transform human beings, their cultures, and the anthropogenic earth and its systems. . .
[T]raditional conceptualizations of free will and accompanying ethical responsibility must be expanded in two important ways. First, they must comprehend not just individual moral judgments, but be able to guide uncertain and highly complex actions affecting complicated integrated human/natural systems. Second, they must begin to focus less on the ethical implications of individual actions resulting in predictable outcomes, and more on process, on continued dialogs with complex systems with even the desired outcomes (the earth systems engineering and management (ESEM) design objectives and constraints) contingent and changing over time, as the ideologies and political positions of stakeholders and affected entities shift.
This leads to a final question: how do we get from where we are to where we want to be, a macroethical systems capable of guiding actions under such unprecedented conditions?
We may begin by rejecting the common approach of simply projecting individual ethical responsibility to the scale of emergent behaviors of ESEM systems. It is simply untenable to make individual scientists or engineers responsible for the behavior of systems to which they may have contributed, but which in many cases are self-organizing and demonstrate behaviors which are unpredictable and become apparent only over significant time periods. The designer of a chip which goes into a router which goes into the Internet cannot be held personally ethically responsible for how the Net may affect social structures thirty years from now; nor can she, using new 65-nanometer chip technology, be held responsible for the “social effects of nanotechnology.”
But the individual scientist, engineer or environmentalist can, it seems to me, be charged with a fundamental responsibility to ensure that a process is established by which technical communities, and society at large, can dialog with complex technological systems. . . The nature of such a dialog, which must be highly multidisciplinary and multicultural, is itself a reason why individuals cannot carry such a burden in a substantive sense, for no single individual has the requisite knowledge, and very few have the ability to suspend their own ontologies, as such a dialog requires.
Individuals of all kinds, from engineers to scientists to environmentalists, can, however, certainly be charged with ethical responsibility for supporting the procedural process. The dialog itself will have to rest with an institutional host -- one that combines technical knowledge, with a broad, transparent and open process, and that is sensitive to its own agendas and ontologies, and can be explict about them without imposing them on the dialog.