- Prosperity vs. Social Responsibility
- Ecology vs. Technology
- Progress vs. Safety
These are often conceived as trade offs, but at CRN we reject that notion. We're not the only ones, of course. Our recent post on Biotech vs. Nanotech and the excellent comments and discussion that followed make that clear. It is quite possible to be technically progressive, environmentally sensitive, and socially responsible, all without contradiction.
Brad Allenby is professor of civil and environmental engineering at Arizona State University, a fellow at the University of Virginia's Darden Graduate School of Business, and previously was AT&T's vice president of environment, health, and safety.
For many environmentalists, continuing evolution of technological and economic systems is seen in highly negative and even apocalyptic terms, with predictions of increasing loss of biodiversity, increasing human appropriation of land and other resources, increasing impacts on the dynamics of fundamental natural systems. . . In Europe, the Precautionary Principle with its deep skepticism regarding technological evolution is increasingly embedded in public policy. Behind all these positions is an assumption that some technologies must, and can, be stopped or at least impeded for the foreseeable future.
Obviously, how and whether technology should be restricted, how and by whom its costs and benefits should be weighed, and who has the ethical responsibility for technological systems, are complex and contentious questions. . .
Dr. Allenby, who has a Ph.D. in Environmental Sciences as well as a J.D. from the University of Virginia Law School, then considers the issue of decision-making -- and the power to make decisions -- in the context of human history.
It seems a difficult but historically unarguable conclusion that the will to power is fundamental to being human. The implications are severe. For one thing, the necessary governance scale for policy in areas such as the environmental or social implications of NBIC [Nanotech, Biotech, Infotech, and Cognitive Science] is obviously global, not regional. Europe may not like genetically modified organisms, and the United States may ban stem cell technology -- but that only passes the torch to others.
Cultural competition is only one manifestation of the will to power, which operates across the species as a whole at all scales. More profoundly, the will to power likely has driven human evolution biologically, and now drives it culturally and technologically: the idea that we are the end point of evolution, or that we have stopped evolving (indeed, that we are not evolving more rapidly than we ever have before) seems a desperate fantasy, almost willful blindness. Idealistic arguments, whether good or bad, are mere utopian wistfulness if they do not understand, and accommodate, this perhaps most important and fundamental dimension of our species -- the will to power.
As the CRN Global Task Force on Implications and Policy begins work, these insights could prove quite valuable.
The implications of this analysis of socially distributed cognition for industrial ecologists, environmental researchers, and policymakers are obvious. To begin with, many of the issues and systems such communities engage with are clearly complex: non-linear, rapidly changing, reflexive, unpredictable, data intense, and characterized by emergent properties.
Moreover, the institutional and governance structures that are frequently encountered, from private firms to NGOs to governments, are also complex. Under such circumstances, it is highly unlikely that single individuals, no matter how well placed, either completely understand, or control, systems response. This is not to argue that powerful individuals, such as CEOs or politicians, cannot have substantial impacts on systems performance (and thus bear ethical responsibility for such behavior), but it does suggest that a more nuanced approach to systems behavior and learning might pay dividends.
If Dr. Allenby is correct that "the necessary governance scale for policy in areas such as the environmental or social implications" of nanotechnology and other emerging technologies, then crafting effective policy recommendations will indeed require a "nuanced approach." His observations, however, indicate that this will be a daunting challenge.
If one wanted to encourage the evolution of environmentally sensitive nanotechnology, one could try to educate each nanotech researcher on relevant environmental and ethical issues, a daunting task. Or, one could try to define the relevant distributed cognitive system, and create artifacts and processes, such as regulatory pathways, that accomplished the same goal from a systems perspective -- not easy, but doable.
Manipulation of distributed cognition systems is not a substitute for decisions about ultimate ends. These are still chosen by humans exercising intentionality (including failure to agree on goals, which is of course still a decision, albeit by default). But understanding, using, and participating in distributed cognition systems is frequently far more effective operationally than reliance on a false construct of solipsistic individual cognition.