There's an interesting article over at IEEE spectrum (hat tip to CRN Task Force member Brian Wang for spotting it). Charles Perrow, author of Normal Accidents, begins by recapping his argument that "any organizational solution to a human problem that is complex enough to be interesting will necessarily be imperfect." This means that failures, even large failures, are inevitable; rather than seeking to blame the person or organization closest to the failure, we should simply accept that failures will happen sometimes, and trade off the benefits of "complex, tightly coupled systems" vs. the costs of their inevitable failure.
Examples given of complex, tightly coupled systems include air travel, electric power distribution, petroleum refining, and nuclear reactors. Will we have to add molecular manufacturing infrastructure to this list?
The answer appears to be no--with one very important exception.
The first piece of good news is that technically speaking, individual molecular manufacturing systems (nanofactories) need not be complex. They may be very complicated, and certainly will contain vast numbers of parts. But controlling them may be no more difficult than playing a DVD--an operation which requires a large fraction of Avogadro's number of transistor operations. (That is, if each transistor operation were a single atom, they'd add up to a significant fraction of a gram.)
The second piece of good news is that a network of nanofactories need not be tightly coupled. Each nanofactory could run basically independently, on local feedstock and energy. If they all ran the same software, there'd be some risk of a Y2K bug or a computer virus taking them out. But there's no technical reason why they would have to be interdependent.
The bad news is that a security infrastructure, to prevent people from building undesirable products, probably would have to be a complex, tightly-coupled system.