The ultimate goal of molecular manufacturing is to build useful products in useful amounts.
In order to build a "useful" amount of almost anything molecule by molecule, the process will have to be fully automated. This means that machines will have to do whatever sensing and actions need to be done in order to form the product.
All of the operations will have to be very reliable. How much is "very"? Well, for comparison, I've been told that the ribosomes (protein-making machinery) in the human body have an error rate of about one per ribosome. Every ribosome is imperfect in a different way, and most of them work well enough despite these flaws.
The fault-tolerance of ribosomes is kind of impressive, but today's machinery is reasonably fault-tolerant as well. If you disconnected a random wire or unscrewed a random fastener in your car, it would probably still work well enough.
We don't know exactly how low the error rate needs to be, in order to make machines that work well enough. But let's pick a number - let's say that we want an error in less than one-millionth of the automated bond-forming operations. This is probably the right number, give or take two orders of magnitude. For a million-atom machine, one hundred errors would likely be too many, while two orders of magnitude in the other direction would mean that 99% of the machines were perfect.
In today's "wet chemistry," a yield of 99% is excellent. An error rate of one in a million is four orders of magnitude beyond that. Clearly, a different approach would be needed to achieve such a radically low error rate.
Recently, Svein Ove objected to my characterizing the difference between 99% and 99.99% as less than 1%, since it's also two orders of magnitude. In my response, I argued that the difference would indeed be huge in a purification context, but it is less significant in the context of finding designs. In this post, talking about error rates, we're getting pretty close to talking about purity - but there are some important differences.
For example, if error rates depend on reaction rates at individually addressed sites, then we can switch from talking about how to prevent errors, and talk instead about how to speed up desired reactions. Mechanically guided chemistry can speed up reaction rates at the desired site by
many orders of magnitude, both by increasing the effective
concentration (as Drexler has recently discussed) and by lowering energy barriers. Each order of magnitude speedup corresponds directly to an order of magnitude less error. (There are exceptions, such as undesired reactions happening between the desired reactants. But these depend on the chemistry, and can be made extremely low in some chemistries.)
Biology uses all sorts of error detection and correction machinery, which adds significantly to its complexity. Engineered error detection and correction can be somewhat simpler. Since biology depends largely on diffusion and random processes, its processes must depend on the probability that two molecules will bump into each other in the right way to detect whatever needs detecting. In an engineered system with relatively rigid machines, it is possible to sense a condition in a single deterministic operation, correct it in another operation, and know with a very high degree of confidence that these operations have taken place in a timely fashion. Even if the correction operation is only 99% effective, it needs to be repeated only three times to achieve 99.9999% effectiveness. (Again, there are assumptions here, such as that sensing never causes new damage.)
There is a more fundamental reason why error rates will suddenly fall from non-useful to useful ranges (for any given approach). A practical molecular manufacturing system will require machines building
machines that build more machines, and so on for many generations to produce enough machines to be useful. If the error rate is slightly too high, this multi-generational approach cannot work. Reduce the error rate by merely half, and suddenly the number of machines in each generation can grow exponentially. So, if the magic number is 99.9999%, then it will probably be a lot harder to get from 99.99% to 99.9998% than from 99.9998% to 99.9999%.
The other question that I'm sure some readers will be asking is whether such low error rates are possible at all. Our experience with chemistry, as well as our experience with mechanical devices, gives us the intuition that such low errors are not possible, but transistors can supply the opposite intuition: the transistors in your computer do the right thing more than 99.999999999999999999% of the time.
You may wonder: what about physical limits? Well, even at room temperature, thermal noise will not break most common organic bonds quickly enough to worry about. The theoretical error rate required by entropy, Heisenberg, and so on, is vanishingly small. Thermodynamics says that every process is imperfect, but it also says that externally supplied energy can be used to improve local order. There is nothing in physics that poses any sort of problem for a mere 99.99999999% correctness rate at room temperature - which is surely better than we need.
Chris Phoenix
Recent Comments