Thirty years ago, a robotics and AI researcher named James Albus published a book called People's Capitalism. Most of the book, as the title suggests, is about economic reform. I won't comment on the economic ideas in the book. But Chapter 5 is very relevant to molecular manufacturing predictions.
Chapter 5 explains the technical advances that both require and allow (according to the author) the new economic structure he proposes. It seems that fully automated factories were about to be developed, using computer-controlled robots to do manufacturing operations.
When automatic factories begin to manufacture automatic factories, cost reductions will propagate exponentially from generation to generation. The introduction of computers into the manufacturing process thus has the potential for increasing productivity on a scale never before conceivable. Eventually the cost of finished manufactured goods may fall to only slightly above the cost of unprocessed raw materials.
Albus projected some of the possible consequences. His writing on this point sounds a lot like CRN's:
Robot technology, like computer technology, has military as well as economic implications. Any country that develops the capacity to run its factories around the clock seven days per week with only a few human workers will have a tremendous advantage both economically and militarily. If nothing else, this capability would allow military weapons to be produced in virtually unlimited quantities at extremely low costs. But, even assuming that such plants were never used for military production, the country that possessed such a large surplus of efficient production facilities could easily dominate the world economically simply by selling manufacturing capacity at rates far below what countries using less efficient methods could hope to match. .... Whether this event results in unprecedented benefits or economic chaos depends largely on whether we can devise satisfactory answers to the questions: “Who owns these machines? Who controls them, and who gets the wealth they create?”
So what happened? Why didn't automated factories change the world? Why aren't factories fully automated, even now? And why do we expect that general-purpose nanofactories will be easier to develop?
Several factors make nanofactories different from large-scale robot factories. But first, note that automation in factories has in fact brought down the cost of goods quite substantially. If not for the fact that advances in transportation have allowed robots to be outcompeted by inexpensive labor overseas, we would probably be seeing even more robot use.
A large factory is limited in its speed; large machines can only work so fast. If the first automated factory-building factory costs a billion dollars, and it can build a new factory in as little as a year, then it will take ten years before 1,000 factories exist, and each factory will still have a million dollars in amortized capital cost--plus the substantial cost of raw materials. Meanwhile, advances in manufacturing technology will require continual redesign to avoid obsolescence.
Today's machines require a wide array of materials, formed by an even broader array of processes. I'd guess that at least a million different operations, from chemical purification to injection molding to grinding to measuring, are involved in making a large modern factory. Each of those operations would have to be researched and developed in order to automate it. It's no surprise that progress has been incremental.
By contrast, a molecular manufacturing system will use extremely small machines that can work very fast. Basic scaling laws, as well as comparisons with biology and preliminary engineering studies, indicate that a nanofactory should be able to manufacture its own mass in something like an hour. Within a month, the cost of developing a nanofactory could be amortized over not thousands, but billions of factories; furthermore, nanofactory manufacturing capacity would not be scarce. That changes the economics of production more fundamentally than a mere order of magnitude decrease in cost over several years.
The high performance of nanoscale machines (again, due to scaling laws) implies that there will be more demand for nanofactory-built products. It also implies that machines can be over-engineered, making designs less dependent on exact material choices and reducing the number of different materials needed.
Reduction in materials needed implies a reduction in the number of processes needed to make those materials. From the other direction, the discrete and uniform nature of atoms makes control much easier; two parts built by identical operation sequences will be perfectly identical (except for transient variations from thermal noise, and very rare manufacturing errors that will be detectable with limited sensing). The discrete nature of atoms and their bonds also means that properly-designed parts will not wear or require lubrication. All these factors simplify the design task.
It has been speculated that a minimal engineered bacterium might require less than 200 genes. This is for a system that not only manufactures, but metabolizes and self-repairs. It is likely impossible to build a macro-scale hands-free manufacturing system with 200, or even 2000 parts. But there are fewer than 200 atoms in the Periodic Table, and a molecular manufacturing system will be able to take advantage of that fact.
Design of a nanofactory would be extremely difficult today; it might cost on the order of a billion dollars. However, improvements in computers will make it far easier, quite rapidly. I can simulate reactions and components on my laptop today that would have required a small supercomputer a decade ago. Advances in general knowledge of chemistry will continue to improve the models. Rapidly improving tools, driven by other industries, will also make the job easier.
Today, it would be possible to build a fully robotic factory, but it would not be economically rewarding. In just a few years, it will be possible to build a nanofactory--and it will be very rewarding. Someone will do it.
Tags: nanotechnology nanotech nano science technology ethics weblog blog
That's a good point about the variety of production processes required to scale a bulky factory vs. the small library of mechanosynthetic steps and feedstock prep. required to scale diamond MNT.
My analysis of bulky replicators stalled when I hit Chemical Vapour Deposition (required for CNTs and diamonds among other products). CVD needs a high-melting temp metal like molybdenum alloyed in its reactor walls. One million CVD reactors would exhaust the world's annual production of the metal. A 4km^2 footprint of CVD reactors only gets you an annual diamond production totalling a cube of diamond 10 meters high.
It is not certain molybdenum can even be extracted from seawater. So some variation of Freitas's NASA Lunar Replicators would be needed... Basically, of all the different productivity boosting technologies on the horizon, diamond mechanosynthesis seems to be the one with the fewest technical hurdles.
Posted by: Phillip Huggan | May 19, 2006 at 11:45 AM
I have always found it odd that so many people insist MNT is not needed for various applications, that more conventional tech like CVD can do anything that MNT can do. Or that bulk carbon nanotubes can easily replace modern ICs, researchers resort to using AFMs to manipulate them one at a time to make circuits, but no, we don't need MNT. It sure seems that having millions of tiny manipulator arms, working in parallel, in a nanofactory, sure would help there. Then there is the hope that you could have a pre-paterned wafer to grow them on in the correct way. This raises the question, how is this any better than conventional photo-lithography?
Their claims of what can be done with bulk processes, if subjected to the same scrutiny as MNT, may have just as many holes in them. Upon further analysis it may be that the mainstream nanotech scientists are making promises, that will be proven scientifically, can only be kept by Drexlerian nanotech. Nobody knows.
Posted by: NanoEnthusiast | May 19, 2006 at 08:57 PM
NanoEnthusiast, "nanotechnology" used to mean Drexler's version. Then it got funded... but what was funded was near-term stuff. And all those researchers, and funders, had to justify the funding. It's no surprise that some of the claims of potential outcomes are... overblown. And it's no surprise that a lot of them sound like what MM can do.
In theory, in some cases, growing or depositing buckytubes on a wafer, according to a pattern with a certain resolution, can actually be better than making traditional circuits with that resolution. Crossed buckytubes can be a wire, a switch, and a diode, all in the space of a molecule. So you can (in theory!) pack in crossbar memory elements more tightly than if you had to put a transistor, a capacitor, and several wires at each junction. This probably generalizes.
Nanoscale technologies are certainly useful, and for some applications they may actually give MM a run for its money. But ... wait a minute, I feel a blog post coming on.
Chris
Posted by: Chris Phoenix, CRN | May 20, 2006 at 10:48 AM
OK, here's the blog post.
Chris
Posted by: Chris Phoenix, CRN | May 20, 2006 at 11:13 AM
Less than 200 elements, true. How many chemical bonds are there amongst those 200 elements, however?
And yet even hydrogenated diamondoid reproduction - requiring making two types of bonds, carbon-carbon and carbon-hydrogen, and breaking 3 types of bonds, carbon-carbon single and triple bonds and carbon-hydrogen (plus the unspecified actions needed to use the 'vitamin') - requires a LOT more than 2 steps, at least per Merkle's "Hydrocarbon Metabolism" paper. Is there a better reference out there to a step-by-step process potentially leading to mechanochemical self-replication?
In short, I question your assumption that a full-element nanofactory has any less complexity than a modern automated factory. In fact, I'd be hard-pressed to say that the complexity of a nanofactory isn't greather than your WAG of "at least a million different operations" in a modern automated factory.
Sincerely,
John B
Posted by: John B | May 23, 2006 at 08:34 AM
I think the stiffness of diamond gave some false hopes about how easy a substrate carbon would be to work with. Now we know carbon dimers deposited, creating an sp3 diamond geometry have the potential to reconstruct to lower energy sp2 configurations, and Freitas's most recent paper has taken that into consideration.
Posted by: Phillip Huggan | May 23, 2006 at 10:27 AM
I read on R.Jones's blog that Moriarty's group is getting $1.7 million dollars to do diamond surface chemistry experiments that are MNT relevant. Our present tools are still clumsy and IMO there aren't any killer product apps. along the way. The lowest diamondoid-mass product may be a diamondoid computer, and that still requires scale-up and molecular computer R + D. So someone has to be willing to sink tens of billions of dollars (under present high-end SPM prices) into a decades long investment.
I don't see a complicated reaction pathway being a hinderance. Chemical reactions happen fast; the price of the feedstock seems like the only barrier here.
Posted by: Phillip Huggan | May 23, 2006 at 10:37 AM
Phillip Huggan - is the paper by Frietas you reference the one found at http://www.molecularassembler.com/Papers/DMSToolbuildProvPat.htm , or is it some other one? If another, could you please point me towards it?
While I agree chemical reactions "happen fast", they still take a finite period of time. My comment above was not in reaction to the time needed but rather (Chris? Mike? The blog post was unsigned) whoever wrote the original blog's dismissal of the complexity of a full-element nanofactory's construction.
I'd LOVE to see a detailed design of a fully crosslinking CHONS nanofac, for instance - but so far as I know only CH and Si have had any detailed work done on them. (I honestly would be surprised if anyone'd gotten very far with the details of a mechanosynthetic CHONS reaction set, due to the huge number of bond types and odd interactions...)
-John
Posted by: John B | May 23, 2006 at 11:04 AM
A point of clarification on Phillip Huggan's comment above:
The EPSRC IDEAS Factory scheme has ear-marked funding of ~ £1.4 M for the 'matter compilation' topic . This money will eventually be distributed amongst a consortium of groups subject to the associated research proposal meeting appropriate quality criteria. The Nottingham group has not been awarded $1.7M for MNT research.
Best wishes,
Philip
Posted by: Philip Moriarty | May 23, 2006 at 01:26 PM
John B: http://www.molecularassembler.com/Papers/JCTNPengFeb06.pdf
Posted by: Phillip Huggan | May 23, 2006 at 02:44 PM
Yes, the number of potential bond types is quite a bit greater than the number of atoms. In fact, it's immense. But the question is, how many different materials do we need, and how many bond types do those materials require to construct?
The answer will differ by probably a couple orders of magnitude depending on whether you're looking at a Freitas/Merkle type assembler, built out of diamond, or a Drexler-style "less diamondlike diamondoid" style of design.
Freitas thinks it'll only take a few reactions to build diamond itself. (I don't know if that includes recharging the tool tips.) Drexler, I think, plans to have automated parameter-tuning to develop the wide library of reactions needed to build e.g. his planetary gear.
Even for Drexler-style, it might take only a few tool tips, with trajectories modifiable in software. With re-usable hardware and semi-automated design, the design difficulty and operating complexity of even a Drexler-level capability could be pretty tractable.
(Note that there are at least two different "Drexler-style"s, the other being the biopolymer bootstrapping pathway: "Ribosome Mark II.")
Chris
Posted by: Chris Phoenix | May 23, 2006 at 11:26 PM