• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed



  • Powered by FeedBlitz

« Not Just About Making Stuff | Main | DNA Folding Isn't Nanocomputers »

May 28, 2009

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Paul S

One other note on nanotech and war. Actually this applies to all molecular manufacturing applications, but not with the same potential for disaster.

In true molecular manufacturing, "things" become almost trivial. Any given thing can be built/grown with an assembler of suitable size, as long as you have the code for it. In that environment, the critical element is the code: knowing how to build something becomes functionally equivalent to owning that thing.

But data is way more portable than stuff. These days, someone who wants to, say, steal a shipment of surface-to-air missiles has to actually carry off the missiles. This has all kinds of logistical implications: the thieves have to carry and conceal all that hardware from wherever it's stored to wherever they need it. And they're limited to however many missiles they can carry.

In the molecular manufacturing future, anyone who can hack the appropriate database can quickly begin manufacturing missiles as needed. The code can be copied and recopied, making recovery almost impossible.

Unless safeguards can be built into the code itself, data security will become far more important and volatile than physical security is today.

jim moore

Paul,
You don't just need computer code to buld the device you also need software to run most devices. You could make the software that runs the device call "home" to check in to see if you are allowed to use the device. Or you could design devices so that every copy needs slightly different software in order to run. Or software that is keyed to work for one person only.

Still non physical, but the software programs that you would need to steal to make and use the device are too complicated for most organizaztions (let alone individuals) to understand in detai. So the question becomes can you trust the device you stole to do what you want it to do, and only what you want it to do?

Paul S

Jim,

The measures you list are examples of the kind of data security that I believe will be necessary in the molecular manufacturing age. Though I wouldn't make a biometric security system dependent on just one person, or even a few people. That would make your manufacturing process far too vulnerable.

And of course all such methods can be circumvented - which was part of my point: once molecular manufacturing becomes a robust technology, the security arms race will largely shift from protecting "stuff" to protecting data.

I don't think you'll be able to rely on complexity, though. If the organization that developed the code has the skills to understand it, then any other organization can have those skills. After all, skills are just a matter of training. Microsoft relies on copyright protection and carefully guarded source code to protect their creations. They never assume that a rival organization couldn't understand their code, because that's a very bad bet.

Tom Craver

One point - it may be possible to write code so obscurely that it'd generally be easier to re-write it, than understand it well enough to modify it.

Dan S.

Tom,
No it is not possible, not for large software projects. Otherwise, developers will stop to understand code too. This will cause all sorts of untracable bugs and eventual collapse of whole project.

The comments to this entry are closed.