• Google
    This Blog Web

October 2011

Sun Mon Tue Wed Thu Fri Sat
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

RSS Feed

Bookmark and Share

Email Feed

  • Powered by FeedBlitz

« Natural protein vs. designed machines | Main | 3D 100-nm DNA Structures »

May 15, 2009


TrackBack URL for this entry:

Listed below are links to weblogs that reference What I've Been Up To: Risk Analysis:


Feed You can follow this conversation by subscribing to the comment feed for this post.


I have never personally heard of a scientist being convinced that his own personal research is too dangerous to continue. For heaven's sake, Robert Oppenheimer pursued the fission chain reaction, even though no one (not even him) was sure what would happen, and some candidate scenarios included the literal destruction of the planet.

I think you will have a hard time convincing any scientist to take a public stand even against another scientist's work. There are two main reasons for this, one personal and one theoretical.

On the personal level, scientists are very reluctant to take a professional position on a question that isn't solidly within their expertise. In practical terms, this usually means that such a scientist would want at least to talk personally with the researchers involved in problematic research, and ideally he would want to have been involved in such research himself. Without that level of involvement in the issues, most scientists would consider themselves unqualified to render a concrete professional opinion on whether or not such research should be performed.

This ties into the theoretical issue. On the broader stage, saying that research should not proceed is functionally equivalent to saying that there are questions that should not be asked. There is probably no idea in the world that is more anathema to a scientist than this.

That's why Oppenheimer forged ahead, and that's why scientists are so reluctant to make such proposals. Before a scientist would say such a thing he or she would want to have to most thorough grounding possible in the details of that research. I suspect that the experts you've talked with are shying away from professional opposition for these reasons more than any other.

the Oakster1

"I gave you your chance to play cop . . . and you blew it!" - Lt. Moe Tilden(Copland Character)

Tom Mazanec

Look at the fuss over the LHC.


Hey Paul, opera fan?

jim moore

Hey Chris,

If you did not know a Science Fiction writer you know spilled the beans on the nature of your concern. I will try to be vague in my response.

I agree with you, this research is potentially very disruptive to the preexisting ecological relationships between this bacterium and the rest of the ecosystem.
And the microbe in question can survive in aerobic and anaerobic conditions, it lives in the gut of most mammals and birds, it lives in the soil and the water.

The biggest problem I see is that we know so little about the details of the ecology of microbes. It is very difficult to do ecological research on microbes in their natural environment.

Perhaps the way forward is ask the researcher to buy catastrophic risk insurance for the experiments he would like to do. That way he is working with outside experts to asses the nature of the threat. And he can reduce the cost of the insurance by adding protocols that decrease the danger.

Chris Phoenix

Sigh... in the same email where I told him about it, I said this:

"In this case, I think it's important that scientists self-regulate.
Creating [the microbe in question] will be pretty easy to do in a few more
years, and [a scientist] has already published an article recommending
it. If scientists don't change their collective mind before
environmentalists get hold of the issue, things will turn adversarial
- and the research will proceed. We can't afford that."

Fortunately, he stuck the mention of the research at the end of a post on a different topic. None of the comments have referenced it.

IMO, it would be appropriate (though not sufficient) to require any researcher creating such microbes to feed the parent strain to his children. Just to test whether he's _absolutely_ sure they won't grow in the wild.

The good news is that the current project will not produce a strain that's fully [what they're trying to build]. It's likely on the favorable (mundane) side of the overgrowth tipping point. So we may have more time than I thought.


Tom Craver

Perhaps we should have "science insurance", instead of the "Precautionary Principle".

"You want to do a nuclear fission chain reaction experiment - but you don't think it will *really* destroy the world? OK, we set the potential damages at $1 x 10^16, and an independent panel of experts reviewing your research sets the probability that you are incorrect at around 1 x 10^-8 - so pay $100 million up front."

"No refunds if you're correct - the money will be spent on amelioration (e.g. efforts to get humanity off the planet)."

"Your biological experiment would only cause $1 x 10^12 in damages? We agree with your projection of 99.9999% certainty it won't. So pay $1 million and go ahead."

"What's that? You say you can add some controls that will reduce the chance from 1 in a million to 1 in a hundred million? OK, if you can that, the cost drops to $20,000. Why not $10,000? You should have come up with better safety controls before bringing your research to the insurance board - now we have to review your proposal again."

There should also be rewards for reducing risks.

"You want to set up an asteroid monitoring program that reduces the chance of an asteroid destroying all human life on Earth by 90%, by giving us 20 years warning to do something about it? OK, we calculate that to be worth $1 x 10^9 a year, except that 90% of that value is already provided by various efforts we're funding to get humanity off the planet and in a position to do something about asteroids. So you get $100M a year.

The comments to this entry are closed.