On 1/25/08, Dr Skip wrote: > In other words, no matter how bad an idea, there will be > money and a place that will be willing to 'risk it' and foist it upon the world > with whatever longer consequences there may be. People thought the same thing about robots and machines as we started to move into the electronic age. At the time it was considered inevitable that the human brain would be eclipsed by an electronic brain and soon. We didn't have enough information then to know how far away that was, nor how near it was. Right now we can imagine all sorts of bad scenarios given just this first tiny foray into biological engineering. Yes, we don't know how the cells work to the last piece, and to some degree we are playing with fire. But while some of the first fires must have gotten out of control and killed many people - perhaps populations of people, we got through that. The first atomic bomb didn't ignite the atmosphere. There are several places in the past where risk wasn't known or understood, and the benefit has been great. That's not to say that the ends justify the means, but that even if human self-destruction is possible using these methods (and it's not at ALL clear that it's even possible, nevermind probable) it can still be considered an unlikely outcome. While some restraint should be shown, I think the fears are greatly overpowering the discussion here. It seems like any discussion of the possible good and useful end product that could result would simply devolve into, "It's too dangerous." -Adam -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist