by Flemming Funch
Sheldon Pacotti writes in Salon about whether or not we're doomed based on the fairly inevitable direction of technology towards much more pervasive ways of keeping an eye on all of us, and scary possibilities for destruction, beyond anything we've seen before. He particularly speaks to the discussion on whether governments should try to stop certain kinds of knowledge from being generally available."The computer-networked, digital world poses enormous threats to humanity that no government, no matter how totalitarian, can stop. A fully open society is our best chance for survival." Yeah, I agree. There's really no way of stopping it, so we need to expand our collective ability to solve problems, our collective intelligence, at least as fast as the speed that new technologies are developed at. The author talks about various sectors of society where governments might think they ought to hold on to all the knowledge. Like, surveillance. If there are cameras everywhere, do we trust government agencies with deciding what to do with what they see? No, of course not. If there has to be surveillance, the only safe thing is if it easily available to all of us."If we must submit to a surveillance society, I think it is clear that an open network, in which no group, agency, or individual is privileged over any other, would lead to a society with a superior character than one in which the citizens remain separate from and observed by the government. Better for us all to be able to watch one another than for the "authorities" to monopolize this power and leave us with only the fear." He then goes on to talk about technologies like nano-tech or genetic engineering, which quite likely might allow individuals or small groups to produce results that could kill every last one of us on the planet. How do we guard against that? Stop that kind of research? Bill Joy suggested something like that. But, no, that ain't gonna happen. There will always be some groups, military, religious, terrorist, or whatever, who will want to do it anyway. So, the question is what is most secure - share information widely, or try to keep it secret. That's the same argument that applies to computer security. Microsoft advocates to just keep all security problems really quiet and everything will be fine. The Open Source world advocates to put everything out in the open, and thousands or millions of people can all help finding the holes and plugging them. Possibly the same thing applies to other technologies."What happens, in a police bureaucracy, if someone releases a nanotech plague into the environment? If the police can suppress information on the structure of the nanobots, then only a handful of government bureaus and hand-picked researchers may be allowed to work on a cure. Millions could die waiting for the bureaucracy to solve the problem. On the other hand, if the molecular structure of the pest is published worldwide, anyone with the expertise could help design defensive technology." I'd lean in that direction. But then again, would that apply to, for example, nuclear weapons? If we all knew how to make hydrogen bombs, would some of us figure out how to make an antidote against them?
|
|