Pages

2012/06/24

Born this way


We spend our whole lives trying to understand human nature, but it could be that all we need to know we learned in childhood. I recall a little troublemaker in our neighborhood named Johnny, a budding sociopath who lived down the street from us. He was constantly committing acts of minor mayhem and ducking the blame for them, and his mother was famous in the neighborhood for the many times she had to track him down or apologize to the neighbors for the trouble he caused.
One day my sister and I were playing with him in his backyard in a small thicket that looked out over a ravine. Johnny pulled out a magnifying glass and proudly announced that he bet we didn’t believe he could use it to burn an ant that was crawling across a leaf. We believed him and asked him not to try, but he did anyway. Within a couple minutes the ant and the leaf which he was trying to do a controlled burn on suddenly ignited the grass around it. It was quickly out of control, and Johnny’s first thought was not the water hose but his mother’s wrath. He took off into the ravine and left us to deal with the blaze. We were afraid of taking the rap, too, and could have fled, but instead we ran for the water hose.
Johnny’s mother came out and we were terrified that she would tell our parents that we started a fire in her yard, but she just rolled her eyes, thanked us, and only asked drolly, “Where is he?”
It’s important to recall that people like Johnny exist in this world, regardless of whether they grew up in good neighborhoods, went to good schools and had decent parents. When humanity is tempted to risk the hazards of a dangerous technology, we always overestimate our ability to design systems to regulate the danger, and we forget the tendency of most people to fail to do the right thing in a perilous situation. But the world has a sufficient number of Johnnys to lay waste to the best regulated plans, and this will always be the case, no matter how great our regulatory frameworks or how wonderfully we teach ethics in schools. Even though we sometimes have some success in governing ourselves well, there are certain risks we should not take. Perhaps we can take a chance on risks that are low impact. Civilization can survive the occasional investment bubble or chemical spill. But risks with high, widespread devastating impact, whether they are low probability or high, are the ones we have to create artificial lines around.
Within a couple decades of frightening experiments with nuclear weapons (1945-1965), certain taboos began to form that, while falling short of elimination, at least settled at the global consensus that it was unthinkable to make first use of a nuclear weapon in a conflict, or to conduct atmospheric testing. Then there was the total test ban treaty. 
With two major nuclear power plant accidents now having occurred within twenty-five years (Chernobyl 1986, Fukushima 2011), many people see an emerging taboo on nuclear power. Much of the public has the impression that these were serious accidents that were brought under control without doing too much damage, but the truth is that both of them came close to being civilization-ending nightmares. How many “final warnings” do we need? These accidents woke humanity up to the fact that a damaged nuclear reactor or spent fuel pool can be just as apocalyptic as a nuclear war. A second explosion at Chernobyl was narrowly averted, but if it had occurred, Western Europe would have been rendered uninhabitable along with much of the Soviet Union. If the spent fuel pool of Fukushima Daiichi Unit Four had collapsed, all of the Northern Hemisphere could have become an exclusion zone. And this threat has not been resolved, so it could happen yet.
Another reason to push for this taboo formation is that when disasters happen, they are worsened by the failure to do the right thing by thousands of people with access to vital information.
It was revealed months afterward that while the Fukushima Daiichi catastrophe was unfolding, US military flights were collecting information about the fallout patterns blowing to the northwest of the power plant. It had already been scandal enough that the Japanese authorities claimed to not have data from their own systems, but now it has been revealed that they had access to the American data immediately. The US gave it to them with permission to release it to the public. However, the Japanese authorities sat on the information. Residents of these areas were not informed and evacuated for weeks, and in fact many people had been evacuated to these areas that actually had fallout levels higher than the places they had fled.
I noticed much of the commentary on this scandal was outrage that hundreds of people within the Japanese government and nuclear agencies failed to go to the media with the information. But it is also telling that no Americans who were privy to the information felt obliged to go public with it. American authorities simply handed over the information to the Japanese and respectfully left it to them to decide what to do with it. Japan's allies have generally been far too respectful of Japan's autonomy during this fiasco (it calls up the concept of duty to friends in the saying friends don't let friends drive drunk). But the motivations to protect the nuclear industry are international in their scope. No one on the Japanese or American side did the humane act of leaking the information to the media, or even to the numerous blogs that were becoming the only source for reliable information. Failures like this are reason enough to say humanity lacks the moral and spiritual capacity to manage nuclear energy.

Related:

No comments:

Post a Comment