FacebookTwitterYoutubeInstagramGoogle Plus

The Long Shadow Of The Manhattan Project Part II: Ethics In The Atomic Age

9 comments

Last week, we explored the Manhattan Project’s scientific legacy: new opportunities for energy, medicine, and spaceflight, and the beginnings of massive, government-funded science. But hundreds of thousands have suffered directly from the fallout of that ingenious, destructive science. With a modern world still struggling to be a responsible custodian of atomic forces, this week we examine the ethical legacy of the Manhattan Project.

The Original Scientists: Responsibility, Relief, and Regret

The details of the destruction of Hiroshima and Nagasaki are stomach-turning: people incinerated, leaving shadows burned into concrete; victims fleeing, staggering for miles, vomiting, skin hanging off faces and hands. And later, the second wave: leukemia, tumors, cataracts. Was this horrific cost in human life worth it, even to end a war? Many thought so, including scientists.

University of Pennsylvania physicist Gino Segre remembers that his uncle, Manhattan Project scientist Emilio Segre, was convinced that making the atomic bomb was justified.

“He was pretty clear about it, that he felt it was the thing to do,” Segre says. The senior Segre and many of his fellow Manhattan Project scientists “were all refugees from Europe, and very eager to participate in the war effort. As far as I know, no one had any hesitations; whatever hesitations they had came later.”

Even when scientists felt justified in their work, the realities of war could create an ethical tangle, as Harvard University theoretical physicist Roy Glauber found out.

Glauber had just completed his sophomore year of college when he was recruited to work on the bomb. “I did have qualms,” Glauber says. “But [the Manhattan Project work] was against the background of enormous bombing raids in Europe, several vast raids in Japan; raids ultimately as destructive as what we had in mind for the atomic bomb.”

The figures do show that the Allies hardly needed the Manhattan Project to wreak havoc. The bombs dropped on Hiroshima and Nagasaki are thought to have killed around 185,000 people; the tally of deaths resulting from the firebombing of Dresden is still disputed, but is thought to lie somewhere between 25,000 and 100,000 people. The firebombing raids on Tokyo are often given as causing approximately 100,000 deaths (though these numbers, too, are disputed).

Was dropping atomic bombs on Japan really necessary to end the war? The debate may never end. The earliest version of the story told in the West was that the bombings of Nagasaki and Hiroshima were the final straws for Japan, and averted what would have been a costly ground invasion by the Allies. In the 1960s, another group of historians began arguing that Japan was already about to surrender before the bombs were dropped, and the mushroom clouds were instead meant by U.S. President Harry S. Truman to intimidate America’s friends of convenience in the U.S.S.R. Tsuyoshi Hasegawa, a University of California, Santa Barbara historian, offers a third explanation: It was the Soviets’ declaration of war that prompted Japan to bend, and surrendering to America was the best way for the nation to hold onto its lands and maintain the position of the Imperial family.

Whether or not the bomb was a deciding factor in the end of World War II, the sheer unprecedented power of the bomb planted doubts in minds at the highest levels of the Manhattan Project. J. Robert Oppenheimer, for one, said that Hiroshima and Nagasaki did not weigh on his conscience personally, but that he did sense a sea change in science, and the world at large. “In some sort of crude sense which no vulgarity, no humor, no overstatements can quite extinguish,” Oppenheimer said in a 1947 lecture, “the physicists have known sin; and this is a knowledge which they cannot lose.”

Fallout Today: An Environmental Assessment

The story of the Atomic Age is written in the very air. Since 1955, atomic bomb testing has doubled the amount of carbon-14, a radioactive isotope of carbon, in the atmosphere. Every tree that was alive in 1954—from the pine trees in Finland to the kapok trees growing on the banks of the Amazon—carries a radioactive “spike” of this isotope, a biological mark on the calendar commemorating the nuclear bomb. Doctors have found similar traces of bomb-derived carbon-14 in people that were alive during the nuclear testing era. Nuclear bomb-derived isotopes can also pinpoint ivory poachers carrying horns and tusks from animals that have fallen under recent endangered species protections. But even chemistry has a short memory. With no new atomic weapons being tested in the air since a 1963 test ban, carbon-14 levels are migrating back to the pre-atomic baseline.

Many areas touched by the bomb are not as cordoned off as you might expect. Hiroshima and Nagasaki are thriving population centers today; the radiation levels are the same as the normal current background levels elsewhere in the world. Scientists have yet to find large-scale disease effects directly attributable to radiation in the children of atomic bomb survivors at those two sites.

Nuclear power plant accidents are not recovering as quickly. While people can live in Hiroshima and Nagasaki today, Ukranian officials still restrict access to a 1,000-square-mile spot of land around the Chernobyl plant where disaster struck in 1986. Part of the difference is quantity: Fat Man and Little Boy combined contained just 70 kilograms (154 pounds) of nuclear material; the Chernobyl reactor that was damaged in the accident contained 160 tons of fuel.

Bikini Atoll, where 23 nuclear weapon tests were conducted over 12 years in the 1940s and 50s, is still considered unsafe as well. In the 1990s, the International Atomic Energy Agency recommended that the islands not be resettled, since eating native plants and other locally produced food would expose a person to an effective radiation dose of about 15 millisieverts per year—five times the normal background radiation dose most people receive in the natural course of things, but still below the maximum dose of 50 millisieverts set for U.S. radiation workers. But the rates of radioactive cesium-137 are falling faster than expected, thanks in part to remediation efforts, and it may be possible for the original inhabitants of the island to return to their home.

“Conditions have really changed on Bikini,” Terry Hamilton, a researcher at Lawrence Livermore Laboratory, told Outside Magazine recently. “They are improving at an accelerated rate. By using the combined option of removing soil and adding potassium, we can get very close to the 15 millirem standard. That has been true for roughly the past 10 years.”

Where mankind is failing, says Edwin Lyman, a nuclear policy expert at the Union of Concerned Scientists, is in creating a plan to handle the radioactive waste from nuclear power plants and weapons manufacturing. There are tens of millions of gallons of liquid radioactive waste that governments are still trying to stabilize, and the storage pools at plants are holding more spent fuel than they were originally designed to hold, Lyman says. There’s a good chance nuclear waste will outlive humanity—meaning, among other things, we’ll need to find a reasonably permanent, non-linguistic way to warn our potential successors 200,000 years from now.

Ultimately, there is no safety guarantee for nuclear power, according to Lyman. “What level of nuclear risk are you willing to accept—and who gets to make that decision?” he asks. “The public doesn’t really get to weigh in; the decisions are really made by a small group of regulators, industry people, and in some cases Congress.”

The Ethics of Science in Modern Warfare

As science advances, and continues its complicated relationship with government, the potential for ethically murky projects remains. Nuclear fission is hardly the only Pandora’s Box that scientists have opened: While no nuclear weapons have been used in warfare since Hiroshima and Nagasaki, mankind is still inventing new and ingenious ways to kill people, and scientists are often called into this service, either directly or as an extension of an original project. Cyberwarfare, for example, can have consequences beyond the digital realm; hackers may be able to shut down hospital computer systems, power grids, and civil services. Biological weapons can slip invisibly through population centers.

University of California, Santa Barbara physicist Andy Howell says he can sympathize with the scientists working on the Manhattan Project, since they faced an existential threat with an uncertain outcome. But at the very least, he says, the U.S. might’ve staged a test of the weapon and invited the world to watch it.

“I am not so presumptuous to think I’d ever have been asked to work on the Manhattan Project,” says Howell.  “But if I had, I am sure I would have regretted it. Once the result is in the hands of politicians and generals, the scientists have lost control. Then the weapons can be used against anyone, not just the arguably justified first intended purpose.”

Segre says if he were in his uncle’s place back in 1942, he probably would have made the same choice to work on the project. But “if they had asked me to work on weapons during the Vietnam War, I would have said no,” Segre says. “I don’t think it was a just war. More recently, if you had asked me to work on weapons in the Iraq War, weapons of potential mass destruction, I think I would have said no. I’m glad nobody asked me.”

Comments

Comments

  1. Jorge L Anzardo says

    What is the difference in killing 100k people with one bomb versus killing 40k people by fire bombing Tokyo with conventional weapons. The Japanese were the aggressor and the US acted in self defense.

Trackbacks

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Videos

Related Content