Powerful weapon of mass destruction that harnesses the energy created by nuclear fission (the splitting of the nucleus of an atom). The result creates an explosion of tremendous force.
In December 1938, German physicist Otto Hahn became the first person to perfect the process of nuclear fission. In his experiments, Hahn bombarded uranium atoms with slow-moving neutrons, causing the uranium nuclei to split in two and release more neutrons. Physicists later discovered that the fission of one atom could cause a chain reaction by triggering the fission of surrounding atoms.
Pioneering theoretical physicists such as Albert Einstein, Niels Bohr, and Robert Oppenheimer realized the enormous possibilities of this research and quickly alerted the British and American governments to the German discoveries. These physicists were aware that the fission of a single atom releases a million times more energy per pound than dynamite. A country that is able to harness such energy could produce an unimaginably powerful weapon. Many of these scientists were also European refugees who had fled to the United States because of the anti-Semitic policies of Nazi Germany. Terrified of the prospect of the Nazis possessing a workable atomic bomb, they were determined to develop one first.
By 1941, with Great Britain struggling to fight off the Nazis and prospects for Allied victory dim, the development and production of the atomic bomb was turned over to the Americans. President Franklin D. Roosevelt approved a feasibility study in October 1941, and the project to build the bomb—code-named the Manhattan Project—was finally launched in December of that year. The scientists of the Manhattan Project developed a highly enriched form of uranium called U-235 for use in a working nuclear bomb. They also tested predictions about the capabilities of the newly discovered element plutonium for use in an atomic bomb.
The motivation behind the American and British work on the atomic bomb was to develop it before the Nazis did, and they succeeded. In the end, however, Germany surrendered before the Allies had successfully tested their nuclear device. That first test, code-named Trinity, took place in the New Mexico desert on July 16, 1945. Originally designed for use against Germany, the bomb was ultimately used against the Japanese.
The first atomic weapon ever used against a live target was the enriched-uranium bomb named “Little Boy.” It was dropped on Hiroshima, Japan, on August 6, 1945, from a B-29 bomber named Enola Gay. The second (and, so far, last) atomic bomb dropped in anger was “Fat Man,” which used enriched plutonium as its power source. Three days after Little Boy destroyed Hiroshima, the bomber Bock’s Car delivered Fat Man to the city of Nagasaki. Each atomic explosion incinerated everything within a half-mile radius, caused fires to break out in the area immediately outside that zone, and created radioactive rainfall. Together, the bombs killed about 150,000 people immediately, and many thousands more died of wounds and radiation sickness in the following weeks and months. The second atomic attack convinced the reluctant Japanese army to surrender, bringing an end to World War II.
Although the atomic bomb ended World War II, it did not, as the scientists who created it had hoped, demonstrate the necessity for an end to all war. Instead, the dawn of the nuclear era created new rules in warfare. Although firebombing (the most deadly form of combat prior to the atomic bomb) had been used extensively against civilians during World War II, never before had a single weapon killed hundreds of thousands of people at once. The United States had unleashed an unheard-of destructive force that made it the most powerful nation in the world and caused its friends, rivals, and enemies to recalculate the possibilities for future relations.
The atomic bomb also created new kinds of conflict with a new level of consequences—the potential destruction of all or part of a civilization. This understanding shaped the Cold War that immediately followed World War II. The development of the atomic bomb also helped to spur an arms race between the United States and the Soviet Union, the sole remaining superpowers after the war. America continued to refine its nuclear technology, and the Soviets finally acquired the bomb in August 1949.
The development of the Soviet bomb changed the rules of international conflict yet again. All countries, whether allies or enemies of one of the superpowers, were now indirectly threatened with annihilation, although none more so than the superpowers themselves. This realization led to the development of the principle of mutually assured destruction (MAD). This principle stated that if one superpower attacked the other, the second country would immediately launch a devastating counterattack. During the Cuban Missile Crisis of 1962, the two superpowers came dangerously close to fulfilling the doctrine of MAD. This close call affirmed the atomic bomb’s position as the ultimate weapon of deterrence.
In an attempt to slow the arms race during the 1970s, the United States and the Soviet Union entered into the Strategic Arms Limitations Talks (SALT I and II). These bilateral talks produced agreements that set limits on the manufacture of atomic weapons. During the 1980s, the concept of a “nuclear freeze”—mutual cessation of atomic bomb production—became popular among antiwar advocates in the United States. President Ronald Reagan, however, initially favored developing the capability to intercept nuclear attacks through the Strategic Defense Initiative, also known as Star Wars. Later, however, Reagan participated in arms limitation discussions such as the Strategic Arms Reduction Talks (START I and II).
Today, the United States and Russia are two of nine countries that admit to possessing nuclear weapons. The other declared nuclear powers include Great Britain, France, China, India, Pakistan, and, most recently, North Korea. Israel, although it admits to possessing nuclear weapons, refuses to declare itself a nuclear power. Iran and Libya are also widely suspected of seeking nuclear capacity.
By contrast, some countries have actually given away their nuclear arsenals. For example, Belarus, Ukraine, and Kazakhstan all rid themselves of their warheads after the collapse of the Soviet Union in 1991. South Africa also developed nuclear weapons but later voluntarily relinquished them and signed the Nuclear Non-Proliferation Treaty.
The renunciation of nuclear arms arises partly from practical considerations and partly from doubts about the morality of using the bomb that began to surface immediately after the attack on Hiroshima. From a practical standpoint, nuclear weapons are expensive to maintain, and aging warheads pose a potential hazard for countries that do not have the ability to maintain them properly. The United States addressed such concerns after the fall of the Soviet Union by offering financial incentives for former Soviet republics to dismantle their nuclear arsenals.
Moral reservations about the atomic bomb grew out of President Truman’s controversial decision to use the bomb against Japan. That decision has been criticized both on its own merits and in light of the implications it raised for future warfare. Some people praised Truman’s decision because it ended the war quickly and prevented a land invasion that many military experts estimated would have cost more than one million American and Japanese lives. Others criticized Truman for taking civilian lives rather than selecting a purely military target. People still debate the ethical value of using a weapon that can not only kill tens or hundreds of thousands instantly but also harm the next generation by causing radioactivity-related birth defects. These debates are likely to continue as long as nuclear weapons remain in the arsenals of the world’s military.