124 The Arms Buildup, the Space Race, and Technological Advancement

Harnessing years of discoveries in nuclear physics, the work of hundreds of world-class scientists, and $2 billion in research funds, during World War II the Manhattan Project had created atomic weapons. The first nuclear explosive device, “Trinity,” exploded on the deserts of New Mexico on July 16, 1945 with the destructive power equivalent of 20,000 tons of TNT. Choking back tears, physicist J. Robert Oppenheimer would remember the experience by quoting from Hindu scripture: “I have become Death, the destroyer of worlds.” The director of the Trinity test was plainer: “Now, we’re all sons of bitches.”

The world soon saw what nuclear weapons could do. In August, two bombs leveled two cities and killed perhaps 180,000 people. The world was never the same.

The Soviets accelerated their research in the wake of Hiroshima and Nagasaki, expedited in no small part by spies such as Klaus Fuchs, who had stolen nuclear secrets from the Manhattan Project. Soviet scientists successfully tested an atomic bomb on August 29, 1949, years before American officials had estimated they would. This unexpectedly quick Russian success not only caught the United States off guard, caused tensions across the Western world, and propelled a nuclear “arms race” between the US and the USSR.

The United States detonated the first thermonuclear weapon, or hydrogen bomb (using fusion explosives of theoretically limitless power) on November 1, 1952. The blast measured over 10 megatons and generated an inferno five miles wide with a mushroom cloud 25 miles high and 100 miles across. The irradiated debris—fallout—from the blast circled the Earth, occasioning international alarm about the effects of nuclear testing on human health and the environment. It only hastened the arms race, with each side developing increasingly advanced warheads and delivery systems. The USSR successfully tested a hydrogen bomb in 1953, and soon thereafter Eisenhower announced a policy of “massive retaliation.” The US would henceforth respond to threats or acts of aggression with perhaps its entire nuclear might. Both sides, then, would theoretically be deterred from starting a war, through the logic of “mutually-assured destruction,” (MAD). Oppenheimer likened the state of “nuclear deterrence” between the US and the USSR to “two scorpions in a bottle, each capable of killing the other,” but only by risking their own lives.

The mushroom cloud of an atomic explosion rising above the clouds.
In response to the Soviet Union’s test of a pseudo-hydrogen bomb in 1953, the United States began Castle Bravo — the first U.S. test of a dry fuel, hydrogen bomb. Detonated on March 1, 1954, it was the most powerful nuclear device ever tested by the U.S. But the effects were more gruesome than expected, causing nuclear fall-out and radiation poisoning in nearby Pacific islands. Photograph, March 1, 945. Wikimedia.

Fears of nuclear war produced a veritable atomic culture. Films such as Godzilla, On the Beach, Fail-Safe, and Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb plumbed the depths of American anxieties with plots featuring radioactive monsters, nuclear accidents, and doomsday scenarios. Anti-nuclear protests in the United States and abroad warned against the perils of nuclear testing and highlighted the likelihood that a thermonuclear war would unleash a global environmental catastrophe. Yet at the same time, peaceful nuclear technologies, such as fission and fusion-based energy, seemed to herald a utopia of power that would be clean, safe, and “too cheap to meter.” In 1953, Eisenhower proclaimed at the UN that the US would share the knowledge and means for other countries to use atomic power. Henceforth, “the miraculous inventiveness of man shall not be dedicated to his death, but consecrated to his life.” The ‘Atoms for Peace’ speech brought about the establishment of International Atomic Energy Agency (IAEA), along with worldwide investment in this new economic sector.

As Germany fell at the close of World War II, the United States and the Soviet Union each sought to acquire elements of the Nazi’s V-2 superweapon program. A devastating rocket that had terrorized England, the V-2 was capable of delivering its explosive payload up to a distance of nearly 600 miles, and both nations sought to capture the scientists, designs, and manufacturing equipment to make it work. A former top German rocket scientist, Wernher Von Braun, became the leader of the American space program; the Soviet Union’s program was secretly managed by former prisoner Sergei Korolev. After the end of the war, American and Soviet rocket engineering teams worked to adapt German technology in order to create an intercontinental ballistic missile (ICBM). The Soviets achieved success first. They even used the same launch vehicle on October 4, 1957, to send Sputnik 1, the world’s first human-made satellite, into orbit. It was a decisive Soviet propaganda victory.

In response, the US government rushed to perfect its own ICBM technology and launch its own satellites and astronauts into space. In 1958, the National Aeronautics and Space Administration (NASA) was created as a successor to the National Advisory Committee for Aeronautics (NACA). Initial American attempts to launch a satellite into orbit using the Vanguard rocket suffered spectacular failures, heightening fears of Soviet domination in space. While the American space program floundered, on September 13, 1959, the Soviet Union’s “Luna 2” capsule became the first human-made object to touch the moon. The “race for survival,” as it was called by the New York Times, reached a new level. The Soviet Union successfully launched a pair of dogs (Belka and Strelka) into orbit and returned them to Earth while the American Mercury program languished behind schedule. Despite countless failures and one massive accident that killed nearly one hundred Soviet military and rocket engineers, Russian ‘cosmonaut’ Yuri Gagarin was launched into orbit on April 12, 1961. Astronaut Alan Shepard accomplished a sub-orbital flight in the Freedom 7 capsule on May 5. John Kennedy would use America’s losses in the “space race” to bolster funding for a moon landing.

While outer space captivated the world’s imagination, the Cold War still captured its anxieties. The ever-escalating arms race continued to foster panic. In the early 1950s, the Federal Civil Defense Administration (FDCA) began preparing citizens for the worst. Schoolchildren were instructed, via a film featuring Bert the Turtle, to “duck and cover” beneath their desks in the event of a thermonuclear war.

Although it took a back seat to space travel and nuclear weapons, the advent of modern computing was yet another major Cold War scientific innovation, the effects of which were only just beginning to be understood. In 1958, following the humiliation of the Sputnik launches, Eisenhower authorized the creation of an Advanced Research Projects Agency (ARPA) housed within the Department of Defense (later changed to DARPA). As a secretive military research and development operation, ARPA was tasked with funding and otherwise overseeing the production of sensitive new technologies. Soon, in cooperation with university-based computer engineers, ARPA would develop the world’s first system of “network packing switches” and computer networks would begin connecting to one another.

License

Icon for the Creative Commons Attribution 4.0 International License

United States History II Copyright © by Lumen Learning is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book