Survey of the Modern United States’ Virtual Nuclear Weapons Testing Program

1992 marked the year negotiations of the Comprehensive Nuclear Test Ban Treaty ended explosive nuclear weapon testing for the United States. Since that time, the United States has never performed another nuclear weapons test in the atmosphere, underwater, or underground; however, the treaty has never prevented the proliferation of nuclear materials or testing of nuclear physics. Because of this, the United States has pursued alternative methods of testing nuclear weapons without the need of conducting explosive tests. Modernizing the United States’ nuclear arsenal has meant the adoption of computational models to simulate virtual nuclear weapons, their effects, and how to maintain and update the arsenal itself. Here we conduct an overview of nuclear laboratories and their computational technology that secure and maintain the nuclear arsenal, discuss the computational methods for simulating the effects of nuclear weapons, introduce the Stockpile Stewardship Program, provide a summary of experiments performed in support of the stockpile, and review the mixed legacy of virtual testing and its future with regards to the U.S. nuclear weapons complex.

Three primary laboratories oversee the nuclear testing facilities in the United States. Operating under the National Nuclear Security Administration (NNSA), the Lawrence Livermore, Los Alamos, and Sandia National laboratories main and secure the United States’ nuclear arsenal. Each lab manages an Advanced Simulation and Computing (ASC) program sponsored by the NNSA to participate in, and advance, the nuclear arsenal’s safety and capabilities. With the use of high performance computers, the goal of the collective ASC program is to improve the confidence in nuclear weapon predictions through simulations, development quantifiable bounds on the uncertainty of computational results, and further increase predictive capabilities by combining simulation and experimental activities within nuclear testing facilities. Developing these models is of the upmost importance for the United States, who are choosing the never use an explosive test again for nuclear weapons.

To achieve these results, multiple supercomputers are utilized within the labs. Specifically, two are housed at the Los Alamos lab, two at Sandia, and one at Lawrence Livermore. These computers can conduct over one thousand trillion instructions per second, allowing for immense computational power when it comes to simulating complex physical systems such as a nuclear explosion. We will conduct a short overview of the primary computing system of each lab before discussing in depth the past, present, and future of the Stockpile Stewardship Program.

Lawrence Livermore

Sierra is the upgraded supercomputer being used at the Lawrence Livermore National Lab. It replaces the Sequoia computer previously maintained there. Sierra is projected to provide upwards to six times the performance as its predecessor. Its funding was provided through the NNSA, and it is going to be used to assist in fulfilling the stockpile stewardship program’s mission to avert underground nuclear testing.

This computer will be used to ensure the support of the NNSA’s ASC program. It is planned to provide experimental computational results in several key scientific areas:

  • Materials Modeling
  • Turbulent flow and instabilities
  • Laser plasma calculations

In addition to simulating the current nuclear arsenal’s capabilities.

Sandia

               Sandia’s participating in the ASC program is extensive. Their main contributions revolve around physics and engineering models, and computational systems. Their models provide extensive research into the U.S. nuclear stockpile by describing the multitude of physical processes that present themselves during a nuclear explosion.

Like the other labs, Sandia applies predictive science-based models to attain their results. In this case, Sandia develops the following physical models:

  • Material strength and damages
  • Radiation and electrical responses
  • Aerodynamics and vibrations
  • Thermal and fluid responses

Each of these models are implemented into the other ASC sponsored labs for use in testing and experimentation.

Los Alamos

The Los Alamos National Laboratory houses two supercomputers: Cielo and Trinity. Both are required support for the Stockpile Stewardship Program. Beginning with Cielo, this supercomputer was developed under a joint effort between Los Alamos and Sandia. It – like the other supercomputers – will be used by all three labs partnered under the ASC program. It has also performed many simulations, include one involving the mitigation of an asteroid. Trinity is a supercomputer designed to perform similar functions.

Stockpile Stewardship Program

Clearly, each of the ACS labs harbors plenty of computational strength to simulate the effects of nuclear explosions and the necessary safety and security capabilities required for maintaining the aging U.S. nuclear arsenal. It is because of this aging arsenal and the need for continued testing that in 1995, the United States created the Stockpile Stewardship Program (SSP). Its mission is to make scientific and technological advancements in order to assess the nuclear weapons arsenal without having to rely on explosive nuclear testing.

This program has a number of specific missions regarding the safety and security of the arsenal. It covers the broad range of weapon life extension, predictive modeling, plutonium science, high-energy-density science, infrastructure improvements, and high-explosive science. All of these programs require the use of the national labs’ supercomputers to properly model each scenario and provide results with actual nuclear testing. Additionally, a number of facilities exist in order to facilitate these experiments.

Each facility exists within either national labs or security sites within the country. We list some of the key facilities along with a brief description of their responsibilities. First, the Z-Machine provides an investigative platform in which scientists can further understand the properties of material, plasma, and radiation. Omega provides a similar platform, but also performs some fusion experiments and is accessible to universities. The Dual-Axis Radiographic Hydrodynamic Test Facility (DARHT)  applies the use of two large X-ray machines to record the interiors of three-dimensional materials and within experiments these materials are subjected to hydrodynamic shock to simulate the implosion process of nuclear bombs. Furthermore, BEEF (Big Explosive Experiment Facility) performs experiments on materials as they are merged together through the use of high-explosive detonations. Plenty of these experimental facilities exist and apply the use of the computational power of the national lab’s supercomputers such that we can maintain the aging nuclear arsenal of the United States.

With the future of the United States’ arsenal in mind, it is important to recognize the faults that still exist within the current program. While the maintenance and simulations of the nuclear arsenal seek to examine and model every detail of the nuclear process – and are doing so successfully in many cases – the management of the missiles and nuclear material themselves may not always be up to par. The deployment of the weapons still requires the use of floppy disks to activate the arsenal while we have supercomputers conducting entire nuclear explosions. This disparity exists in several areas regarding the United States’ arsenal. Additionally, many other countries have been less attentive when it comes to adhering to the Comprehensive Nuclear Test Ban Treaty. By refusing to fully adhere to the treaty, other countries can continue to develop new nuclear weapons and test them in similar methods as the United States did before they banned testing in the nineties.

In general, the United States virtual weapons testing program is reasonably extensive. With the goal of simulating every process of a nuclear detonation, it is an understandably complex and difficult problem to solve. Avoiding the environmental damage caused by classic nuclear testing is imperative, though, so having these virtual facilities is an important and continuing step forward in the development of nuclear weapons and testing for the United States.

 

References

[1] https://wci.llnl.gov/about-us/weapon-simulation-and-computing
[2] https://www.files.ethz.ch/isn/135139/DH17.pdf
[3] https://nnsa.energy.gov/aboutus/ourprograms/defenseprograms/futurescienceandtechnologyprograms/asc
[4] https://nnsa.energy.gov/aboutus/ourprograms/defenseprograms/stockpilestewardship
[5] https://nnsa.energy.gov/sites/default/files/Quarterly%20SSP%20Experiment%20Summary-Q1FY15.pdf
[6] http://large.stanford.edu/courses/2011/ph241/hamman2/
[7] https://www.ucsusa.org/nuclear-weapons/us-nuclear-weapons-policy/us-nuclear-weapons-arsenal
[8] https://en.wikipedia.org/wiki/United_States_Department_of_Energy_national_laboratories

[9] http://www.sandia.gov/missions/nuclear_weapons/about_nuclear_weapons.html

[10] http://www.lanl.gov/projects/cielo/index.php

[11]  http://www.lanl.gov/asc/

[12] https://asc.llnl.gov/coral-info

[13] http://www.sandia.gov/asc/computational_systems/index.html

Enrico Fermi

Author: Matthew Rhea

Pre-War

Before his involvement with The Manhattan Project, Enrico Fermi spent his early life as a professor of physics at the University of Pisa in Italy. It was during this time and Fermi’s following professorship in 1927 at the University of Rome where he developed heavy interests in the field of nuclear and particle physics.

Before the professorship, he discovered the Statistical Laws. Known nowadays at Fermi Statistics. These laws describe principles which govern the behavior of particles contingent on Pauli’s Exclusion Principle — a principle which states that any number of identical fermions are not able to be in the same quantum state in the same system simultaneously. Following this discovery was his election as a professor until 1938 where, after receiving the Nobel Prize in Physics “for his demonstrations of the existence of new radioactive elements produced by neutron irradiation, and for his related discovery of nuclear reactions brought about by slow neutrons” [1], Fermi fled Italy to America.

Fermi first arrived in New York where he became a professor at Columbia University in 1939. From this university, he continued his work on nuclear fission with support from the Uranium Committee – and later by its successor, the National Defense Research Committee, both agencies within the United States’ government. With help from the Columbia University football team, Fermi worked on building chain reaction piles of graphite and uranium. These piles were named quite literally. The scientists and football team layered uranium and graphite. More on the construction of the piles will be later when the more prominent one, the Chicago Pile-1, is discussed.

During the War

At the peak of World War II in 1942, Fermi moved to Chicago to work within the Chicago Metallurgical Laboratory. Similarly, he recruited the football team to developed Chicago Pile-1 which became the first self-sustained, controlled nuclear reaction on December 2, 1942. Essentially, layers of graphite bricks were construction on top of a wood framing. Many of the graphite bricks had holes drilled into them to make space for uranium to be placed insides. With each uranium filled brick, there were neighboring “dead uranium” [2] bricks. In order to control the nuclear reaction, some of the graphite bricks has much larger holes drilled in (upwards of fourteen feet) to contain cadmium, which would cause the reaction to go critical if removed from the pile.

The Chicago Pile-1 was the model nuclear reactor for following reactors such as the X-10 Graphite Reactor and the B Reactor at Hanford. With its success, Fermi saw a similar success and was recruited by J. Robert Oppenheimer for the Manhattan project in 1944. After recruitment, Fermi saw himself become the Associate Director of the laboratory at Los Alamos. He was placed in F Division at the laboratory, and oversaw much of the theoretical and nuclear physics at the lab. Within this division, there were four branches: F-1, F2, F-3, and F-4. Of which, Fermi has a part in all of the events. F-1 oversaw the thermonuclear bomb, F-2 contained the aqueous homogeneous research reactor, F-3 was Super Experimentation, and F-4 studied nuclear fission [3].

Post War

After the war and his work at the Los Alamos Laboratory, Fermi became a professor at the Institute for Nuclear Studies division of the University of Chicago. From this point on, in 1946, Fermi remained a professor until death in 1954. During his time there he focused on high-energy physics.

Citations

[1] https://www.nobelprize.org/nobel_prizes/physics/laureates/1938/

[2] https://www.atomicheritage.org/history/chicago-pile-1

[3] https://en.wikipedia.org/wiki/Enrico_Fermi#Manhattan_Project

[4] https://en.wikipedia.org/wiki/Pauli_exclusion_principle

[5] https://www.atomicheritage.org/article/manhattan-project-spotlight-enrico-fermi

[6] https://www.nobelprize.org/nobel_prizes/physics/laureates/1938/fermi-bio.html