Survey of the Modern United States’ Virtual Nuclear Weapons Testing Program

1992 marked the year negotiations of the Comprehensive Nuclear Test Ban Treaty ended explosive nuclear weapon testing for the United States. Since that time, the United States has never performed another nuclear weapons test in the atmosphere, underwater, or underground; however, the treaty has never prevented the proliferation of nuclear materials or testing of nuclear physics. Because of this, the United States has pursued alternative methods of testing nuclear weapons without the need of conducting explosive tests. Modernizing the United States’ nuclear arsenal has meant the adoption of computational models to simulate virtual nuclear weapons, their effects, and how to maintain and update the arsenal itself. Here we conduct an overview of nuclear laboratories and their computational technology that secure and maintain the nuclear arsenal, discuss the computational methods for simulating the effects of nuclear weapons, introduce the Stockpile Stewardship Program, provide a summary of experiments performed in support of the stockpile, and review the mixed legacy of virtual testing and its future with regards to the U.S. nuclear weapons complex.

Three primary laboratories oversee the nuclear testing facilities in the United States. Operating under the National Nuclear Security Administration (NNSA), the Lawrence Livermore, Los Alamos, and Sandia National laboratories main and secure the United States’ nuclear arsenal. Each lab manages an Advanced Simulation and Computing (ASC) program sponsored by the NNSA to participate in, and advance, the nuclear arsenal’s safety and capabilities. With the use of high performance computers, the goal of the collective ASC program is to improve the confidence in nuclear weapon predictions through simulations, development quantifiable bounds on the uncertainty of computational results, and further increase predictive capabilities by combining simulation and experimental activities within nuclear testing facilities. Developing these models is of the upmost importance for the United States, who are choosing the never use an explosive test again for nuclear weapons.

To achieve these results, multiple supercomputers are utilized within the labs. Specifically, two are housed at the Los Alamos lab, two at Sandia, and one at Lawrence Livermore. These computers can conduct over one thousand trillion instructions per second, allowing for immense computational power when it comes to simulating complex physical systems such as a nuclear explosion. We will conduct a short overview of the primary computing system of each lab before discussing in depth the past, present, and future of the Stockpile Stewardship Program.

Lawrence Livermore

Sierra is the upgraded supercomputer being used at the Lawrence Livermore National Lab. It replaces the Sequoia computer previously maintained there. Sierra is projected to provide upwards to six times the performance as its predecessor. Its funding was provided through the NNSA, and it is going to be used to assist in fulfilling the stockpile stewardship program’s mission to avert underground nuclear testing.

This computer will be used to ensure the support of the NNSA’s ASC program. It is planned to provide experimental computational results in several key scientific areas:

  • Materials Modeling
  • Turbulent flow and instabilities
  • Laser plasma calculations

In addition to simulating the current nuclear arsenal’s capabilities.

Sandia

               Sandia’s participating in the ASC program is extensive. Their main contributions revolve around physics and engineering models, and computational systems. Their models provide extensive research into the U.S. nuclear stockpile by describing the multitude of physical processes that present themselves during a nuclear explosion.

Like the other labs, Sandia applies predictive science-based models to attain their results. In this case, Sandia develops the following physical models:

  • Material strength and damages
  • Radiation and electrical responses
  • Aerodynamics and vibrations
  • Thermal and fluid responses

Each of these models are implemented into the other ASC sponsored labs for use in testing and experimentation.

Los Alamos

The Los Alamos National Laboratory houses two supercomputers: Cielo and Trinity. Both are required support for the Stockpile Stewardship Program. Beginning with Cielo, this supercomputer was developed under a joint effort between Los Alamos and Sandia. It – like the other supercomputers – will be used by all three labs partnered under the ASC program. It has also performed many simulations, include one involving the mitigation of an asteroid. Trinity is a supercomputer designed to perform similar functions.

Stockpile Stewardship Program

Clearly, each of the ACS labs harbors plenty of computational strength to simulate the effects of nuclear explosions and the necessary safety and security capabilities required for maintaining the aging U.S. nuclear arsenal. It is because of this aging arsenal and the need for continued testing that in 1995, the United States created the Stockpile Stewardship Program (SSP). Its mission is to make scientific and technological advancements in order to assess the nuclear weapons arsenal without having to rely on explosive nuclear testing.

This program has a number of specific missions regarding the safety and security of the arsenal. It covers the broad range of weapon life extension, predictive modeling, plutonium science, high-energy-density science, infrastructure improvements, and high-explosive science. All of these programs require the use of the national labs’ supercomputers to properly model each scenario and provide results with actual nuclear testing. Additionally, a number of facilities exist in order to facilitate these experiments.

Each facility exists within either national labs or security sites within the country. We list some of the key facilities along with a brief description of their responsibilities. First, the Z-Machine provides an investigative platform in which scientists can further understand the properties of material, plasma, and radiation. Omega provides a similar platform, but also performs some fusion experiments and is accessible to universities. The Dual-Axis Radiographic Hydrodynamic Test Facility (DARHT)  applies the use of two large X-ray machines to record the interiors of three-dimensional materials and within experiments these materials are subjected to hydrodynamic shock to simulate the implosion process of nuclear bombs. Furthermore, BEEF (Big Explosive Experiment Facility) performs experiments on materials as they are merged together through the use of high-explosive detonations. Plenty of these experimental facilities exist and apply the use of the computational power of the national lab’s supercomputers such that we can maintain the aging nuclear arsenal of the United States.

With the future of the United States’ arsenal in mind, it is important to recognize the faults that still exist within the current program. While the maintenance and simulations of the nuclear arsenal seek to examine and model every detail of the nuclear process – and are doing so successfully in many cases – the management of the missiles and nuclear material themselves may not always be up to par. The deployment of the weapons still requires the use of floppy disks to activate the arsenal while we have supercomputers conducting entire nuclear explosions. This disparity exists in several areas regarding the United States’ arsenal. Additionally, many other countries have been less attentive when it comes to adhering to the Comprehensive Nuclear Test Ban Treaty. By refusing to fully adhere to the treaty, other countries can continue to develop new nuclear weapons and test them in similar methods as the United States did before they banned testing in the nineties.

In general, the United States virtual weapons testing program is reasonably extensive. With the goal of simulating every process of a nuclear detonation, it is an understandably complex and difficult problem to solve. Avoiding the environmental damage caused by classic nuclear testing is imperative, though, so having these virtual facilities is an important and continuing step forward in the development of nuclear weapons and testing for the United States.

 

References

[1] https://wci.llnl.gov/about-us/weapon-simulation-and-computing
[2] https://www.files.ethz.ch/isn/135139/DH17.pdf
[3] https://nnsa.energy.gov/aboutus/ourprograms/defenseprograms/futurescienceandtechnologyprograms/asc
[4] https://nnsa.energy.gov/aboutus/ourprograms/defenseprograms/stockpilestewardship
[5] https://nnsa.energy.gov/sites/default/files/Quarterly%20SSP%20Experiment%20Summary-Q1FY15.pdf
[6] http://large.stanford.edu/courses/2011/ph241/hamman2/
[7] https://www.ucsusa.org/nuclear-weapons/us-nuclear-weapons-policy/us-nuclear-weapons-arsenal
[8] https://en.wikipedia.org/wiki/United_States_Department_of_Energy_national_laboratories

[9] http://www.sandia.gov/missions/nuclear_weapons/about_nuclear_weapons.html

[10] http://www.lanl.gov/projects/cielo/index.php

[11]  http://www.lanl.gov/asc/

[12] https://asc.llnl.gov/coral-info

[13] http://www.sandia.gov/asc/computational_systems/index.html

Leave a Reply

Your email address will not be published. Required fields are marked *