We have hard copies of many of our publications. Inquiries are welcome.
Senior leaders in the US Department of Defense, as well as nuclear strategists and academics, have argued that the advent of nuclear weapons is associated with a dramatic decrease in wartime fatalities. This assessment is often supported by an evolving series of figures that show a marked drop in wartime fatalities as a percentage of world population after 1945 to levels well below those of the prior centuries. The goal of this report is not to ascertain whether nuclear weapons are associated with or have led to a decrease in wartime fatalities, but rather to critique the supporting statistical evidence. We assess these wartime fatality figures and find that they are both irreproducible and misleading. We perform a more rigorous and traceable analysis and discover that post-1945 wartime fatalities as a percentage of world population are consistent with those of many other historical periods.
First published in Statistics and Public Policy, Volume 9, Issue 1
The United States has been a party to numerous treaties on nuclear weapons, dating back to the 1960s. These treaties fall into two general categories: treaties that constrain activities (e.g., nuclear testing, placing nuclear weapons in outer space, and nuclear proliferation) and treaties that constrain the number and nature of weapons that the parties can possess. All nine treaties limiting the size and nature of nuclear arsenals (including one treaty that limited missile defense) have been bilateral agreements between the United States and Russia (or the Soviet Union before 1992). During negotiations for strategic arms-control agreements, the key US objectives have been to sustain stable strategic nuclear deterrence and to reduce unnecessary and costly arms races. This report describes all nine of these treaties, with particular focus on the New Strategic Arms Control Treaty (New START)—the only such treaty that is still in effect. Further, this study analyzes how well these treaties kept up with emerging technology and the security environment of their times, and how well they met the goals just listed. This report then draws lessons from earlier treaties and developments of the last decade to provide considerations for the United States to account for when negotiating whatever treaty follows New START. Finally, many earlier arms-control treaties between the United States and Russia took from two and a half to seven years to negotiate, exclusive of preparatory work to initiate negotiations. The expiration date for New START is February 2026, so the time to begin thinking about arms control beyond New START is now.
While careful analysis of the likelihood and consequences of the failure of nuclear deterrence is not usually undertaken in formulating national security strategy, general perception of the risk of nuclear war has a strong influence on the broad directions of national policy. For example, arguments for both national missile defenses and deep reductions in nuclear forces depend in no small part on judgments that deterrence is unreliable. However, such judgments are usually based on intuition, rather than on a synthesis of the most appropriate analytic methods that can be brought to bear. This work attempts to establish a methodological basis for more rigorously addressing the question: What is the risk of nuclear war? Our goals are to clarify the extent to which this is a researchable question and to explore promising analytic approaches. We focus on four complementary approaches to likelihood assessment: historical case study, elicitation of expert knowledge, probabilistic risk assessment, and the application of complex systems theory. We also evaluate the state of knowledge for assessing both the physical and intangible consequences of nuclear weapons use. Finally, we address the challenge of integrating knowledge derived from such disparate approaches.
What are the risks of nuclear war in all its potential manifestations? This is not an easy question to answer, and I do not propose to answer it here. Rather, the more tractable question is whether the process of studying it could yield policy-relevant insights even if it is unlikely to lead to a precise determination of the actual risks of nuclear weapons use. In this chapter, I summarize the current state of analysis regarding the likelihood of nuclear war, focusing on The Lugar Survey on Proliferation Threats and Responses, the Bulletin of the Atomic Scientists’ Doomsday Clock, and a sampling of analysts’ estimations of the likelihood of interstate nuclear war and nuclear terrorism. These estimations differ widely and are all of questionable validity because they are either fundamentally intuitive or based on very simple—even simplistic—analyses. Can we improve on this state of analysis by using more structured and more comprehensive approaches to provide a sounder basis for policies that will inevitably be based on imperfect analyses of the likelihood of nuclear war?
Case studies are useful in analyzing infrequent events because they can assess “close calls” in which such events could have occurred, as well as those instances in which they actually occurred. Nuclear weapons have been used twice, but there have been many more close calls. This chapter outlines an agenda for using case studies to assess the risks of nuclear weapons use. First, it identifies twelve cases in which leaders used, seriously contemplated using, or might have considered using nuclear weapons. Second, it notes thirteen cases of close calls of accidental or unauthorized detonation of a nuclear weapon. Third, it assesses three possible paths toward the use of nuclear weapons by non-state actors, none of which as yet has had any known close-call incidents. The chapter then briefly assesses how the historical risks of nuclear weapons use might change as the world evolves toward a larger number of nuclear weapons states. Finally, the chapter develops policy-relevant questions on the risks of nuclear weapons use that can be addressed through case studies, including the behavior of new nuclear weapons states, the likelihood of nuclear weapons use by field commanders versus that by national command authorities, the safety trade-offs of dispersed versus centralized nuclear weapons sites, and the differences between contemporaneous and historical evaluations of nuclear risks. These contributions are unlikely to lead to clear point estimates of nuclear risks, but they may help identify which paths toward possible nuclear weapons use deserve more attention and how risks on these paths can be reduced.
Jane M. Booker
Every decision and problem solution involves the use of knowledge gained from the experiences and thought processes of humans. Even for data-rich problems, humans influence how data are gathered, interpreted, modeled, and analyzed. For data-poor problems, such as those assessing risks of never-seen, rare, or one-of-a-kind events, knowledge from experts may be the sole available source of information. Assessing the risk of nuclear deterrence failure is an ill-posed problem that falls into the data-poor category. As a result, experts are needed (1) to supply the information and knowledge for the risk assessment and (2) to define and structure the deterrence problem. These two uses of elicited expert knowledge are discussed. For both, formal elicitation methods for bias minimization are recommended and briefly described. Formal elicitation also involves planning and the use of methods for obtaining the best-quality information from the experts’ thinking and problem solving. This formalism includes the characterization of uncertainties, which are prevalent in the deterrence problem, and the analysis of the elicited information, which is necessary for assessing the likelihood and consequence constituents of risk.
Martin E. Hellman
Probabilistic risk assessment (PRA) can provide a quantitative estimate of risk for catastrophes that have not yet occurred by analyzing sequences of events that can lead to that event—in our case a major nuclear war. PRA is also useful for reducing that risk by identifying potential paths to nuclear weapons use that otherwise might escape attention. While PRA has been embraced in nuclear power, spaceflight, and other engineering fields, there are significant challenges to transferring that experience to the risk of nuclear deterrence failing. In-depth PRA of nuclear deterrence holds promise but requires significant further research. Fortunately, a simple approach can be used to show that the risk of nuclear deterrence failing currently appears to be on the order of 1 percent per year. It is hoped that this surprising result will cause society to invest in the larger efforts required for in-depth analysis, both to estimate and to reduce the risk of a major nuclear war.
Edward T. Toton and James Scouras
Even if the US Cold War nuclear deterrence system could be regarded as a triumphant success because no nuclear war occurred between the United States and the Soviet Union, the strategic nuclear deterrence system of today must contend with a geopolitical landscape far more complicated than that of the Cold War. Seven acknowledged nuclear powers exist, and others, including transnational organizations, are attempting to join their ranks. Multiple nuclear states, nuclear capabilities that vary widely in technological sophistication, and different levels of stockpiles and security implementations all suggest that the nuclear deterrence landscape is far more uncertain in its risk of failure than at any other time in history. These components also suggest that the nuclear deterrence system has features that are consistent with the formal definition of complex systems; therefore, complex systems theory is most appropriate for addressing fundamental questions of risk. We explore these features and discuss failures from the points of view of accidents and human error or missteps, drawing on treatments of complex systems in general and the Cuban missile crisis in particular. We suggest how fundamental research in complex systems theory can contribute to assessing the risk of failure of nuclear deterrence. Whether formal modeling of nuclear deterrence systems can provide practical utility in a multipolar nuclear world has yet to be determined. We suggest construction of simplified mathematical models as a first step in grappling with the complexities of systems of nuclear deterrence. We propose that assessment of the risk of failure of nuclear deterrence associated with the close calls in the Cuban missile crisis would be a practical test of preliminary understanding of this complex problem.
Michael J. Frankel, James Scouras, and George W. Ullrich
The considerable body of knowledge on the consequences of nuclear weapons use—accumulated through an extensive, sustained, and costly national investment in both testing and analysis over two-thirds of a century—underlies all operational and policy decisions related to US nuclear planning. We find that even when consideration is restricted to the physical consequences of nuclear weapons use, where our knowledge base on effects of primary importance to military planners is substantial, there remain very large uncertainties. These uncertainties exist in no small part because many facets of the issue, such as the effects on the infrastructures that sustain society, have not been adequately investigated. Other significant uncertainties in physical consequences remain because important phenomena were uncovered late in the nuclear test program, have been inadequately studied, are inherently difficult to model, or are the result of new weapon developments. Nonphysical consequences, such as social, psychological, political, and full economic effects, are even more difficult to quantify and have never been on any funding agency’s radar screen. As a result, the physical consequences of a nuclear conflict tend to have been underestimated, and a full-spectrum all-effects assessment is not within anyone’s grasp now or in the foreseeable future. The continuing brain drain of nuclear scientists and the general failure to recognize the post–Cold War importance of accurate and comprehensive nuclear consequence assessments, especially for scenarios of increasing concern at the lower end of the scale of catastrophe, do not bode well for improving this situation. This paper outlines the current state of our knowledge base and presents recommendations for strengthening it.
Analyses of the effects of nuclear weapons have traditionally focused on the physical destruction they produce, especially their human toll, devastation of cities, and damage to the environment. To the extent that nonphysical effects are taken into account, strategists have emphasized the influence of nuclear weapons on national decision-making, particularly whether a limited strike would escalate to an all-out nuclear exchange. Yet, the range of nonphysical weapon effects is much broader, encompassing social, psychological, political, and economic impacts that would reverberate long after a nuclear attack. In a limited-use scenario, these ramifications—designated as “intangible” effects in this analysis—may greatly surpass the physical damage incurred, just as the cost and scope of the response to September 11 dwarfed the direct effects of the attacks. Moreover, unlike physical phenomena, many of these intangible effects are the result of human decisions and are thus theoretically controllable. Given that the limited use of nuclear weapons is probably more likely than a massive nuclear war, there is a pressing need to understand these intangible effects and identify practical steps to minimize them.
Jane M. Booker
For multifaceted problems such as assessing the risk of nuclear deterrence failure, data, information, and knowledge can emerge from many different sources involving diverse subject areas and in myriad qualitative or quantitative forms. Often the amounts of data, information, and knowledge are limited, apply to rare events or events that have never occurred, or both, necessitating the combined use of all sources. For example, sources include historical data on past events; expertise from authorities in different subject areas; and knowledge about past and current cultures, human behaviors, sociology, politics of people and states, as well as the theory or rules governing politics. Regardless of source and form, available knowledge has uncertainty attached. Some uncertainties can be significant, and the uncertainties themselves can be of different types. Depending on the type of uncertainty, quantification may not be feasible or the appropriate mathematical theory for it may be difficult to apply. Nonetheless, decision- and policy-makers need a final or top-level answer about nuclear deterrence failure accompanied by an understandable uncertainty. Knowledge integration methods address these needs and provide ways to tackle other difficulties encountered when combining all available data, information, and knowledge and their associated uncertainties to produce an assessment of risk. Some of the integration principles and methods are described in this chapter, especially those related to the challenges in assessing the risk of nuclear deterrence failure—a problem of significant uncertainties and poor data, information, and knowledge.
Motivated by the importance of the perceived risk of nuclear deterrence failure in national security policy formulation, we began our study by asking whether more structured analytic approaches could improve on the highly intuitive manner by which the risk of deterrence failure has generally been assessed. For the likelihood dimension of risk, each of the approaches included in this book—case study, elicitation of expert judgment, probabilistic risk assessment, and application of complex systems theory—has something unique to offer. However, none of these approaches can do the job by itself. Rather we have reinforced the notion that multiple disciplines can each shed limited light on the question. We must extract from each of them whichever valuable insights they offer and do our best to synthesize these insights, using the art and science of knowledge integration, into a policy-relevant assessment. However daunting this task, discernible research paths hold significant promise. As for the physical consequences of nuclear use, it is clear that our knowledge base, derived primarily from concern about the military effectiveness of nuclear weapons, is inadequate to assess the potential consequences from the broader array of nuclear uses that now appear possible or from intangible consequences that could exceed even the physical consequences. This lack of knowledge is easier to address from an analytic perspective but requires an adequately funded research program. The dim prospects of such a program are yet another consequence of the complacency induced by our intuitive sense that nuclear weapon risks have largely abated.
Testing will remain a key tool for those managing health care and making health policy for the current coronavirus pandemic, and testing will probably be an important tool in future pandemics. Because of test errors, the observed fraction of positive tests, the surface positivity, is generally different from the underlying incidence rate of the disease. We model, using both analytical and simulation tools, the process of testing to address (1) how to go from positivity to a point estimate incidence rate; (2) how to compute a reasonable range of possible incidence rates, given the models and data; (3) how to compare different levels of positivity in light of test errors, particularly false negatives; and (4) how to compute the risk (defined as including one infected individual) of groups of different sizes, given the estimate of incidence rate. Our approach is based on modeling the process generating test data in which the true state of the world (incidence rate, probability of a false negative test, and probability of a false positive test) is known. This allows us to compare analytical predictions with a known situation, thus providing confidence when the tools are used when the true state of the world is not known.
The US government and its social media partners are bolstering their defenses against foreign election interference and campaigns to corrode democratic governance. Those efforts are vital but inadequate for the emerging security environment. The United States should also account for the risk that in intense regional crises, adversaries will use information operations (IOs) to coerce US and allied behavior. In particular, opponents will seek to convince US and allied policymakers that unless they back down, their nations will suffer punishment that dwarfs any gains they hope to achieve. If adversaries cannot prevail through IOs alone, they may fulfill their threats and launch increasingly destructive cyberattacks, paired with warnings that further punishment will follow until the US and its allies capitulate.
The US military is rapidly improving its ability to conduct coercive operations against US opponents. Yet, the federal government has barely begun to develop strategies and capabilities to defeat equivalent campaigns against us. This study examines the vulnerabilities of the US public and policymaking process to coercive IOs and analyzes Chinese and Russian technologies to exploit these vulnerabilities with unprecedented effectiveness. The study also proposes options to defeat (and, ideally, help deter) future coercive campaigns, in ways that uphold the Constitution and leverage progress already underway against electoral interference and the corrosion of democratic governance.
Situational awareness during disaster response is critical as it enables the response community to rapidly and efficiently assist those in urgent need during the time-sensitive, acute phase of a disaster. New technologies can drastically improve the effectiveness of response operations: satellite imagery to quickly map the destructive path of a hurricane, social media tracking to identify communities of increased need, and computer modeling to predict the route of a wildfire to inform evacuations. The US government has prioritized implementation of artificial intelligence (AI) systems throughout the federal agencies, including those technologies that may assist in disaster response. In this report, we contribute a technological road map for delivering to the response community near- and more distant-future AI-enabled technologies that could aid in SA during disasters. By exploring current and historical technology trends, successes, and difficulties, we envision the benefits and vulnerabilities that such new technologies could bring to disaster response. Given the complexities associated with both disasters and AI-enabled technologies, an integrated approach to development will be necessary to ensure that new technologies are both science driven and operationally feasible.
The United States should establish a National Cyber Defense Center (NCDC) in the Office of the National Cyber Director to proactively address cyber threats to US interests. The NCDC would plan and coordinate US governmental efforts across four areas: cyber deterrence, active cyber defense, offensive cyber operations in support of defense, and incident management. It would also work closely with the private sector, state and local governments, and US allies. Such a proactive and comprehensive approach is needed to deal with cyber adversaries who are exploiting seams within the US government and between the US government and the private sector.
This report addresses the questions of whether the United States should resume nuclear testing and, if not, whether it should better prepare to do so in the future. Our goal is to provide a comprehensive and balanced consideration of all significant arguments that inform these questions. To place these arguments in the proper context, we briefly recount US nuclear testing history, describing alternative objectives for nuclear tests and providing a taxonomical retrospective of significant surprises encountered—in the nuclear environment, in vulnerabilities of military systems, and in weapon performance and safety. We review as well the critical role played by Science-Based Stockpile Stewardship in lieu of testing and the concerns of its critics. We also describe the current state of nuclear test readiness and assess whether the United States can presently meet its readiness obligations. After considering all significant technical and policy arguments and counterarguments, both for and against test resumption, we conclude that under present circumstances, the United States should not resume nuclear testing because of the lack of a compelling national security need combined with potentially significant negative geopolitical consequences for nuclear proliferation and reignition of a nuclear arms race. However, we identify a series of future technical and political developments whose occurrence would require revisiting our decision calculus. We end the report with recommendations to improve test readiness and, as a final thought, place the issue of whether or not to resume nuclear testing in the context of conflicting far- and near-term US national security goals.
Even though vaccines for coronavirus are increasingly available, it will be many months before sufficient herd immunity is achieved. Thus, testing remains a key tool for those managing health care and making policy decisions. Test errors, both false positive tests and false negative tests, mean that the surface positivity (the observed fraction of tests that are positive) does not accurately represent the incidence rate (the unobserved fraction of individuals infected with coronavirus). In this report, directed to individuals tasked with providing analytical advice to policymakers, we describe a method for translating from the surface positivity to a point estimate for the incidence rate, then to an appropriate range of values for the incidence rate, and finally to the risk (defined as the probability of including one infected individual) associated with groups of different sizes. The method is summarized in four equations that can be implemented in a spreadsheet or using a handheld calculator. We discuss limitations of the method and provide an appendix describing the underlying mathematical models.
As part of an overall examination of nuclear weapons in post–Cold War crises, this case study examines the role of nuclear weapons in tensions between the United States and Russia during the invasion of Ukraine’s Crimean Peninsula. The initial success of Russian efforts to coerce Ukraine away from association with the European Union (EU) triggered a popular protest movement that led to the removal of Ukrainian president Viktor Yanukovych from office. Faced with a new, pro-Western government in Kiev, Russia immediately moved to invade the Crimean Peninsula. As international tensions rose, both the United States and the EU sought to maintain Ukraine’s territorial integrity through diplomatic and economic means. However, Russia did not accept US nonintervention as a given and sought to deter action to reverse the invasion by brandishing its nonstrategic nuclear arsenal through nuclear messaging, allusions to nuclear first-use policies, drawing of nuclear red lines, and the maneuver of dual-use platforms onto the occupied Crimean Peninsula. By examining the roles that nuclear weapons and their characteristics played throughout the crisis, this case study points to potentially important variables for consideration in future academic studies, and sounds the warning for policy makers on how Russia might leverage its nonstrategic nuclear arsenal in future confrontations.
The Johns Hopkins University Applied Physics Laboratory commissioned “Measure Twice, Cut Once: Assessing Some China–US Technology Connections,” a series of papers from experts in specific technology areas to explore the advisability and potential consequences of decoupling.
In each of these areas, the authors have explored the feasibility and desirability of increased technological separation and offered their thoughts on a possible path forward. The authors all recognize the real risks presented by aggressive, and frequently illegal, Chinese attempts to achieve superiority in critical technologies. However, the project also represents a reality check regarding the feasibility and potential downsides of broadly severing technology ties with China.
The project was led by former Secretary of the Navy Richard Danzig, initially in partnership with Avril Haines, former Deputy National Security Advisor. This compilation of papers was authored by experts from across the nation, and the views of the authors are their own.
In the information age, the Chinese People’s Liberation Army (PLA) believe that success in combat will be realized by winning a struggle for information superiority in the operational batttlespace. China’s informationized warfare strategy and information-centric operational concepts are central to how the PLA will generate combat power. These South China Sea (SCS) military capability (MILCAP) studies provide a survey of military technologies and systems on Chinese-claimed island-reefs in the disputed Spratly Islands. The relative compactness of China’s SCS outposts makes them an attractive case study of PLA military capabilities. Each island-reef and its associated military base facilities may be captured in a single commercial satellite image. An examination of capabilities on China’s island-reefs reveals the PLA’s informationized warfare strategy and the military’s designs on generating what the Chinese call “information power.” The SCS MILCAP series is organized around different categories of information power capabilities, from reconnaissance to communications to hardened infrastructure. Kinetic effects will remain an important component of PLA operational design. However, any challenger to Chinese military capabilities in the SCS must first account for and target the very core of the PLA’s informationized warfare strategy—its information power.
What if North Korea were to actually use one or more nuclear weapons? How should the United States respond? The singularly important US prewar objective is to deter nuclear war, but once nuclear weapons have been unleashed, this objective will immediately become moot. US post-nuclear-attack imperatives will likely include (1) physically preventing further use of nuclear weapons by North Korea; (2) cognitively dissuading further North Korean nuclear use; (3) convincing other adversaries that nuclear use is a horrendous idea; (4) allaying allies’ concerns about extended deterrence; (5) satisfying domestic political demands; (6) conforming to international law; and (7) last, and quite possibly least, restoring the nuclear taboo. We address each of these imperatives in turn. Our goal is not to determine the “correct” response to North Korean nuclear first use but rather to identify the principal considerations involved in each of these imperatives. Fulfilling all these diverse imperatives in any particular scenario is highly improbable, so we also briefly address the relative priorities among several of them. We conclude with a discussion of the roles of the research and analysis community, the public, and political and military elites who may find themselves in positions of advising the president in a future nuclear crisis.
Nuclear war is clearly a global catastrophic risk, but it is not an existential risk as is sometimes carelessly claimed. Unfortunately, the consequence and likelihood components of the risk of nuclear war are both highly uncertain. In particular, for nuclear wars that include targeting of multiple cities, nuclear winter may result in more fatalities across the globe than the better-understood effects of blast, prompt radiation, and fallout. Electromagnetic pulse effects, which could range from minor electrical disturbances to the complete collapse of the electric grid, are similarly highly uncertain. Nuclear war likelihood assessments are largely based on intuition, and they span the spectrum from zero to certainty. Notwithstanding these profound uncertainties, we must manage the risk of nuclear war with the knowledge we have. Benefit-cost analysis and other structured analytic methods applied to evaluate risk mitigation measures must acknowledge that we often do not even know whether many proposed approaches (e.g., reducing nuclear arsenals) will have a net positive or negative effect. Multidisciplinary studies are needed to better understand the consequences and likelihood of nuclear war and the complex relationship between these two components of risk, and to predict both the direction and magnitude of risk mitigation approaches.
First published in Journal of Benefit-Cost Analysis, Volume 10, Issue 2
The objective of this workshop, funded by the Defense Threat Reduction Agency (DTRA) through the Project on Advanced Systems and Concepts for Countering Weapons of Mass Destruction, was to address issues associated with responding to the first use of nuclear weapons by North Korea, with an emphasis on restoring the taboo against nuclear use. The Johns Hopkins University Applied Physics Laboratory conducted the workshop on April 23–24, 2019, to bring together thought leaders from a variety of fields, including norm theory and practice, nuclear strategy, and Northeast Asia. Workshop participants considered scenarios involving North Korean nuclear first use and developed and analyzed options for responding to that first use. The workshop concluded with discussions of key questions. Through workshop contributions and post-workshop deliberations, we developed recommendations for DTRA, US Strategic Command, the intelligence community, and the Office of the Secretary of Defense. Primary among these recommendations is that all these organizations take restoration of the nuclear taboo seriously as a US objective after an adversary’s nuclear first use and that they conduct appropriate analyses and planning in advance to provide the president with effective nonnuclear retaliatory options that could reduce the severity and duration of damage to the taboo.
This study assesses game theory’s potential to contribute to understanding the North Korean nuclear crisis. Previous APL work suggests that game theory provides a useful framework for formulating and analyzing multilateral nuclear stability issues; however, it seldom provides unique policy-relevant insights. For it to do so, game theorists must work closely with policy and subject matter experts. Thus, APL invited game theorists and nuclear and regional experts to (1) discuss the strengths and limitations of game theory applied to the North Korean nuclear crisis; (2) address the policy community’s skepticism about game theoretic analyses; and (3) explore mechanisms for collaboration. We found that game theory, when correctly applied, is a rigorous framework for understanding interactions in international conflicts; however, its predictive capability is limited. Misguided expectations have led to skepticism about its utility. While collaboration between these communities might increase understanding of nuclear crisis dynamics, obstacles include resolving motivational and communication issues, regulating expectations, and avoiding improper applications of game theory.
We establish that the US system for nuclear deterrence is a complex system in the formal sense, that nuclear deterrence must be regarded as a system-level function, and that the consequence of this is that there is the possibility of system-level failures not obviously connected to any component failures. These are emergent properties not predictable from an understanding of each of its components and interactions that may be candidates for Taleb’s black swan events. To understand the potential risk of failure of the US nuclear deterrence system as it exists in the United States and in the larger context of multiple state actors, it is necessary to understand the potential interactions of components and command authority. For the analyst, this means constructing models that attempt to capture the non-linearities of interactions, the existence of which is increasingly apparent.
The binomial distribution is widely used across many different disciplines. In cases where data can be represented with a binomial distribution, an estimate for the binomial distribution parameter (for the probability of success) is often produced. However, uncertainty surrounding this estimate is only sometimes reported, partly due to the opacity of the various methods available for determining this uncertainty. Failing to appropriately analyze uncertainty can lead to erroneous, or at least incomplete, conclusions. Here, we explore both Bayesian and frequentist methods for quantifying uncertainty in the binomial distribution parameter, and discuss each method's various advantages and limitations. Our work is motivated by nuclear crisis outcome data. While nuclear crises have been studied to determine the likelihood of the nuclear-superior, compared to the nuclear-inferior, state winning in a crisis, there is great uncertainty in these estimates for the probability of winning. We demonstrate methods that appropriately quantify such uncertainty and use nuclear crisis outcome data to illustrate applications of the methods we present, as well as to demonstrate insights that can be provided by explicitly analyzing uncertainty.
The United States has a nuclear triad consisting of ballistic missile submarines, land-based intercontinental ballistic missiles, B-52 bombers, and B-2 bombers. At one time, it also had thousands of nonstrategic nuclear weapons (NSNWs) that were not covered by any treaties until the Intermediate-Range Nuclear Forces (INF) Treaty banned several types of US and Soviet weapons in 1987. Today, US NSNWs are limited to unguided bombs on non-stealthy short-range fighters at several bases in NATO countries. Russia, by contrast, has a much larger inventory of NSNWs and is modernizing them. China also has NSNWs, and North Korea either already poses, or soon will pose, a nuclear threat in the western Pacific. This growing asymmetry in NSNWs may pose a threat to NATO and to US allies in the western Pacific. The United States needs to devote more attention to this situation, considering improvements to its NSNWs along with other measures that might help mitigate these asymmetries, such as improved defenses against small nuclear attacks. The United States also needs to consider options for modifications to the INF Treaty in lieu of complete withdrawal.
During World War II, international threats and national goals were clear. That clarity continued, albeit to a lesser degree, throughout the Cold War and into the new century, with the United States as the world’s preeminent superpower and leader in defense, technology, and economic might. Today’s world is a different place, and the need for a clear picture of it is critical. This paper helps to crystallize that picture by first identifying premises that served processes, institutions, and strategies from World War II through the Cold War, seeking to comprehend our inherited predispositions as predicate for rethinking them. It then identifies changes that undermine these premises. To forge new premises, the authors specify foundational American strengths that must be protected and expanded amid and despite these changes. Finally, the authors suggest premises for a new age of strategic thought. This paper does not recommend a new national security strategy. Instead, it serves as a necessary preface to such a strategy by articulating how our national strengths and weaknesses must be understood as foundations for American security and by showing how the premises that have guided us from World War II to the present must be modified for the future.
Power companies and US government agencies have an unprecedented opportunity to strengthen preparedness for cyber and physical attacks on the electric grid. In December 2015, Congress authorized the secretary of energy to issue emergency orders to grid operators to protect and restore grid reliability in grid security emergencies. These orders could help sustain electric service to military bases and other vital facilities. However, unless the electric industry and the Department of Energy partner to develop emergency orders before attacks occur, they will miss significant opportunities to help deter and (if necessary) defeat such attacks.
This report examines design requirements for emergency orders. It analyzes decisions criteria that the president might use to determine that a grid security emergency exists, which is a prerequisite for issuing emergency orders. The report assesses possible orders for three phases of grid security emergencies: when attacks are imminent; when attacks are under way; and when utilities begin to restore power, potentially while facing follow-on attacks. It identifies recommendations to strengthen emergency communications plans and capabilities. It concludes by identifying areas for further analysis, including measures to bolster cross-sector resilience between the grid and the other infrastructure systems and sectors on which it depends.
Scenarios that describe cyber attacks on the electric grid consistently predict significant disruptions to the economy and citizens’ quality of life. Most offer anecdotal support for the grid’s vulnerability to such an attack and assume the existence of an adversary with the means and intent to launch the attack. An estimate of risk, however, also requires knowledge of the probability that an attack of the required caliber can be successfully executed. Quantifying the probability of success for a large-scale cyber attack is hard because of the lack of precedent and the changing nature of threats and vulnerabilities. This report uses the grid cyber attack scenario outlined in the Lloyd’s of London and the University of Cambridge Centre for Risk Studies 2015 report, Business Blackout, to demonstrate how a probabilistic assessment could be used to quantify the likelihood that the scenario could occur. The analysis is subject to the limitations inherent in any probabilistic risk assessment; however, it serves to highlight some interesting phenomena that deserve further investigation, such as the importance of some individual power plants in influencing the adversary’s probability of success. In addition, it describes feasible data collection that would materially increase the validity of such an analysis.
There is a need in the scientific, technology, and financial communities for economic forecast models that improve the ability to estimate new or immature technology developments. Engineering design or conceptual technical requirements with which to drive parametric estimates or translate analogous system costs are often unavailable in early life-cycle stages of technology development. The limited availability of comparable systems, design or performance parameters, and other objective bases makes it challenging to produce even rough-order-of-magnitude cost and schedule models. Often compounding the limited availability of information is the proprietary or protected nature of technology research and development efforts and related intellectual property. Consequently, executives, program managers, budget analysts, and other decision-makers must often rely on historical information from related yet often very dissimilar systems or the subjective opinion or “best guess” of subject-matter experts. This paper first investigates available industry modeling concepts, frameworks, models, and tools. A representative project data set is identified and selected for cost and schedule modeling, leveraging macro-parameters generally known or available in early technology development stages. Several model forms are then created and evaluated based on key performance criteria.
The cyber attack on Sony Pictures Entertainment in late 2014 began as a public embarrassment for an American company and ultimately led to a highly visible response from national leaders after the purported criminals threatened 9/11-style attacks on movie theaters showing the film. The cyber attack was triggered by the imminent release of The Interview, a comedy by Sony Pictures Entertainment in which an American talk show host and his producer are recruited by the Central Intelligence Agency to travel to North Korea and assassinate Kim Jong-un, the country's supreme leader. The cyber attack was discussed everywhere: from supermarket tabloids, delighting in gossip-rich leaked emails, to official statements by leaders in the US government, including President Obama.
The events surrounding the attack and attribution provide insight into the effects of government and private-sector actions on the perception of a cyber event among the public, the effect of attribution on the behavior of the attackers, and possible motives for North Korea's high-profile cyber actions. The incident also illuminates the role of multi-domain deterrence to respond to attacks in the cyber domain.
The United States has a nuclear triad that consists of ballistic missile submarines (SSBNs), land-based intercontinental ballistic missiles (ICBMs), B-52 bombers, and B-2 bombers. The non-stealthy B-52 relies entirely on the AGM-86 Air-Launched Cruise Missile (ALCM) in the nuclear role, whereas the B-2 penetrates enemy airspace to drop unguided bombs. The current SSBNs, ICBMs, ALCMs, and B61 bombs will all reach end of life between the early 2020s (for the B61 bomb) and the early 2040s, whereas the B-52 should last until at least 2045 and the B-2 should last until at least 2050. Programs are well under way for a new SSBN, a new bomber, and the B61-12 guided bomb, whereas programs have just started for a new ICBM and for the Long-Range Standoff (LRSO) cruise missile that is planned to replace the AGM-86. Among these programs, the LRSO is the most controversial and (probably) the one at most risk of cancellation. Analyses presented here suggest that LRSO is critical to the future of the triad and should not be terminated or delayed.
The world has changed greatly since the last Nuclear Posture Review (NPR) was formulated only some seven years ago, and US nuclear policy must be responsive to these changes. In particular, the 2010 NPR assessed that Russia and the United States are “no longer adversaries” and their “prospects for military confrontation have declined dramatically.” This assessment has been directly confronted in the intervening years by Russia’s steady stream of nuclear saber rattling, its naked aggression in Ukraine, and its palpably bellicose willingness to project its military might beyond Europe. Moreover, large asymmetries in nonstrategic nuclear capabilities, coupled with Russia’s escalate-to-deescalate doctrine and earlier abandonment of its commitment to a no-first-use nuclear posture, suggest that Russia views nuclear weapons as useful instruments of intimidation and warfighting. We argue that Russian first-use of nuclear weapons in Europe should be addressed as a high priority nuclear threat in the trump administration’s NPR. We address the roles of allied nonstrategic nuclear weapons in Europe; the challenges posed by asymmetries in numbers, systems, and doctrine; and NATO’s potential response options. Looking forward, we anticipate key nuclear policy decisions the trump administration must face, and suggest that the issue of nonstrategic nuclear weapons, heretofore treated as a nearly irrelevant epicycle orbiting the greater strategic nuclear issues at play, can no longer be neglected.