104,99 €
Covers critical infrastructure protection, providing a rigorous treatment of risk, resilience, complex adaptive systems, and sector dependence Wide in scope, this classroom-tested book is the only one to emphasize a scientific approach to protecting the key infrastructures components of a nation. It analyzes the complex network of entities that make up a nation's infrastructure, and identifies vulnerabilities and risks in various sectors by combining network science, complexity theory, risk analysis, and modeling and simulation. This approach reduces the complex problem of protecting water supplies, energy pipelines, telecommunication stations, power grid, and Internet and Web networks to a much simpler problem of protecting a few critical nodes. The new third edition of Critical Infrastructure Protection in Homeland Security: Defending a Networked Nation incorporates a broader selection of ideas and sectors than the previous book. Divided into three sections, the first part looks at the historical origins of homeland security and critical infrastructure, and emphasizes current policy. The second examines theory and foundations, highlighting risk and resilience in the context of complexity theory, network science, and the prevailing theories of catastrophe. The last part covers the individual sectors, including communications, internet, cyber threats, information technology, social networks, SCADA, water and water treatment, energy, and more. * Covers theories of catastrophes, details of how sectors work, and how to deal with the problem of critical infrastructure protection's enormity and complexity * Places great emphasis on computer security and whole-community response * Includes PowerPoint slides for use by lecturers, as well as an instructor's guide with answers to exercises * Offers five robust appendices that augment the non-mathematical chapters with more rigorous explanations and mathematics Critical Infrastructure Protection in Homeland Security, Third Edition is an important book for upper-division undergraduates and first-year graduate students in political science, history, public administration, and computer technology. It will also be of great interest to professional security experts and policymakers.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 1399
Veröffentlichungsjahr: 2019
COVER
FOREWORD BY SEN. MARK WARNER
1 NEW RULES
2 COMBATING MISINFORMATION AND DISINFORMATION
3 HARDEN NETWORKS, WEAPONS SYSTEMS, AND IOT (INTERNET OF THINGS)
4 REALIGN DEFENSE SPENDING
5 PRESIDENTIAL/GOVERNMENT LEADERSHIP
FOREWORD BY PROF. ANDREW ODLYZKO
1 INTRODUCTION
2 THE TECHNOLOGISTS' SKEWED VIEW OF THE WORLD
3 THE STATE OF THE WORLD
4 THREATS
5 HUMANSPACE VERSUS CYBERSPACE
6 PLUSES AND MINUSES OF NATURAL STUPIDITY
7 SMART AND STUPID CRIMINALS
8 THE CYBERCRIME ECOSYSTEM
9 BLACK SWANS VERSUS LONG TAILS
10 NEGLECT OF OBVIOUS SECURITY MEASURES
11 SURVEILLANCE CAPITALISM AND LOSS OF PRIVACY
12 THE DECEPTIVELY TRANSPARENT BUT OPAQUE WORLD
13 THE VIRTUES OF MESSINESS
14 SPEED, REACH, AND COST FOR OFFENSE AND DEFENSE
15 THE INCREASINGLY AMBIGUOUS NOTION OF SECURITY
16 CONCLUSIONS
ACKNOWLEDGMENTS
REFERENCES
PREFACE
HOW TO USE THIS BOOK
ABOUT THE COMPANION WEBSITE
1 ORIGINS OF CRITICAL INFRASTRUCTURE PROTECTION
1.1 RECOGNITION
1.2 NATURAL DISASTER RECOVERY
1.3 DEFINITIONAL PHASE
1.4 PUBLIC–PRIVATE COOPERATION
1.5 FEDERALISM: WHOLE OF GOVERNMENT
1.6 RISE OF THE FRAMEWORK
1.7 IMPLEMENTING A RISK STRATEGY
1.8 ANALYSIS
1.9 EXERCISES
1.10 DISCUSSIONS
REFERENCES
2 RISK STRATEGIES
2.1 EXPECTED UTILITY THEORY
2.2 PRA AND FAULT TREES
2.3 MRBA AND RESOURCE ALLOCATION
2.4 CYBER KILL CHAINS ARE FAULT TREES
2.5 PRA IN THE SUPPLY CHAIN
2.6 PROTECTION VERSUS RESPONSE
2.7 THREAT IS AN OUTPUT
2.8 BAYESIAN BELIEF NETWORKS
2.9 RISK OF A NATURAL DISASTER
2.10 EARTHQUAKES
2.11 BLACK SWANS AND RISK
2.12 BLACK SWAN FLOODS
2.13 ARE NATURAL DISASTERS GETTING WORSE?
2.14 BLACK SWAN AL QAEDA ATTACKS
2.15 BLACK SWAN PANDEMIC
2.16 RISK AND RESILIENCE
2.17 EXERCISES
2.18 DISCUSSIONS
REFERENCES
3 THEORIES OF CATASTROPHE
3.1 NORMAL ACCIDENT THEORY (NAT)
3.2 BLOCKS AND SPRINGS
3.3 BAK'S PUNCTUATED EQUILIBRIUM THEORY
3.4 TRAGEDY OF THE COMMONS (TOC)
3.5 THE US ELECTRIC POWER GRID
3.6 PARADOX OF ENRICHMENT (POE)
3.7 COMPETITIVE EXCLUSION PRINCIPLE (CEP)
3.8 PARADOX OF REDUNDANCY (POR)
3.9 RESILIENCE OF COMPLEX INFRASTRUCTURE SYSTEMS
3.10 EMERGENCE
3.11 EXERCISES
3.12 DISCUSSIONS
REFERENCES
4 COMPLEX CIKR SYSTEMS
4.1 CIKR AS NETWORKS
4.2 CASCADING CIKR SYSTEMS
4.3 NETWORK FLOW RISK AND RESILIENCE
4.4 PARADOX OF REDUNDANCY
4.5 NETWORK RISK
4.6 THE FRAGILITY FRAMEWORK
4.7 EXERCISES
4.8 DISCUSSIONS
REFERENCES
5 COMMUNICATIONS
5.1 EARLY YEARS
5.2 REGULATORY STRUCTURE
5.3 THE ARCHITECTURE OF THE COMMUNICATIONS SECTOR
5.4 RISK AND RESILIENCE ANALYSIS
5.5 CELLULAR NETWORK THREATS
5.6 ANALYSIS
5.7 EXERCISES
5.8 DISCUSSIONS
REFERENCES
6 INTERNET
6.1 THE INTERNET MONOCULTURE
6.2 ANALYZING THE AUTONOMOUS SYSTEM NETWORK
6.3 THE RFC PROCESS
6.4 THE INTERNET OF THINGS (IOT)
6.5 COMMERCIALIZATION
6.6 THE WORLD WIDE WEB
6.7 INTERNET GOVERNANCE
6.8 INTERNATIONALIZATION
6.9 REGULATION AND BALKANIZATION
6.10 EXERCISES
6.11 DISCUSSIONS
7 CYBER THREATS
7.1 THREAT SURFACE
7.2 BASIC VULNERABILITIES
7.3 BOTNETS
7.4 CYBER RISK ANALYSIS
7.5 CYBER INFRASTRUCTURE RISK
7.6 ANALYSIS
7.7 EXERCISES
7.8 DISCUSSIONS
REFERENCES
8 INFORMATION TECHNOLOGY (IT)
8.1 PRINCIPLES OF IT SECURITY
8.2 ENTERPRISE SYSTEMS
8.3 CYBER DEFENSE
8.4 BASICS OF ENCRYPTION
8.5 ASYMMETRIC ENCRYPTION
8.6 PKI
8.7 COUNTERMEASURES
8.8 EXERCISES
8.9 DISCUSSIONS
REFERENCES
9 HACKING SOCIAL NETWORKS
9.1 WEB 2.0 AND THE SOCIAL NETWORK
9.2 SOCIAL NETWORKS AMPLIFY MEMES
9.3 TOPOLOGY MATTERS
9.4 COMPUTATIONAL PROPAGANDA
9.5 THE ECHO CHAMBER
9.6 BIG DATA ANALYTICS
9.7 GDPR
9.8 SOCIAL NETWORK RESILIENCE
9.9 THE REGULATED WEB
9.10 EXERCISES
9.11 DISCUSSIONS
REFERENCES
10 SUPERVISORY CONTROL AND DATA ACQUISITION
10.1 WHAT IS SCADA?
10.2 SCADA VERSUS ENTERPRISE COMPUTING DIFFERENCES
10.3 COMMON THREATS
10.4 WHO IS IN CHARGE?
10.5 SCADA EVERYWHERE
10.6 SCADA RISK ANALYSIS
10.7 NIST‐CSF
10.8 SFPUC SCADA REDUNDANCY
10.9 INDUSTRIAL CONTROL OF POWER PLANTS
10.10 ANALYSIS
10.11 EXERCISES
10.12 DISCUSSIONS
11 WATER AND WATER TREATMENT
11.1 FROM GERMS TO TERRORISTS
11.2 FOUNDATIONS: SDWA OF 1974
11.3 THE BIOTERRORISM ACT OF 2002
11.4 THE ARCHITECTURE OF WATER SYSTEMS
11.5 THE HETCH HETCHY NETWORK
11.6 RISK ANALYSIS
11.7 HETCH HETCHY INVESTMENT STRATEGIES
11.8 HETCH HETCHY THREAT ANALYSIS
11.9 ANALYSIS
11.10 EXERCISES
11.11 DISCUSSIONS
REFERENCES
12 ENERGY
12.1 ENERGY FUNDAMENTALS
12.2 REGULATORY STRUCTURE OF THE ENERGY SECTOR
12.3 INTERDEPENDENT COAL
12.4 THE RISE OF OIL AND THE AUTOMOBILE
12.5 ENERGY SUPPLY CHAINS
12.6 THE CRITICAL GULF OF MEXICO CLUSTER
12.7 THREAT ANALYSIS OF THE GULF OF MEXICO SUPPLY CHAIN
12.8 NETWORK ANALYSIS OF THE GULF OF MEXICO SUPPLY CHAIN
12.9 THE KEYSTONEXL PIPELINE CONTROVERSY
12.10 THE NATURAL GAS SUPPLY CHAIN
12.11 ANALYSIS
12.12 EXERCISES
12.13 DISCUSSIONS
REFERENCES
13 ELECTRIC POWER
13.1 THE GRID
13.2 FROM DEATH RAYS TO VERTICAL INTEGRATION
13.3 OUT OF ORDERS 888 AND 889 COMES CHAOS
13.4 THE NORTH AMERICAN GRID
13.5 ANATOMY OF A BLACKOUT
13.6 THREAT ANALYSIS
13.7 RISK ANALYSIS
13.8 ANALYSIS OF WECC96
13.9 ANALYSIS
13.10 EXERCISES
13.11 DISCUSSIONS
REFERENCES
14 HEALTHCARE AND PUBLIC HEALTH
14.1 THE SECTOR PLAN
14.2 ROEMER'S MODEL
14.3 THE COMPLEXITY OF PUBLIC HEALTH
14.4 RISK ANALYSIS OF HPH SECTOR
14.5 BIOTERRORISM
14.6 EPIDEMIOLOGY
14.7 PREDICTING PANDEMICS
14.8 BIO‐SURVEILLANCE
14.9 NETWORK PANDEMICS
14.10 THE WORLD TRAVEL NETWORK
14.11 EXERCISES
14.12 DISCUSSIONS
REFERENCES
15 TRANSPORTATION
15.1 TRANSPORTATION UNDER TRANSFORMATION
15.2 THE ROAD TO PROSPERITY
15.3 RAIL
15.4 AIR
15.5 AIRPORT GAMES
15.6 EXERCISES
15.7 DISCUSSIONS
REFERENCES
16 SUPPLY CHAINS
16.1 THE WORLD IS FLAT, BUT TILTED
16.2 THE WORLD TRADE WEB
16.3 RISK ASSESSMENT
16.4 ANALYSIS
16.5 EXERCISES
16.6 DISCUSSIONS
REFERENCES
17 BANKING AND FINANCE
17.1 THE FINANCIAL SYSTEM
17.2 FINANCIAL NETWORKS
17.3 VIRTUAL CURRENCY
17.4 HACKING THE FINANCIAL NETWORK
17.5 HOT MONEY
17.6 THE END OF STIMULUS?
17.7 FRACTAL MARKETS
17.8 EXERCISES
17.9 DISCUSSIONS
REFERENCES
18 STRATEGIES FOR A NETWORKED NATION
18.1 WHOLE OF GOVERNMENT
18.2 RISK AND RESILIENCE
18.3 COMPLEX AND EMERGENT CIKR
18.4 COMMUNICATIONS AND THE INTERNET
18.5 INFORMATION TECHNOLOGY (IT)
18.6 SURVEILLANCE CAPITALISM
18.7 INDUSTRIAL CONTROL SYSTEMS
18.8 ENERGY AND POWER
18.9 GLOBAL PANDEMICS
18.10 TRANSPORTATION AND SUPPLY CHAINS
18.11 BANKING AND FINANCE
18.12 DISCUSSIONS
APPENDIX A: MATH: PROBABILITY PRIMER
A.1 A PRIORI PROBABILITY
A.2 A POSTERIORI PROBABILITY
A.3 RANDOM NETWORKS
A.4 CONDITIONAL PROBABILITY
A.5 BAYESIAN NETWORKS
A.6 BAYESIAN REASONING
REFERENCES
FURTHER READING
APPENDIX B: MATH: RISK AND RESILIENCE
B.1 EXPECTED UTILITY THEORY
B.2 BAYESIAN ESTIMATION
B.3 EXCEEDENCE AND PML RISK
B.4 NETWORK RISK
B.5 MODEL‐BASED RISK ANALYSIS (MBRA)
REFERENCES
APPENDIX C: MATH: SPECTRAL RADIUS
C.1 NETWORK AS MATRIX
C.2 MATRIX DIAGONALIZATION
C.3 RELATIONSHIP TO RISK AND RESILIENCE
REFERENCE
APPENDIX D: MATH: TRAGEDY OF THE COMMONS
D.1 LOTKA–VOLTERRA MODEL
D.2 HOPF–HOLLING MODEL
APPENDIX E: MATH: THE DES AND RSA ALGORITHM
E.1 DES ENCRYPTION
E.2 RSA ENCRYPTION
APPENDIX F: GLOSSARY
INDEX
END USER LICENSE AGREEMENT
Chapter 1
TABLE 1.1 The basic critical infrastructure sectors (8) defined by PDD‐63 (1998)...
TABLE 1.2 CIKR (14) as of 2003
TABLE 1.3 CIKR (16) and responsibilities as defined by HSPD‐7
TABLE 1.4 CIKR as defined by PPD‐21 (2013)
TABLE 1.5 Selection of CIKR assets
TABLE 1.6 Some common high‐ and low‐risk hazards are classified according to thei...
Chapter 2
TABLE 2.1 The parameters for the threat–asset pairs in Figure 2.1 include T, V, a...
TABLE 2.2 Inputs and analysis of the hospital fault tree in Figure 2.3 shows inve...
TABLE 2.3 Inputs and resource allocation calculations minimization risk by reduci...
TABLE 2.4 Hazards may be classified as high‐ or low‐risk hazards by the fractal d...
Chapter 4
TABLE 4.1 Cascade failure results for increases in parameters spectral radius, vu...
TABLE 4.2 Constants for the 11 networks analyzed in Figure 4.7 and critical point...
TABLE 4.3 Fractal dimension of cascades in the SEPTA network
TABLE 4.4 Change in allocation is abrupt when the attacker budget exceeds $7098
TABLE 4.5 Example of scoring the Hodges Fragility Framework
Chapter 5
TABLE 5.1 Evolution of cellular telephony
Chapter 6
TABLE 6.1 A sampling of the most common usernames and passwords that are rarely c...
Chapter 7
TABLE 7.1 Input values for the general fault tree model of Figure 7.7 are used to...
Chapter 8
TABLE 8.1 EXCLUSIVE‐OR logic: Only one of the two operands can be 1 in order to p...
TABLE 8.2 Sample countermeasures to vulnerabilities typical found in enterprise s...
Chapter 9
TABLE 9.1 Resilience score for social network analysis using the Hodges conceptua...
Chapter 10
TABLE 10.1 Checklist for the protection step of the NIST‐CSF is composed of acces...
TABLE 10.2 Input and output values used in Figure 10.6 to evaluate risk reduction...
TABLE 10.3 Department of Energy's 21 steps to SCADA security are easy to follow...
Chapter 11
TABLE 11.1 These contaminants are regulated per the 1962 Public Health Service st...
TABLE 11.2 The 1996 SDWA amendments require US EPA to enforce the following
TABLE 11.3 Input data for the top five assets in the MBRA network model of Hetch ...
TABLE 11.4 Hypothetical input values for the fault tree of Figure 11.7 indicates ...
TABLE 11.5 Most of $600 million is allocated to harden pipelines against earthqua...
Chapter 12
TABLE 12.1 Energy density of familiar fuels: Uranium‐235 is off the scale, while ...
TABLE 12.2 Rank, name, and location of the most productive refineries in the Gulf...
TABLE 12.3 Allocation of $300 million reduces energy fault tree risk from $4484 m...
TABLE 12.4 The largest NG pipelines are more than 10,000 miles long and move over...
Chapter 14
TABLE 14.1 An investment of $100 million reduces risk to less than 10% and focuse...
TABLE 14.2 The major causes of death as reported on a death certificate are not t...
Chapter 15
TABLE 15.1 Commuter/light rail systems are very fragile, even though they have lo...
TABLE 15.2 There have been 582 airliner deaths due to suicide, sabotage, and terr...
Chapter 16
TABLE 16.1 Top 10 countries in the WTW ranked by network properties generally ind...
Appendix A
TABLE A.1 There are 16 possible combinations of H and T in four coin tosses
TABLE A.2 Pascal's triangle yields the probability of 0, 1, 2, 3, 4, … H's in fou...
Appendix B
TABLE B.1 This spreadsheet of 2007 raw data and EP calculations was used to produ...
TABLE B.2 Spreadsheet containing SARS data and calculations needed to convert tim...
Chapter 1
FIGURE 1.1 The structure of the cybersecurity and infrastructure protection ...
FIGURE 1.2 Two frameworks for qualitative risk management—one for physical a...
FIGURE 1.3 A resilience triangle is formed by a collapse followed by recover...
FIGURE 1.4 Some hazards are low risk and some are high risk. Risk increases ...
Chapter 2
FIGURE 2.1 The fundamental unit of risk is the threat–asset pair as illustra...
FIGURE 2.2 Risk and return on investment decline after a modest investment i...
FIGURE 2.3 AND fault tree for the hypothetical hospital power supply: redund...
FIGURE 2.4 Return on investment analysis for the redundant hospital power mo...
FIGURE 2.5 A fault tree model of a single trusted path from a user to data c...
FIGURE 2.6 A multipath kill chain connects paths with an OR gate and replica...
FIGURE 2.7 MBRA's network model of the hospital redundant power source requi...
FIGURE 2.8 Bayesian network model of threat consists of three propositions: ...
FIGURE 2.9 The Gutenberg–Richter law for relating the number of earthquakes ...
FIGURE 2.10 Long‐tailed exceedence probability curves: one for high‐risk haz...
FIGURE 2.11 Exceedence probability—in
x
–
y
coordinates and log(
x
)–log(
y
) coor...
FIGURE 2.12 Exceedence probability of financial consequences from US natural...
FIGURE 2.13 Fractal dimension of the Levy flight of SARS as it spread to 29 ...
FIGURE 2.14 The exceedence probability distributions for consequence, distan...
Chapter 3
FIGURE 3.1 Consequence exceedence for all known nuclear power accidents (195...
FIGURE 3.2 The six‐block apparatus connected by springs illustrates how a si...
FIGURE 3.3 Bak's sand pile experiment simulates a landslide, but it has beco...
FIGURE 3.4 State space diagrams of three tragedy of the commons scenarios: s...
FIGURE 3.5 State space diagrams reveal a system's sustainability or lack of ...
FIGURE 3.6 Two state space diagrams of the electric power grid tragedy of co...
FIGURE 3.7 Economic declines and time to next decline are punctuated events ...
FIGURE 3.8 State space diagram of homeownership from the Great Depression to...
FIGURE 3.9 Probability estimates of the time between subsequent events are b...
FIGURE 3.10 Most CIKR sectors form industrial commons around value chains as...
Chapter 4
FIGURE 4.1 The complex CIKR system network of the Washington, DC, drinking w...
FIGURE 4.2 CIKR networks are typically random, scale‐free, or clustered. (a)...
FIGURE 4.3 Forest fires are simulated as a grid containing cells that are ei...
FIGURE 4.4 Results of the forest fire simulation for two scenarios: (a) freq...
FIGURE 4.5 This cascade frequency heat map of a small microgrid section of t...
FIGURE 4.6 Exceedence probability for small vulnerability network nodes obey...
FIGURE 4.7 Fractal dimension of cascade failures in a CIKR network declines ...
FIGURE 4.8 The normalized cascade resilience metric separates networks into ...
FIGURE 4.9 Heat map display results of analysis of the LA Metro railway netw...
FIGURE 4.10 Risk dramatically increases when critical nodes are attacked in ...
FIGURE 4.11 An illustration of Braess's paradox in network flows: removing a...
FIGURE 4.12 Robustness is a measure of the fraction of nodes and links that ...
FIGURE 4.13 MBRA model of the crude oil transmission pipeline system in the ...
FIGURE 4.14 PML risk profiles for the crude oil CIKR network of Figure 4.13 ...
FIGURE 4.15 MBRA model of the flow network of Figure 4.11 layered on a map, ...
FIGURE 4.16 The diminishing returns curves for prevention, response, and thr...
FIGURE 4.17 Risk ranking and calculated results after allocation of $1500 to...
FIGURE 4.18 The Hodges Fragility Conceptual Framework defines fragility in t...
FIGURE 4.19 The Hodges framework is isomorphic to a fault tree where dimensi...
Chapter 5
FIGURE 5.1 The structure of US governmental agencies involved in the regulat...
FIGURE 5.2 The 1996 Telecommunications Act deregulated the communications se...
FIGURE 5.3 The architecture of the communications sector includes landlines,...
FIGURE 5.4 Major carrier hotels within the United States form the backbone o...
FIGURE 5.5 Human‐caused hazard fault tree risk model for the communications ...
FIGURE 5.6 Telephone outages reported by Kuhn indicate the US communications...
FIGURE 5.7 Critical factor analysis and resilience of the top 30 telecom rou...
FIGURE 5.8 The major submarine cables circling the globe carry most of the c...
Chapter 6
FIGURE 6.1 The DNS is a tree‐shaped network of Internet usernames and number...
FIGURE 6.2 Example of an email message as it travels through the Internet.
FIGURE 6.3 The top 500 autonomous system servers in the Internet, circa 2004...
FIGURE 6.4 Core of the AS500 Internet circa 2004 contained the most connecte...
FIGURE 6.5 The ISO/OSI standard protocol stack for the Internet consists of ...
FIGURE 6.6 The core of Internet governance circa 2010 included W3C, ISOC, IE...
Chapter 7
FIGURE 7.1 The exceedence probability and PML risk profile of a sample of 21...
FIGURE 7.2 The concept of an attack surface or surfaces where computer secur...
FIGURE 7.3 This “family tree” shows the heritage of some of the malware deri...
FIGURE 7.4 TCP/IP is intrinsically flawed because of its simplicity.
FIGURE 7.5 Buffer overflow exploits enter a computer as data but overwrites ...
FIGURE 7.6 DDoS recruits innocent zombies to participate in a denial‐of‐serv...
FIGURE 7.7 Common threat–asset pairs in a general fault tree of cyber threat...
FIGURE 7.8 ROI analysis of the general fault tree model for the values shown...
FIGURE 7.9 AS2000: The top 2000 autonomous systems of the global Internet fo...
FIGURE 7.10 Result of four simulations shows hub hardening to be 3.7 times a...
Chapter 8
FIGURE 8.1 The architecture of a trusted computing base (TCB) consists of se...
FIGURE 8.2 A detailed view of a typical TCB and the security technologies in...
FIGURE 8.3 RSA encryption produces a seemingly random stream of codewords fr...
FIGURE 8.4 Screen display showing “randomized” ciphertext output from the RS...
FIGURE 8.5 An Example of PKI: Alice sends Msg_to_Bob to Bob. Issuing certifi...
Chapter 9
FIGURE 9.1 Forms of harassment experienced on Wikimedia.
FIGURE 9.2 The nonuniform structure of a piece of the Facebook.com social ne...
FIGURE 9.3 The topological structure of a social network determines how fast...
FIGURE 9.4 Filter bubbles are managed by software that rewards users with co...
FIGURE 9.5 A typical CNN contains many layers of hidden neurons and robust i...
FIGURE 9.6 The Hodges conceptual framework adapted to social network resilie...
Chapter 10
FIGURE 10.1 This diagram shows a simple view of a typical SCADA system and i...
FIGURE 10.2 The 132‐mile north‐to‐south Pacific Pipeline delivers crude oil ...
FIGURE 10.3 Most SCADA systems are open to access by a number of partners an...
FIGURE 10.4 General fault tree of possible vulnerabilities of a typical SCAD...
FIGURE 10.5 Major nodes of the SFPUC water SCADA network are connected by la...
FIGURE 10.6 Fault tree of the three most critical nodes with hypothetical th...
FIGURE 10.7 An industrial control system for control of power plants connect...
FIGURE 10.8 PML risk due to cascading before and after optimal allocation of...
FIGURE 10.9 Top plot is damage versus recovery time. Bottom plot is percent ...
FIGURE 10.10 Top plot is node cascade resilience before an investment of $50...
Chapter 11
FIGURE 11.1 Typical community water systems are vertically integrated natura...
FIGURE 11.2 The Hetch Hetchy water and power supply network starts in the He...
FIGURE 11.3 Flow resilience analysis shows the Hetch Hetchy network is modes...
FIGURE 11.4 Simulation of downstream cascades caused by a random failure of ...
FIGURE 11.5 Risk versus investment in both vulnerability reduction (preventi...
FIGURE 11.6 Results of Stackelberg optimization of defender and attacker all...
FIGURE 11.7 A fault tree model of Hetch Hetchy critical nodes identifies the...
FIGURE 11.8 Risk reduction and vulnerability versus investment shows that ri...
Chapter 12
FIGURE 12.1 Most energy consumed in the US United States comes from coal, na...
FIGURE 12.2 Past, present, and future energy consumption in the United State...
FIGURE 12.3 BNSF railway network connects the largest source of coal in the ...
FIGURE 12.4 Simplified supply chain model of NG and petroleum energy infrast...
FIGURE 12.5 Petroleum Administration for Defense Districts (PADDs) are still...
FIGURE 12.6 Pipelines are “multiplexed” by combining different products on t...
FIGURE 12.7 Refined petroleum products flow from the Gulf of Mexico oil fiel...
FIGURE 12.8 Equipment failure and human error are the top refinery hazards....
FIGURE 12.9 Equipment failure, corrosion, operational accidents, and aging a...
FIGURE 12.10 Lightning/static electricity, operational/reaction accidents, a...
FIGURE 12.11 Linden Station is the focal point of the massive storage facili...
FIGURE 12.12 General fault tree model of probable threat–asset pairs in the ...
FIGURE 12.13 The core of the Gulf of Mexico oil field network is centered on...
FIGURE 12.14 The 10,600 mile Transco Pipeline carries liquid natural gas (LN...
Chapter 13
FIGURE 13.1 The five major components of the power grid are generation, tran...
FIGURE 13.2 Power outages in the United States increased in number and size ...
FIGURE 13.3 Vertically integrated power companies have been broken into olig...
FIGURE 13.4 Many layers of regulation shape the Grid: Congress, FERC, NERC, ...
FIGURE 13.5 Major power grid interconnect components, reliability coordinato...
FIGURE 13.6 The Grid must obey Kirchhoff's law by adjusting inflows to equal...
FIGURE 13.7 The 2003 Blackout started with reports from FirstEnergy (dark), ...
FIGURE 13.8 MBRA fault tree analysis of red team threats invests most in pro...
FIGURE 13.9 Threat analysis risk declines faster than vulnerability, because...
FIGURE 13.10 The Western power grid of 1996 (WECC96) contained a number of c...
FIGURE 13.11 The WECC96 power grid's blocking nodes hold the grid together a...
Chapter 14
FIGURE 14.1 Roemer's model is a simplified model of the public health sector...
FIGURE 14.2 State, local, and federal government and healthcare spending is ...
FIGURE 14.3 Fault tree analysis of public health sector risk focuses on thre...
FIGURE 14.4 A relatively small investment dramatically reduces risk because ...
FIGURE 14.5 The number of infected people due to the SARS pandemic obeyed a ...
FIGURE 14.6 The SARS pandemic formed a social network that spanned 29 countr...
FIGURE 14.7 The spread of SARS within the United States and globally was abr...
FIGURE 14.8 State diagrams of SIR and SIS epidemics differ—under certain con...
FIGURE 14.9 The four first‐order blocking nodes of the 9/11 terrorist social...
FIGURE 14.10 The OpenFlight100 network shown here with
n
= 100 airports and
Chapter 15
FIGURE 15.1 The Department of Transportation is the sector‐specific agency f...
FIGURE 15.2 The intermodal transportation system of the United States consis...
FIGURE 15.3 The cost of building and maintaining the 50+ year‐old Interstate...
FIGURE 15.4 The Interstate Highway System forms a transportation network con...
FIGURE 15.5 Railroad technology was one of the earliest examples of technolo...
FIGURE 15.6 Bay Area Rapid Transit (BART) is a light rail commuter train ser...
FIGURE 15.7 The casualty rate among all airlines carrying 14 or more people ...
FIGURE 15.8 The primary airports and routes of the US domestic market form a...
FIGURE 15.9 GUARDS is randomizing software that uses game theory to allocate...
FIGURE 15.10 Bayesian belief network for airport security uses evidence to d...
Chapter 16
FIGURE 16.1 The largest ports in the world are mostly located in Asia. (a) L...
FIGURE 16.2 A typical optimized supply chain contains bottlenecks known as h...
FIGURE 16.3 The GDP of the United States, the European Union (EU), and China...
FIGURE 16.4 The World Trade Web is a network of 178 nations (nodes) and thei...
FIGURE 16.5 Risk ranking in MSRAM considers vulnerability and consequence.
FIGURE 16.6 PROTECT uses Stackelberg competition to allocate limited resourc...
Chapter 17
FIGURE 17.1 The Federal Reserve (Fed) and US Department of Treasury (Treasur...
FIGURE 17.2 Money flows from the Treasury to the Fed and then on to the Fede...
FIGURE 17.3 The Fed balance sheet expanded at an alarming rate following the...
FIGURE 17.4 The 3‐D secure payment protocol enables online e‐commerce using ...
FIGURE 17.5 Two different forms of payment using virtual currency. (a) PayPa...
FIGURE 17.6 Transactions in the UCSD study span the globe.
FIGURE 17.7 Hot money flows from low interest rate economies to high interes...
FIGURE 17.8 Simulation of the nonlinear effects of economic expansion on GDP...
FIGURE 17.9 The S&P 500 index crashed in October 1987. Log‐periodic waves ar...
FIGURE 17.10 The S&P 500 index from July 1985 to October 26, 1987, is shown ...
Chapter 18
FIGURE 18.1 Rewiring the network of Figure 10.7 to reduce PML risk improves ...
Appendix A
FIGURE A.1 Probability distribution for the number of heads occurring in fou...
FIGURE A.2 A random network contains nodes with a binomial degree distributi...
FIGURE A.3 A tabular model of Pascal's ideal world of mathematical precision...
FIGURE A.4 A Bayesian network (BN) model of threat. (a–c) Prior beliefs of a...
Appendix B
FIGURE B.1 The computation tree of the BN in Chapter 2 contains all possible...
FIGURE B.2 Comparison of probability distribution, ranked exceedence probabi...
FIGURE B.3 Forest fires in Southern California are high risk according to th...
FIGURE B.4 Number of people who contracted SARS during an epidemic that star...
FIGURE B.5 Models of threat, vulnerability, and consequence used by MBRA att...
FIGURE B.6 Network model of major Amtrak routes in the United States. Nodes ...
FIGURE B.7 Exceedence probability of cascade episodes is obtained by simulat...
Appendix C
FIGURE C.1 The flow network of Chapter 4 as it is represented in a computer ...
Cover
Table of Contents
Begin Reading
iii
iv
xv
xvi
xvii
xviii
xix
xx
xxi
xxii
xxiii
xxiv
xxv
xxvi
xxvii
xxviii
xxix
xxx
xxxi
xxxiii
xxxiv
xxxv
xxxvii
xxxviii
xxxix
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
Third Edition
TED G. LEWIS
This third edition first published 2020© 2020 John Wiley & Sons, Inc.
Edition HistoryJohn Wiley & Sons Inc. (1e, 2006)John Wiley & Sons Inc. (2e, 2015)
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Ted G. Lewis to be identified as the author of this work has been asserted in accordance with law.
Registered OfficeJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USA
Editorial Office111 River Street, Hoboken, NJ 07030, USA
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of WarrantyIn view of ongoing research, equipment modifications, changes in governmental regulations, and the constant flow of information relating to the use of experimental reagents, equipment, and devices, the reader is urged to review and evaluate the information provided in the package insert or instructions for each chemical, piece of equipment, reagent, or device for, among other things, any changes in the instructions or indication of usage and for added warnings and precautions. While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging‐in‐Publication Data
Names: Lewis, T. G. (Theodore Gyle), 1941– author.Title: Critical infrastructure protection in homeland security : defending a networked nation / Theodore Gyle Lewis.Description: Third edition. | Hoboken, NJ : John Wiley & Sons Inc., 2020. | Includes bibliographical references and index.Identifiers: LCCN 2019032791 (print) | LCCN 2019032792 (ebook) | ISBN 9781119614531 (hardback) | ISBN 9781119614555 (adobe pdf) | ISBN 9781119614562 (epub)Subjects: LCSH: Computer networks–Security measures–United States. | Computer security–United States–Planning. | Terrorism–United States–Prevention. | Terrorism–Government policy–United States. | Civil defense–United States. | Public utilities–Protection–United States.Classification: LCC QA76.9.A25 L5 2020 (print) | LCC QA76.9.A25 (ebook) | DDC 005.8–dc23LC record available at https://lccn.loc.gov/2019032791LC ebook record available at https://lccn.loc.gov/2019032792
Cover design by WileyCover image: © SERGII IAREMENKO/SCIENCE PHOTO LIBRARY/Getty Images
“Today, December 7th, is an auspicious date in our history. We remember Pearl Harbor as the first foreign attack on US soil in modern history. Unfortunately, we also remember Pearl Harbor as a major intelligence failure. As Vice Chairman of the Intel Committee, I've spent the better part of the last two years on an investigation connected to America's most recent intelligence failure. It was also a failure of imagination—a failure to identify Russia's broader strategy to interfere in our elections. Our federal government and institutions were caught flat‐footed in 2016, and our social media companies failed to anticipate how their platforms could be manipulated and misused by Russian operatives. Frankly, we should have seen it coming.
Over the last two decades, adversary nations like Russia have developed a radically different conception of information security—one that spans cyber warfare and information operations. I fear that we have entered a new era of nation‐state conflict: one in which a nation projects strength less through traditional military hardware and more through cyber and information warfare. For the better part of two decades, this was a domain where we thought we had superiority. The thinking was that our cyber capabilities were unmatched. Our supposed superiority allowed us to write the rules.
This confidence appears to have blinded us to three important developments: First, we are under attack, and candidly, we have been for many years. Our adversaries and their proxies are carrying out cyber attacks at every level of our society. We've seen state‐sponsored or sanctioned attacks on healthcare systems, energy infrastructure, and our financial system. We are witnessing constant intrusions into federal networks. We're seeing regular attempts to access parts of our critical infrastructure and hold them ransom. Last year, we saw global ransomware attacks increase by 93%. Denial‐of‐service attacks increased by 91%. According to some estimates, cyber attacks and cybercrime account for up to $175 billion in economic and intellectual property loss per year in North America. Globally, that number is nearly $600 billion. Typically, our adversaries aren't using highly sophisticated tools. They are attacking opportunistically using phishing techniques and rattling unlocked doors. This has all been happening under our noses. The effects have been devastating, yet the attackers have faced few, if any, consequences.
Second, in many ways, we brought this on ourselves. We live in a society that is becoming more and more dependent on products and networks that are under constant attack. Yet the level of security we accept in commercial technology products is unacceptably low—particularly when it comes to rapidly growing Internet of Things. This problem is only compounded by our society‐wide failure to promote cyber hygiene. It is an outrage that more digital services from email to online banking don't come with default two‐factor authentication. And it is totally unacceptable that large enterprises—including federal agencies—aren't using the available tools.
Lastly, we have failed to recognize that our adversaries are working with a totally different playbook. Countries like Russia are increasingly merging traditional cyber attacks with information operations. This emerging brand of hybrid cyber warfare exploits our greatest strengths—our openness and free flow of ideas. Unfortunately, we are just now waking up to it. Looking back, the signs should have been obvious. Twenty years ago, Sergei Lavrov, then serving as Russia's UN Ambassador, advanced a draft resolution dealing with cyber and prohibiting particularly dangerous forms of information weapons. We can debate the sincerity of Russia's draft resolution, but in hindsight, the premise of this resolution is striking. Specifically, the Russians saw traditional cyber warfare and cyber espionage as interlinked with information operations. It's true that, as recently as 2016, Russia continued to use these two vectors—cyber and information operations—on separate tracks. But there is no doubt that Putin now sees the full potential of hybrid cyber operations. By contrast, the United States spent two decades treating information operations and traditional information security as distinct domains. Increasingly, we treated info operations as quaint and outmoded. Just a year after Lavrov introduced that resolution, the United States eliminated the United States Information Agency, relegating counterpropaganda and information operations to a lower tier of foreign policy. In the two decades that followed, the United States embraced the Internet revolution as inherently democratizing. We ignored the warning signs outside the bubble of Western democracies.
The naïveté of US policy makers extended not just to Russia, but to China as well. Recall when President Clinton warned China that attempts to police the Internet would be like nailing Jell‐O to the wall. In fact, China has been wildly successful at harnessing the economic benefits of the Internet in the absence of political freedom. China's doctrine of cyber sovereignty is the idea that a state has the absolute right to control information within its border. This takes the form of censorship, disinformation, and social control. It also takes the form of traditional computer network exploitation. And China has developed a powerful cyber and information affairs bureaucracy with broad authority to enforce this doctrine. We see indications of the Chinese approach in their successful efforts to recruit Western companies to their information control efforts. Just look at Google's recent push to develop a censored version of its search engine for China. Today, China's cyber and censorship infrastructure is the envy of authoritarian regimes around the world. China is now exporting both its technology and its cyber‐sovereignty doctrine to countries like Venezuela, Ethiopia, and Pakistan. With the export of these tools and ideas, and with countries like North Korea and Iran copying Russia's disinformation playbook, these challenges will only get worse. And yet as a country we remain complacent.
Despite a flurry of strategy documents from the White House and DoD, the federal government is still not sufficiently organized or resourced to tackle this hybrid threat. We have no White House cyber czar, nor cyber bureau or senior cyber coordinator at the State Department. And we still have insufficient capacity at State and DHS when it comes to cybersecurity and disinformation. Our Global Engagement Center at the State Department is not sufficiently equipped to counter propaganda from our adversaries. And the White House has still not clarified roles and responsibilities for cyber across the US government. While some in the private sector have begun to grapple with the challenge, many more remain resistant to the changes and regulations needed. And the American people—still not fully aware of the threat—have not internalized the lessons of the last few years. We have a long way to go on cyber hygiene and online media consumption habits. Let me be clear: Congress does not have its act together either. We have no cyber committee. Cyber crosses numerous committee jurisdictions frequently hindering our ability to get ahead of the problem.
It's even worse in the area of misinformation/disinformation. The dangers are only growing as new technologies such as Deepfakes audio and video manipulation that can literally put words into someone's mouth are commercialized. The truth is, we are becoming ever more dependent on software. But at the same time, we are treating cybersecurity, network resiliency, and data reliability as afterthoughts. And these vulnerabilities will only continue to grow as our so‐called real economy becomes increasingly inseparable from the digital economy.
If we're going to turn this around, we need not just a whole‐of‐government approach; we need a whole‐of‐society cyber doctrine. So what would a US cyber doctrine look like? It's not enough to simply improve the security of our infrastructure, computer systems, and data. We must also deal with adversaries who are using American technologies to exploit our freedom and openness and attack our democracy.
Let me lay out five recommendations:
First, we need to develop new rules and norms for the use of cyber and information operations. We also need to better enforce existing norms. And most importantly, we need to do this on an international scale. We need to develop shared strategies with our allies that will strengthen these norms. When possible, we need to get our adversaries to buy into these norms as well. The truth is, our adversaries continue to believe that there won't be any consequences for their actions. In the post‐9/11 national security environment, we spent tremendous energy combating terrorism and rogue states. But frankly, we've allowed some of our near‐peer adversaries to operate with relative impunity when they attack the United States in the digital domain. There have been some reports in the press about the United States supposedly punching back at second‐tier adversaries on occasion. But we've largely avoided this with Russia and China out of a fear of escalation. If a cyber attack shuts down Moscow for 24 h with no power, that's a problem. If someone were to shut down New York for 24 h, that would be a global crisis. As a result, for Russia and China, it's pretty much been open season on the United States. That has to end.
We need to have a national conversation about the defensive and offensive tools we are willing to use to respond to the ongoing threats we face. In short, we need to start holding our adversaries accountable. Failing to articulate a clear set of expectations about when and where we will respond to cyber attacks is not just bad policy, but it is downright dangerous. We are allowing other nations to write the playbook on cyber norms. Part of this is the result of US inaction: from the late 1990s into the early 2000s, the United States was a consistent dissenting voice in UN meetings where cyber norms were proposed. In part, this reflected our aversion to piecemeal approaches to cybersecurity. But it also reflected a view that we didn't want to be bound by lesser powers. In 2015, there was a major effort at the UN—including the United States—to agree to principles of state behavior in cyberspace. We saw some international consensus around protecting critical infrastructure and investigating and mitigating cybercrime. Unfortunately, those 2015 principles at the UN failed to address economic espionage. And even the 2015 US–China cyber espionage deal was insufficient. And in 2017, disagreements between the United States, China, and Russia at the UN led to a deadlock on the question of how international law should apply to cyber conflicts. Little progress has been made since then.
It's true that some folks in the private sector and the NGO space have stepped up. Look at Microsoft's Digital Geneva Convention. Look at the recent Paris Call for Trust and Security in Cyberspace—signed by 57 nations, but not by the United States. This is yet another example of the United States stepping back on the world stage, with countries like France filling the void.
Recently, the US government and the State Department, in particular, have renewed efforts to advance a norms discussion. These efforts must be elevated and strengthened. But norms on traditional cyber attacks alone are not enough. We also need to bring information operations into the debate.
This includes building support for rules that address the Internet's potential for censorship and repression. We need to present alternatives that explicitly embrace a free and open Internet. And we need that responsibility to extend not only to government, but to the private sector as well. We need multilateral agreements with key allies, just like we've done with international treaties on biological and chemical weapons. That discussion needs to address mutual defense commitments.
We should be linking consensus principles of state behavior in cyberspace, explicitly, with deterrence and enforcement policies. US policy makers, with allies, should predetermine responses for potential targets, perpetrators, and severity of attack. That means clearly and publicly linking actions and countermeasures to specific provocations. That could mean sanctions, export controls, or indictments. It could even include military action or other responses. Now, we should be realistic about the limits of norms in shaping behavior.
Let's not kid ourselves: in the short term, a nation like Russia that routinely ignores global norms is not going to make an about‐face in the cyber domain. This should not deter us, but it should give us a more realistic set of expectations for how quickly we can expect to see results. But the stronger we make these alliances, the more teeth we can apply to these norms, and the more countries we can recruit to them, the more effective these efforts will be at disciplining the behavior of Russia, China, and other adversaries.
My second recommendation is: we need a society‐wide effort to combat misinformation and disinformation, particularly on social media. My eyes were really opened to this through the Intel Committee's Russia investigation. Everyone on the Committee agrees that this linkage between cyber threats and disinformation is a serious challenge—especially on social media. In some ways, this was a whole new world for the IC. It is now clear that foreign agents used American‐made social media to spread misinformation and hijack our civil discourse.
Let's recap. The Russian playbook included:
Cyber penetrations of our election infrastructure;
Hacks and weaponized leaks;
Amplification of divisive, pro‐Kremlin messages via social media;
Overt propaganda;
Funding and supporting extreme candidates or parties; and
Misinformation, disinformation, and actual fake news.
The goal was, and is, to undermine our faith in the facts—our faith in the news media—and our faith in the democratic process. This is an ongoing threat, and not just to the United States. We've also seen these tools used against other Western democracies. We've seen them used to incite racial and ethnic violence in places like Myanmar. This threat is particularly serious in countries with low media literacy. In many ways, social media IS the Internet in some of these countries. So, what do we do? How do we combat this threat? We can start by recognizing that this is a truly global problem. A twenty‐first‐century cyber and misinformation doctrine should lean into our alliances with NATO countries and other allies who share our values.
Earlier this year, Senator Rubio and I brought together a group of 12 parliamentarians from our NATO allies at the Atlantic Council. We held a summit focused on combating Russian election interference. Ironically, this was the very same day that our President stood on stage and kowtowed to Vladimir Putin in Helsinki. Meanwhile, we were working with our NATO allies to develop a road map for increased cooperation and information sharing to counter Russian cyber and misinformation/disinformation aggression. In many cases, these countries are further along in educating their populations about the threat of misinformation and disinformation.
Last month, I met with the Prime Minister of Finland. As he put it, the Finns have been dealing with Russian misinformation and disinformation for over a 100 years. Finland is one of the most resilient countries when it comes to countering this threat from its neighbor to the east. Why is that? Again, it is their whole‐of‐society approach. It relies on a free press that maintains trust through strong self‐regulatory mechanisms and journalistic standards. It places limits on social media platforms. They also have a vibrant digital civics initiative.
Finland's approach also depends on national leadership that stays true to its values—even in the midst of contested elections and its own brand of partisan politics. Here in the United States, it will take all of us—the private sector, the government, including Congress, and the American people—to deal with this new and evolving threat.
In terms of the private sector, the major platform companies—like Twitter and Facebook, but also Reddit, YouTube, and Tumblr—aren't doing nearly enough to prevent their platforms from becoming petri dishes for Russian disinformation and propaganda.
I don't have any interest in regulating these companies into oblivion. But as these companies have grown from dorm‐room startups into media behemoths, they have not acknowledged that their power comes with great responsibility. Recall that immediately following the election, Mr. Zuckerberg publicly ridiculed the idea that Russia had influenced the US election via Facebook as a “pretty crazy idea.”
Now, I don't have all the solutions. But I expect these platforms to work with us in Congress so that together we can take steps to protect the integrity of our elections and our civil discourse in the future. Companies like Facebook and Twitter have taken some helpful voluntary steps—but we need to see much more from them.
That's going to require investments in people and technology to help identify misinformation before it spreads widely. I've put forward a white paper, which lays out a number of policy proposals for addressing this: we can start with greater transparency. For example, I think folks have the right to know if information they're receiving is coming from a human or a bot. I've also put forward legislation called the Honest Ads Act that would require greater transparency and disclosure for online political ads.
Companies should also have a duty to identify inauthentic accounts—if someone says they're Mark from Alexandria but it's actually Boris in St. Petersburg, I think people have a right to know. We also need to put in place some consequences for social media platforms that continue to propagate truly defamatory content. I think platforms should give greater access to academics and other independent analysts studying social trends like disinformation. We also discuss in that paper a number of other ideas in the white paper around privacy, price transparency, and data portability. These are ideas intended to spark a discussion, and we need social media companies' input. But we're moving quickly to the point where Congress will have no choice but to act on its own. One thing is clear: the wild west days of social media are coming to an end.
Third, we need to harden the security of our computer networks, weapons systems, and IoT devices. Many of the responsibilities for cyber and misinformation/disinformation will fall on the government. But our nation's strategic response must also include greater vigilance by the private sector, which has frequently resisted efforts to improve the security of its products.
For over a decade, the United States thought it could set a light‐touch standard for global data protection by avoiding any legislation. While regulation can have costs, what we've learned is that US inaction can also have costs—as other jurisdictions leap ahead with more stringent privacy and data protections.
We see this with GDPR, where the US failure to adopt reasonable data protection and privacy rules left the field open for much stricter European rules. These standards are now being adopted by major economies like Brazil, India, and Kenya. More broadly, we need to think about a software liability regime that drives the market toward more secure development across the entire product lifecycle. But nowhere is the need for private sector responsibility greater than the Internet of Things. General Ashley, Director of the DIA, has described insecure IoT and mobile devices as the most important emerging cyber threat to our national security.
As a first step, we should use the purchasing power of the federal government to require that devices meet minimum security standards. I have legislation with Senator Cory Gardner to do this. At least at the federal level, we need to make sure that these devices are patchable. We need to make sure they don't have hard‐coded passwords that cannot be changed. We need standards to make sure they're free of known security vulnerabilities. And on a broader level, public companies should have at least one board member who can understand and model cyber risk.
Another area I've been working on is trying to impose some financial penalties on companies like Equifax who fail to take the necessary steps to secure their systems from cyber intrusions. Unfortunately, even in areas where we would expect a higher level of security and cyber hygiene, we find these same problems. In October, a GAO report found that “nearly all” of our new weapons systems under development are vulnerable to attack.
Earlier this year, we successfully included language in the NDAA requiring cyber vulnerability assessments for weapons systems, which hopefully should help correct this. The Pentagon has also taken steps recently to make cybersecurity a greater priority within DoD, but frankly we face some serious workforce challenges in recruiting and retaining the top cyber professionals who have plenty of lucrative opportunities in the private sector.
This is a good segue to my fourth recommendation: realigning our defense spending priorities. The US military budget is more than $700 billion, while Russia spends roughly $70 billion a year on their military. The United States is spending it mostly on conventional weapons and personnel. By contrast, Russia devotes a much greater proportion of its budget to cyber and other tools of asymmetric warfare like disinformation. Russia has come to the realization that they can't afford to keep up with us in terms of traditional defense spending. But when it comes to cyber, misinformation, and disinformation, candidly Russia is already a peer adversary.
A matter of fact, if you add up everything Russia spent on election interference in 2016 and double it, that's still less than the cost of one new F‐35. I worry we may be buying the world's best twentieth‐century military hardware without giving enough thought to the twenty‐first‐century threats we face. And it's a similar story with China. China spends roughly $200 billion on defense, but it spends a greater proportion on cyber misinformation and disinformation. If you look at the delta between what we're spending and what China is spending on defense, they're investing more in AI, quantum computing, 5G, and other twenty‐first‐century technologies. Frankly, they are outpacing us by orders of magnitude. We need to realign our priorities while we still can. Some of DoD's budget should be redirected toward cyber defense. But we also need efforts at other agencies, including R&D funding for quantum computing and AI, as well as investments in cyber technology and cyber workforce development.
The final point is that we desperately need strong federal and presidential leadership for any US cyber doctrine to be truly effective. Because this challenge literally touches every aspect of our society, we need presidential leadership and a senior coordinating official to head the interagency process on this issue.
It's true there are men and women within DoD, DHS, and other agencies who are working hard to defend the United States from cyber attacks. But only the President can mobilize the whole‐of‐society strategy we need. I do want to acknowledge some positive steps that have been taken in recent months.
The White House and DoD have released two important strategic documents on cyber strategy that move us in the right direction. I also welcome the delegation of authorities to defend and deter cyber attacks below the presidential level. This has allowed for quicker responses and greater interagency coordination. But frankly, these efforts are inadequate.
In the most recent NDAA, Congress attempted to establish a more aggressive posture on US cybersecurity policy. This includes the potential use of offensive cyber capabilities to deter and respond to cyber attacks against US interests—as well as authorization to combat info operations. It also grants the President and Defense Secretary authority to direct Cyber Command to respond and deter “an active, systematic, and ongoing campaign of attacks” carried out by Russia, China, North Korea, and Iran. These powers, if used correctly, are important components of a cyber doctrine. But by definition they require thoughtful, decisive leadership at the top.
I'll leave you with some final thoughts. More broadly, we need a coherent strategy for how to deal with the hybrid approach of our adversaries. Let me be clear about what I'm not saying: I am not advocating that the United States mimic the approach of Russia and China—the idea that states have a sovereign right to control or censor information within their borders. Frankly, that vision is incompatible with our American values and our Constitution.
What I am saying is that we need to confront the fact that our adversaries have an approach that considers control of information an essential component of their overall strategies. We have not only failed to recognize this situation, but over the last two decades we have tended to minimize the dangers of information operations. The truth is, the 2016 presidential election served as a wake‐up call in the use of cyber attacks and information operations.
People keep warning of a “digital Pearl Harbor” or a “digital 9/11” as if there will be a single extraordinary event that will force us to action on these issues. But I have news for you: we are already living these events. They're happening every day. Look at the 2017 NotPetya attack. In the United States, we treated this as a one‐day news story, but the global cost of that one attack is over $10 billion. This is the most costly and devastating cybersecurity incident in history, and most Americans have no idea. But the true costs of