143,99 €
An up-to-date and comprehensive treatment of biosurveillance techniques With the worldwide awareness of bioterrorism and drug-resistant infectious diseases, the need for surveillance systems to accurately detect emerging epidemicsis essential for maintaining global safety. Responding to these issues, Disease Surveillance brings together fifteen eminent researchers in the fields of medicine, epidemiology, biostatistics, and medical informatics to define the necessary elements of an effective disease surveillance program, including research, development, implementation, and operations. The surveillance systems and techniques presented in the text are designed to best utilize modern technology, manage emerging public health threats, and adapt to environmental changes. Following a historical overview detailing the need for disease surveillance systems, the text is divided into the following three parts: * Part One sets forth the informatics knowledge needed to implement a disease surveillance system, including a discussion of data sources currently used in syndromic surveillance systems. * Part Two provides case studies of modern disease surveillance systems, including cases that highlight implementation and operational difficulties as well as the successes experienced by health departments in the United States, Canada, Europe, and Asia. * Part Three addresses practical issues concerning the evaluation of disease surveillance systems and the education of future informatics and disease surveillance practitioners. It also assesses how future technology will shape the field of disease surveillance. This book's multidisciplinary approach is ideal for public health professionals who need to understand all the facets within a disease surveillance program and implement the technology needed to support surveillance activities. An outline of the components needed for a successful disease surveillance system combined with extensive use of case studies makes this book well-suited as a textbook for public health informatics courses
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 774
Veröffentlichungsjahr: 2012
Contents
Cover
Half Title page
Title page
Copyright page
Dedication
Contributors
Preface
Acknowledgments
Chapter 1: Disease Surveillance, a Public Health Priority
1.1 Introduction
1.2 The Emerging Role of Informatics in Public Health Practice
1.3 Early Use of Technology for Public Health Practice
1.4 Guiding Principles for Development of Public Health Applications
1.5 Information Requirements for Automated Disease Surveillance
1.6 Historical Impact of Infectious Disease Outbreaks
1.7 Disease as a Weapon
1.8 Modern Disease Surveillance Applications
1.9 Summary
References
Part I: System Design and Implementation
Chapter 2: Understanding the Data: Health Indicators in Disease Surveillance
2.1 Data Source Concepts
2.2 Data from Pharmacy Chains
2.3 Data from EMS and 911
2.4 Data from Telephone Triage Hotlines
2.5 Data from School Absenteeism and School Nurses
2.6 Data from Hospital Visits
2.7 Data from Physicians’ Office Visits
2.8 Laboratories Role in Pre-Diagnostic Surveillance
2.9 Other Health Indicator Data
2.10 Data Source Evaluation
2.11 Study Questions
References
Chapter 3: Obtaining the Data
3.1 Introduction to Data Collection and Archiving
3.2 Obtaining Access to Surveillance Data
3.3 The Role of Standards in Data Exchange
3.4 Establishing the Data Feeds
3.5 Study Questions
References
Chapter 4: Alerting Algorithms for Biosurveillance
4.1 Need for Statistical Alerting Algorithms
4.2 Features of Alerting Algorithms
4.3 Outbreak Detection as a Signal-to-Noise Problem
4.4 Algorithms Based on Time-Series Data
4.5 Spatiotemporal Alerting Methods
4.6 Methods Considering Multiple Data Sources
4.7 Study Questions
References
Chapter 5: Putting It Together: The Biosurveillance Information System
5.1 Introduction
5.2 System Architectures for Disease Surveillance
5.3 Databases
5.4 Web Applications
5.5 Implementing Syndromic Grouping
5.6 Implementing Detectors
5.7 Visualization in a Disease Surveillance Application
5.8 Communication Among Surveillance Users
5.9 Security
5.10 System Administration
5.11 Summary
5.12 Study Questions
References
Part II: Case Studies
Chapter 6: Modern Disease Surveillance Systems in Public Health Practice
6.1 Public Health Surveillance Requirements
6.2 Identification of Abnormal Health Conditions
6.3 Utility of Disease Surveillance Systems at the Local Level
6.4 Electronic Biosurveillance at the National Level
6.5 Summary
6.6 Study Questions
References
Chapter 7: Canadian Applications of Modern Surveillance Informatics
7.1 Introduction: Disease Surveillance in Canada
7.2 Disease-Specific Surveillance Enabled Through Technology
7.3 Real-Time Syndromic Surveillance
7.4 Conclusions: Public Health Surveillance
7.5 Study Questions
References
Chapter 8: Case Study: Use of Tele-health Data for Syndromic Surveillance in England and Wales
8.1 Introduction
8.2 System Design and Epidemiological Considerations
8.3 Results from The NHS Direct Syndromic Surveillance System
8.4 Adding Value to the Surveillance Data
8.5 Conclusions
8.6 Study Questions
References
Chapter 9: Surveillance for Emerging Infection Epidemics in Developing Countries: EWORS and Alerta DISAMAR
9.1 Improving Surveillance in Resource-Poor Settings
9.2 U.S. Military Overseas Public Health Capacity Building
9.3 Case Study 1: EWORS (Southeastern Asia and Peru)
9.4 Case Study 2: Alerta DISAMAR (Peru)
9.5 Conclusions
9.6 Study Questions
References
Part III: Evaluation, Education, and the Future
Chapter 10: Evaluating Automated Surveillance Systems
10.1 The Context of Evaluation
10.2 Defining the Evaluation
10.3 Identifying or Creating Evaluation Data
10.4 Applying Detection Algorithms and Response Protocols
10.5 Measuring Performance
10.6 Summary
10.7 Study Questions
References
Chapter 11: Educating the Workforce: Public Health Informatics Training
11.1 Competencies for Disease Surveillance
11.2 Professions of Disease Surveillance
11.3 Training Opportunities in Public Health Education
11.4 Informatics Training
11.5 Conclusions
11.6 Study Questions
References
Chapter 12: The Road Ahead: The Expanding Role of Informatics in Disease Surveillance
12.1 Introduction
12.2 Integration of Disease Surveillance Systems
12.3 Surveillance System Enhancements
12.4 Study Questions
References
Index
DISEASE SURVEILLANCE
THE WILEY BICENTENNIAL–KNOWLEDGE FOR GENERATIONS
Each generation has its unique needs and aspirations. When Charles Wiley first opened his small printing shop in lower Manhattan in 1807, it was a generation of boundless potential searching for an identity. And we were there, helping to define a new American literary tradition. Over half a century later, in the midst of the Second Industrial Revolution, it was a generation focused on building the future. Once again, we were there, supplying the critical scientific, technical, and engineering knowledge that helped frame the world. Throughout the 20th Century, and into the new millennium, nations began to reach out beyond their own borders and a new international community was born. Wiley was there, expanding its operations around the world to enable a global exchange of ideas, opinions, and know-how.
For 200 years, Wiley has been an integral part of each generation’s journey, enabling the flow of information and understanding necessary to meet their needs and fulfill their aspirations. Today, bold new technologies are changing the way we live and learn. Wiley will be there, providing you the must-have knowledge you need to imagine new worlds, new possibilities, and new opportunities.
Generations come and go, but you can always count on Wiley to provide you the knowledge you need, when and where you need it!
Copyright © 2007 by John Wiley & Sons, Inc. All rights reserved.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey.Published simultaneously in Canada.
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic format. For information about Wiley products, visit our web site at www.wiley.com.
Wiley Bicentennial Logo: Richard J. Pacifico
Library of Congress Cataloging-in-Publication Data:
Disease surveillance : a public health informatics approach / [edited by] Joseph S.Lombardo, David Buckeridge.p.; cm.Includes bibliographical references and index.ISBN 978-0-470-06812-0 (cloth : alk. paper)1. Public health surveillance. 2. Medical informatics. I. Lombardo, Joseph S., 1946– II. Buckeridge, David, 1970–[DNLM: 1. Population Surveillance—methods. 2. Public Health Informatics. WA 105D6117 2007]RA652.2.P82D57 2007362.1—dc 222006053118
To those public health workers who get the call at 5 P.M.on Friday afternoon and give freely of their own timeto protect the health of the populations they serve.It is hoped that the advance disease surveillance methodsdescribed in this book will help them to use their timeand talents more efficiently to accomplish their mission.
Contributors
JEFF ARAMINI, DVM, PHD, Public Health Agency of Canada, Guelph, Ontario, Canada
RAJ ASHAR, MA, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
STEVEN BABIN, MD, PHD, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
DAVID BLAZES, MD, MPH, U.S. Naval Medical Research Center Detachment, Lima, Peru
DAVID L. BUCKERIDGE, MD, PHD, McGill University, Montreal, Quebec, Canada
HOWARD BURKOM, PHD, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
JEAN-PAUL CHRETIEN, MD, PHD, Walter Reed Army Institute of Research, Silver Spring, Maryland, USA
JACQUELINE COBERLY, PHD, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
DUNCAN COOPER, BSC, MRES, Health Protection Agency, West Midlands, United Kingdom
R. LOREN ERICKSON, MD, DRPH, Walter Reed Army Institute of Research, Silver Spring, Maryland, USA
JONATHAN GLASS, MD, U.S. Naval Medical Research Center Detachment, Jakarta, Indonesia
SHILPA HAKRE, MPH, DRPH, Walter Reed Army Institute of Research, Silver Spring, Maryland, USA
SHERI HAPPEL LEWIS, MPH, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
LOGAN HAUENSTEIN, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
REKHA HOLTRY, MPH, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
HAROLD LEHMANN, MD, PHD, The Johns Hopkins University School of Medicine Baltimore, Maryland, USA
KATHY HURT-MULLEN, MPH, Montgomery County (Maryland) Department of Health and Human Services Rockville, Maryland, USA
JOSEPH S. LOMBARDO, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
WAYNE LOSCHEN, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
HAOBO MA, MD, MS, Science Applications International Corporation, Atlanta, Georgia, USA
STEVEN MAGRUDER, PHD, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
COLLEEN MARTIN, MSPH, Science Applications International Corporation, Atlanta, Georgia, USA
SHAMIR NIZAR MUKHI, MSC, Public Health Agency of Canada, Guelph, Ontario, Canada
CECILIA MUNDACA, MD, U.S. Naval Medical Research Center Detachment, Lima, Peru
DAVID ROSS, SCD, Public Health Informatics Institute (PHII), Decatur, Georgia, USA
MARVIN SIKES, MHA, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
CAROL SNIEGOSKI, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
NATHANIEL TABERNERO, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
MICHAEL W. THOMPSON, PHD, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
JEROME I. TOKARS, MD, MPH, Centers for Disease Control and Prevention (CDC), Atlanta, Georgia, USA
RICHARD WOJCIK, MS, The Johns Hopkins University Applied Physics Laboratory (JHU/APL), Laurel, Maryland, USA
Preface
During the last quarter of the twentieth century, countries with advanced healthcare systems felt comfortable in their ability to manage the spread of diseases that would have had high morbidity and mortality in earlier years. Smallpox had been eradicated and the memory of the Spanish Influenza outbreak of 1918 had faded. Toward the end of the twentieth and into the twenty-first century, the overuse of antibiotics, resulting in disease resistance, the rapid spread of HIV, and the increasing threat of bioterrorism were examples of public health issues that were beginning to increase pressure to enhance existing disease surveillance processes.
Among these concerns, the clandestine release of a deadly pathogen on an unsuspecting population maybe the most insidious public health threat. Most pathogens available as bioweapons can cause high mortality and could lead to the collapse of the healthcare delivery and emergency response systems in an area under attack. The contamination and the possible closure of major medical centers, even if only temporary, would have a serious impact on the health of the population. To mitigate the consequences of this type of public health event, an effective detection and treatment campaign must be launched early in the course of the outbreak.
Because of the threat of bioterrorism and the emergence of new infectious diseases, disease surveillance systems that utilize modern technology are becoming commonplace in public health agencies. The objective of this book is to present the components of an effective disease surveillance program that utilize modern technology. These components include the research, development, implementation, and operational strategies that are finding their way into successful practice.
Advanced disease surveillance systems automatically acquire, archive, process, and present data to the user. The development and maintenance of the systems require skilled personnel from the fields of medicine, epidemiology, biostatistics, and information technology. In addition, for the surveillance systems to be useful, they must adapt to the changing environment in which they operate and accommodate emerging public health requirements that were not conceived previously.
Research and innovation have led to the implementation of surveillance methods that would have been considered impossible or radical only a few years ago. For example, the case definitions or events under surveillance, which traditionally rely on diagnosis, have been altered in many systems to rely on less specific pre-diagonstic health indicators of syndromes. Correctly filtering data into syndromes or other categories for analysis requires knowledge of the underlying diseases and health-seeking behaviors of the population. Additionally, for analytical tools to have high specificity, they must take into account the normal range of all of the variables that comprise the background for the health indicators. Tools that fuse data and information from inhomogeneous indicators are necessary to provide decision-maker with comprehendable output. Similarly, information technologists must automate data ingestion and cleansing, optimize system architecture, and create user-friendly interfaces while meeting the challenge of using and customizing commercial, off-the-shelf products.
Users’ requirements must have a higher priority than solutions that are technologically exciting. Continuing dialogue must exist among the users and the multidisciplinary development team to establish an effective surveillance capability that fits within the environment where it will be deployed. Without close interaction between these groups, effective advanced disease surveillance will be compromised. Changes to traditional thinking have resulted in the implementation of improved methods that are more suited to meeting the current challenges facing health departments. Because it is difficult to anticipate future public health emergencies, a continuing adaptation is required to maintain satisfactory system performance.
The field of public health informatics is growing rapidly as applications of technology are being applied to permit health departments to recognize and manage disease in the populations they serve. This book is intended for use (1) as a textbook for public health informatics students, (2) as a reference for health departments that are exploring modern information technology to support their surveillance activities, and (3) as training material for workshops that are components of disease surveillance and public health conferences.
The contents of this book provide insight into not only the technology but also into the difficulties and the successes that the public health community has had with the implementation and operational use of advanced disease surveillance systems. Hence, chapter authors provide not only the views of academics and developers of the technology, but also of users from health departments in the United States, Canada, United Kingdom, South America, and Asia. This wide variety of perspectives will hopefully provide a broad and balanced treatment of issues related to developing and operating advanced surveillance systems.
The book is divided into three parts. Following an introductory chapter (Chapter 1), the first part (Chapters 2 through 5) presents the methods and technologies needed to implement a modern disease surveillance system, including the data sources currently being used in syndromic surveillance systems (Chapter 2); the mechanisms for the acquisition of data for surveillance systems (Chapter 3); an overview of analytical methods for the recognition of abnormal trends within the data captured for surveillance (Chapter 4); and some basics of systems architectures, text parsing, and data visualization techniques (Chapter 5).
The second part of the book (Chapters 6 through 9) is devoted to case studies of modern disease surveillance systems and provides examples of several implementations in the United States, Canada, Europe, and Asia. These chapters indicate the breadth of the techniques used across the globe in applications of modern technology to disease surveillance.
The third and last part of this book (Chapters 10 through 12) addresses practical questions regarding the evaluation of disease surveillance systems, education of future public health informatics personnel and disease surveillance practitioners, and a look to the future to consider how technology will continue to influence the practice of disease surveillance.
Joseph S. LombardoThe Johns Hopkins UniversityApplied Physics LaboratoryLaurel, Maryland, USA
David L. BuckeridgeMcGill UniversityMontreal, QuebecCanada
Acknowledgments
The chapter authors drew upon their extensive knowledge and experience to make this book a reality. Their contributions have made it possible for this book to include a depth of information on the wide variety of topics needed to implement, operate, and evaluate advanced disease surveillance applications.
In addition to the chapter authors, the talents of several individuals made this book possible. They took time from their busy lives to help assemble and review the materials from the authors. We are indebted to Steve Babin in particular for his knowledge of LaTEX and for the enthusiasm he brought to the project. The book could not have been assembled without the organizational skills of Sue Pagan who converted the diverse materials received from authors into chapters. Evelyn Harshbarger should also be recognized for the conversion of draft figures into high-quality images. Christian Jauvin’s help was instrumental in creating the index. Finally, we would also like to thank Judy Marcus for her expert guidance in translating many flavors of jargon into readable material.
Joseph S. Lombardo, David Ross
Pandemic influenza, West Nile virus, severe acute respiratory syndrome (SARS), and bioterrorism are a few of the current challenges facing public health officials. The need for early notification of, and response to, an emerging health threat is gaining increasing visibility as public opinion increases the pressure to reduce the mortality and morbidity of health threats. With the greater emphasis on the early recognition and management of health threats, federal, state, and local health departments are turning to modern technology to support their disease surveillance activities. Several modern disease surveillance systems are in operational use today. This book presents the components of an effective automated disease surveillance system and is intended for use by public health informatics students, masters of public health students interested in modern disease surveillance techniques, and health departments seeking to improve their disease surveillance capacities.
This introductory chapter provides an overview of the changing requirements for disease surveillance from the perspective of past, present, and future concerns. It includes a brief history of how technology has evolved to enhance disease surveillance, as well as a cursory look at modern disease surveillance technology and activities.
Control of infectious diseases is a cornerstone of public health. Various surveillance methods have been used over the centuries to inform health officials of the presence and spread of disease. The practice of disease surveillance began in the Middle Ages and evolved into the mandatory reporting of infectious disease cases to authorities responsible for the health of populations.
A common definition of surveillance is “the ongoing systematic collection, analysis, and interpretation of outcome-specific data for use in planning, implementation, and evaluation of public health practice” [1]. One of the more challenging aspects of public health surveillance is the early identification of infectious disease outbreaks that have the potential to cause high morbidity and mortality. In recent years, concern over potential uncontrolled outbreaks due to bioterrorism or the appearance of highly virulent viruses such as avian influenza has placed increased pressure on public health officials to monitor for abnormal diseases. Public concern was heightened when at the beginning of the twenty-first century, the dissemination of a biological warfare agent through the U.S. mail system revealed weaknesses in the ability of existing public health surveillance systems to provide early detection of a biological attack.
Containment of potential outbreaks is also confounded by advances in transportation technology. Modern transportation systems permit communicable diseases to be carried around the world in hours over many public health jurisdictions. Health authorities can no longer simply be concerned only with the health status of the populations they serve; they must also cooperate and collaborate in surveillance and containment activities at regional, national, and international levels.
The Internet is an enabling technology for collaboration across wide geographic areas. Information technology in general is also playing a vital role in the timely capture and dissemination of information needed for identification and control of outbreaks. The subject of this book is the use of modern information technology to support the public health mission for early disease recognition and containment.
For more than 50 years, public health has been undergoing a change in identity that strongly affects how the public health sector envisions the use of information technologies. Public health is best viewed as an emergent industry. It has grown from a collection of single-purpose disease prevention and intervention programs to a national network of professionals linked through professional and organizational bonds. The 1988 Institute of Medicine report titled “The Future of Public Health” recognized that public health was established around three core functions and 10 essential services.
The core functions are:
Assessment
Assurance
Policy development
The 10 essential services are:
Information is one of the central products produced by public health. Protecting community health; promoting health; and preventing disease, injury and disability require vigorous monitoring and surveillance of health threats and aggressive application of information and knowledge by those able to prevent and protect the public’s health. Thus, public health informatics supports the activities, programs, and needs of those entrusted with assessing and ensuring that the health status of entire populations is protected and improves over time.
Public health informatics has been defined as the systematic application of information and computer science and technology to public health practice [2]. The topic supports the programmatic needs of agencies, improves the quality of population-based information upon which public health policy is based, and expands the range of disease prevention, health promotion, and health threat assessment capability extant in every locale throughout the world [3]. In the future, public health informatics may change to be defined as informatics supporting the public’s health, a discipline that may be practiced beyond the walls of the health department.
In 1854, John Snow conducted the first comprehensive epidemiological study by linking the locations of cholera patients’ homes to a single water pump. In doing so, he established that cholera was a waterborne disease. Using visual data, Snow quickly convinced the authorities to remove the pump handle. Following that simple intervention, the number of infections and deaths fell rapidly [4].
Over the past 30–50 years, public health programs have emerged around specific diseases, behaviors, or intervention technologies (e.g., immunization for vaccine preventable diseases), each having specific data and information needs. Not surprisingly, information systems were developed to meet the specific needs of each categorical program, and a culture of program-specific information system design permeated public health thinking. By the mid-1990s, leaders in public health acknowledged the need to rethink public health information systems, conceive of systems as support tools for enterprise goals, and do so through nationally adopted standards. As noted in [3] “Public health has lagged behind health care delivery and other sectors of industry in adopting new information technologies, in part because public health is a public enterprise depending on funding action by legislative bodies (local, state, and federal). Additionally, adoption of new technologies requires significant effort to work through government procurement processes.” A 1995 Centers for Disease Control and Prevention (CDC) study reported that integrated information and surveillance systems “can join fragments of information by combining or linking the data systems that hold such information. What holds these systems together are uniform data standards, communications networks, and policy-level agreements regarding confidentiality, data access, sharing, and reduction of the burden of collecting data” [5].
In the late 1990s, it became apparent that public health should be more comprehensive in understanding disease and injury threats. Reassessing its information mission has led federal programs such as CDC and Health Resources Service Administration (HRSA), to view information system integration as the driver for future information system funding. Integration across programs and organizations requires interoperability: data from various sources being brought together, collated in a common format, analyzed, and interpreted without manual intervention. Interoperability also requires an underlying architecture for data coding, vocabularies, message formats, message transmission packets, and system security. Interoperability implies connectedness among systems, which requires agreements that cover data standards, communications protocols, and sharing or use agreements. Interconnected, interoperable information systems will allow public health to address larger aspects of the public’s health. The twenty-first century will probably be seen as the enterprise era of public health informatics. Once the domain of humans alone, the process of gathering and interpretating data should now be mediated by computers. Major advances in the quality, timeliness, and use of public health data will require a degree of machine intelligence not presently embedded in public health information systems [6].
The context in which informatics can contribute to public health progress is changing. New initiatives within public health and throughout the health care industry portend changes in how data are captured, the breadth of data recorded, the speed with which data are exchanged, the number of parties involved in the exchange of data, and how results of analyses are shared. Increasing use of electronic health record systems provides an opportunity to gather more granular, discrete data from a variety of sources, including nursing, pharmacy, laboratory, radiology, and physician notes, thereby changing the specificity and timeliness of knowledge about the distribution of risk factors, preventive measures, disease, and injury within subpopulations.
As agreements are reached on the major information architectural standards (data, transmission, and security) and appropriate approaches to governance and viable business models can be demonstrated, health information exchanges will emerge to assist and transform how health care is delivered. Public health considerations must be central to this transformation, and public health informatics will be central to how public health agencies participate in this rapidly evolving environment.
There are historical accounts in the bible of social distancing as a control measure to stop the spread of leprosy. During the spread of plague in Europe in the fourteenth century, public health authorities searched vessels looking for signs of disease in passengers waiting to disembark. In the United States, the practice of disease surveillance by public health inspection at immigration has been highly publicized as a result of the renovation of Ellis Island. The immigration law of 1891 required a health inspection of all immigrants coming into the United States by Public Health Service physicians. Between 1892 and 1924, over 22 million immigrants seeking to become American citizens were subject to health inspections (Fig. 1.1). The law stipulated the exclusion of “all idiots, insane persons, paupers or persons likely to become public charges, persons suffering from a loathsome or dangerous contagious diseases” [7]. Technology was limited to paper-and-pencil recordkeeping for these surveillance and control activities.
Fig. 1.1 Public health inspectors at Ellis Island looking at the eyes of immigrants for signs of trachoma.
(Photo courtesy of the National Library of Medicine)
One of the earliest technologies used in disease surveillance was the statistical interpretation of mortality data. In 1850, William Farr analyzed the 1849 cholera outbreak in London by deriving a mathematical solution using multiple causation [8].
Florence Nightingale used statistical methods to fight for reform in the British military. She developed the polar-area diagram to demonstrate the needless deaths caused by unsanitary conditions during the Crimean War (1854–1856). Nightingale was an innovator in the collection, tabulation, interpretation, and graphical display of descriptive statistics. Figure 1.2 is Florence Nightingale’s famous diagram depicting the causes of mortality for British troops during the Crimean War. The circle in the figure is divided into wedges, each representing a month of the war. The radius of each wedge is equal to the square root of the number of deaths for the month. The area of each wedge, measured from the center, is proportional to the statistic being represented. Dark gray wedges represent deaths from “preventable or mitigable zymotic” diseases (contagious diseases such as cholera and typhus), medium gray wedges represent deaths from wounds, and light gray wedges are deaths from all other causes [9].
Fig. 1.2 Florence Nightingale’s visualization of the causes of mortality in British troops during the Crimean War.
(From Cohen [9])
Another example of the use of graphics to support epidemiological investigations is the 1869 chart by C.J. Minard describing Napoleon’s ill-fated 1812–1813 march to Moscow and back [10]. Figure 1.3 is Minard’s chart. The upper portion of the chart provides the strength of the French forces as a function of time superimposed on a map of Russia. The gray band is a measure of the size and location of the force as it advanced to Moscow; the black band represents the size and location of the retreating forces. On the lower portion of the chart is a record of the temperatures that the army encountered upon their retreat. Napoleon’s army numbered 422,000 when it crossed the Polish border on the way to Russia. Only 100,000 survived to participate in the battle at Moscow. The returning army facing the Russians at the Battle of Berezina numbered only 19,000. The returning forces suffered massive casualties due to disease and hypothermia associated with the declining temperatures. Temperatures in Russia dropped to −35 degrees Celsius during the campaign.
Fig. 1.3 C.J. Minard’s chart showing the condition of Napoleon’s forces during the 1812–1813 campaign
(From Dursteler [10])
The invention of the telegraph and Morse code in the mid nineteenth century provided a means for rapid dissemination of information over a wide geographic area. This technology had important implications for public health surveillance. During the Spanish Flu outbreak in 1918, the telegraph and the weekly Public Health Reports became essential tools to provide the Public Health Service with surveillance data on the progression of the pandemic.
Medical computing applications evolved with the development of computing technology. The very earliest applications were patient records to support diagnosis and clinical laboratory work. Bruce Blum describes the objects that are processed by computers as data, information, or knowledge [11]. A data point is a single measurement, element of demographics, or physical condition made available to the computer application or analyst. Information is a set of data with some interpretation or processing to add value. Knowledge is a set of rules, formulas, or heuristics applied to the information and data to create greater understanding.
Applications using data were introduced in the 1960s when the IBM 1401 mainframe computer found use in university and research settings. In the 1970s, with the advent of low-cost minicomputers, such as the DEC PDP series or Data General Nova series, computer processing applications were developed to create information to support diagnosis in various branches of medicine. Medical imaging made great advances because images could now be acquired, stored, and processed as individual pixels, permitting multidimensional slices with high resolution. In 1970, a prototype computerized tomography system, developed by Grant [12], enabled multiaxis images to be acquired of a region under investigation. By 1973, Ledley had begun development of a whole-body CT scanner, called the automatic computerized transverse scanner (ACTA), which began clinical service early in 1974 [13].
One of the initial languages developed specifically for the organization of files in the health care industry was the Massachusetts General Hospital Utility Multi-Programming System (MUMPS). The language was developed by Neil Pappalardo, an MIT student working in the animal laboratory at Massachusetts General Hospital in Boston during 1966 and 1967. The original MUMPS system was built on a spare DEC minicomputer. MUMPS was designed for building database applications that help programmers develop applications that use as few computing resources as possible. The core feature of MUMPS is that database interaction is built transparently into the language [14].
The Veterans’ Health Administration (VHA) adopted MUMPS as the programming language for an integrated laboratory/pharmacy/patient admission, tracking, and discharge system in the early 1980s. This system, known originally as the Decentralized Hospital Computer Program (DHCP), has been extended continuously in the years since. In March 1988, the Department of Defense launched the Composite Health Care System (CHCS), based on the VHA’s DHCP software, for all of its military hospitals [15]. DHCP and CHCS form the largest medical records archiving systems in the United States. These archives are sources of indicators of emerging diseases and outbreaks.
In the United States, state and local health departments have taken on the role of collecting and archiving vital statistics for the populations they serve. Health departments issue certified copies of birth, death, fetal death, and marriage certificates for events that occur in their population. Many departments also provide divorce verifications and registries on adoption and act as adjudicators of paternity.
The National Center for Health Statistics (NCHS) is the lead U.S. federal government agency for collecting, sharing, and developing procedures and standards for vital statistics. The NCHS is the oldest and one of the first examples of intergovernmental data sharing in public health. The data are provided through contracts between NCHS and individual record systems operated in the various jurisdictions legally responsible for the registration of vital events: births, deaths, marriages, divorces, and fetal deaths. In the United States, legal authority for maintaining registries of vital events and for issuing copies of birth, marriage, divorce, and death certificates resides with the states, some individual cities (Washington, DC, and New York City), and the territories [16, 17].
In 1916, the Illinois Department of Public Health (IDPH) assumed responsibility for collecting data on vital events such as live births, still births, and deaths. In 1938, the department acquired IBM tabulation equipment for the generation of vital statistics and other health data. A computer was first used in population monitoring to support the Census Bureau in tabulating data from the 1950 census. In 1962, the IDPH became the first state health department to convert its applications on tabulation equipment to the newly acquired IBM 1401 computer. Many applications were developed for the IDPH computers, one of the most famous for a large salmonellosis outbreak in 1985. The IDPH identified communications with local heath departments as a major weakness to the response. As a result, a minicomputer network was established that used modems and phone lines to pass information among state and local health departments. This system was known as the Public Health Information Network [18].
The Public Health Informatics Institute (PHII) was formed in 1992 with a grant from the Robert Wood Johnson Foundation. The Institute helps to foster applications that provide value to public health rather than just using the latest technology for technology’s sake [19]. The Institute has outlined a set of principles to assist in guiding the development and use of computer applications for public health [20]:
Figure 1.4 provides a graphical representation of the PHII principles and the four major steps in the development of a public health informatics application. The first step is to determine how the new system can improve health outcomes by quantifying the health problem, developing a business case for the system, and defining the indicators for measuring success. The second step is to determine how the work will be accomplished through a series of analyses to define the workflow and business processes that will support the application. The third step is to determine the requirements for the application through performance requirements analysis and system design. Once the system is implemented, the final step is to determine how success will be measured through an evaluation and a series of metrics that measure the performance of the system. For advanced disease surveillance systems, the Centers for Disease Control and Prevention (CDC) has developed a framework for evaluating syndromic surveillance systems that contains a series of metrics [21,22]. The framework assumes that the system has been fully developed and operational for several years; thus, a comprehensive evaluation in the early implementation stages of the system using the framework is not possible. It is one of the most comprehensive sets of metrics developed for disease surveillance systems. See Chapter 10 for a discussion of this and other frameworks.
Fig. 1.4 Principles and approach for planning and design of an enterprise information system.
(From Public Health Informatics Institute [20], ©PHII)
James Jekel describes surveillance as the entire process of collecting, analyzing, interpreting, and reporting data concerning the incidence of death, diseases, and injuries and the prevalence of certain conditions whose knowledge is considered important for promoting the health of the public [23]. Most surveillance systems are developed and implemented with a clear objective of the specific outcome being sought. Examples are the linkage of specific environmental risk factors to chronic diseases such as cancer or monitoring of behavioral factors associated with the transfer of sexually transmitted diseases (STDs). As mentioned earlier, a main focus of this book is surveillance systems for the early recognition of outbreaks due to highly infectious diseases that have a potential for high morbidity and mortality, such as virulent forms of influenza or disease agents of bioterrorism. A main objective of a system developed around this focus is to reduce the number of cases by enabling the administration of prophylaxis rapidly or by allowing for social distancing to reduce the spread of disease. To achieve this objective, a disease outbreak must be recognized in the very early stages for a highly contagious disease such as influenza or during the initial symptoms of a disease like anthrax so that treatment and control efforts still have a high chance of a successful outcome. Traditional disease surveillance and response can be represented by the steps shown in Fig. 1.5. Health departments have traditionally relied on reporting from health care providers or laboratories before initiating epidemiological investigations. This surveillance approach is highly specific, but neither sensitive or timely. In the case of anthrax, preventing the mortality of those infected relies on the rapid identification and treatment of the disease.
Fig. 1.5 A traditional method of public health surveillance and response for infectious diseases.
One potential approach for early identification of abnormal disease in a community is to collect and analyze data that are not used traditionally for surveillance and may contain early indicators of the outbreak. This approach relies on capturing health-seeking information when a person becomes ill. The concept of how such a system may operate is illustrated in Fig. 1.6. The concept is based on the assumption that a pathogen is released into the environment either in the air or in the water supply. If some type of sensor is present that can detect the presence of the pathogen and determine its identity, the detection phase is complete, but it is not possible for sensors to be located everywhere. Also, environmental sensors may be of little value if the health threat is due to highly contagious persons rather than pathogens released into the environment. If biological or chemical material has been released into the environment, the effect may be seen in animals, birds, and plant life, as well as in humans. Zoonotic diseases such as West Nile virus may first present with animal illness and death before presenting in humans.
Fig. 1.6 Concept for a disease surveillance system using data sources that may contain early indicators and warnings of a health event.
Several types of data are collected routinely for purposes other than disease surveillance could contain indicators and warnings of an abnormal health event. When continual feeds are established for these data, analytical techniques can be applied to identify abnormal behavior. Signals identified through this process can fall into several different classes, where the most important is an outbreak with the potential for high morbidity or mortality in the population. Once it has been established that the signal is of importance, additional data are needed to understand what is occurring before a public health response can be executed.
Following the detection of a statistical aberration in surveillance data, several questions must be answered. What disease is present, and what agent is causing it? What are the characteristics of the disease and what methods are used to treat the disease? Where and when did people get infected? Was the exposure at a single point over a short duration, or was exposure over an extended time period and a large geographic area? Knowledge of the population at risk is also necessary to assess the potential public health implications of a surveillance alarm. If the disease is highly contagious, is it contagious before symptoms develop, and which persons are at risk of being infected by contact with those initially infected? Where are those who have been infected, and how can they be contacted? These are just a few of the questions for which answers would be urgently needed.
Health departments need the answers to these questions to develop and execute a response to contain an outbreak. However, surveillance systems that use nonspecific data as early indicators of disease cannot provide many answers; traditional epidemiological investigations are still needed. The best modern disease surveillance systems recognize this burden and attempt to collect as much data as possible to assist investigators in pulling together as much information as possible in a timely manner.
Modern medicine has had a significant impact on the control of infectious disease outbreaks. During the majority of the past century, Western countries have had abundant supplies of vaccines and antibiotics to control emerging outbreaks. A large outbreak of an unknown strain of an infectious disease agent or a large bioterrorist event could overburden the ability of the medical communities to give high-quality care to all those infected. A review of the history of significant outbreaks provides insight into the challenges facing the public health community.
One of the most significant diseases in the history of humankind is smallpox. Early accounts of smallpox date back to 10,000 B.C., when it appeared in the agricultural settlements of northeastern Africa [24]. Egyptian merchants helped to spread the disease to India in the last millennium B.C. Lesions resembling smallpox were found on the faces of mummies, including the well-preserved mummy of Ramses V, who died in 1157 B.C.
Western civilization has been affected greatly by smallpox. The plague of Antonine, around A.D. 180, killed between 3.5 and 7 million persons and coincided with the beginning of the decline of the Roman Empire [25, 26]. Arab expansionism, the Crusades, and the discovery of the West Indies all contributed to the spread of smallpox. The disease was introduced into the new world by Spanish and Portuguese conquistadors and contributed to the fall of the Aztec and Inca empires. During the decade following the Spanish arrival in Mexico, the population decreased from 25 million to 1.6 million, with disease contributing significantly to the decline [27].
The diseases that ravaged Europe and Asia for centuries were for some time unknown to Native North Americans. Ultimately, infectious diseases introduced by expansionism devastated the American Indian, with the greatest number of deaths caused by smallpox — sometimes intentionally. During the Indian siege of Fort Pitt in the summer of 1763, the British sent smallpox-infected blankets and handkerchiefs to the Indians in a deliberate attempt to start an epidemic [28]. The plan to infect the Indians and quell the siege was documented in a letter written by Colonel Henry Bouquet to Sir Jeffrey Amherst, the commander-in-chief of British forces in North America.
In 1796, Edward Jenner, an English physician, observed that dairymaids who contracted cowpox, a much milder disease, were immune to smallpox. With serum taken from a dairymaid, Jenner began vaccination. When it was available, vaccination became an effective way of controlling the spread of smallpox.
In 1947, the Soviet Union established its first smallpox weapons factory in Zagorsk just northwest of Moscow. Animal tests showed that fewer than five viral particles were needed to cause infection in 50 percent of subjects. In comparison, 1500 plague cells and 10,000 anthrax spores were needed to achieve the same results. By 1970, smallpox was considered so important to the biological weapons arsenal that over 20 tons were stored annually at Zagorsk for immediate use [29].
In 1967, the World Health Organization (WHO) initiated a mass vaccination program that resulted in the eradication of smallpox by 1978 [30, 31, 32]. On May 8, 1980, WHO announced that smallpox had been eradicated from the planet. Smallpox immunization programs were discontinued, and only limited quantities of the virus were retained for research purposes at the Centers for Disease Control in Atlanta and the Ivanovsky Institute of Virology in Moscow. Coincidently, the Soviet weapons program, Biopreparat, included smallpox in the weapons improvement list in its five-year 1981–1985 plan [29].
Bubonic plague, or Black Death, left an indelible mark on history. In 1346, there were fearful rumors of plague in the East at major European seaports. India was depopulated; Tartary, Mesopotamia, Syria, and Armenia were covered with dead bodies. The disease traveled from the Black Sea to the Mediterranean in galleys following the trade routes to Constantinople, Messina, Sicily, Sardinia, Genoa, Venice, and Marseilles. By 1348, the Black Death had taken a firm grip on Italy. Between the years 1347 and 1352, plague accounted for the destruction of one third to one half the population of Europe, approximately 25 million victims. The disease terrified the populations of European cities because it struck so swiftly and consumed a town or city within weeks. Victims died within days in agony from fevers and infected swellings [33].
Plague had been around London since it first appeared in Britain in 1348, but in 1665, a major outbreak occurred. Two years earlier, plague ravaged Holland. Trade was restricted with the Dutch, but despite the precautions, plague broke out in London, starting in the poorer sections of the city. Initially, the authorities ignored it, but as spring turned into one of the hottest summers in recent years, the number of deaths increased dramatically. In July, over 1000 deaths per week were reported, and by August, the rate peaked at over 6000 deaths per week. A rumor that dogs and cats caused the spread resulted in a drastic reduction in their numbers, leaving the plague-carrying rats without predators.
Control measures consisted of quarantining families in their homes. When a person in a household became infected, the house was sealed until 40 days after the victim either recovered or died. Guards were posted at the door to see that no one left. The guard had to be bribed to allow any food to pass to the homes. Accounting for victims was difficult because the quarantine measures were so harsh that families were not willing to report the death of family members. Nurses went from door to door in an attempt to quantify the number dead. Estimates are that over 100,000 people (about a quarter of the population of London) perished in the outbreak. In 1666, the Great Fire of London burned down the city slums and brought the plague under control.
In colonial times, laws were passed mandating the reporting of smallpox, yellow fever, and cholera [24]. By the nineteenth century, mandatory reporting at the state and federal levels became common. During the twentieth century, increasing use of vaccines and antibiotics, improvements in communication, and the dedication of individuals and organizations led to a significant decline in morbidity and mortality due to highly contagious diseases. The twentieth century also saw the pandemic or world-wide epidemic of the Spanish influenza of 1918 and the belief by government leadership that modern medicine had conquered the risk of infectious disease outbreaks by the end of the century. These beliefs led to complacency in allocating funding to improve disease surveillance activities.
There were three major pandemic influenza outbreaks in the twentieth century [34]. In 1918–1919, Spanish influenza, caused by the H1N1 subtype of the influenza A virus, infected up to one-third of the world’s populations.1 The pandemic erupted during the final stages of World War I and ultimately killed more people than the war. The number of dead is estimated at between 20 and 40 million, with the exact numbers unknown due to inadequate reporting. In the United States, the outbreak claimed 675,000 lives. It has been cited as the most devastating epidemic in recorded world history. More people died of influenza in a single year than in the four years of the Black Death from 1347 to 1351.
From analysis to determine the virulence of the H1N1 virus strain, a U.S. Armed Forces Institute of Pathology study determined that the Spanish influenza could first have appeared in a young British soldier during the Battle of the Somme in 1916 [35]. In 1916, supply lines stretching through the French town of Etaple comprised not only hundreds of thousands of troops but also piggeries and chicken coops to supply food for the forces. Etaple could have been the incubation site for the transfer of the virus from chickens and pigs to humans. The Institute of Pathology study also included the collection of virus samples from victims buried in the Alaska permafrost. Using documentary evidence and new genetic clues, researchers have been able to trace the flu’s spread in three waves around the world. These studies are being used to speculate about the impact of a potential H5N1 Avian Influenza pandemic [36].
Camp Funston provides a graphic example of how the 1918 pandemic ravaged communities. The 29th Field Artillery Battalion was constituted on July 5, 1918, as part of the Army’s 10th Division at Camp Funston, Kansas. There, they underwent equipment issue and tactical training and began preparations to deploy to Europe. However, during this period, Camp Funston suffered an influenza outbreak that devastated the installation. Figure 1.7 shows an emergency hospital set up at Camp Funston to care for the influenza patients. By the end of October 1918, there were 14,000 reported cases and 861 deaths in Camp Funston alone. The State of Kansas reported a total of 12,000 deaths by the time the flu had run its course and the units were healthy, the war had ended. Camp Funston was originally considered the initial site of the Spanish Influenza outbreak.
Fig. 1.7 Emergency hospital set up in Camp Funston, Kansas, during the beginning of the 1918 influenza epidemic.
(Photo courtesy of the National Museum of Health and Medicine)
There are still several questions regarding the characteristics of the 1918–1919 pandemic. Figure 1.8 gives the mortality rate in the United Kingdom for the Spanish Flu. Three distinct waves occurred: in the spring of 1918, the fall of 1918, and the late winter of 1919. The first two waves of the pandemic occurred at a time of the year unfavorable to normal influenza virus strains. Could the virus have mutated around the world so quickly and simultaneously?
Fig. 1.8 Combined influenza and pneumonia mortality rate in the United Kingdom for 1918–1919.
Another major difference between the pandemic strain and normal flu related to the groups affected. Mortality for influenza typically occurs among the very young or aged populations. In the 1918–1919 pandemic, disproportionate numbers of healthy young adults became victims. One theory is that earlier circulating influenza strains provided partial immunity for those exposed to a similar strain of the virus. The elderly would have been exposed to many more strains. Because most elderly could be expected to have weaker immune systems, the rates remained high. Figure 1.9 provides a comparison of the number of deaths per 100,000 persons in the United States by age group during 1911–1917 with those that occurred during 1918.
Fig. 1.9 Combined influenza and pneumonia mortality by age at death per 100,000 persons in each age group, United States, 1911–1918. Influenza- and pneumonia-specific death rates are plotted for the nonpandemic years 1911–1917 (dashed line) and for the pandemic year 1918 (solid line).
Two influenza pandemics have swept the world since 1919: the Asian influenza pandemic of 1957 (H2N2) and the Hong Kong influenza pandemic of 1968 (H3N2), both of which were avian influenza viruses. The Asian flu pandemic probably made more people sick than the pandemic of 1918, but the availability of antibiotics to treat the secondary infections resulted in a much lower death rate. Asian flu was first identified in China in February 1957. The virus was quickly identified due to advances in scientific technology, and vaccine production began in May 1957, before the disease spread to the United States in June 1957. By August 1957, vaccine was available in limited supply in the United States. The virus claimed 1 million victims worldwide.
The Hong Kong flu pandemic strain of H3N2 evolved from H2N2 by antigenic shift. Antigenic shift is the process by which two different strains of influenza combine to form a new subtype with a mixture of the surface antigens of the two original strains. Annual flu virus mutation occurs through a process called antigenic drift, where the surface proteins change slowly over time. The body’s immune system can react to slow changes but cannot readily adapt to a rapid antigenic shift. Because of its similarity to the 1957 Asian flu and, possibly, the subsequent accumulation of related antibodies in the affected population, the Hong Kong flu resulted in far fewer casualties than in most pandemics. Casualty estimates vary; between 750,000 and 2 million people died of the virus worldwide during the two years (1968–1969) that it was active [37].
A highly virulent form of the avian virus H5N1 is currently being spread across the world by migrating waterfowl. Domestic poultry catch the virus from contact with migratory birds. Humans have caught H5N1 from close contact with infected chickens. Originally endemic only in birds in Southeast Asia, migratory patterns threaten to infect birds everywhere. Tens of millions of birds have died of the H5N1 virus, with hundreds of millions slaughtered in an attempt to control the disease. Figure 1.10 shows an example of the flyways currently being used by migratory bird. The flyway patterns cover most populated areas of the globe.
Fig. 1.10 Flyway patterns of migratory birds.
(Adapted from United Nations Food and Agriculture Organization Figure [38])
The present form of the H5N1 virus does not pass efficiently between humans. However, as the virus continues to evolve, another pandemic on the order of the Spanish flu is feared. Table 1.1 presents the number of human cases of H5N1 and related deaths from 2003 until March 16, 2006. Of the 176 confirmed cases, there have been 97 fatalities, yielding a case fatality rate of 56.4%. The rate far exceeds that of previous pandemics [40].
Table 1.1 Cumulative Number of Confirmed Human Cases of Avian Influenza A/(H5N1) Reported to WHO as of March 10, 2006.
Source: World Health Organization [39]
Table 1.2 provides a list of major outbreaks considered pandemics from answers.com. There were undoubtedly many more episodes that did not make this list due to the lack of documented historical evidence prior to the eighteenth century. For the last entry, severe acute respiratory syndrome (SARS), there were fewer than 10,000 cases of the disease, but air travel spread the previously unknown contagious disease quickly.
Table 1.2 Documented Pandemics
165–180
Antonine plague (smallpox)
541
Plague of Justinian (bubonic plague)
1300s
The Black Death (plague)
1732–1733
Influenza
1775–1776
Influenza
1816–1826
Cholera
1829–1851
Cholera
1847–1848
Influenza
1852–1860
Cholera
1857–1859
Influenza
1863–1875
Cholera
1899–1923
Cholera
1918–1919
Spanish flu (influenza)
1957–1958
Asian flu (influenza)
1959-present
AIDS
1960s
El Tor (cholera)
1968–1969
Hong Kong flu (influenza)
1993–1994
Plague, Gujarat. India
2002–2003
SARS
Before the twentieth century, biological weapons were relatively simple. Infected materials were used to induce illness in an opponent’s forces, or food or water supplies were poisoned. In the sixth century B.C., the Assyrians poisoned the drinking water of their enemies; in medieval times Mongol and Turkish armies catapulted the diseased corpses of animals or humans into fortified castles; and as late as 1710, Russian armies used plague corpses as weapons. During World War I, German agents in the United States inoculated horses and cattle with glanders before they were shipped to France for use by the Allied powers.
In 1925, the first international agreement, known as the Geneva Protocol, to limit the use of chemical and biological weapons was signed. The Protocol prohibited the use in war of asphyxiating gases and of bacteriological methods of warfare. The agreement did not address production, storage, or verification mechanisms and could not be used to support disarmament. As a result, significant research was performed in the twentieth century to increase the performance of biowarfare agents and delivery methods. Biological weapons could be developed very cheaply and cause large numbers of casualties compared with conventional weapons [41].
The Soviet Union established its biological weapons program in the late 1920s after a typhus epidemic in Russia from 1918 to 1922 killed between 2 and 10 million, illustrating graphically the destructive and disruptive power of biological weapons. From the occupation of Manchuria in 1931 to the end of World War II in 1945, the Imperial Japanese Army experimented with biological weapons on thousands of Chinese. These experiments were conducted in a disguised water purification plant known as Unit 731 at Pingfan, near the city of Harbin in northeastern China [42]. Japanese scientists tested plague, cholera, smallpox, botulism, and other diseases on prisoners. Their research led to the development of a defoliation bacilli bomb to destroy crops and a flea bomb to spread bubonic plague. Initial successes with this technology stimulated other developments, which enabled Japanese soldiers to launch biological attacks with anthrax, plague-carrying fleas, typhoid, dysentery, choler, and other deadly pathogens. At least 11 Chinese cities were attacked with biological weapons, resulting in an estimated 10,000 to 200,000 deaths. In addition, there are firsthand accounts of the Japanese infecting civilians through the distribution of infected food and contaminated water supplies, with estimated casualties of over 580,000 from plague and cholera. Following the war, the United States granted amnesty to the Japanese scientists in exchange for their experimentation data. Figure 1.11 shows a human vivisection experiment conducted by Unit 731 during World War II, in which a team of Japanese surgeons is removing organs while another is taking measurements on the organs.
Fig. 1.11 Japanese vivisection experiment conducted on a Chinese victim infected with a biological agent.
(From Hal Gold [42], p. 169)
In 1941, a biological weapons development program initiated by the United States, the United Kingdom, and Canada in response to German and Japanese weapons development activities resulted in the weaponization of anthrax, brucellosis, and botulinum toxin. During World War II, the United Kingdom developed the Allies’ first anthrax bomb by experimenting with sheep on Gruinard Island in Scotland. Sheep were used because they were similar in weight to humans, are highly susceptible to anthrax, and are plentiful in the area. The research left the island contaminated with anthrax spores (Fig. 1.12).
Fig. 1.12 Gruinard Island was the site of an experimental anthrax bomb
(AP Photo/Press Association, used with permission.)
In another World War II program, termed Operation Vegetarian, the UK manufactured and planned to drop 5 million anthrax cattle cakes on German beef and dairy herds. The plan was to wipe out the German herds and simultaneously infect the German human population. Because antibiotics were not available to the general population, the operation could have caused thousands, if not millions of human deaths. The operation was abandoned due to the success of the Normandy invasion. At the end of 1945, the British incinerated 5 million anthrax cattle cakes.
Stockpiles of biological weapons were destroyed after President Nixon unilaterally ended the United States’ offensive biological warfare program. This initiative ultimately resulted in the Biological Weapons Convention in 1972. Signers of the Convention pledged to never develop, produce, stockpile, acquire, or retain biological warfare agents or the means to deliver them.
