A Framework of Human Systems Engineering -  - E-Book

A Framework of Human Systems Engineering E-Book

0,0
108,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Explores the breadth and versatility of Human Systems Engineering (HSE) practices and illustrates its value in system development A Framework of Human Systems Engineering: Applications and Case Studies offers a guide to identifying and improving methods to integrate human concerns into the conceptualization and design of systems. With contributions from a panel of noted experts on the topic, the book presents a series of Human Systems Engineering (HSE) applications on a wide range of topics: interface design, training requirements, personnel capabilities and limitations, and human task allocation. Each of the book's chapters present a case study of the application of HSE from different dimensions of socio-technical systems. The examples are organized using a socio-technical system framework to reference the applications across multiple system types and domains. These case studies are based in real-world examples and highlight the value of applying HSE to the broader engineering community. This important book: * Includes a proven framework with case studies to different dimensions of practice, including domain, system type, and system maturity * Contains the needed tools and methods in order to integrate human concerns within systems * Encourages the use of Human Systems Engineering throughout the design process * Provides examples that cross traditional system engineering sectors and identifies a diverse set of human engineering practices Written for systems engineers, human factors engineers, and HSI practitioners, A Framework of Human Systems Engineering: Applications and Case Studies provides the information needed for the better integration of human and systems and early resolution of issues based on human constraints and limitations.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 615

Veröffentlichungsjahr: 2020

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright Page

Dedication Page

Editor Biographies

Contributors List

Foreword

Preface

Section 1: Sociotechnical System Types

1 Introduction to the Human Systems Engineering Framework

1.1 Introduction

1.2 Human‐Centered Disciplines

1.3 Human Systems Engineering

1.4 Development of the HSE Framework

1.5 HSE Applications

1.6 Conclusion

References

2 Human Interface Considerations for Situational Awareness

2.1 Introduction

2.2 Situational Awareness: A Global Challenge

2.3 Putting Situational Awareness in Context: First Responders

2.4 Deep Dive on Human Interface Considerations

2.5 Putting Human Interface Considerations in Context: Safe Cities

2.6 Human Interface Considerations for Privacy‐Aware SA

Reference

3 Utilizing Artificial Intelligence to Make Systems Engineering More Human

*

3.1 Introduction

3.2 Changing Business Needs Drive Changes in Systems Engineering

3.3 Epoch 4: Delivering Capabilities in the Sociotechnical Ecosystem

3.4 The Artificial Intelligence Opportunity for Building Sociotechnical Systems

3.5 Using AI to Track and Interpret Temporal Sociotechnical Measures

3.6 AI in Systems Engineering Frameworks

3.7 AI in Sociotechnical Network Models

3.8 AI‐Based Digital Twins

3.9 Discussion

3.10 Case Study

3.11 Systems Engineering Sociotechnical Modeling Approach

3.12 Results

3.13 Summary

References

4 Life Learning of Smart Autonomous Systems for Meaningful Human‐Autonomy Teaming

4.1 Introduction

4.2 Trust in Successful Teaming

4.3 Meaningful Human‐Autonomy Teaming

4.4 Systematic Taxonomy for Iterative Through‐Life Learning of SAS

4.5 Ensuring Successful SAS

4.6 Developing Case Study: Airborne Shepherding SAS

4.7 Conclusion

Acknowledgment

References

Section 2: Domain Deep Dives

5 Modeling the Evolution of Organizational Systems for the Digital Transformation of Heavy Rail

5.1 Introduction

5.2 Organizational System Evolution

5.3 Model‐Based Systems Engineering

5.4 Modeling Approach for the Development of OCMM

5.5 Implementation

5.6 Case Study: Digital Transformation in the Rail Industry

5.7 OCMM Reception

5.8 Summary and Conclusions

References

6 Human Systems Integration in the Space Exploration Systems Engineering Life Cycle

6.1 Introduction

6.2 Spacecraft History

6.3 in the NASA Systems Engineering Process

6.4 Mission Challenges

6.5 Conclusions

References

7 Aerospace Human Systems IntegrationEvolution over the Last 40 Years

7.1 Introduction

7.2 Evolution of Aviation: A Human Systems Integration Perspective

7.3 Evolution with Respect to Models, Human Roles, and Disciplines

7.4 From Rigid Automation to Flexible Autonomy

7.5 How Software Took the Lead on Hardware

7.6 Toward a Human‐Centered Systemic Framework

7.7 Conclusion and Perspectives

References

Section 3: Focus on Training and Skill Sets

8 Building a Socio‐cognitive Evaluation Framework to Develop Enhanced Aviation Training Concepts for Gen Y and Gen Z Pilot Trainees

8.1 Introduction

8.2 Virtual Technologies in Aviation

8.3 Human Systems Engineering Challenges

8.4 Potential Applications Beyond Aviation Training

8.5 Looking Forward

Acknowledgement

References

9 Improving Enterprise Resilience by Evaluating Training System ArchitectureMethod Selection for Australian Defense

9.1 Introduction

9.2 Defense Training System

9.3 Concept of Resilience in the Academic Literature

9.4 DTS Case Study Methodology

9.5 Research Findings and Future Directions

References

10 Integrating New Technology into the Complex System of Air Combat Training

*

10.1 Introduction

10.2 Method

10.3 Results and Discussion

10.4 Conclusion

Acknowledgments

References

Section 4: Considering Human Characteristics

11 Engineering a Trustworthy Private Blockchain for Operational Risk ManagementA Rapid Human Data Engineering Approach Based on Human Systems Engineering

11.1 Introduction

11.2 Human Systems Engineering and Human Data Engineering

11.3 Human‐Centered System Design

11.4 Practical Issues Leading to Large Complex Blockchain System Development

11.5 Framework for Rapid Human Systems–Human Data Engineering

11.6 Human Systems Engineering for Trustworthy Blockchain

11.7 From Human System Interaction to Human Data Interaction

11.8 Future Work for Trust in Human Systems Engineering

11.9 Conclusion

Acknowledgment

References

12 Light’s Properties and Power in Facilitating Organizational Change

12.1 Introduction

12.2 Implicit Properties and a Mathematical Model of Light

12.3 Materialization of Light

12.4 Leveraging Light to Bring About Organizational Change

12.5 Summary and Conclusion

References

Section 5: From the Field

13 Observations of Real‐Time Control Room Simulation

13.1 Introduction

13.2 Future General‐Purpose Simulators

13.3 Operators

13.4 Data

13.5 Measurement

13.6 Conclusion

Disclaimer

References

14 A Research Agenda for Human Systems Engineering

14.1 The State of Human Systems Engineering

14.2 Recommendations from the Chapter Contributions

14.3 Uniting the Human Systems Engineering Stakeholders

14.4 Summary

Disclaimer

References

Index

End User License Agreement

List of Tables

Chapter 3

Table 3.1 Case study comparative analysis.

Chapter 4

Table 4.1 Key elements of meaningful human control (MHC).

Table 4.2 Technology readiness level (TRL).

Table 4.3 Proposed SAS cognitive competence levels.

Table 4.4 Proposed SAS readiness levels.

Table 4.5 Human–robot interaction roles.

Table 4.6 Situation awareness levels.

Chapter 5

Table 5.1 Organizational evolution example.

Table 5.2 Mapping the user portals to information libraries.

Table 5.3 Model groupings in metamodel.

Table 5.4 Demonstrator scope.

Table 5.5 Task‐based SA assessment (pre‐ETCS example).

Table 5.6 Summary of SA factors (operate train example).

Table 5.7 Workload by variables (ATMS example).

Table 5.8 Workload assessment by process (excerpt).

Chapter 9

Table 9.1 RAN training Force intent, associated actions, and measures of achi...

Table 9.2 Key elements of resilience definition and their associated vocabula...

Table 9.3 Six “increments” of resilience concept.

Table 9.4 Attributes commonly associated with the concept of resilience.

Table 9.5 Resilience attributes as proposed in a single source.

Table 9.6 Examples of the nine resilience attributes' definitions.

Table 9.7 Resilience triggers.

Table 9.8 Resilience attributes and their manifestations.

Table 9.9 DTS resilience questions for training system specialist.

Chapter 10

Table 10.1 Aircraft flown by participants.

Table 10.2 Naval air combat expert profiles.

Table 10.3 Results of the assessment of potential LVC training hazards using ...

List of Illustrations

Chapter 1

Figure 1.1 HSE original framework.

Figure 1.2 HSE framework as an index for the case studies.

Chapter 3

Figure 3.1 Systems engineering evolution.

Figure 3.2 A conceptual architecture for Epoch 4.

Figure 3.3 A taxonomy of artificial intelligence.

Figure 3.4 Risk taxonomy.

Figure 3.5 Example project social network.

Figure 3.6 Project information ecosystem.

Figure 3.7 BBN example element.

Figure 3.8 Case study project social network.

Figure 3.9 Confidence vs. stability.

Figure 3.10 Case study project events.

Figure 3.11 Belief alignment and stability results for case study.

Chapter 4

Figure 4.1 Technology evolution for successful M‐HAT in Sky Shepherd researc...

Figure 4.2 Sky Shepherd pilot sets up a DJI Mavic II Enterprise Duo ready fo...

Figure 4.3 A flock of Dorper sheep (

Ovis aries

) responding to the presence o...

Chapter 5

Figure 5.1 Organizational system evolution.

Figure 5.2 OCMM development approach.

Figure 5.3 OCMM metamodel (excerpt).

Figure 5.4 Example system introductions.

Figure 5.5 ETCS level 2 system structure.

Figure 5.6 Operate train (ETCS L2 onboard operation modes).

Figure 5.7 Partial activity diagram for ETCS L2 stop at signal.

Figure 5.8 Role allocation (RASCI).

Figure 5.9 RASCI role structure for operate train process under ETCS.

Figure 5.10 OCMM model of situation awareness.

Figure 5.11 Situational awareness assessment (operate train example).

Chapter 6

Figure 6.1 Mercury spacecraft displays and controls.

Figure 6.2 Gemini spacecraft displays and controls.

Figure 6.3 Apollo command module displays and controls.

Figure 6.4 Lunar lander displays and controls.

Figure 6.5 Space Shuttle flight deck.

Figure 6.6 ISS Cupola workstation.

Figure 6.7 Mockup of the Orion D&C glass cockpit.

Figure 6.8 NASA HSI domains.

Figure 6.9 Notional HSI domain interaction.

Figure 6.10 NASA systems engineering engine.

Figure 6.11 Notional systems engineering and HSI interaction.

Figure 6.12 Notional holistic view of NASA HSI and the SE life cycle.

Figure 6.13 Human‐centered design.

Chapter 7

Figure 7.1 ATC‐to‐ATM evolution from procedural control to trajectory manage...

Figure 7.2 The TOP model.

Figure 7.3 Example of separability property of a system of systems (SoS). Fo...

Figure 7.4 The four loops of automation of the airspace.

Figure 7.5 Human‐centered design evolution.

Figure 7.6 Procedures, automation, and problem solving leading to the alloca...

Figure 7.7 An isolated system.

Figure 7.8 A system of systems represented as an infrastructure (i.e. a soci...

Figure 7.9 Synthetic view of the system representation.

Figure 7.10 A function of functions mapped onto a structure of structures.

Figure 7.11 Emerging functions (yellow) and structures (pink) within an acti...

Figure 7.12 A function logically transforms a task into an activity.

Figure 7.13 HSI recursive definition of a system.

Chapter 9

Figure 9.1 Resilience attributes' interrelationships.

Figure 9.2 DTS resilience driving factors.

Chapter 10

Figure 10.1 Overview of the data analysis procedure.

Notes

. ORM, operational...

Chapter 11

Figure 11.1 The rapid human‐centered systems engineering and human data engi...

Figure 11.2 User data interaction design (UDID) of trustworthy blockchain il...

Figure 11.3 Distribution of key pairs and certificate.

Figure 11.4 Key and CSR generation for ICAs.

Figure 11.5 Design of trusted gateways.

Figure 11.6 Design of trusted peers and orderers to form a preliminary block...

Figure 11.7 Trusted channel among the consortium.

Figure 11.8 Systems engineering of human data interaction of the trustworthy...

Figure 11.9 Future of human systems engineering trend based on VDMA predicti...

Figure 11.10 Future of human systems engineering trend.

Chapter 13

Figure 13.1 Contemporary control room.

Figure 13.2 Future general‐purpose simulator.

Figure 13.3 Future on‐site simulator

Figure 13.4 ISA key values.

Figure 13.5 NASA‐TLX scales.

Guide

Cover

Table of Contents

Begin Reading

Pages

ii

iii

iv

v

xv

xvii

xviii

xix

xxi

xxii

xxiii

xxiv

1

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

63

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

131

132

133

134

135

136

137

138

139

140

141

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

271

272

273

274

275

IEEE Press445 Hoes LanePiscataway, NJ 08854

IEEE Press Editorial BoardEkram Hossain, Editor in Chief

Jón Atli Benediktsson

David Alan Grier

Elya B. Joffe

Xiaoou Li

Peter Lian

Andreas Molisch

Saeid Nahavandi

Jeffrey Reed

Diomidis Spinellis

Sarah Spurgeon

Ahmet Murat Tekalp

A Framework of Human Systems Engineering

Applications and Case Studies

Edited by

Holly A. H. HandleyAndreas Tolk

Copyright © 2021 by The Institute of Electrical and Electronics Engineers, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey.

Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per‐copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750‐8400, fax (978) 750‐4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748‐6011, fax (201) 748‐6008, or online at http://www.wiley.com/go/permissions.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762‐2974, outside the United States at (317) 572‐3993 or fax (317) 572‐4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging‐in‐Publication Data:

Names: Handley, Holly A. H., editor. | Tolk, Andreas, editor.Title: A framework of human systems engineering : applications and case studies / edited by Holly A. H. Handley, Andreas Tolk.Description: Hoboken, New Jersey : Wiley‐IEEE Press, [2021] | Includes bibliographical references and index.Identifiers: LCCN 2020038951 (print) | LCCN 2020038952 (ebook) | ISBN 9781119698753 (cloth) | ISBN 9781119698777 (adobe pdf) | ISBN 9781119698760 (epub)Subjects: LCSH: Systems engineering. | Human engineering.Classification: LCC TA168 .F736 2021 (print) | LCC TA168 (ebook) | DDC 620.8/2–dc23LC record available at https://lccn.loc.gov/2020038951LC ebook record available at https://lccn.loc.gov/2020038952

Cover Design: WileyCover Images: Power robotic arm© Phanie/Alamy Stock Photo, Young woman with wearing smart glass © Tim Robberts/Getty Images, Self driving vehicle © metamorworks/Getty Images, Abstract background © MR.Cole_Photographer/ Getty Images

To MarkQuarantined together for 86 days and still married.HAHH

To my father, who taught me the love for numbers and precision,And to my mother, who taught me the love for the human mind and inclusion.Andreas Tolk

Editor Biographies

Holly A. H. Handley is an associate professor in the Engineering Management and System Engineering Department of Old Dominion University (ODU). Her research focuses on developing models and methodologies to better represent the human component during the architecting and design of sociotechnical systems. She received her PhD from George Mason University in 1999 and is a licensed professional engineer. Her education includes a BS in Electrical Engineering from Clarkson College (1984), a MS in Electrical Engineering from the University of California at Berkeley (1987), and an MBA from the University of Hawaii (1995). Prior to joining ODU, Dr. Handley worked as a design engineer for Raytheon Company (1984–1993) and as a senior engineer for Pacific Science & Engineering Group (2002–2010). Dr. Handley is a member of the Institute of Electrical and Electronic Engineers (IEEE) Senior Grade, the International Council on Systems Engineering (INCOSE), and the Human Factors and Ergonomics Society. She is currently the chair of the IEEE Systems Council Human Systems Integration Technical Committee and was recently named an HFES Science Policy Fellow.

Andreas Tolk is a senior divisional staff member at The MITRE Corporation in Charlottesville, VA, and adjunct full professor at Old Dominion University in Norfolk, VA. He holds a PhD and MSc in Computer Science from the University of the Federal Armed Forces of Germany. His research interests include computational and epistemological foundations and constraints of model‐based solutions in computational sciences and their application in support of model‐based systems engineering, including the integration of simulation methods and tools into the systems engineering (SE) education and best practices. He published more than 250 peer‐reviewed journal articles, book chapters, and conference papers and edited 12 textbooks and compendia on SE and modeling and simulation topics. He is a fellow of the Society for Modeling (SCS) and Simulation and senior member of IEEE and the Association for Computing Machinery (ACM) and received multiple awards, including professional distinguished contribution awards from SCS and ACM.

Contributors List

Hussein A. AbbassSchool of Engineering and Information TechnologyUniversity of New South WalesCanberra, Canberra, ACT, Australia

Alliya AndersonKhalifa University of Science and TechnologyAbu Dhabi, United Arab Emirates

Philip S. BarryGeorge Mason UniversityFairfax, VA, USA

Marius BechererUniversity of New South Wales at Australian Defence Force AcademyCanberra, ACT, Australia

Jean BogaisUniversity of SydneySydney, NSW, Australia

Amy E. BoltonOffice of Naval ResearchArlington, VA, USA

Guy André BoyCentraleSupélec, Paris Saclay UniversityGif‐sur‐Yvette, France; ESTIA Institute of Technology, Bidart, France

Thien Bui‐NguyenUniversity of New South Wales at Australian Defence Force AcademyCanberra, ACT, Australia

A. Peter CampbellSMART Infrastructure FacilityUniversity of WollongongWollongong, NSWAustralia

Elizabeth ChangUniversity of New South Wales at Australian Defence Force AcademyCanberra, ACT, Australia

Hugh DavidChartered Institute of Ergonomics and Human FactorsBirmingham, UK

Steve DoskeyThe MITRE CorporationMcLean, VA, USA

Mahmoud EfatmaneshnikDefence Systems Engineering at the University of South Australia (UNISA)Adelaide, SA, Australia

Samuel F. FengKhalifa University of Science and TechnologyAbu Dhabi, United Arab Emirates

Florian GottwaltUniversity of New South Wales at Australian Defence Force AcademyCanberra, ACTAustralia

Stuart GreenUniversity of New South Wales at Australian Defence Force AcademyCanberra, ACT, Australia

Claudine HabakEmirates College for Advanced EducationAbu Dhabi, United Arab Emirates

Holly A. H. HandleyOld Dominion UniversityNorfolk, VA, USA

Fabrizio InterlandiEtihad Aviation TrainingAbu Dhabi, United Arab Emirates

Victoria JnitovaSchool of Engineering and Information Technology, University of New South Wales at Australian Defence Force Academy, Canberra, ACT, Australia

Keith F. JoinerCapability Systems Center, School of Engineering and Information Technology, University of New South Wales at Australian Defence Force Academy, Canberra, ACT, Australia

Michael JoyIDEMIA National Security SolutionsNew York, NY, USA

Grace A. L. KennedySMART Infrastructure FacilityUniversity of WollongongWollongong, NSW, Australia

Nelson KingKhalifa University of Science and TechnologyAbu DhabiUnited Arab Emirates

Pravir MalikFirst Order Technologies, LLCBerkeley, CA, USA

Angus L. M. T. McLeanCollins AerospaceCedar Rapids, IA, USA

Michael MelkonianEmirates College for Advanced EducationAbu Dhabi, United Arab Emirates

Kelly J. NevilleThe MITRE CorporationOrlando, FL, USA

Vladimir ParezanovićKhalifa University of Science and TechnologyAbu Dhabi, United Arab Emirates

Maria Natalia Russi‐VigoyaKBR, Houston, TX, USA

George SalazarJohnson Space Center, NASA, Houston, TX, USA

Christian G. W. Schnedler CISSP®, CSEP®, PMP®, and PSP®; IDEMIA National Security Solutions, New York, NY, USA

William R. ScottSMART Infrastructure FacilityUniversity of Wollongong, WollongongNSW, Australia

Sarah M. SherwoodNaval Medical Research Unit DaytonWright‐Patterson AFB, OH, USA

Farid ShirvaniSMART Infrastructure Facility, University of Wollongong, Wollongong, NSW, Australia

Andreas TolkThe MITRE Corporation, Charlottesville, VA, USA

Melissa M. WalwanisNaval Air Warfare Center Training Systems DivisionOrlando, FL, USA

M. Lynn WoolseyEmirates College for Advanced EducationAbu DhabiUnited Arab Emirates

Kate J. YaxleySchool of Engineering and Information Technology, University of New South Wales, Canberra, Canberra, ACT, Australia

Michael ZipperleUniversity of New South Wales at Australian Defence Force Academy, Canberra, ACT, Australia

Foreword

No one would question that we are today living in the age of connectivity. Global communications, global commerce, and global pandemics epitomize current affairs.

From a system of systems perspective rarely do we ever design and employ a system in isolation. Systems are developed and used in new and innovative ways as needs change, often working with other systems in ways not considered when the systems themselves were conceived. Complex supply chains integral to the modern economy include connections and dependencies that go beyond common understanding. With unknown billions of nodes in the Internet, connectivity between systems and people is a bedrock of contemporary society.

People are connected to their workplace, retailers, and their friends and family electronically. The majority of Americans possess “smart phones” that connect them into a growing network of cyber–physical systems – the Internet of Things – where they are part of a complex collaborative exchange with people and systems. People have moved beyond being “users” of systems to become an integral part of systems of systems both in the small and large sense. People no longer simply consume services of systems, but they and their actions are a core part of the dynamics of the larger system of systems. Their actions can affect the systems of systems in ways often not well understood, and changes in human behavior can have considerable ripple effects on large complex societal capabilities.

All of this has profound implications for human systems engineering. While a premium continues to be placed on human‐centered design focusing on the direct relationship between systems and their users, human systems considerations have expanded in this age of connectivity putting new demands on systems engineers as they factor human systems considerations into engineering endeavors.

We as systems engineers are no longer just expected to ensure that our systems are usable by an individual, but we are also expected to integrate users into complex distributed systems of systems where the users are part of the systems of systems and their behavior is part of the larger system of systems dynamics.

Systems engineers are no longer just expected to design systems, so they have value for the users but increasingly are asked to build systems that also bring value to the system owners through generation of data to support other aspects of the enterprise or to influence people’s economic, political, or social behavior.

Particularly in safety critical situations, it is no longer enough for systems engineers to design systems that enable people to operate systems to meet their immediate needs, but as these systems are part of a larger dynamic environment, a growing need exists to provide sufficient situational awareness to understand the impacts individual actions may have on other systems and people in the larger systems of systems.

Finally, as systems take on functions that had in the past been done by people, there is an increased emphasis on developing approaches to human systems teaming, a challenge heightened by the increased use of machine learning, where the balance between human and systems may shift over time based on experience.

These changes make this book both timely and important. With the framework provided by Handley in the opening chapter to the research agenda by Tolk at the close, the papers here explore numerous dimensions of human systems engineering, providing a window on experiences today and challenges for the future.

Judith DahmannMITRE Corporation Technical FellowINCOSE FellowAlexandria, Virginia

Preface

The International Council on Systems Engineering (INCOSE) defines Systems Engineering (SE) as an interdisciplinary approach and means to enable the realization of successful systems. SE focuses on defining customer needs by documenting requirements and then proceeds with functional analysis, design synthesis, and system validation. Throughout this process the complete system life cycle is considered: operations, performance, test, manufacturing, cost and schedule, training and support, and disposal.

SE promotes a team effort integrating various disciplines and specialty groups into a structured development process that considers both the business and the technical needs of all customers with the goal of providing a quality product that meets the users’ needs. It is therefore considered a methodical, disciplined approach for the design, realization, technical management, operations, and retirement of a system. In all these aspects, humans play a vital role. They define, use, maintain, and, as operators and decision makers, are part of the system. Since a system is only as strong as its weakest component, human potentials, capabilities, constraints, and limitations are pivotal for the successful engineering of systems.

The Human Systems Integration (HSI) Technical Committee (TC) of the IEEE Systems Council was formed in order to increase awareness of the user during SE processes. It focuses on identifying and improving methods to integrate human concerns into the conceptualization and design of systems. It encourages early understanding of human roles and responsibilities, along with limitations and constraints that may impact system design. This consideration of human concerns from the system design perspective is termed human systems engineering (HSE). HSE describes the engineering efforts conducted as part of the system design and analysis processes to evaluate the appropriateness and feasibility of system functions and roles allocated to operators. The importance of this topic is apparent from notable design errors, i.e. the placement of the iPhone 4 antenna resulting in poor performance when holding the phone, to design successes, for example, the Xbox Kinect that allowed users to interact with the game system without a handheld interface.

One of the goals of the HSI TC is to improve communication between the HSI and SE communities to provide better integration of human and systems to expedite resolution of issues. The HSI TC members promote this collaboration through conference presentations and workshops, as well as cooperation with other societies through joint events. Our members serve as technical reviewers and society liaisons to promote the role of human factors in engineering. This volume is a continuation of our technical committee outreach efforts.

This book was written for both systems engineers and HSI practitioners who are designing and evaluating different types of sociotechnical systems across various domains. Many engineers have heard of HSE but don’t understand its importance in system development. This book presents a series of HSE applications on a range of topics, such as interface design, training requirements, personnel capabilities and limitations, and human task allocation. Each chapter represents a case study of the application of HSE from different dimensions of sociotechnical systems. The examples are organized using a sociotechnical system framework to reference the applications across multiple system types and domains. These case studies serve to illustrate the value of applying HSE to the broader engineering community and provide real‐world examples. The goal is to provide reference examples in a variety of domains and applications to educate engineers; the integration of the human user is listed as one of the enablers of SE in the Systems Engineering Body of Knowledge (SEBoK).

As IEEE is primarily concerned with the engineering of electrical technologies, our goal is to include the perspective of design engineers who may be removed from the end user and unaware of potential concerns. The book chapters represent specific projects from the HSI TC members; the result is a set of stories that show the value of HSE through the development of human interfaces, improvement of human performance, effective use of human resources, and the design of safe and usable systems. The examples cross traditional SE sectors and identify a diverse set of HSE practices. Our contributed book is a source of information for engineers on current HSE applications.

Holly A. H. Handley, PhD, PE and Andreas Tolk, PhD

Section 1Sociotechnical System Types

1Introduction to the Human Systems Engineering Framework

Holly A. H. Handley

Old Dominion University, Norfolk, VA, USA

Keywords: human systems engineering; human system integration; ergonomics; socio‐technical framework;

1.1 Introduction

Many human‐centered disciplines exist that focus on the integration of humans and systems. These disciplines, such as human factors (HF), human systems integration (HSI), and human factors engineering (HFE), are often used interchangeable but have distinct meanings. This introductory chapter identifies these varied disciplines and then defines the domain of human systems engineering (HSE). HSE implies that human has been “engineered” into the design, in contrast to “integrating” the user into the system at later stages of design.

The use of HSE for increasing complex and varied sociotechnical systems requires a more context‐specific suite of tools and processes to address the combination of human and system components. More often a wider range of system stakeholders, including design and development engineers, are becoming involved in, and are vested in, the success of both HSE‐ and HSI‐related efforts. To assist these efforts, a framework was developed based on the dimensions of sociotechnical system and domain types, with relationships to specific HSI and SE concerns. The development of this framework and its dimensions is also described in the chapter.

Finally, the framework is used to organize a wide range of case studies across a variety of system types and domains to provide examples of current work in the field. These case studies focus on both the systems engineering (SE) applications and the HSE successes. Linking the cases to the framework identifies the contextual variables, based on both sociotechnical system and domain characteristics, and links them to specific human system concerns. Our goal with this volume is to emphasize the role of systems engineers in the development of successful sociotechnical systems.

1.2 Human‐Centered Disciplines

HF is a broad scientific and applied discipline. As a body of knowledge, HF is a collection of data and principles about human characteristics, capabilities, and limitations. This knowledge base is derived from empirical evidence from many fields and is used to help minimize the risk of systems by incorporating the diversity of human characteristics (England 2017). Ergonomics is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system and the profession that applies theory, principles, data, and methods to design in order to optimize human well‐being and overall system performance (IEA 2018). The term “human factors” is generally considered synonymous with the term “ergonomics.” HF engineers or ergonomics practitioners apply the body of knowledge of HF to the design of systems to make them compatible with the abilities and limitations of the human user.

HF has always employed a systems approach; however, in large complex systems, it was recognized that the role of the human must be considered from multiple perspectives (Smillie 2019). HSI is the interdisciplinary technical process for integrating multiple human considerations into SE practice (DOA 2015). Seven HSI areas of concerns have been identified – manpower, personnel, training, HFE, health and safety, habitability, and survivability – all of which need to be addressed in an interconnected approach. The emphasis of the HSI effort is on the trade‐offs within and across these domains in order to evaluate all options in terms of overall system performance, risk, and personnel‐related ownership cost (SAE6906 2019). HSI provides a comprehensive snapshot of how human systems interaction has been addressed throughout the system development process by evaluating each of these domains as the system design progresses through different stages. It identifies what issues remain to be resolved, including their level of risk, and suggests potential mitigations.

Human factors integration (HFI) is a systematic process for identifying, tracking, and resolving human‐related issues ensuring a balanced development of both technological and human aspects of a system (Defence Standard 00‐251 2015). HFI is the term used in the United Kingdom equivalent to HSI. Similar to HSI, HFI draws on the breadth of the HF disciplines and emphasizes the need to facilitate HFI management activities of concern across seven similar domains: manpower, personnel, training, HFE, system safety, health hazard assessment, and social and organizational (England 2017). The methods and processes available for HFI can be broken down into both technical activities and management activities; HFI has a well‐defined process and can draw on many methods, tools, standards, and data in order to prevent operational and development risks (Bruseberg 2009).

1.3 Human Systems Engineering

The HSI discipline was established with the primary objective to enhance the success of the Department of Defense (DoD) systems by placing humans on more equal footing with design elements such as hardware and software (SAE6906 2019). SE is an interdisciplinary field of engineering and engineering management that focuses on how to design and manage complex systems over the system life cycle. While HSI is considered an enabler to SE practice, systems engineers need to be actively engaged to continuously consider the human as part of the total system throughout the design and development stages. HSE is the application of human principles, models, and techniques to system design with the goal of optimizing system performance by taking human capabilities and limitations into consideration (DOD 1988). HSE approaches the human system design from the perspective of the systems engineer and views the human component as a system resource. Human‐focused analyses that occur as part of the HSE evaluations determine the required interactions between users and technology and are essential to insure efficient processes and data exchange between the technology elements and the human users (Handley 2019a). In the United Kingdom, human‐centric systems engineering (HCSE) seeks better ways to address HF within mainstream SE while building on and optimizing the coherence of existing best practice. Similar to HSE, HCSE approaches HF from an SE viewpoint and aims to develop core SE practices that help engineering organizations adopt the best HF processes for their needs (England 2017).

HSE applies what is known about the human to the design of systems. It focuses on the tasks that need to be performed, the allocation of specific tasks to human roles, the interactions required among the human operators, and the constraints imposed by human capabilities and limitations. A key focus of HSE is on the determination of the human role strategy; this allocation determines the implications for manning, training, and ultimately cost (ONR 1998). The human elements of the system possess knowledge, skills, and abilities that must be accounted for in system design, along with their physical characteristics and constraints, similar to other technical elements of the system. The goal of HSE is to augment the system descriptions with human‐centered models and analysis; these purposeful models inform trade‐off analyses between system design, program costs, schedule, and overall performance (Handley 2019a). As part of the SE process, HSE incorporates the human‐related specifications into the system description to improve overall system performance through human performance analysis throughout the system design process.

1.4 Development of the HSE Framework

The HSE framework was developed for the SE community to provide a basis for categorizing and understanding applications of HSE for different types of sociotechnical systems. It was developed by cross‐referencing and aligning different aspects of domains, system types, and design stages with applicable HSE and HSI tools and methods. The goal was to categorize projects in such a way that systems engineers and HSI practitioners could leverage tools, processes, and lessons learned across projects (Handley 2019b).

The original framework was developed by a team of Army HSI practitioners and subject matter experts (SMEs). The HSE framework was part of a larger project designed to mitigate human performance shortfalls and maximize system effectiveness by integrating well‐defined HSE (and where applicable HSI) processes and activities into the acquisition life cycle and to make these analyses explicit to stakeholders to increase “buy‐in” early in the design process (Taylor 2016). The resulting ontology could be expanded as needed to provide a common framework to identify elements and relationships important to the application of HSE, including classifying different stakeholders, system types, acquisition timelines, and user needs. This would allow HSI practitioners, systems engineers, and program managers to determine appropriate tools, methodologies, and information. The overall goal was to provide an overall organizing structure for HSE processes and products relevant to the SE effort that could be linked to a comprehensive repository of information and concurrent and past projects (Taylor 2016).

The original HSE framework is shown in Figure 1.1; it is a subset of the envisioned comprehensive ontology. This framework was used successfully to categorize different projects that involved the intersection of SE and HSI, including the Army’s transition to cross‐functional teams (Handley 2018). The framework represents the initial effort to provide a consistent taxonomy to determine appropriate tools and methodologies to address sociotechnical system concerns by offering an organizing structure to identify similar efforts.

Figure 1.1 HSE original framework.

The dimensions and descriptions of the original framework are as follows:

Sociotechnical system type – This dimension represents the different ways that users interact with systems. From the “users are the system,” which represents organizations and teams, to the other extreme “no direct system,” which represents autonomous systems, the intermediary points suggest different interaction points between users and systems.

Domains – This dimension represents the different contexts of use for systems, as different domains can induce different considerations and restrictions. Domain‐induced constraints include environmental variables, operator state, organizational factors, and personnel characteristics. While the framework was developed specifically for military systems, it can be extended and adapted across various domains such as space, transportation, and aerospace.

System design phases – The intent of the original framework was to capture the impact of different tools and methods at different phases of system design, i.e. concept, preliminary design, detailed design, test and evaluation, deployment, and retirement. This approach emphasized the benefits of applying human‐centered analyses early in the system development.

Tools and methods – By mapping the three previous dimensions to available tools and methods, the intent of the framework was that it could be used to suggest tool sets for different human‐centered analyses depending on the system type, domain, and stage of system development.

The framework acts as an index to identify essential information and previously validated findings. It can be used to suggest tools, methods, processes, data, standards, and expertise across similar systems and/or domains. The intent in developing the framework was that the dimensions could be expanded or modified as needed to capture evolving elements in sociotechnical systems and provide the metadata to classify the required HSE efforts.

1.5 HSE Applications

The original framework has been repurposed here to classify the case studies that compose this volume. The original dimensions have been slightly modified to better provide an index to the cases presented. This revised framework maintains the sociotechnical and domain dimensions; however, the second two dimensions were modified slightly to represent both HSE and SE concerns as shown in Figure 1.2. Note that for simplicity, both the HSE and SE concerns dimensions were limited to those that appear in the case studies. The modified framework presents a better categorization of the cases provided and facilitates easy identification of cases that best match the readers' interest.

Figure 1.2 HSE framework as an index for the case studies.

Additionally, the rendering of the framework has changed from the original tree structure to a multi‐axis plot. Each axis represents one of the framework dimensions, and the hash marks identify the subcategories. This visualization allows the cases to be “plotted” as an intersection of two (or more) dimensions. While the original framework identified the categories for each domain, the new rendering allows these categories to be used as a classification system, easily identifying the key content of each case study. As the applications in this volume are quite varied, the framework provides a logical way to organize and connect the case studies.

As shown in Figure 1.2, each chapter has been located on the framework to show its intersection among the dimensions. The first section of the book contains applications that describe different sociotechnical system types and their relationships with the human user. For example, Chapter 2describes human considerations for domain awareness and focuses on human interface design. The authors make a comprehensive analysis of situational awareness platforms for public safety and stress the importance of traditional training methods coupled with cutting‐edge technology. Chapter 3 defines the sociotechnical factors shown to improve success using artificial intelligence in a system development. With the integration of artificial intelligence into every system domain, the authors employ a quantitative model of the sociotechnical space to identify the discrepancies between not considering the stakeholder and high risks in complex agile projects. Chapter 4 considers both technology readiness and autotomy level to determine meaningful human control based on trust in human‐autonomy teaming. The authors use an example of herding sheep with airborne drones to provide a validation scenario for the proposed concept and process.

The second section of the book provides a “deep dive” focus on specific domains. These chapters provide examples of HSE impacts in specific contexts. For example, Chapter 5 looks at the Australian heavy rail industry and the use of sociotechnical modeling. The authors describe how integrating HF models with SE can be used to introduce new capabilities from an integrated organizational standpoint. Chapter 6 focuses on the engineering life cycle for space exploration systems and the use of human‐centered programs to mitigate risk. The authors describe how HSE can play an important role throughout the SE phases to optimize total system performance. Chapter 7 reviews the evolution of cockpit design based on the impact of evolving technologies the aerospace domain. The author describes how traditional human–computer interaction practices have given way to user experience “UX” and interaction design methodologies.

The next section, section three, focuses on training and skill sets with cross‐references to different domains. Chapter 8 discusses the impact of generational differences of users on the design of training programs. The authors describe a socio‐cognitive framework that combines the social aspects, i.e. generational differences, with the cognitive aspects, such as neuropsychology, that allows researchers to assess the effectiveness of gamified learning interventions. Chapter 9 investigates how training resiliency impacts readiness in the military domain. The authors identify basic workforce resilience measures that can be used to guide SE efforts to migrate to new training systems. Finally, Chapter 10 describes research that evaluates the introduction of virtual and constructive technology into live air combat training systems. The authors use qualitative methods, influenced by cognitive engineering and action research, to iteratively identify, assess, and mitigate risks stemming from the change of training techniques.

Section four presents two chapters that focus on the intersection of the socio‐component and human characteristics. Chapter 11 presents an approach to build trustworthy blockchain applications for large complex enterprises based on HSE principles. The methodology develops a human data integration and interaction methodology through establishing trust and security links. The authors illustrate their approach through an operational risk management example. Chapter 12 offers a unique took at the impact of light technologies on organizational information. The author describes the association between the implicit properties of light on the four organizational principles of presence, power, knowledge, and harmony.

Finally, section five offers some observations “from the field.” Chapter 13 provides a lighter note, offering an unedited account of some observations and suggestions for real‐time control room future designs. Chapter 14 concludes the volume with a selection of research topics challenges compiled into several categories. The chapter author hopes that members of the scholastic community will contribute to the improvement of this first topology of challenges as well as the framework for HSE itself.

1.6 Conclusion

While many systems engineers understand that the human operator and maintainer are part of the system, they often lack the expertise or information needed to fully specify and incorporate human capabilities into the system design (INCOSE 2011). Human systems engineers are actively involved in the development of the system and ensure human‐centered principles are incorporated into design decisions. HSE provides methods for integrating human considerations with and across system elements to optimize human system performance and minimize total ownership costs.

The case studies in this volume provide insights into HSE efforts across different sociotechnical system types across a variety of domains. Currently, most of the existing sociotechnical system case studies are from the HSI perspective, i.e. working with users to improve the system usability and interfaces in deployed systems. The focus of this book, however, is from the SE viewpoint, encouraging early consideration of the human in the system design. While some of the chapters will overlap with the traditional HSI approaches, the goal of the book is to encourage systems engineers to think about the human component earlier in the system development. The chapters are organized and indexed by the framework; the book can be read in order to follow the progression across the framework, or Figure 1.2 can be used to identify specific chapters of interest to the reader based on any one of the four dimensions. The goal of this book is to serve as a reference volume for HSE.

References

Bruseberg, A. (2009).

The Human View Handbook for MODAF (Pt. 2, Technical Description)

. Somerset, UK: Human Factors Integration Defence Technology Centre.

DOA (2015).

Soldier‐Materiel Systems Human Systems Integration in the System Acquisition Process

. Department of the Army Regulation 602‐2. Washington, DC: DOA.

DOD (1988).

Manpower, Personnel, Training, and Safety (MPTS) in the Defense System Acquisition Process

. DoD Directive 5000.53. Washington, DC: DOD.

England, R. (2017).

Human Factors for SE

. INCOSE UK, Z12, Issue 1.0 (March 2017).

http://incoseonline.org.uk/Groups/Human_Centric_Systems_Engineering_WG/Main.aspx

(accessed 16 March 2020).

Handley, H. (2018).

CFT by System Type and HSI Domain, Deliverable to Human Systems Integration (HSI) Tool Gap Analysis Report for Deputy Director

. US Army Human Systems Integration.

Handley, H. (2019a). Human system engineering. In:

The Human Viewpoint for System Architectures

. Springer.

Handley, H. (2019b). A socio‐technical architecture. In:

The Human Viewpoint for System Architectures

. Springer.

IEA (2018).

What Is Ergonomics?

International Ergonomics Association.

https://iea.cc/what‐is‐ergonomics

(accessed 16 March 2020).

INCOSE (2011).

Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities

, 3.2e (ed. H. Cecilia). San Diego, CA: INCOSE.

ONR (1998).

Human Engineering Process

. Technical Report, SC‐21 S&T Manning Affordability Initiative. Washington, DC: Office of Naval Research.

SAE6906 (2019). Standard Practice for Human System Integration, SAW6906, 2019‐02‐08.

Smillie, R. (2019). Introduction to the human viewpoint. In:

The Human Viewpoint for System Architectures

(ed. H. Handley). Springer.

Taylor, A. (2016).

The Human Systems Integration Workbench

. White Paper PJF‐18‐425. US Army Materiel Command (AMC).

UK Defence Standardization (2015).

Def Stan 00‐251 Human Factors Integration for Defence Systems

, Public Comment Draft, Issue 1, Version 1.0 (September 2015).

2Human Interface Considerations for Situational Awareness

Christian G. W. Schnedler1 and Michael Joy2

1 CISSP®, CSEP®, PMP®, and PSP®, IDEMIA National Security Solutions, New York, NY, USA

2 IDEMIA National Security Solutions, New York, NY, USA

2.1 Introduction

The field of situational awareness (SA) arguably embodies the most urgent demand for human systems integration (HSI) as it encompasses the real‐time application of (increasingly machine‐assisted) human decision making in all‐too‐often life and death circumstances. Birthed in the maritime and military domains, SA concepts are now applied to fields as diverse as public safety and first responders, facility and border security, autonomous vehicles, and digital marketing. Common across these domains is the need to understand relevance within vast amounts of disparate data and present this information to human operators in an intuitive, timely, and conspicuous manner. To achieve these objectives, SA systems must disambiguate the definition of “relevant” by understanding the rules governing an operator's potential range of actions and the specific context of the operator receiving the information.

Emerging developments in the technology platforms of sensors, data, artificial intelligence (AI), computer vision, and mobile devices are enabling advancements in the SA platforms that provide real‐time decision‐making opportunities in both structured and unstructured space. These developments challenge the traditional ways that information has been collected, aggregated, collated, analyzed, disseminated, and provide opportunities to empower operators and citizens to gain greater awareness of their surroundings in order to make better informed and more meaningful decisions. Inherent challenges with the volume, variety, velocity, and veracity of this information demand novel approaches to HSI across multiple, concurrent operational theaters.

This chapter summarizes major considerations given to SA platforms and illustrates these through their application to the public safety domain. The authors draw on their decades‐long experience designing and implementing SA systems in municipal and federal public safety organizations in regions as diverse as the United States, Middle East, and Africa. Due consideration is given to the growing concerns around privacy in Western nations and the apparent paradox around the need to promote transparency within public safety organizations without empowering terrorists, criminals, and others' intent on disrupting the lives and liberties of those engaged in democratic societies.

2.2 Situational Awareness: A Global Challenge

Situational awareness is a concept, a system, and a solution. There are well‐established SA definitions and related organizations for the maritime domain, the space domain, and the Arctic. In her seminal Designing for Situation Awareness (Endsley 2011), Dr. Mica Endsley summarizes SA as “being aware of what is happening around you and understanding what that information means to you now and in the future.” Elsewhere, Dr. Endsley has defined SA as “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future.”1 It is the internal mental model of the dynamic environment, which when combined with more static system and procedural knowledge allows decision makers in these domains to function effectively.

In the wake of the 9/11 attacks, the New York Police Department (NYPD) led a public–private partnership (PPP) effort to create what became the Domain Awareness System (DAS) to counter future terrorist attempts and to improve public safety.2 This initial DAS effort by the NYPD provided a subsequent technology framework for the development of real‐time SA solutions to address a broad range of public and private use cases, from high value facility security and border management to conflict zone and environmental protection, to healthcare, to opioid crisis response, and to the recovery of persons at risk from human traffickers. In each of these use cases, development was led by industry in partnership with government.

The Chinese central government has led PPP development of its “Sharp Eyes” surveillance system.3 By intertwining digital commerce with public safety, China has created an unprecedented surveillance apparatus with near limitless opportunities for machine learning and analytics to process, categorize, and contextualize information for human operators. This surveillance system model now exported around the world as “Safe City” solutions challenges the Western notion of privacy and human rights when employed against targeted population groups like the Muslim Uighurs in western China.

In light of this range of applications, the definition of “situational awareness” remains somewhat ambiguous and nonpractical. For the purpose of this paper, SA refers back to the foundational definition espoused by Dr. Endsley and refers to the real‐time presentation of pertinent information to a human operator to inform subsequent action. The geographic domain is relevant only in so much as its relevance to the human operator in question. Similarly, historic information and trends are relevant only in as much as they apply to the real‐time context of the operator. Multiple operators may be involved in a single event, and the SA platform must consider the perspective and context of each in order to achieve its intended purpose.

2.3 Putting Situational Awareness in Context: First Responders

Although much literature has been written on SA concepts in the aerospace, military, and maritime domains, the proliferation of Internet of Things (IoT) devices and advancements in machine vision and AI have enabled the democratization of SA capabilities. Under the banner of “smart cities,” municipalities have begun implementing static surveillance capabilities and outfitting first responders with mobile and body‐worn devices that act as both a sensor and a means of improving SA. By some estimates, the market for surveillance equipment will reach $77B by 2023.4 This explosion in sensors has led to increased public safety expectations, as well as greater scrutiny over the actions taken by first responders.

To meet these expectations, law enforcement agencies in particular employ a variety of surveillance tools to achieve awareness of events occurring in the geographic domain under their authority. These tools include closed‐circuit television (CCTV) cameras; license plate readers; and chemical, biological, radiation, nuclear, and explosive (CBRNE) sensors. Historically compartmented information warehouses containing criminal histories, emergency calls, use of force logs, and similar are increasingly being fused and made available for real‐time search. Moreover, noncriminal information ranging from social media and other open‐source datasets to credit histories and other quasi‐public records are increasingly accessible to provide context to an event. The use of such noncriminal records to assist law enforcement is often vigorously contested and will be addressed later in this chapter, but regardless of a particular agency's implementation, today's challenge remains a big data problem. In other words, identifying the particular set of information relevant to an event is paramount; with few exceptions, the requisite data points to improve an officer's SA are available.

Complicating the analysis and dissemination of pertinent information to SA are the layers of information security policies applied to the first responder community. For example, law enforcement agencies in the United States must adhere to the Criminal Justice Information Standards established by the Federal Bureau of Investigation.5 These standards mandate, among other requirements, that anyone accessing law enforcement data first authenticate themselves as a qualified operator and further establish a need to know the information requested. These regulations are often further restricted by agency‐specific policies, such as preventing the disclosure of information pertaining to active cases to anyone not specifically associated with the case in question. Such policies and regulations were generally enacted and expanded in the wake of inadvertent (or deliberate) misuse of information over many decades. Few contemplated the ramifications on nonhuman actors, such as the potential of AI, and fewer still considered how persistent access to such information may contribute to real‐time SA platforms charged with improving the safety and effectiveness of modern‐day first responders.

It is in this context that the demands on first responders to employ SA platforms for decision support are being placed. With this comes a myriad of HSI concerns, ranging from the physical real estate available to first responders to interact with SA platforms to the means by which this complex set of information can be presented. Underpinning all considerations is the paramount importance of officer safety and the need to understand the operator's context in order to establish information relevance and right to know.

2.4 Deep Dive on Human Interface Considerations

With the advent of IoT sensors and significant increases in capabilities for both connectivity and storage, big data has become the prime dependency for many new technologies and solutions, especially SA. In public safety, and more particularly with first responders, the sheer breadth of information available is overwhelming. Designing human system interfaces that can retrieve, parse, and organize relevant data based on real‐time activities and events, as well as present it in a meaningful, concise, and unintrusive (yet attentive) way, is a defining challenge.

At its core, public safety focused SA is predicated on alerting to noteworthy events in real time while increasing the knowledge and expanding the experience of responding personnel by drawing upon all pertinent historical, concurrent, and predictive information available to the agency. With a primary focus on officer safety, users of this system only have a few minutes upon being notified of the event to ingest the relevant data, make a determination on tactics, and adjust their response accordingly. This is all while they are also driving, communicating with dispatch, and coordinating with colleagues and supervisors. As such, the intelligence generated and presented must offer substantive benefits as rapidly and concisely as possible. The immediate goal of all first responders is to protect life, and much of the data available to police departments can support key areas such as subject identification, threat assessment, and response tactics, all of which greatly enhance SA and help to keep everyone safe.

Machine‐assisted data retrieval, organization, and presentation not only improve the safety of all those involved, but it supports officer decision making by informing them of supplementary details and historical activities and actions. These characteristics are unique to every call for service, and a better understanding of them within the context of the current interaction is invaluable. However, the same mechanisms that collate the appropriate information must also exclude the rest. Considering the highly mobile nature of first responders and the inherent limitations of portable hardware in a public safety setting, it is not practical to expose all associated data, even if it could potentially be relevant in some ancillary contexts. Conversely, ignoring that information has its own tangible detriments, most notably, indicating an incorrect narrative to responding personnel that causes them to make poor judgments that have lasting impacts.

Computers have a unique ability to project truth, regardless of the quality and completeness of the underlying data. This “machine heuristic”6