Introduction to Humans in Engineered Systems - Roger Remington - E-Book

Introduction to Humans in Engineered Systems E-Book

Roger Remington

0,0
127,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Fully up-to-date coverage of human factors engineering--plus online access to interactive demonstrations and exercises Engineering accomplishments can be as spectacular as a moon landing or as mundane as an uneventful drive to the local grocery store. Their failures can be as devastating as a plane crash or a massive oil spill. Over the past decade, psychologists and engineers have made great strides in understanding how humans interact with complex engineered systems--human engineering. Introduction to Humans in Engineered Systems provides historical context for the discipline and an overview of some of the real-world settings in which human engineering has been successfully applied, including aviation, medicine, computer science, and ground transportation. It presents findings on the nature and variety of human-engineering environments, human capabilities and limitations, and how these factors influence system performance. Important features include: * Contents organized around the interaction of the human operator with the larger environment to guide the analysis of real-world situations * A web-based archive of interactive demonstrations, exercises, and links to additional readings and tools applicable to a range of application domains * Web content customizable for focus on particular areas of study or research

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 920

Veröffentlichungsjahr: 2012

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Contents

Title Page

Copyright

Dedication

Preface

Part I: Historical Perspective

Chapter 1: Natural and Engineered Systems

Purposeful Design

User-Centered Design

Design against Failure

Summary

References

Chapter 2: Historical Roots

Engineering for Physical Limitations

Engineering for Human Cognition

The Modern Era

A Fractured Field

Summary

References

Chapter 3: The Current Practice

Aerospace

Medicine

Automotive Industry

Computer Industry

Summary

References

Part II: The Environment

Chapter 4: The Varied Nature of Environments

Static vs. Dynamic Domains

Sources of Difficulty in Static Environments

Sources of Difficulty in Dynamic Environments

Internal vs. External Pacing

Error Tolerance

Summary

References

Chapter 5: The Social Context

Methodological Consequences of Group Size

Communication and Coordination Consequences of Group Size

Summary

References

Chapter 6: Analysis Techniques

Modeling Static Environments: Finite State Representations

Modeling Dynamic Environments

Control Theory

Measuring Complexity Using Information Theory

Modeling Throughput Using Queuing Theory

Summary

References

Part III: The Human Element

Chapter 7: Determinants of Human Behavior

The Human Factor

Structure and Content

Levels of Analysis

Summary

References

Chapter 8: The Structure of Human Information Processing

Processing Stages

Cognition and Action

Cognition and Goal-Directed Behavior

Response Selection

The Nature of Capacity Limitations

Summary

References

Chapter 9: Acquiring Information

Sensory Processing

Attention

Summary

References

Chapter 10: Central Processing Limitations on Multitasking

Bottleneck Theories

Central Bottleneck Theory and Human-Computer Interaction

Capacity Theories

Multiple Resource Theory

Applications of Single-Channel and Multiple Resource Theories

Timesharing

Timesharing Strategies and the Control of Processing

Summary

References

Chapter 11: Memory

Types of Memories

Retaining and Forgetting Information

Retrieving Information

Summary

References

Chapter 12: Decision Making

Anatomy of a Decision

Normative Approaches to Decision Making

Nonoptimality of Human Decisions

Cognitive Approaches to Decision Making

Heuristics in Human Decisions

Other Influences on Decision Making

Process Models of Human Decision Making

Naturalistic Decision Making

Relationship between Decision-Making Models and Systems Engineering

Summary

References

Part IV: Human-System Integration

Chapter 13: A Case Study in Human-System Performance: The Exxon Valdez

An Account of the Grounding of the Tankship Exxon Valdez

The Nature of the Error

Summary

References

Chapter 14: Human Error

Human Error and System Error

The Nature of Human Error

Theories of Human Error

Situation Awareness

Summary

References

Chapter 15: Contextual Factors Affecting Human-System Performance

Workload

Interruption

Operator State

Summary

References

Chapter 16: The Role of Automation in Human-System Performance

Using Automated Devices

Levels of Automation

A Taxonomy of Automation Levels

Automation as a Decision Support Aid

Automation and System Safety

Summary

References

Chapter 17: Supporting Human-System Performance

Alarms and Alerts

Information Displays

Create Barriers

Summary

References

Index

End User License Agreement

List of Illustrations

Chapter 2: Historical Roots

Figure 2.1 Early stone tools were adapted to fit the hand.

Figure 2.2 Woodwind instruments.

Figure 2.3 Chinese character types.

Figure 2.4 Simplified control loop for landing an aircraft.

Chapter 4: The Varied Nature of Environments

Figure 4.1 Schematic depiction of factors that affect human performance in complex environments. The importance is indicated by the level in the hierarchy (width), starting with the characteristics of the physical environment and ending with practitioner characteristics.

Figure 4.2 Examples of different control dynamics. (a) Linear dynamics: outputs proportional to inputs for levels of input. (b) Nonlinear negative dynamics: change in output decreases for same change in inputs at high input levels (characteristic of negative feedback). (c) Nonlinear positive dynamics: change in output increases for same change in inputs at high input levels (characteristic of positive feedback). (d) Nonlinear catastrophic dynamics: output stable but changes abruptly at some level of continuous input. (e) Nonlinear chaotic dynamics: output unpredictable with small changes in input.

Figure 4.3 Nonlinear dynamics of aircraft control surfaces. Diagram (a) shows the accelerated airflow over a curved wing that generates lift. Other forces also act on the aircraft, as indicated in diagram (b). The pilot controls the aircraft by altering lift on the wings and other control surfaces (e.g., rudder). Controlling the aircraft is thus a matter of adjusting the pressure differentials to achieve the correct forces, a process that is inherently nonlinear.

Figure 4.4 Nonlinear dynamics can often be transformed to nearly linear input-output relationships. Graph (a) shows the nonlinear input-output relationship defined by the equation of y = x

3

+ 4x

2

. Graph (b) shows the linear relationship obtained when the logarithm of the input is used to determine the logarithm of the output.

Figure 4.5 Differences in control order. The task is to use a joystick (pictured) to move a “ball” from the bottom to the top of the screen. The user makes a control input and observes a change in position of the ball over time. Zero-order (position) control: position of the ball is determined by position of the joystick. First-order (velocity) control: ball moves at constant speed determined by joystick position. Velocity must be returned to zero to stop the ball. Second-order (acceleration) control: ball moves first with increasing speed, then with decreasing speed. Both acceleration and velocity must be cancelled for the ball to come to rest.

Chapter 5: The Social Context

Figure 5.1 The size of the social network affects system response time as well as the time needed to conduct empirical tests of system functioning.

Figure 5.2 Methods of inquiry change as the scale of operations increases, becoming more qualitative, more subjective, and more focused on the system than the individual.

Figure 5.3 A hierarchical organization of sales representatives

Figure 5.4 The same sales force depicted in Figure 5.3 with lateral communication and authority linkages

Chapter 6: Analysis Techniques

Figure 6.1 A simple finite-state model of searching through a list of linked Web pages, such as a menu or the World Wide Web. Each link is examined in turn to determine if it is close enough to the desired target to explore further. If so, it is selected. If not, the next link is selected and the process repeats. If no more links remain, the user is assumed to press the Back button and select a new link from the “parent” page.

Figure 6.2 Simplified control loop for landing an aircraft. K = gain, ∫ = integration. Monitored state variables include airspeed, descent rate, altitude, and heading. The control agent used feedback of error from desired state to select an action. Evaluation of error should include lags in system response, and actions should include lags in internal response.

Figure 6.3 Illustration of the effect of activation strength, noise and signal distributions, and criterion on the decision to say that a target stimulus was present or absent. The abscissa plots the activation strength as a random variable. The ordinate plots the probability of a specific activation strength (on the abscissa) given the noise and signal distributions. The black region represents False Alarms (saying that a target was present when there was only noise). The gray region represents Misses (saying a target was not present when it was). When the distributions overlap, as they often do at threshold, the placement of the criterion determines the proportion of False Alarms and Misses.

Figure 6.4 Illustration of an easy discrimination in signal detection theory. Overlap in signal and noise distributions is minimal, as the means are far apart relative to their variances. Note that placing the criterion (β) midway between the means of the two distributions minimizes joint error (False Alarms and Misses). Moving the criterion to further reduce Misses, as shown in graph (b), produces a large increase in False Alarms for a small reduction in Misses.

Figure 6.5 Elements of queuing theory. Events arrive to a processor (server) randomly with some density. The server takes a specified time to process each item. Which item is chosen to process next is set by a policy. The number of items in a queue, and their average wait time, can be calculated from the arrival rate, processing time, and queue policy.

Chapter 7: Determinants of Human Behavior

Figure 7.1 Behavior results from the interaction of the user, her goals, and the context (world) in which she is immersed.

Figure 7.2 Depiction of information flow in terms of system control. The state of the controlled system is sensed and assessed against the goals of the context. This leads to a plan of action, the consequences of which are predicted. If the prediction conforms to the desired behavior, the action is taken; if not, the plan is further assessed and refined.

Chapter 8: The Structure of Human Information Processing

Figure 8.1 Flow of human information processing from sensory input to motor output. Each arrow represents a notional unit of information. Selective attention limits the amount of perceptual information input at any given time to cognition. A major function of cognitive processing is

action selection

, though that term should not be interpreted as a complete description of cognitive activity.

Figure 8.2 Example of goal-directed behavior in a choice response time task. Time flows downward. Task goals must be accessed to determine which of several responses to make.

Figure 8.3 Eye tracking patterns for the picture in the upper left vary dramatically depending on the information the viewer is asked to glean.

Figure 8.4 A notional graph depicting a hypothetical relationship between the number of pitches in a pitcher’s repertoire and the time it will take a batter to select a particular swing. Solid line shows the linear relationship between response time (Swing Selection Time) and entropy (H) when all pitches are equally likely. The dashed line shows predicted swing selection time when the pitch can be predicted 60 percent of the time. The dotted line shows the estimated amount of time available to select the swing for a successful hit.

Figure 8.5 Diagram (a) shows the layout of the burners on the stove top of one of the authors (Remington) with the control knobs arranged vertically on the right. Numbers in parentheses indicate the burner controlled by each knob. Although the correspondence is regular, there is no perceptual cue to indicate the spatial correspondence. The layout in diagram (b) arranges the knobs in spatial correspondence to the burners. Note how it is no longer necessary to indicate which knob controls which burner.

Chapter 9: Acquiring Information

Figure 9.1 The visual system is sensitive to only a small fraction of the electromagnetic spectrum, consisting of the wavelengths of light between about 400–700 nm.

Figure 9.2 Illumination. Diagram (a) shows a case of pure specular reflection, in which the light is reflected back at the same angle at which it struck without scatter. The angle of incidence (θ) is the same as the angle of reflection. Diagram (b) shows how light at different wavelengths is absorbed or reflected. Blue (B), green (G), yellow (Y), and red (R) wavelengths strike a surface that reflects only wavelengths in the red portion of the spectrum. The surface will appear red under normal sunlight, as only red wavelengths will reach the eye.

Figure 9.3 The power of sunlight is concentrated in and distributed relatively uniformly across the visible spectrum (Taylor & Kerr, 1941). Graph (a) shows the relative percentage of the spectral power of sunlight in the visible spectrum. Graph (b) shows the relative percentage of power in a cool white fluorescent lamp. Units are in percentage of total energy. The vertical dotted lines represent regions corresponding to the wavelengths of the specified colors.

Figure 9.4 The basic components of the eye. Light rays are bent by the cornea, and then enter the eye through the pupil, whose diameter is determined by muscles controlling the iris. The lens produces a focused image on the back of the retina, where photoreceptors transform the light into neural activity which is then processed through layers of neurons and passed on to the visual areas of the brain by way of the optic nerve.

Figure 9.5 The Snellen Eye Chart is commonly used for assessing visual acuity.

Figure 9.6 Diagram (a) shows a high-contrast, low-frequency square wave grating. Diagram (b) shows a low-contrast, high-frequency square wave grating. Corresponding amplitude profiles are shown below each grating. Square wave gratings are constructed by summing the odd harmonics of a sine wave grating.

Figure 9.7 Computation of visual angle. S denotes the size of the object, D the distance of the object from the lens, Y the distance from the lens to the retina, and V the angle subtended by S. R is the image of S projected on the retina.

Figure 9.8a Contrast sensitivity as a function of spatial frequency in cycles per degree of visual angle (cpd). Contrast sensitivity is the inverse of threshold. The higher the contrast sensitivity, the lower the contrast needed for threshold perception of a luminance difference. This panel shows the effect of differences in mean field luminance on a 525-nm grating. Sensitivity increases with increasing luminance.

Figure 9.8b The effect of retinal eccentricity. Sensitivity increases nearer the fovea. Contrast sensitivity also varies as a function of grating type (square wave, sinusoid) and flicker rate.

Figure 9.9 A demonstration of the effect of luminance contrast on reading. The top bar is a grey-scale image showing the luminance contrast produced by yellow text on a white background. It reads, “White text on a Yellow background is bad because both have high luminance, reducing the luminance contrast.” The middle bar is a similar grey-scale image that reads, “Blue on Black is bad as both are low in luminance and reduce contrast.” The bottom bar reads, “What about Red on Green?” Red on green is difficult because of the way color is processed.

Figure 9.10 An Anstis Array. All letters should be equally visible while fixating on the center dot. The Anstis Array demonstrates that equating the cortical representation of objects compensates for decreases in acuity with retinal eccentricity.

Figure 9.11 The adaptation function, measured as the contrast threshold (left vertical axis) as a function of time in the dark. Rod threshold is greater than cone threshold when light-adapted. Cones adapt more rapidly to the dark than do rods, but sensitivity levels off early.

Figure 9.12 Examples of visual crowding. In each letter grouping, try to read the single letter, or the letter in the middle, while maintaining fixation on the small black dot. Note how flanking letters increase difficulty in the periphery, but not near the fovea.

Figure 9.13 Spectral sensitivity functions for three cone types and rods. The peak of the L cone is the primary cone type for responses to red (R). The M cones correspond roughly to green (G), and the S cones are specialized for the short wavelengths of blue (B). The rod response is also shown. Values on the

y

-axis reflect the relative sensitivity with respect to the peak sensitivity, not the absolute sensitivity.

Figure 9.14 A simplified version of color and luminance processing. Activation is summed in the luminance channel. Red (R) and green (G) cones provide most of the activation. R and G are summed to get yellow (Y). Color difference channels compute the difference of R and G separately from the difference between R and Y (R + G).

Figure 9.15 CIE XY chromaticity diagram. The central circle represents the point of achromatic white, which equates to equal amounts of all three primaries (X = Y = 1/3). The triangle shows the colors possible given three common primaries for color television.

Figure 9.16 The relationship of airflow and pressure inside an open tube (flute) and a tube open only at one end (clarinet). Sound is reflected from the ends of the tube. At the open end the pressure is equivalent to the external pressure, but not at the closed end.

Figure 9.17 Four different kinds of waveforms. The square, triangle, and sawtooth waves can all be constructed by a weighted sum of the harmonics of sine waves. For example, both the square wave and the triangle wave contain only the odd harmonics of the fundamental sine wave. For the sine wave we note its amplitude (A), and wavelength (λ).

Figure 9.18 The structures of the ear.

Figure 9.19 Equal loudness contours. Each line reflects the sound pressure level at the tested frequency (

x

-axis) required to match the perceived loudness of a 1000-Hz tone at a specified loudness (phon). The ear is most sensitive in the 2–5 kHz range.

Figure 9.20 Approximate bandwidth (equivalent rectangular bandwidth) of the critical band as a function of frequency. The plot illustrates how the bandwidth increases with frequency. Critical bands describe the frequency tuning in the auditory system and determine the amount of interference (masking) that will occur if two tones are presented.

Figure 9.21 The effect of masking noise raising the threshold of detection of a target sound.

Figure 9.22 The interaural time difference. The sound arrives at the observer’s right ear at T

n

, at the left ear at T

n+D

. In this case D = 2l. The head also attenuates the sound, especially at high frequencies, creating a sound shadow.

Figure 9.23 Spatial attention. A cue is presented pointing left or right to indicate the probable location of the target. The arrow cue is 80 percent Valid, 20 percent Invalid. The task is simply to press a key whenever either box is illuminated. Graph (b) compared reaction times for Valid and Invalid cues with those of a Neutral cue. Compared to a Neutral cue, Valid cues facilitate reaction time, Invalid cues inhibit reaction time.

Figure 9.24 Feature and conjunction feature search. In diagram (a), targets are a black X or black T. Search is among white Xs and gray Ts. Targets are easy to find regardless of distractors, and set size has little effect. In diagram (b), targets are either a white T or a black X. Search is among black Ts and white Xs, so the target is the conjunction of color and shape. As the set size gets larger, search becomes increasingly difficult.

Figure 9.25 Typical search times for feature and conjunction search as a function of set size. Lines represent best fitting linear functions. Slope of “Target Absent” trials in conjunction search is twice that of “Target Present” trials, consistent with an item-by-item search that terminates when the target is found.

Figure 9.26 A model of attention allocation in visual search. The features of stimuli are coded in separate feature maps. Activity at each stimulus location in each feature map is weighted according to task relevance and is then combined to form an activation map. Attention can be allocated to locations and objects in order of these activation values, or by selecting a specific feature map.

Figure 9.27 Capture by abrupt-onset characters. The fixation display consisted of placeholders that indicated where characters would appear. In addition, an abrupt-onset character was presented at a location not indicated in the fixation display. The abrupt onset could be either a target or a distractor. The right panel shows that when the abrupt onset was the target, set size had no effect on search times, consistent with the abrupt onset having captured attention.

Figure 9.28 Two of the four conditions from Folk, Remington, and Johnston (1992). Each of four groups was presented with a single pairing of cue type (color, onset) and target type (color, onset). Diagram (a) shows the sequence of events for the Onset Cue/Color Target group and the Color Cue/Onset Target group. On any trial, cues could be Valid (same location as target) or Invalid (different location). Diagram (b) shows that cue validity had an effect only when the cues and targets were of the same type: Onset Cue/Onset Target and Color Cue/Color Target. When participants were looking for color, onsets would not capture attention, nor would color capture attention when they were looking for an abrupt-onset target.

Chapter 10: Central Processing Limitations on Multitasking

Figure 10.1 The two tasks of Figure 8.2 presented as Task 1 (T1) and Task 2 (T2) in the Psychological Refractory Period paradigm. Onset of the T2 stimulus (S2) occurs at varying times after onset of the T1 stimulus (S1), referred to as the stimulus onset asychrony (SOA). R1 and R2 refer to the responses to S1 and S2, respectively.

Figure 10.2 Central Bottleneck Theory account of delays in RT2 in the PRP paradigm. At short SOAs, the mental processing of T1 and T2 overlaps, creating competition for the response-selection stage (RS). This competition causes postponement of RS on T2, resulting in elevated RT2. The dotted line in graph (b) plots predicted results from Central Bottleneck Theory based on response times at the shortest and longest SOAs.

Figure 10.3 Stimuli and responses for dual-task driving study (Levy, Pashler, & Boer, 2006). An auditory tone was presented at varying times (SOA) prior to the illumination of brake lights of the leading car. Participants made a vocal response to the auditory stimulus. In the single-foot driving condition, the participant lifted the foot off the gas pedal and depressed the brake with the same foot as quickly as possible.

Figure 10.4 Results of the single-foot driving condition dual-task driving study (Levy, Pashler, & Boer, 2006). Brake response times (BRT) show PRP elevation similar to less familiar manual responses in laboratory. Movement time is constant across SOAs. The effect of SOA on BRT is due primarily to the time needed for participants to remove the foot from the gas pedal.

Figure 10.5 The resources of the Model Human Processor and their connections.

Figure 10.6 Performance-resource functions for two tasks. Task A performance peaks with 60% allocation of resources, Task B with 10%. Performance on A is resource limited up to 60%, after which it is data limited. B shows strong data limits throughout.

Figure 10.7 Performance-resource functions for tasks of varying difficulty. As task difficulty increases, more resources must be allocated to achieve the same level of performance. Task A is easy, reaching peak performance with about 40% allocation. Task B is more difficult, reaching peak performance only with full resource allocation. Task C is very difficult, with no performance improvements until virtually full allocation.

Figure 10.8 Dual-task trade-offs as depicted by the resource-performance functions. As resources are devoted to T1 (solid line), T2 performance (dashed line) declines. In graph (a), T2 is relatively easy. When performance on T1 reaches 90%, the remaining resources (shaded area) are sufficient to perform T2 at around 85%. In graph (b), T2 is more difficult. T2 performance is less than 40% when T1 is at 90%.

Figure 10.9 Resources and associated processing modules in Multiple Resource Theory. There are four resource dimensions associated with (1) Stages, (2) Modalities, (3) Codes, and (4) Responses. Each enclosed module has its own resources. The perceptual and response stages also have resources that limit independent internal processing. Auditory and visual outputs can lead to both verbal and spatial codes. Verbal codes are associated with vocal output, spatial codes with manual output.

Figure 10.10 The effect of the number of intervening items on the response time to a task switch. Here the

x

-axis is in logarithmic coordinates.

Figure 10.11 Speed-accuracy trade-off. The shaded region extending from 0–C indicates the region in which speed and accuracy show significant trade-off.

Chapter 11: Memory

Figure 11.1 Components of the modal model of memory and their connections.

Figure 11.2 Forgetting curve.

Figure 11.3 Working memory system.

Figure 11.4 Ebbinghaus forgetting function.

Figure 11.5 Interference effects of intervening items on short-term memory performance.

Figure 11.6 Release from Proactive Interference.

Figure 11.7 Proactive Interference and forgetting as a function of number of previous lists learned.

Figure 11.8 Response time in retrieval from short-term memory.

Figure 11.9 Hierarchical network model of semantic memory. Activation of one node can spread to other nodes increasing activation for related items.

Figure 11.10 Sentence verification task results. The time to respond to TRUE questions increases with distance in the hierarchy, as measured by the number of nodes in between the exemplar and category label.

Figure 11.11 Demonstration of the encoding specificity principle. Recall is better in the same environment as learning.

Chapter 12: Decision Making

Figure 12.1 Framework for decision making.

Figure 12.2 Computation of utility and the expected value. In this example, you wager $1 on the roll of two dice and win $1.75 every time an even number greater than 4 occurs. The expected value is 0, which means that over the long run you expect to break even, neither winning nor losing. E < 0 means you expect to lose in the long run, E > 0 means you expect to win.

Figure 12.3 Estimated and true rates of death from various causes. Pattern illustrates the Availability heuristic, according to which people estimate the frequency of occurrence (base rate) by how accessible their memories are for the last incident. High estimated rates for low-occurrence events are influenced by news reports.

Figure 12.4 Payoff matrix for the Prisoner’s Dilemma. Each cell represents the utility of the strategy for both participants (Suspect A, Suspect B). Staying silent yields the best outcome only if the other stays silent as well. The tendency is to take the suboptimal alternative of testifying, as it yields the minimax solution.

Figure 12.5 A cognitive model of decision making illustrating stages and mental operations in arriving at a decision.

Figure 12.6 The process of selecting appropriate action according to Recognition-Primed Decision-Making (RPDM).

Figure 12.7 Example of recognition and pattern matching as the basis for decision making. The possible solutions are heavily influenced by the decider’s perceptual structure of the world and experience with similar situations.

Chapter 13: A Case Study in Human-System Performance: The Exxon Valdez

Figure 13.1 The geography of Prince William Sound and track of the

Exxon Valdez

leading up to the grounding on Bligh Reef.

Figure 13.2 Contextual Control Model. Multitasking difficulty is a function of the number of concurrently active goals the operator must satisfy and the time interval within which they must be done. As multitasking difficulty increases, control becomes less proactive, less responsive to future system states, and more reactive, increasingly focused on managing immediate demands.

Chapter 14: Human Error

Figure 14.1 Overlap in situation awareness for different subsystems of the shuttle Launch Control Center. The NASA Test Director coordinates all subsystems, but does not maintain complete situation awareness of the state of each system.

Chapter 15: Contextual Factors Affecting Human-System Performance

Figure 15.1 Simplified activity network for moving text showing how state networks can represent options. Boxes correspond to states in the process, arrows to operators that transition between states. Dotted arrows represent keyboard inputs, solid arrows mouse actions. The “Move + Click + Drag” represents mouse actions in selecting from a menu.

Figure 15.2 Analysis of cut-and-paste text editing using a mouse and menu. Example uses the Keystroke-Level Model (KLM) from Card, Moran, and Newell (1983).

Figure 15.3 Questions used by the NASA Task Load Index (NASA-TLX).

Chapter 17: Supporting Human-System Performance

Figure 17.1 Emergent properties in human vision. Each “gauge” has a zero point and a normal operating range, indicated by the shaded portion.

Figure 17.2 Conflict detection times as a function of the size of the angle formed by a conflict pair and the density (number) of aircraft in a sector. In graph (a), data are plotted as a column bar graph; in graph (b) the same data are plotted as connected lines. In graph (a), attention is drawn to the comparison at each density level; in graph (b) the lines draw attention to the relationship of each angle to the density.

Figure 17.3 Example of (a) standard inside-out attitude display showing a 30° bank to the right and (b) an outside-in display of the same bank angle. Diagrams (c) and (d) show how this would be represented on a frequency-separated display. The representation in (c) shows the initial response to the bank command. The aircraft symbol rotates as it would in the outside-in display in (b) as the horizon begins to rotate toward its final position. In (d) the aircraft symbol has rotated back to its horizontal orientation while the horizon line now reflects the inside-out orientation of (a). By “quickening” the display with the initial rotation of the aircraft, the pilot gets immediate feedback as to the control action and predictive information about the future state of the aircraft.

Figure 17.4 Data from Figure 17.2 plotted as a stacked bar graph. Note how the relationships in each stack can be judged roughly against one another by inspecting the size of the rectangles. Exact comparisons require estimation of starting and ending points. Comparisons of the middle and end components of different stacks require adjustment based on the starting points, which are a function of earlier stack elements.

Figure 17.5 Frame effects in graph perception. The apparent slope of the line from A to B and B to C looks larger inside a tall narrow frame than a wider frame, and smallest inside the short and wide rectangle.

Figure 17.6 Swiss cheese model showing how human error leads to system error. Barriers prevent latent failures that promote human error at a given layer from leading to system-wide error. Only a few errors that manage to interact with failures in all barriers will lead to system error.

List of Tables

Chapter 6: Analysis Techniques

Table 6.1 Decision outcomes as a function of state of the world

Chapter 14: Human Error

Table 14.1 Types of behavior proposed by Rasmussen (1983) and the types of errors associated with those behavior levels.

Chapter 16: The Role of Automation in Human-System Performance

Table 16.1 Levels of Automation.

Guide

Cover

Contents

Start Reading

Pages

i

ii

iii

xiii

xiv

xv

1

2

3

4

5

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

49

50

51

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

376

377

378

379

380

381

383

384

385

386

387

388

389

390

391

392

Introduction to Humans in Engineered Systems

Roger Remington

Deborah Boehm-Davis

Charles Folk

Cover image: © Pei Ling Wu/iStockphoto

Cover design: N/A

Copyright © 2012 by John Wiley & Sons, Inc. All rights reserved

Published by John Wiley & Sons, Inc., Hoboken, New Jersey

Published simultaneously in Canada

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at www.wiley.com/go/permissions.

Limit of Liability/Disclaimer of Warranty: Although the publisher and authors have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor the author shall be liable for damages arising herefrom.

For general information about our other products and services, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley publishes in a variety of print and electronic formats and by print-on-demand. Some material included with standard print versions of this book may not be included in e-books or in print-on-demand. If this book refers to media such as a CD or DVD that is not included in the version you purchased, you may download this material at http://booksupport.wiley.com. For more information about Wiley products, visit www.wiley.com.

Library of Congress Cataloging-in-Publication Data:

Remington, Roger W., 1947-

Introduction to humans in engineered systems / Roger Remington, Charles L. Folk, Deborah Boehm-Davis.

pages cm

Includes index.

Includes bibliographical references.

ISBN 978-0-470-54875-2 (hardback); ISBN 978-1-118-32995-5 (ebk); ISBN 978-1-118-33222-1 (ebk);

ISBN 978-1-118-33271-9 (ebk); ISBN 978-1-118-39373-4 (ebk); ISBN 978-1-118-39375-8 (ebk);

ISBN 978-1-118-39376-5 (ebk); ISBN 978-1-118-50762-9 (ebk)

1. Human engineering. I. Folk, Charles L. II. Boehm-Davis, Deborah Ann. III. Title.

T59.7.R46 2013

620.8’2—dc23

2012026243

Dedicated to Karen Remington, Stuart Davis, and Valerie Greaud Folk for all their love and support during this project, as always.

Preface

Courses on human factors, human-system integration, engineering psychology, human-computer interaction, or applied psychology, though varying in specific content or approach, all share a common concern with the human as part of a system built by humans. The title of this book—Introduction to Humans in Engineered Systems—reflects that common link. Our core idea was to develop a program for the study of human-system integration based on the combination of a concept-oriented text with a flexible, interactive website. The book is designed to introduce major concepts and principles common across the various disciplines. As an integrating factor, the material is organized around the flow of information in control theoretic diagrams. A high-level treatment of control theory is a powerful way to link the various system elements, including the human, and to guide the analysis of real-world situations. The website (http://www.wiley.com/go/remington) provides a resource for pursuing topics in more depth. The website is conceived as a collection of exercises complete with the necessary programs to demonstrate concepts, case studies that provide a foundation for discussion, links to interesting demonstrations online, and material on topics not covered in detail in the text.

One of the underlying principles of control theory is that the behavior of human operators cannot be fully understood in terms of just mental and physical capabilities. It is necessary also to understand the goals the operator attempts to attain, the system being controlled (aircraft, car, computer), and the influence of the environment in which the system is embedded (including other people). The organization of the text reflects this focus on the human in context by treating four broad thematic areas.

Historical Perspective. This section is designed to prepare the reader for the material in later chapters by providing a fundamental understanding of the human as a component of a system. The concept of human-system integration is introduced with emphasis on systems-level thinking. A brief history chronicles the key role that usability has played in technological progress throughout human history, and documents how the increasing complexity of machinery and manufacturing has given rise to the modern study of human-system integration. Related disciplines (e.g., organizational psychology, engineering psychology) are discussed in terms of how they overlap with, or are different from, human-system integration.

The Environment. This goal of this section is to build awareness of the range of challenges posed by environments that characterize home and work. The key concepts introduced are adaptability and complexity. Because people are adaptable, the demands and incentives of the environment itself are strong determinants of behavior. Reliance on adaptability is seen in management approaches that emphasize a rule-governed, procedural, or incentive-based environment. Limits on adaptability are introduced through a discussion of environmental complexity and its role in human-system performance. Comparisons of fields such as medicine, transportation, and human-computer interaction provide examples of how different environments place different demands on human performance.

This section also introduces the kinds of quantitative techniques that characterize modern human-system analysis. This introduction will familiarize students with task analysis techniques, information theory, finite-state analysis, and signal-detection theory; and provide a brief introduction to human-system modeling. The key organizing concept introduced here, and used throughout the book, is control theory. Control theory is treated at a conceptual level to provide a framework for representing the flow of information in a way that highlights the interaction of all the components of the system. We introduce noise as a real factor in performance, and emphasize the contribution of feedback and lag as issues in human usability. Thus, this section is designed to provide the concepts and knowledge necessary to recognize the potential for user-related issues.

The Human Element. In the first two sections, the human is treated as an adaptable component of the entire system. This section introduces the student to the limits on that adaptability by characterizing human capabilities and limitations in information processing. The control theory framework is again used to represent the flow from perception to situation understanding, from situation understanding to action, and from action back to perception. The key points are not just that people have limited processing capacity, but that we are limited in particular ways which have implications when humans occupy decision-making roles in complex systems. Although all of the many aspects of human behavior are potentially relevant to human-system performance, this section focuses on key characteristics that strongly shape behavior in human-system interactions. To aid students in understanding the range of behavior, we distinguish the characteristics of human behavior associated with the structural properties of the human information-processing system (i.e., the visual and auditory sensory systems, the role of attention in mediating perception, and limits on multitasking) from those associated with the contents of the information-processing system (i.e., memory storage/retrieval and decision making/action selection). Structural factors in general determine the limits on how much information can be processed, whereas content factors determine how that information is used. We emphasize that this distinction is somewhat artificial, in that behavior is ultimately the joint product of these two. Nonetheless, it can be helpful to students in making sense of the large body of literature on human behavior.

Human-System Integration. Up to this point, students have been presented with a broad understanding of the discipline, knowledge of techniques for inquiring into system performance, and how the information-processing and decision characteristics of humans shape performance. In this final section, we present an analysis of an illustrative case history (the Exxon Valdez disaster) with the goal of showing how concepts and principles in the first three sections can be applied to the analysis of real-world situations, again within the context of a control theory framework. The key idea is that common intuition can be replaced by a structured approach to thinking about systems outcomes. Thus, this section examines how the environment, the human element, and the task to be performed come together to affect system performance. Operational constructs of situation awareness, workload, human error, and usability are discussed in terms of the underlying psychological principles developed in the first three sections.

The website (http://www.wiley.com/go/remington) complements the text and is structured around modules. Each module is structured into sections, as appropriate, including the goal of the module, description of the exercise, materials needed, instructions, readings for further information, and reference to the corresponding book section. Some modules contain questions and descriptions of case studies that can be used as the basis for discussion. Others contain interactive exercises that either demonstrate phenomena (e.g., control order) or provide opportunities for students to further explore material described in the book (e.g., task analysis). Links to demonstrations available on the web that illustrate basic psychological phenomena are also provided. Finally, some modules focus on material not covered in depth in the text (e.g., anthropometry). The website is designed to grow over time to include additional modules and materials; we also intend to update the modules to keep the material fresh. For example, we anticipate that as new technologies (for example, the iPhone) are introduced, articles and examples of them will be incorporated into the site. The instructor can tailor these modules to meet various pedagogical goals. Some of the modules will be suitable for undergraduates at the junior or senior level, others more suitable for graduate courses. Instructors also can select web modules as desired to focus on topics as they see fit. Thus, engineering departments may choose modules associated with finite-state modeling of systems, whereas human factors courses may focus on the task analysis modules, and engineering psychology classes may omit both and instead add extra modules on auditory processing. We hope that instructors who adopt the book will contact us with suggestions for new topics that they would like to see covered.

As with any project, this one consumed a great deal of time and effort. We thank those who helped us along the way, including Shayne Loft, Beth Lyall, Jennifer McKneely, and Hal Pashler, who read the book and provided us with many suggestions for improvements (although they should not be faulted for any remaining inaccuracies); and Rebecca Davis, who helped us with reference checking and indexing, as well as editorial feedback. We thank David Kidd, Brian Taylor, and Nicole Werner, who developed the initial ideas and structure for the exercises included in our website. We also thank our (very) patient spouses, Karen Remington, Stuart Davis, and Valerie Folk, who gave us the space we needed to produce the program we desired. Without their support, this project would not have been possible.

Roger Remington, Deborah Boehm-Davis, Charles Folk

Part IHistorical Perspective

On 19 April 1770, James Cook, captain of HMS Endeavor, made the first direct recorded observations of the indigenous peoples of Australia, commonly referred to as aboriginals. They were of the Greagl tribe, whose territory was the area around what is today Sydney in southeastern Australia. The Greagl were but one of thousands of small groups of hunter-gatherers scattered across the continent. To the European sailors, the aboriginals seemed desperately primitive. For the most part they were naked, bathed in the grease of a native marsupial (the Australian possum) to protect them from the swarming flies and mosquitoes. They built no impressive shelters, nor did they appear to have permanent settlements. The sailors had previously encountered primitive natives in Patagonia along the banks of the Straits of Magellan. Yet the aboriginals seemed to lack even the accoutrements of these primitive natives. Despite this, the aboriginals showed a high degree of social organization, had a remarkable knowledge of the flora and fauna of their territory, possessed an impressive array of hand tools for hunting and fire sticks for keeping fires lit, and were skilled at acquiring ochre and other minerals for painting. As many Europeans would discover to their dismay, their spears could strike with deadly accuracy, and they were skilled in the use of spear throwers. European explorers and settlers were also to discover new tools, as for example, the boomerang and didgeridoo. Everywhere the explorers of the great age of discovery ventured and found people, they found sophisticated tools adapted to the needs of the local people and to which the people owed their existence.

Humans, it seems, are natural engineers. Evolution has imbued us with the capacity and compulsion to sculpt the environment in ways that not only enhance our ability to survive, but also just make the task of living “easier.” Think about it: from the time we wake in the morning until we go to sleep at night, we are surrounded by a world of our own devising, an engineered environment. Alarm clocks wake us; refrigerators keep our food cold; stoves and microwave ovens heat our food; clothes keep us warm; automobiles or trains or busses take us to work, where we communicate and create using telephones, computers, and (the newest of creations) small hand-held devices that instantly put us in contact with even the most remote places in the world. The companies and institutions in which we work are themselves engineered environments. The rigid hierarchy of one company shares with the free-flowing egalitarianism of its competitor the fact that each was created to fulfill a specific vision, to achieve a goal. On a larger scale, our society, though much more complex and difficult to manage, is itself a product of our own engineering. Laws are made with the express intent of achieving some societal outcome. Even customs are often vestiges of explicit solutions whose ancestry may or may not be traceable.

It appears that we were engineers from the very beginning. Using mitochondrial DNA (mtDNA) passed from mother to offspring, geneticists trace a common female ancestor of all living homo sapiens, “Eve,” to around 200,000 years ago (Cann, Stoneking, & Wilson, 1987; Penny, Steel, Waddell, & Hendy, 1995). Similar analyses of the male Y chromosome yield a roughly comparable date (Cavalli-Sforza & Feldman, 2003; Mitchell & Hammer, 1996). Yet, archeologists have found flaked stone tools for cutting and hewing, made with considerable skill, dating from about 2.5–2.6 million years ago (Dominguez-Rodrigo, Rayne Pickering, Semaw, & Rogers, 2005; Sileshi, 2000). Not only are we engineers, we are descended from engineers. Indeed, it is not too speculative to suggest that our prowess as engineers facilitated our success as a species. There was a shift in climate around the Pliocene-Pleistocene boundary roughly 2.5–1.8 million years ago in which grasslands took over from dense forest, exposing our ancestors to new and dangerous challenges (Bobe, Behrensmeyer, & Chapman, 2002; DeMenocal, 2004; Reed, 1997). This created something of an evolutionary bottleneck: of the many proto-humans who existed at the time, only a few thousand emerged to give rise to the Homo sapiens of today. It may well have been our ability to engineer our societies and our tools that made it possible to survive.

It is true that many animals also engineer tools and alter their environment. Birds build nests, beavers build dams and lodges, termites and bees build hives, and crows have been observed to use rocks dropped from above to break shells. What makes us different is not just that we do more tool-making or more environmental modification (damage if you are of one ideological persuasion). More so than any other species, we humans seem to come equipped with characteristics particularly adapted to engineering.

For example, one important skill for engineers is the ability to build on previous successes. This skill requires being able to observe and learn from the behavior of others. In turn, learning from others includes formal instruction—another engineered system, devised for a purpose—but also informal learning, which occurs by mimicking the behavior of other members of a society. Studies have shown that children will observe an adult or an older child and repeat the actions they observe. In an experiment run on Australian and African children (Nielsen & Tomaselli, 2010), the children observe an adult going through an elaborate series of steps to open a rather odd-looking box. When presented with the box to open, the vast majority of children mimic the actions they have seen. The interesting outcome is that children above the age of four tend to mimic even when they know an easier way to open the box. Children under four years of age tend to use the simpler method they know to get the box open.

This kind of mimicry is not characteristic of even our closest kin, the great apes. It appears that as human children develop, they reach a stage of social maturity where it becomes important to pattern their behavior after that of other members of the group. The importance of this patterning is not simply that it builds social acceptance and cultural identity, but that it provides a natural mechanism for the transmission of skills, one of which would be the design and manufacture of tools. By observation, then, without overt instruction, children learn to manipulate the world in ways like others of their group do. In this example, we see the foundation for the accumulation of skill and knowledge, and for its transmission from one generation to another.

So, it is abundantly clear that humans are uniquely equipped to engineer their environments. Indeed, we live in a world full of overlapping engineered systems of which we all are a part. But how “good” are these systems? To what extent do they achieve their intended goals? How efficiently, reliably, and safely do they do so? One could argue that systems engineering simply follows a kind of natural selection process, with better systems “surviving” in such a way that there is always movement toward better and better (i.e., more efficient and reliable) systems. The development of tools is certainly one example of this kind of process.

In the past seventy years or so, this process has been accelerated by applying the tools of science to the evaluation and development of engineered systems (see, e.g., (Fitts, 1958). Driven by wartime increases in the technological complexity of the “tools of war,” as well as their often-puzzling failures, psychologists and engineers began to systematically study the kinds of factors that influence the success and failure of human–machine systems in general. What has become clear from this study of “human engineering” is that understanding such systems requires a careful analysis of the environment, the human participant, and their interaction.

This book addresses the central conceptual issues associated with each of these three facets of human engineering. It is not meant to be an exhaustive compendium of the relevant research in these areas. Rather, it is meant to introduce students to the main concepts, assumptions, and approaches that have emerged in the study of human engineering. More detailed study of particular issues is available in the accompanying online modules. The book is organized into four sections. Part I provides historical context for the modern study of human-engineered systems, and also gives an overview of some of the real-world settings in which human engineering has been successfully applied. Most of the examples are drawn from aviation domains. In part this is because of the intense and long-standing concern over human error and safety in commercial aviation, as well as performance in military aviation. It is also because aviation environments demand much of the human operators, be they pilots, air traffic controllers, or maintenance workers. Where possible, we include examples from medicine, computer science, and driving. It must be noted, however, that the systematic study of human behavior is a much more recent development in those domains than in aviation. Part II focuses on the nature of environments, how they differ, what creates complexity, and techniques for modeling those environments. Part III focuses on the nature of humans, and their capabilities and limitations. We constrain our treatment of the vast literature on human behavior by shaping the discussion around characteristics that determine which of the many sensory events are perceived, how we construct meaning from sensory input, and how we select an action from many possible actions. Finally, Part IV addresses how the structure and content of the human information-processing system influences the capabilities and limitations of human performance, and shows how these characteristics interact with the nature of environments to affect human error and system safety.

REFERENCES

Bobe, R., Behrensmeyer, A. K., & Chapman, R. E. (2002). Faunal change, environmental variability and late Pliocene hominid evolution. Journal of Human Evolution, 42(4), 475–497. doi:10.1006/jhev.2001.0535

Cann, R. L., Stoneking, M., & Wilson, A. C. (1987). Mitochondrial DNA and human evolution. Nature, 325(6099), 31–36. doi:10.1038/325031a0

Cavalli-Sforza, L. L., & Feldman, M. W. (2003). The application of molecular genetic approaches to the study of human evolution. Nature Genetics, 33, 266–275. doi:10.1038/ng1113

DeMenocal, P. B. (2004). African climate change and faunal evolution during the Pliocene-Pleistocene. Earth and Planetary Science Letters, 220(1–2), 3–24. doi:10.1016/S0012-821X(04)00003-2

Dominguez-Rodrigo, M., Rayne Pickering, T., Semaw, S., & Rogers, M. J. (2005). Cutmarked bones from Pliocene archaeological sites at Gona, Afar, Ethiopia: Implications for the function of the world’s oldest stone tools. Journal of Human Evolution, 48(2), 109–121. doi:10.1016/j.jhevol.2004.09.004

Fitts, P. M. (1958). Engineering psychology. Annual Review of Psychology, 9, 267–294. doi:10.1146/annurev.ps.09.020158.001411

Mitchell, R. J., & Hammer, M. F. (1996). Human evolution and the Y chromosome. Current Opinion in Genetics & Development, 6(6), 737–742. doi:10.1016/S0959-437X(96)80029-3

Nielsen, M., & Tomaselli, K. (2010). Overimitation in Kalahari bushman and the origins of human cultural cognition. Psychological Science, 21, 729–736.

Penny, D., Steel, M., Waddell, P. J., & Hendy, M. D. (1995). Improved analyses of human mtDNA sequences support a recent African origin for Homo sapiens. Molecular Biology and Evolution, 12(5), 863–882.

Reed, K. E. (1997). Early hominid evolution and ecological change through the African Plio-Pleistocene. Journal of Human Evolution, 32(2–3), 289–322. doi:10.1006/jhev.1996.0106

Sileshi, S. (2000). The world’s oldest stone artefacts from Gona, Ethiopia: Their implications for understanding stone technology and patterns of human evolution between 2.6–1.5 million years ago. Journal of Archaeological Science, 27(12), 1197–1214. doi:10.1006/jasc.1999.0592

Chapter 1Natural and Engineered Systems

As its title suggests, this book is concerned with how humans interact with engineered systems. This immediately raises questions as to what we mean by an engineered system, what other systems might exist, and how an engineered system differs from other systems. Can the natural environment be considered an engineered system with evolution (natural selection) as the design driver? If not, what characteristics distinguish evolution through natural selection from the sort of engineered systems that are the topics of this book? Where can we draw the boundary?

In our view, the difference between natural and engineered systems is a function of three factors:

Design for a purpose

Design for a certain class of users

Design against failure

PURPOSEFUL DESIGN

Engineered systems have a goal, a purpose lacking in natural design. The modern scientific view of evolution (i.e., natural design) holds that there is no goal either at the level of an individual organism, a species, or an ecosystem as a whole. Rather, evolution uses mutation to generate diversity and natural selection to eliminate variants that are less competitive. According to modern theory, the world we see around us is the result of billions of such experiments having been conducted over billions of years. The natural world exists as it does because it worked, not because someone wanted it to work that way.