Trade-off Analytics -  - E-Book

Trade-off Analytics E-Book

0,0
123,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Presents information to create a trade-off analysis framework for use in government and commercial acquisition environments

This book presents a decision management process based on decision theory and cost analysis best practices aligned with the ISO/IEC 15288, the Systems Engineering Handbook, and the Systems Engineering Body of Knowledge. It provides a sound trade-off analysis framework to generate the tradespace and evaluate value and risk to support system decision-making throughout the life cycle. Trade-off analysis and risk analysis techniques are examined. The authors present an integrated value trade-off and risk analysis framework based on decision theory. These trade-off analysis concepts are illustrated in the different life cycle stages using multiple examples from defense and commercial domains.

  • Provides techniques to identify and structure stakeholder objectives and creative, doable alternatives
  • Presents the advantages and disadvantages of tradespace creation and exploration techniques for trade-off analysis of concepts, architectures, design, operations, and retirement
  • Covers the sources of uncertainty in the system life cycle and examines how to identify, assess, and model uncertainty using probability
  • Illustrates how to perform a trade-off analysis using the INCOSE Decision Management Process using both deterministic and probabilistic techniques 

Trade-off Analytics:  Creating and Exploring the System Tradespace is written for upper undergraduate students and graduate students studying systems design, systems engineering, industrial engineering and engineering management. This book also serves as a resource for practicing systems designers, systems engineers, project managers, and engineering managers.


Gregory S. Parnell, PhD,
is a Research Professor in the Department of Industrial Engineering at the University of Arkansas. He is also a senior principal with Innovative Decisions, Inc., a decision and risk analysis firm and has served as Chairman of the Board. Dr. Parnell has published more than 100 papers and book chapters and was lead editor of Decision Making for Systems Engineering and Management, Wiley Series in Systems Engineering (2nd Ed, Wiley 2011) and lead author of the Handbook of Decision Analysis (Wiley 2013).  He is a fellow of INFORMS, the INCOSE, MORS, and the Society for Decision Professionals.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 1051

Veröffentlichungsjahr: 2016

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Title Page

Copyright

Wiley Series in Systems Engineering and Management

List of Contributors

About the Authors

Foreword

Preface

Need for More Effective Trade Studies

Audience

Themes

Book Organization

Course Outlines Using the Textbook

Notional Course Objectives

Illustrative Academic Course Outlines

Illustrative Professional Short Course Outline

Reference

Acknowledgments

International Council on Systems Engineering (Incose) Corporate Advisory Board (CAB)

Incose Technical Directors

Incose Decision Analysis Working Group

Chapter Authors

Chapter Reviewers

Department of Industrial Engineering at the University of Arkansas

Research Assistant

Final Note

About the Companion Website

Chapter 1: Introduction to Trade-Off Analysis

1.1 Introduction

1.2 Trade-off Analyses Throughout the Life Cycle

1.3 Trade-off Analysis to Identify System Value

1.4 Trade-off Analysis to Identify System Uncertainties and Risks

1.5 Trade-off Analyses can Integrate Value and Risk Analysis

1.6 Trade-off Analysis in the Systems Engineering Decision Management Process

1.7 Trade-off Analysis Mistakes of Omission and Commission

1.8 Overview of the Book

1.9 Key Terms

1.10 Exercises

References

Chapter 2: A Conceptual Framework and Mathematical Foundation for Trade-Off Analysis

2.1 Introduction

2.2 Trade-Off Analysis Terms

2.3 Influence Diagram of the Tradespace

2.4 Tradespace Exploration

2.5 Summary

2.6 Key Words

2.7 Exercises

References

Chapter 3: Quantifying Uncertainty

3.1 Sources of Uncertainty in Systems Engineering

3.2 The Rules of Probability and Human Intuition

3.3 Probability Distributions

3.4 Estimating Probabilities

3.5 Modeling Using Probability

3.6 Summary

3.7 Key Terms

3.8 Exercises

References

Chapter 4: Analyzing Resources

4.1 Introduction

4.2 Resources

4.3 Cost Analysis

4.4 Affordability Analysis

4.5 Key Terms

4.6 Excercises

References

Chapter 5: Understanding Decision Management

5.1 Introduction1

5.2 Decision Process Context

5.3 Decision Process Activities

5.4 Summary

5.5 Key Terms

5.6 Exercises

References

Chapter 6: Identifying Opportunities

6.1 Introduction

6.2 Knowledge

6.3 Decision Traps

6.4 Techniques

6.5 Tools

6.6 Illustrative Examples

6.7 Key Terms

6.8 Exercises

References

Chapter 7: Identifying Objectives and Value Measures

7.1 Introduction

7.2 Value-Focused Thinking

7.3 Shareholder and Stakeholder Value

7.4 Challenges in Identifying Objectives

7.5 Identifying the Decision Objectives

7.6 The Financial or Cost Objective

7.7 Developing Value Measures

7.8 Structuring Multiple Objectives

7.9 Illustrative Examples

7.10 Summary

7.11 Key Terms

7.12 Exercises

References

Chapter 8: Developing and Evaluating Alternatives

8.1 Introduction

8.2 Overview of Decision-making, Creativity, and Teams

8.3 Alternative Development Techniques

8.4 Assessment of Alternative Development Techniques

8.5 Alternative Evaluation Techniques

8.6 Assessment of Alternative Evaluation Techniques

8.7 Key Terms

8.8 Exercises

References

Chapter 9: An Integrated Model for Trade-Off Analysis

9.1 Introduction

9.2 Conceptual Design Example

9.3 Integrated Approach Influence Diagram

9.4 Other Types of Trade-Off Analysis

9.5 Simulation Tools

9.6 Summary

9.7 Key Terms

9.8 Exercises

References

Chapter 10: Exploring Concept Trade-Offs

10.1 Introduction

10.2 Defining the Concept Space and System Concept of Operations

10.3 Exploring the Concept Space

10.4 Trade-off Analysis Frameworks

10.5 Tradespace and System Design Life Cycle

10.6 From Point Trade-offs to Tradespace Exploration

10.7 Value-based Multiattribute TRADESPACE ANALYSIS

10.8 Illustrative Example

10.9 Conclusions

10.10 Key Terms

10.11 Exercises

References

Chapter 11: Architecture Evaluation Framework

11.1 Introduction

11.2 Key Considerations in Evaluating Architectures

11.3 Architecture Evaluation Elements

11.4 Steps in an Architecture Evaluation Process

11.5 Example Evaluation Taxonomy

11.6 Summary

11.7 Key Terms

11.8 Exercises

References

Chapter 12: Exploring the Design Space

12.1 Introduction

12.2 Example 1: Liftboat

12.3 Example 2: Cruise Ship Design

12.4 Example 3: NATO Naval Surface Combatant Ship

12.5 Key Terms

12.6 Exercises

References

Chapter 13: Sustainment Related Models and Trade Studies

13.1 Introduction

13.2 Availability Modeling and Trade Studies

13.3 Sustainment Life Cycle Cost Modeling and Trade Studies14

13.4 Optimization in Availability Trade Studies

13.5 Monte Carlo Modeling

13.6 Chapter Summary

13.7 Key Terms

13.8 Exercises

References

Chapter 14: Performing Programmatic Trade-Off Analyses

14.1 Introduction

14.2 System Acceptance Decisions and Trade Studies

14.3 Product Cancelation Decision Trade Study

14.4 Product Retirement Decision Trade Study

14.5 Key Terms

14.6 Exercises

References

Chapter 15: Summary and Future Trends

15.1 Introduction

15.2 Major Trade-Off Analysis Themes

15.3 Future of Trade-Off Analysis

15.4 Summary

References

Index

End User License Agreement

Pages

xix

xx

xxi

xxii

xxiii

xxiv

xxv

xxvi

xxvii

xxviii

xxix

xxxi

xxxii

xxxiii

xxxiv

xxxv

xxxvi

xxxvii

xxxviii

xxxix

xl

xli

xlii

xliii

xlv

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

224

225

226

227

228

229

230

231

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

288

289

290

291

292

293

294

295

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

350

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

369

370

371

372

373

374

375

377

378

379

380

381

382

383

384

385

386

387

388

389

390

391

392

393

394

395

396

397

398

399

400

401

402

403

405

406

407

408

409

410

411

412

413

414

415

416

417

418

419

420

421

422

423

424

425

426

427

428

429

430

431

432

433

434

435

437

438

439

440

441

442

443

444

445

446

447

448

449

450

451

452

453

454

455

456

457

458

459

460

461

462

463

464

465

466

467

468

469

470

471

472

473

474

475

476

477

478

479

480

481

482

483

484

485

486

487

488

489

490

491

492

493

494

495

496

497

498

499

500

501

502

503

504

505

506

507

508

509

510

511

512

513

514

515

516

517

518

519

520

521

522

523

524

525

526

527

528

529

530

531

532

533

534

535

536

537

538

539

540

541

542

543

544

545

546

547

548

549

550

551

552

553

554

555

556

557

558

559

560

561

562

563

564

565

566

567

568

569

571

572

573

574

575

576

577

578

579

580

581

582

583

584

585

586

Guide

Table of Contents

Foreword

Preface

Begin Reading

List of Illustrations

Chapter 1: Introduction to Trade-Off Analysis

Figure 1.1 Eagle's beak chart

Figure 1.2 INCOSE decision management process

Figure 1.3 Relationships among trade-off study mistakes and impacts

Figure 1.4 Outline of the book

Chapter 2: A Conceptual Framework and Mathematical Foundation for Trade-Off Analysis

Figure 2.1 Overview of integrated trade-off value, cost, and risk analysis

Figure 2.2 Single-dimensional value functions

Chapter 3: Quantifying Uncertainty

Figure 3.1 Uncertainty in gambling

Figure 3.2 Venn diagram

Figure 3.3 Normal distribution

Figure 3.4 Weibull distribution

Figure 3.5 Beta distribution

Figure 3.6 Triangular distribution

Figure 3.7 Default decision

Figure 3.8 Bayes net example

Figure 3.9 Monte Carlo simulation

Figure 3.10 Tornado diagram for sensitivity analysis

Chapter 4: Analyzing Resources

Figure 4.1 Resource space

Figure 4.2 The three components of the resource space

Figure 4.3 Example of skills for people resources

Figure 4.4 Example of types of facility resources

Figure 4.5 Cost resource by class

Figure 4.6 Example resource framework

Figure 4.7 Life cycle costing concept map (Source: Parnell et al. 2011. Reproduced with permission of John Wiley & Sons)

Figure 4.8 Power CERs

Figure 4.9 Exponential CER

Figure 4.10 Logarithm CER

Figure 4.11 Regression plot

Figure 4.12 Normal probability plot

Figure 4.13 Residual plot

Figure 4.14 Plot of learning curves (Source: Parnell et al. 2011. Reproduced with permission of John Wiley & Sons)

Figure 4.15 Ninety percent learning curve for cumulative average cost and unit cost for 32 units

Figure 4.16 Fitted line plot

Figure 4.17 Example net present calue (NPV) tornado chart for a 5-year program

Figure 4.18 Example cumulative net present value hurricane chart

Figure 4.19 Triangular distribution for embedded flight software

Figure 4.20 PDF for total person-months for embedded flight software development

Figure 4.21 PDF for software development cost

Figure 4.22 CDF for total software development costs

Figure 4.23 Sensitivity chart

Figure 4.24 “Big A” and “little a” (Source: MORS Affordability Analysis Community of Practice 2015)

Figure 4.25 Affordability analysis framework (Source: Courtesy of the Military Operations Research Society Affordability Analysis Community of Practice (or MORS AA CoP))

Figure 4.26 Lean six sigma high-level overview product example (Source: Courtesy of the Military Operations Research Society Affordability Analysis Community of Practice (or MORS AA CoP))

Chapter 5: Understanding Decision Management

Figure 5.3 Trade-off studies throughout the system's development life cycle

Figure 5.1 Decision analysis process (Courtesy of Matthew Cilli)

Figure 5.2 Integrated Systems Engineering Decision Management (ISEDM) Process Map

Figure 5.4 Key properties of a high-quality set of fundamental objectives

Figure 5.5 Example of an objectives hierarchy

Figure 5.6 Value function examples

Figure 5.7 Swing weight matrix

Figure 5.8 Objectives hierarchy for sUAV example

Figure 5.9 Graphical representations of value function graphs for sUAV example

Figure 5.10 Weights for sUAV example

Figure 5.11 Assessment flow diagram (AFD) for a hypothetical gun design choice activity (lead author's original graphic)

Figure 5.12 Radar value graph structure

Figure 5.13 Tornado graph structure

Figure 5.14 Value component graph structure

Figure 5.15 Stakeholder value scatterplot structure

Figure 5.16 Value component chart for sUAV

Figure 5.17 Value scatterplot for the sUAV example

Figure 5.18 Weightings as generated by focus group 1 and focus group 2

Figure 5.19 Weight sensitivity line graph structure

Figure 5.20 Stakeholder value scatterplot with uncertainty

Figure 5.21 sUAV performance value sensitivity to changes in priority weight of “avoid impeding soldier sprint” objectives

Figure 5.22 sUAV stakeholder value scatterplot with uncertainty

Figure 5.23 Decision support model construct

Chapter 6: Identifying Opportunities

Figure 6.1 Opportunity space role in tradespace development

Figure 6.2 Potential impact of Ill-framed opportunity space roll in tradespace development

Figure 6.3 Format of a decision hierarchy (Source: Parnell et al. 2013. Reproduced with permission of John Wiley & Sons)

Figure 6.4 Example of a vision statement

Figure 6.5 Influence diagram – same as Figure 2.1

Figure 6.6 Commercial drone decision hierarchy example

Figure 6.7 Example influence diagram for desired capability

Chapter 7: Identifying Objectives and Value Measures

Figure 7.1 Comparison of objectives and functional value hierarchy

Figure 7.2 Squad functional value hierarchy showing decomposition from function to objective to measure

Chapter 8: Developing and Evaluating Alternatives

Figure 8.1 Two phases of alternative development (Source: Parnell et al. 2013. Reproduced with permission of John Wiley & Sons)

Figure 8.2 Morphological box for bicycle suspension system (Source: Ullman 2010. Reproduced with permission of McGraw-Hill Education)

Figure 8.3 Allocation of functions to physical components for interceptor system architecture alternatives (Adapted with permission of Salvatore F. (2008). The Value of Architecture. NDIA 11th Annual Systems Engineering Conference. San Diego, US-CA)

Figure 8.4 Generic physical architecture of a hammer

Figure 8.5 Morphological box used to instantiate architectures (Source: Buede 2009. Reproduced with permission of John Wiley & Sons)

Figure 8.6 Strategy table for fifth-generation Corvette (Adapted from Barrager 2001)

Figure 8.7 Zigzagging between domains from (Szatkowski, 2000)

Figure 8.8 Typical QFD house of quality matrix

Figure 8.9 Typical QFD house of quality matrix mapping by transposing previous HOQ header row to the next HOQ column

Chapter 9: An Integrated Model for Trade-Off Analysis

Figure 9.1 Concept diagram for the integrated trade-off analysis

Figure 9.2 Integrated approach influence diagram

Figure 9.3 Functional allocation to subsystems

Figure 9.4 Value component chart for the assault scenario

Figure 9.5 Deterministic Pareto chart for the assault scenario

Figure 9.6 The integrated approach

Figure 9.7 Stochastic Pareto chart for the assault scenario

Figure 9.8 Value cumulative distribution chart for the assault scenario

Figure 9.9 Value and cost stochastic tornado diagrams for the

performance

alternative from the assault scenario

Figure 9.10 Value measure and cost component linkage to system features

Figure 9.11 Example of value measure and cost component linkages to system features

Figure 9.12 Value model named range data entry setup and SIPmath ribbon

Chapter 10: Exploring Concept Trade-Offs

Figure 10.1 Concept, architecture, design, and system abstraction layer examples

Figure 10.2 Example tradespace reflecting designer-controlled parameterized concepts in terms of stakeholder value metrics (Source: Ross, Massachusetts Institute of Technology, 2006. Reproduced with permission of Ross)

Figure 10.3 Concept of flexibility in tradespaces (Source: Ross 2005. Reproduced with permission of John Wiley & Sons)

Figure 10.4 A trade-off hyperspace for an unmanned vehicle

Figure 10.5 System CONOPS ontology (Adapted from Madni 2015c,d)

Figure 10.6 Anatomy of a real option (Source: Mikaelian et al. 2008. Reproduced with permission of Donna H. Rhodes)

Figure 10.7 Four types of trade-offs: 1) local points, 2) frontier points, 3) frontier sets, and 4) full tradespace exploration (Source: Ross 2005. Reproduced with permission of John Wiley & Sons)

Figure 10.8 Multiconcept tradespace with sensor swarms, aircraft, satellites, and systems of systems (SoS) composed of pairs of assets (Source: Chattopadhyay, Massachusetts Institute of Technology, 2009. Reproduced with permission of Ross)

Figure 10.9 Preference change by stakeholder shifts tradespace (Source: Ross 2005. Reproduced with permission of John Wiley & Sons)

Figure 10.10 Example tradespace: uncertainty mitigation through system portfolios (Source: Walton 2002. Reproduced with permission of the Massachusetts Institute of Technology)

Figure 10.11 Tradespace generation and exploration

Figure 10.12 Decision-maker to attribute mapping for a maritime security system

Figure 10.13 Single-attribute utility (SAU) curves for the security mission

Figure 10.14 GUI screenshot for the maritime security agent-based discrete event simulator

Figure 10.15 Example point design for a maritime security system (

N

= 1)

Figure 10.16 Example Pareto frontier points for a maritime security system (

N

= 6)

Figure 10.17 Example full tradespace and Pareto frontier set for maritime security system (

N

= 8586)

Figure 10.18 Tradespaces colored by number of Hermes and Shadows

Figure 10.19 Tradespaces colored by number of manned patrol boats and type of command authority

Figure 10.20 Tradespaces colored by operators per UAV and number of geographic zones

Chapter 11: Architecture Evaluation Framework

Figure 11.1 Role of architecture in the decision space

Figure 11.2 Architecture evaluation in context of other architecture processes

Figure 11.3 Addressing stakeholder concerns through the use of views and models

Figure 11.4 Architecture evaluation framework

Figure 11.5 Objectives-driven architecture evaluation

Figure 11.6 Architecture evaluation approaches: choosing one or more “lines of attack”

Figure 11.7 Value assessment methods: addressing stakeholder concerns

Figure 11.8 Architecture analysis methods: measuring architecture attributes

Figure 11.9 Architecture measurement scales and protocols

Figure 11.10 Key elements in the measurement process of ISO/IEC 15939

Figure 11.11 Business impact methods example

Figure 11.12 Mission impact methods example

Figure 11.13 Architecture attributes example

Chapter 12: Exploring the Design Space

Figure 12.1 Liftboat docked at the Bollinger Shipyard in Louisiana (Courtesy of Cliff Whitcomb)

Figure 12.2 Liftboat leg internals using a plate construction (Courtesy of Cliff Whitcomb)

Figure 12.3 Liftboat leg internals using a lattice construction (Courtesy of Cliff Whitcomb)

Figure 12.4 Liftboat tradespace for the two responses with respect to the three factors

Figure 12.5 Liftboat design tradespace of displacement and lift weight versus leg length and leg diameter, at a leg thickness of 1.875 in

Figure 12.6 Distribution of liftboat model displacement outputs for

N

= 5000

Figure 12.7 Trade-off space for cruise ships with variables set to the point with the highest revenue

Figure 12.8 Cruise ship design tradespace of revenue, acquisition cost, operating cost, and beam versus passenger capacity and brand quality

Figure 12.9 Cruise ship design tradespace of revenue, acquisition cost, operating cost, and beam versus passenger capacity and brand quality

Figure 12.10 Notional FSSF ship design (Source: NATO. Reproduced with permission of NATO)

Figure 12.11 Stakeholder value functions for the various high-level needs

Figure 12.12 Cost versus OMOE for FSSF surface combatant variants

Figure 12.13 Magnified plot region showing only Pareto frontier of nondominated variants for cost versus OMOE for FSSF surface combatants

Figure 12.14 Effects plot for NATO FSSF surface combatant ship

Figure 12.15 OMOE effects Pareto

Figure 12.16 Cost effects Pareto

Figure 12.17 Trade-off space of speed versus payload (Ship A11)

Figure 12.18 Trade-off space of speed versus payload where payload is restricted and the system becomes infeasible (Ship A11a)

Figure 12.19 Trade-off space of speed versus range (Ship A11a)

Figure 12.20 Trade-off space of speed versus range (Ship A11b)

Figure 12.21 Trade-off space of speed versus payload (Ship A11b)

Figure 12.22 Trade-off space of speed versus payload (Ship A11c)

Figure 12.23 Trade-off space of margin and payload (Ship A11d)

Chapter 13: Sustainment Related Models and Trade Studies

Figure 13.1 System operational concept

Figure 13.2 System operational concept (cont.)

Figure 13.3 Mission timeline

Figure 13.4 System reliability block diagram

Figure 13.5 Influence diagram for the FMDS availability model

Figure 13.6 Reduction in availability due to each failure mode

Figure 13.7 Cost category contributions to the TSLCC

Figure 13.8

A

o

as a function of the maximum flight time (

T

mf

), number of drones per unit (

N

dpu

), and number of control elements (

N

cpu

)

Figure 13.9

A

o

as a function of total system life cycle cost (TSLCC) for the designs provide in Figure 13.8 (

N

cpu

=

3)

Figure 13.10 The “

A

o

Input Parameters” portion of the Control Panel tab

Figure 13.11 The “Decision Variables, Constraints, and Results” portion of the Control Panel tab

Figure 13.12 The Calculations tab

Figure 13.13 The life cycle cost tab

Figure 13.14 Solver window

Figure 13.15 Optimization results

Figure 13.16 Availability/TSLCC tradespace (

T

tf

=

4 h)

Figure 13.17 Tornado diagram for

N

c

= 7 and

N

d

= 6

Figure 13.18 Postflight preparation time triangular distribution

Figure 13.19 Cumulative density functions for control/drone combinations

Figure 13.20 Tornado diagram for the FMDS system

Chapter 14: Performing Programmatic Trade-Off Analyses

Figure 14.1 Generic decision tree for acceptance decision

Figure 14.2 Excel model used to solve the example problem

Figure 14.3 Graph of risk associated with each decision option

Figure 14.4 Receiver operating characteristic curve (for

T

d

= 4000 h,

N

fat

= 34)

Figure 14.5 Receiver operating characteristic curves (for

T

d

= 4000 h,

N

fat1

= 34,

N

fat2

= 38)

Figure 14.6 Comparison of receiver operating characteristic curves for different test designs

Figure 14.7 Acceptance decision model

Figure 14.8 Expected cost versus decision option

Figure 14.9 R coding for the predictive model

Figure 14.10

p

-Value significance and AIC scores for Model A

Figure 14.11 Actual project outcome versus prediction for Model A test data

Figure 14.12 ROC curve for training set and test set for Model A

Figure 14.13 Confusion matrix for train and test of Model A

Figure 14.15 Accuracy measures for test data set Model A

Figure 14.14 Accuracy measures for training data set Model A

Figure 14.16 Significance

p

-values and AIC for Model B

Figure 14.19 Accuracy measures for Model B

Figure 14.20 Actual project outcomes versus predictions for Predictive Model B

Figure 14.21 Generic acquisition phases, decision points, and incorporation of Predictive Model B use in DoDI directive 5000.02 operation of the Defense Acquisition System in determining if the program is viable and whether contractors are fully capable of delivering a successful system within scope, cost, quality, and schedule

Figure 14.22 Incrementally deployed software intensive program milestone decisions, decision points, and incorporation of Predictive Model B in DoDI directive 5000.02 operation of the Defense Acquisition System to inform these critical software project decisions in determining project cancelation or continuation

Figure 14.23 Project management, ISO/IEC/IEEE 15288 and 12207 systems and software life cycle processes overlay and predictive model incorporation

Figure 14.24 Reactor compartment packages buried in Trench 94 at DOE Hanford Nuclear Reservation in Washington state as of November 2009 (Source: Knot 2012)

Figure 14.25 USS Enterprise is the first nuclear-powered aircraft carrier commissioned in 1961 and decommissioned in 2011 (Source: US Deparment of the Navy 2015)

Figure 14.26 Enterprise reactor compartment package barge loading concept for preferred alternative (Source: Knot 2012)

Figure 14.27 The major components of a generic offshore oil and gas platform (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.28 Decision tree showing the decommissioning alternatives considered in the study. Options with green boxes were analyzed in greater detail and gray boxes were omitted from quantitative analysis (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc) (

The reader is referred to the online version of this book for color indication

)

Figure 14.29 Graphical user interface (GUI) for PLATFORM with separated components to define decision options, perform quantitative cost analysis of the scenarios, and conduct multiattribute analysis including all attributes (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.30 Analytica influence diagram showing selected variables and influences involved in calculating the programmatic costs for decommissioning (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.31 Influence diagram showing how the multiattribute analysis is based on the results of analysis of the eight key attributes used to evaluate the costs and benefits of alternative decommissioning options (Source: Henrion et al. 2015)

Figure 14.32 Definition of levels for impact on marine mammals is a qualitative attribute and contains a description and conditions that would give rise to that level. Scores of 70% and 50% are example scores to illustrate user input (Source: Henrion et al. 2015)

Figure 14.33 User interface screen to assist users in assessing swing weights for each attribute in estimating the value to a stakeholder by changing each attribute from its Worst to its Best outcome, relative to most important attribute. Cost are identified as most important attribute and assigned swing weights of 100

(

Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.34 Range sensitivity analysis tornado chart of the difference in value between complete removal and partial removal for platform Harmony, changing the swing weights for each attribute from 0 low to 100 high and cost uncertainty from 10th to 90th percentile while keeping the other variables as their base values (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.35 Preferred decision, partial removal or complete removal for each platform according to the swing weight set for Strict Compliance. The bottom row shows the number of platforms recommended for complete removal. The platforms are ordered by depth (Source: http://www.lumina.com/case-studies/a-win-win-solution-for-californias-offshore-oil-rigs. Reproduced with permission of Max Henrion, Lumina Decision Systems, Inc)

Figure 14.36 The Vee diagram and the placement of the retirement and decommissioning activity in the project life cycle (Source: U.S. Department of Transportation 2013)

Chapter 15: Summary and Future Trends

Figure 15.1 Example developmental training program for systems engineers

List of Tables

Chapter 1: Introduction to Trade-Off Analysis

Table 1.1 Partial List of Decision Opportunities throughout the Life Cycle

Table 1.2 Sources of Systems Risk

Table 1.3 Decision Management Process

Table 1.4 Trade-Off Mistakes

Table 1.5 Illustrative Examples

Chapter 2: A Conceptual Framework and Mathematical Foundation for Trade-Off Analysis

Table 2.1 Key Terms with Examples

Chapter 3: Quantifying Uncertainty

Table 3.1 Distribution of Injuries From a Major Accident

Table 3.2 Distributions and Their Parameters and Functions

Table 3.3 Vehicle Mass Combinations

Chapter 4: Analyzing Resources

Table 4.1 Example Soft Skills

Table 4.2 Example Hard Skills

Table 4.3 Example Set of Hard and Soft Skills for Management

Table 4.4 Example Set of Roles and Functions for People Resources

Table 4.5 Facility Examples

Table 4.6 LCC Techniques by Life Cycle Stage

Table 4.7 AACE International Cost Estimate Classification Matrix

Table 4.8 Unmanned Aerial Vehicle (UVA) Work Breakdown Structure (WBS)

Table 4.9 Linear Transformations for CERs

Table 4.10 Square Footage and Facility Costs for Manufacturing Facility Construction

Table 4.11 Factors for Various Learning Rates

Table 4.12 Unit Cost and Cumulative Average Cost

Table 4.13 Accounting System Data

Table 4.14 Natural Logarithm of Cumulative Units Completed and Cumulative Average Hours

Table 4.15 Example Net Present Value Calculations

Table 4.16 Example Net Present Value Inflation and Interest Rates

Chapter 5: Understanding Decision Management

Table 5.1 Crosswalk Between SE Terms in Figure 5.2 and INCOSE Systems Engineering Handbook V4 and ISO/IEC/IEEE 15288:2015

Table 5.2 Crosswalk Between Fundamental Objectives and Stakeholder Need Statements

Table 5.3 Illustrating the One-to-One Mapping of Objective and Measure

Table 5.4 Properties of a High-Quality Measure

Table 5.5 Measures for sUAV Example

Table 5.6 End and Inflection Points of sUAV Value Functions

Table 5.8 Descriptions for Buzzard I, Buzzard II, Cardinal I, and Cardinal II

Table 5.10 Descriptions for Robin I, Robin II, Dove I, and Dove II

Table 5.7 sUAV Physical Architecture Description

Table 5.11 Physical Architecture to Fundamental Objective Mapping

Table 5.12 Structured Scoring Sheet for a Given Measure

Table 5.13 Consequence Scorecard Structure

Table 5.14 Consequence Scorecard Example for sUAV Case Study

Table 5.15 Value Scorecard Structure

Table 5.16 Value Scorecard for sUAV example

Chapter 6: Identifying Opportunities

Table 6.1 Stakeholder Analysis Techniques

Table 6.2 Advantages and Disadvantages of Popular Survey Methods

Chapter 7: Identifying Objectives and Value Measures

Table 7.1 Preference for Types of Value Measure

Table 7.2 Value Model Structure

Chapter 8: Developing and Evaluating Alternatives

Table 8.1 The Modes for Making Decisions

Table 8.2 Belbin's Nine Roles for Team Members

Table 8.3 Ullman's Suggestions for Increasing Team Performance

Table 8.4 Ten-Step Process for Controlled Convergence

Table 8.5 Example of Generic Solutions and Specific Alternatives Using the First TRIZ Inventive Principle

Table 8.6 First TRIZ Inventive Principle Interpreted for Marketing, Sales, and Advertising

Table 8.7 Assessment of Alternative Development Techniques

Table 8.8 Initial Quantitative Measures of Value for Alternatives

Table 8.9 Initial Scoring of Pugh Matrix

Table 8.10 Updated Scoring of Pugh Matrix with Different Datum

Table 8.11 Best Practices in Forming a Basis for Good Decisions

Table 8.12 Assessment of Alternative Evaluation Techniques

Chapter 9: An Integrated Model for Trade-Off Analysis

Table 9.1 Squad Enhancement Alternatives and System Features

Table 9.2 Swing Weight Matrix for the Assault Scenario

Table 9.3 Swing Weight Matrix for the Defense Scenario

Table 9.4 Value Measures for the Squad Enhancement Design Example

Table 9.5 Squad Scores on Each Value Measure

Table 9.6 Squad Enhancement Design Example Value Functions

Chapter 10: Exploring Concept Trade-Offs

Table 10.1 Example Trade-Off Analysis Frameworks

Table 10.2 Example Value Functions

Table 10.3 Comparison of Tradespace Exploration with Optimization and Decision-Theoretic Approaches

Table 10.4 List of Attributes for Maritime Security Case

Table 10.5 System Concepts

Table 10.6 Design Variables Parametrizing System's Form and CONOPs

Table 10.7 Attributes, Cost, and MAU for Selected Pareto Points

Chapter 11: Architecture Evaluation Framework

Table 11.1 Distinctions between Value Assessment and Architecture Analysis

Table 11.2 Architecture Analysis Objectives and Criteria Examples

Chapter 12: Exploring the Design Space

Table 12.1 Fractional Factorial Design for Liftboat

Table 12.2 Cruise Ship Taguchi L18 Experimental Design

Table 12.3 Stakeholder Needs for an FSSF

Table 12.4 Prioritized Stakeholder Needs

Table 12.5 DOE Data Table for FSSF Surface Combatant

Table 12.6 Cost and OMOE for FSSF Surface Combatant Variants

Table 12.7 Design Variant

Chapter 13: Sustainment Related Models and Trade Studies

Table 13.1 Mission Activities and Nominal Times

Table 13.2 Summary of Failure Modes

Table 13.3 Excel Instantiation of the FMDS Analytic Availability Model

Table 13.4 Summary of Factors Affecting System Availability

Table 13.5 Availability Sensitivity/Trade Study (“No Standby” Drone Failure Mode)

Table 13.6 Effect of Doubling the Maximum Flight Time

Table 13.7 Availability Sensitivity/Trade Study (“No Standby” Control Element Failure Mode)

Table 13.8 Effect of Increasing the Reliability of the Control Element

Table 13.9 Effect of Increasing the Maximum Flying Time of the Drone, the Reliability of the Control Element, and the Probability of Having Drone Spare Parts

Table 13.10 Model Input Parameters and LCC Calculations

Table 13.11 Total O&S Life Cycle Cost Model (Reference Values)

Table 13.12 Integrated Life Cycle Cost Model for

N

dpu

= 6,

N

cpu

= 3, and

T

mf

= 5 h

Table 13.13 Total Life Cycle Cost (

C

tlc

) as a Function of the Maximum Flight Time (

T

mf

) and Number of Drones per Unit (

N

dpu

)

Chapter 14: Performing Programmatic Trade-Off Analyses

Table 14.1 Outcome Future Costs

Table 14.2 Decision Tree Probabilities

Table 14.3 Decision Tree Expected Costs

Table 14.4 Example Decision Tree Inputs

Table 14.5 Table of MI versus C (Time-Terminated Test,

T

d

= 4000 h and

N

fo

= 34)

Table 14.6 Parameters Used to Calculate

C

Table 14.7 Expressions of Possible Outcomes of Tests and Associated Probabilities

Table 14.8 Matrix of Possible Test Outcomes of Tests and Associated Probabilities

Table 14.9 Test Design Trade Space

Table 14.10 Receiver Operating Characteristic Table (

T

d

= 4000 h,

N

fat

= 34)

Table 14.11 Receiver Operating Characteristic Table (

T

d

= 4000 h,

N

fat1

= 34,

N

fat2

= 36)

Table 14.12 Finding

T

d

and

N

fat

that Yield Desired Confidence and Power (

M

d3

= 125 h)

Table 14.13 Simple Mathematical Models for the

C

dd

Components

Table 14.14 Simple Mathematical Models for the

S

dpt

Components

Table 14.15 Example Excel Implementation of the Cost Model

Table 14.16 Cost Model Results

Table 14.17 Translation for Acceptance Decision Model

Table 14.18 Snapshot of Failed Software Projects ID for Coding

Table 14.19 Failure Factor ID Coding

Table 14.20 One-Time Cost of Decommissioning the Legacy HR System and Migrating its Functionality to the New Web-Enabled Platform

Table 14.21 Cost Gains from Decommissioning the Legacy HR System and Migrating Its Functionality to Faster Web-Enabled Technology Platform

Table 14.22 The Retirement Decision Matrix When Translating the Resulting Net Present Value of a System

Table 14.23 Calculations for the NPV, ROI, and Break-Even Analysis for Decommissioning the Legacy System and Developing the New Web-Enabled System

Table 14.24 Several Relevant Factors in Determining the Decision to Retire/Decommission the ENTERPRISE

Table 14.25 Summary of Finding and Characteristics of the Eight Attributes Included in the Multiattribute Analysis. The Analysis Focused on Identifying the Difference Between the Complete and Partial Removal Alternatives Across all Eight Attributes

Trade-Off Analytics

Creating and Exploring the System Tradespace

 

Edited by

Gregory S. Parnell

 

 

 

Copyright © 2017 by John Wiley & Sons, Inc. All rights reserved

Published by John Wiley & Sons, Inc., Hoboken, New Jersey

Published simultaneously in Canada

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 750-4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permission.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

Library of Congress Cataloging-in-Publication Data:

Names: Parnell, Gregory S., editor.

Title: Trade-off analytics : creating and exploring the system tradespace / [edited by] Gregory S. Parnell.

Description: Hoboken, New Jersey : John Wiley & Sons Inc., [2017] | Includes bibliographical references and index.

Identifiers: LCCN 2016023582| ISBN 9781119237532 (cloth) | ISBN 9781119238300 (epub) | ISBN 9781119237556 (Adobe PDF)

Subjects: LCSH: Systems engineering–Decision making. | Multiple criteria decision making.

Classification: LCC TA168 .T73 2017 | DDC 620.0068/4–dc23 LC record available at https://lccn.loc.gov/2016023582

Wiley Series in Systems Engineering and Management

Andrew P. Sage, Editor

 

ANDREW P. SAGE and JAMES D. PALMER

Software Systems Engineering

 

WILLIAM B. ROUSE

Design for Success: A Human-Centered Approach to Designing Successful Products and Systems

 

LEONARD ADELMAN

Evaluating Decision Support and Expert System Technology

 

ANDREW P. SAGE

Decision Support Systems Engineering

 

YEFIM FASSER and DONALD BRETINER

Process Improvement in the Electronics Industry, Second Edition

 

WILLIAM B. ROUSE

Strategies for Innovation

 

ANDREW P. SAGE

Systems Engineering

 

HORST TEMPELMEIER and HEINRICH KUHN

Flexible Manufacturing Systems: Decision Support for Design and Operation

 

WILLIAM B. ROUSE

Catalysts for Change: Concepts and Principles for Enabling Innovation

 

UPING FANG, KEITH W. HIPEL, and D. MARC KILGOUR

Interactive Decision Making: The Graph Model for Conflict Resolution

 

DAVID A. SCHUM

Evidential Foundations of Probabilistic Reasoning

 

JENS RASMUSSEN, ANNELISE MARK PEJTERSEN, and LEONARD P. GOODSTEIN

Cognitive Systems Engineering

 

ANDREW P. SAGE

Systems Management for Information Technology and Software Engineering

 

ALPHONSE CHAPANIS

Human Factors in Systems Engineering

 

YACOV Y. HAIMES

Risk Modeling, Assessment, and Management, Third Edition

 

DENNIS M. SUEDE

The Engineering Design of Systems: Models and Methods, Second Edition

 

ANDREW P. SAGE and JAMES E. ARMSTRONG, Jr.

Introduction to Systems Engineering

 

WILLIAM B. ROUSE

Essential Challenges of Strategic Management

 

YEFIM FASSER and DONALD BRETTNER

Management for Quality in High-Technology Enterprises

 

THOMAS B. SHERIDAN

Humans and Automation: System Design and Research Issues

 

ALEXANDER KOSSIAKOFF and WILLIAM N. SWEET

Systems Engineering Principles and Practice

 

HAROLD R. BOOHER

Handbook of Human Systems Integration

 

JEFFREY T. POLLOCK and RALPH HODGSON

Adaptive Information: Improving Business Through Semantic Interoperability, Grid Computing, and Enterprise Integration

 

ALAN L. PORTER and SCOTT W. CUNNINGHAM

Tech Mining: Exploiting New Technologies for Competitive Advantage

 

REX BROWN

Rational Choice and Judgment: Decision Analysis for the Decider

 

WILLIAM B. ROUSE and KENNETH R. BOFF (editors)

Organizational Simulation

 

HOWARD EISNER

Managing Complex Systems: Thinking Outside the Box

 

STEVE BELL

Lean Enterprise Systems: Using IT for Continuous Improvement

 

J. JERRY KAUFMAN and ROY WOODHEAD

Stimulating Innovation in Products and Services: With Function Analysis and Mapping

 

WILLIAM B. ROUSE

Enterprise Tranformation: Understanding and Enabling Fundamental Change

 

JOHN E. GIBSON, WILLIAM T. SCHERER, and WILLAM F. GIBSON

How to Do Systems Analysis

 

WILLIAM F. CHRISTOPHER

Holistic Management: Managing What Matters for Company Success

 

WILLIAM W. ROUSE

People and Organizations: Explorations of Human-Centered Design

 

MOJAMSHIDI

System of Systems Engineering: Innovations for the Twenty-First Century

 

ANDREW P. SAGE and WILLIAM B. ROUSE

Handbook of Systems Engineering and Management, Second Edition

 

JOHN R. CLYMER

Simulation-Based Engineering of Complex Systems, Second Edition

 

KRAG BROTBY

Information Security Governance: A Practical Development and Implementation Approach

 

JULIAN TALBOT and MILES JAKEMAN

Security Risk Management Body of Knowledge

 

SCOTT JACKSON

Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions

 

JAMES A. GEORGE and JAMES A. RODGER

Smart Data: Enterprise Performance Optimization Strategy

 

YORAM KOREN

The Global Manufacturing Revolution: Product-Process-Business Integration and Reconfigurable Systems

 

AVNER ENGEL

Verification, Validation, and Testing of Engineered Systems

 

WILLIAM B. ROUSE (editor)

The Economics of Human Systems Integration: Valuation of Investments in People's Training and Education, Safety and Health, and Work Productivity

 

ALEXANDER KOSSIAKOFF, WILLIAM N. SWEET, SAM SEYMOUR, and STEVEN M. BIEMER

Systems Engineering Principles and Practice, Second Edition

 

GREGORY S. PARNELL, PATRICK J. DRISCOLL, and DALE L. HENDERSON (editors)

Decision Making in Systems Engineering and Management, Second Edition

 

ANDREW P. SAGE and WILLIAM W. ROUSE

Economic Systems Analysis and Assessment: Intensive Systems, Organizations, and Enterprises

 

BOHDAN W. OPPENHEIM

Lean for Systems Engineering with Lean Enablers for Systems Engineering

 

LEV M. KLYATIS

Accelerated Reliability and Durability Testing Technology

 

BJOERN BARTELS , ULRICH ERMEL, MICHAEL PECHT, and PETER SANDBORN

Strategies to the Prediction, Mitigation, and Management of Product Obsolescence

 

LEVANT YILMAS and TUNCER OREN

Agent-Directed Simulation and Systems Engineering

 

ELSAYED A. ELSAYED

Reliability Engineering, Second Edition

 

BEHNAM MALAKOOTI

Operations and Production Systems with Multipme Objectives

 

MENG-LI SHIU, JUI-CHIN JIANG, and MAO-HSIUNG TU

Quality Strategy for Systems Engineering and Management

 

ANDREAS OPELT, BORIS GLOGER, WOLFGANG PFARL, and RALF MITTERMAYR

Agile Contracts: Creating and Managing Successful Projects with Scrum

 

KINJI MORI

Concept-Oriented Research and Development in Information Technology

 

KAILASH C. KAPUR and MICHAEL PECHT

Reliability Engineering

 

MICHAEL TORTORELLA

Reliability, Maintainability, and Supportability: Best Practices for Systems Engineers

 

DENNIS M. BUEDE and WILLIAM D. MILLER

The Engineering Design of Systems: Models and Methods, Third Edition

 

JOHN E. GIBSON, WILLIAM T. SCHERER, WILLIAM F. GIBSON, and MICHAEL C. SMITH

How to Do Systems Analysis: Primer and Casebook

 

GREGORY S. PARNELL, Editor

Trade-off Analytics: Creating and Exploring the System Tradespace

List of Contributors

Paul Beery

, Systems Engineering Department, Naval Postgraduate School, Monterey, CA, USA

Robert F. Bordley

, Systems Engineering and Design, University of Michigan, Ann Arbor, MI, USA; Booz Allen Hamilton, Troy, MI, USA

Matthew Cilli

, U.S. Army Armament Research Development and Engineering Center (ARDEC), Systems Analysis Division, Picatinny, NJ, USA

Simon R. Goerger

, Institute for Systems Engineering Research, Information Technology Laboratory (ITL), U.S. Army Engineer Research and Development Center (ERDC), Vicksburg, MS, USA

Gina Guillaume-Joseph

, MITRE Corporation, McLean, VA, USA

Alexander D. MacCalman

, Department of Systems Engineering, United States Military Academy, West Point, NY, USA

John E. MacCarthy

, Systems Engineering Education Program, Institute for Systems Research, University of Maryland, College Park, MD, USA

Azad M. Madni

, Department of Astronautical Engineering, Systems Architecting and Engineering and Astronautical Engineering, Viterbi School of Engineering, University of Southern California, Los Angeles, CA, USA

James N. Martin

, The Aerospace Corporation, El Segundo, CA, USA

Kirk Michealson

, Tackle Solutions, LLC, Chesapeake, VA, USA

William D. Miller

, Innovative Decisions, Inc., Vienna, VA, USA

Gregory S. Parnell

, Department of Industrial Engineering, University of Arkansas, Fayetteville, AR, USA

Edward A. Pohl

, Department of Industrial Engineering, University of Arkansas, Fayetteville, AR, USA

Donna H. Rhodes

, Sociotechnical Systems Research Center, Massachusetts Institute of Technology, Cambridge, MA, USA

C. Robert Kenley

, School of Industrial Engineering, Purdue University, West Lafayette, IN, USA

Garry Roedler

, Corporate Engineering LM Fellow, Engineering Outreach Program Manager, Lockheed Martin Corporation King of Prussia, PA

Adam M. Ross

, Systems Engineering Advancement Research Initiative (SEAri), Massachusetts Institute of Technology (MIT), Cambridge, MA, USA

Sam Savage

, School of Engineering, Stanford University, Stanford, CA, USA

Andres Vargas

, Department of Industrial Engineering, University of Arkansas, Fayetteville, AR, USA

Clifford Whitcomb

, Systems Engineering Department, Naval Postgraduate School in Monterey, Monterey, CA, USA

About the Authors

Gregory S. Parnell is Director, M.S. in Operations Management and Research Professor in the Department of Industrial Engineering at the University of Arkansas. He teaches systems engineering, decision analysis, operations management, and IE design courses. He coedited Decision Making for Systems Engineering and Management, Wiley Series in Systems Engineering, 2nd Ed, Wiley & Sons Inc., 2011, and cowrote the Wiley & Sons Handbook of Decision Analysis, 2013. Dr Parnell has taught at West Point, the United Stated Air Force Academy, Virginia Commonwealth University, and the Air Force Institute of Technology. He is a fellow of the International Committee on Systems Engineering (INCOSE), the Institute for Operations Research & Management Science, Military Operations Research Society, the Society for Decision Professionals, and the Lean Systems Society. During his Air Force career, he served in a variety of R&D positions and operations research positions including at the Pentagon where he led two analysis divisions supporting senior Air Force leadership. He is a retired Colonel in the US Air Force. Dr Parnell received a B.S. in Aerospace Engineering from the University of Buffalo, an M.E. in Industrial & Systems Engineering from the University of Florida, an M.S. in Systems Management from the University of Southern California, and a Ph.D. in Engineering-Economic Systems from Stanford University.

Robert F. Bordley is an adjunct professor of decision analysis and systems engineering at the University of Michigan and a full-time consultant for Booz Allen Hamilton. Bob was formerly technical Fellow at General Motors and a Program Director at the National Science Foundation. His Ph.D., M.S., and MBA in Operations Research are from the University of California, Berkeley with an M.S. in Systems Science, B.S. in Physics, and B.A. in Public Policy from Michigan State University. He is an INCOSE-certified expert systems engineering professional (ESEP), an INFORMS-certified analytic professional (CAP), a professional statistician (PStat), and a certified Project Management Professional (PMP). Bob is a Fellow of the Institute for Operations Research and Management Sciences, a Fellow of the American Statistical Association, and a Fellow of the Society of Decision Professionals. Bob also received the 2004 Best Decision Analysis Publication Award. At the National Science Foundation, he served as Program Director for Decision, Risk and Management Sciences. As Technical Fellow at General Motors, he received GM's Chairman Award, President's Council Award, Research Award of Excellence, GM's Engineering Award of Excellence, and UAW-GM Quality Award. Bob led the mission analysis group in Project Trilby, which helped launch GM's vehicle systems engineering effort as well as its R&D portfolio management group. Bob was also a Technical Director on GM's corporate strategy staff and served as internal consultant to GM's marketing, product planning, and quality engineering staffs. At Booz Allen Hamilton, Bob supports requirements management and concept selection for the Army.

Matthew Cilli received his Ph.D. in Systems Engineering from Stevens Institute of Technology in Hoboken, NJ, and leads an analytics group at the US Army's Armament Research Development and Engineering Center (ARDEC) in Picatinny, NJ. His research interests are focused on improving strategic decision-making through the integrated application of holistic thinking and analytics. Prior to his current position, Dr Cilli accumulated over 20 years of experience developing proposals, securing resources, and leading effective technology development programs for the US Army. Dr Cilli graduated from Villanova University, Villanova, PA, with a Bachelor of Electrical Engineering and a Minor in Mathematics in May 1989. He is also a graduate of the Polytechnic University, Brooklyn, NY, with a Master of Science – Electrical Engineering received in January 1992 and in May 1998, graduated from the University of Pennsylvania, Wharton Business School, Philadelphia, PA, with a Masters of Technology Management.

Simon R. Goerger is the ERDC Director of the Institute for Systems Engineering Research (ISER) at the Information Technology Laboratory (ITL) of the Engineer Research and Development Center (ERDC) in Vicksburg, MS. He has been an Operations Research Analyst with the US Army Corps of Engineers since 2012. Prior to working for the Corps of Engineers, he was a Colonel in the US Army serving as the Director of the Department of Defense Readiness Reporting System (DRRS) Implementation Office (DIO). Simultaneously, he served as Senior Defense Readiness Analyst in the Office of the Undersecretary of Defense (Personnel and Readiness). Simon has served as an Assistant Professor and the Director of the Operations Research Center of Excellence in the Department of Systems Engineering at the United States Military Academy, West Point, NY, before deploying to serve as the Joint Multinational Networks Division Chief, Coalition Forces Land Combatant Command/US Army Central Command, Kuwait. He received his Bachelor of Science from the United States Military Academy, his Master of Science (M.S.) in National Security Strategy from the National War College, and his M.S. in Computer Science and Doctorate of Philosophy in Modeling and Simulations from the Naval Postgraduate School. He is a board member for the Military Operations Research Society. His research interests include decision analysis, systems modeling, tradespace analysis, and combat modeling and simulations.

Dr Gina Guillaume-Joseph is an Information Systems Engineer at The MITRE Corporation in McLean, Virginia. In her current role, she acts as a trusted advisor to senior leadership in Federal Agencies by partnering with them to design enhancements to their work systems. Dr Guillaume-Joseph's work has led to improvements that allow the systems and processes to operate more efficiently and effectively in fulfillment of specific functions. Her various roles have included project manager, software developer, test engineer, and quality assurance engineer within the private, government consulting, nonprofit, and telecommunications arenas. Dr Guillaume-Joseph is President of the INCOSE Washington Metro Area (WMA) Chapter. Dr Guillaume-Joseph has a strong record of success based on direct personal contributions. She leads and develops teams that are adaptive, flexible, and highly responsive in the exceptionally dynamic environment of Government support. Her accomplishments and successes are based on strong program performance, leadership discipline, a commitment to developing relevant, innovative and adaptive solutions, and a vigilant focus on best value solutions for her clients. Dr Guillaume-Joseph has advanced knowledge of software development lifecycle activities, such as agile, waterfall, iterative, incremental, and associated processes including planning, requirements management, design and development, testing, and deployment. Her strong communication skills make her adept at conveying specialized technical information to nontechnical audiences. Dr Guillaume-Joseph received her B.A. in Computer Science from Boston College and M.S. in Information Technology Systems from the University of Maryland. She obtained her Ph.D. in Systems Engineering from George Washington University with a topic focused on Predicting Software Project Failure Outcomes using Predictive Analytics and Modeling.

C. Robert Kenley is an Associate Professor of Engineering Practice in Purdue's School of Industrial Engineering in West Lafayette, IN. He teaches courses in systems engineering at Purdue and has over 30 years of experience in industry, academia, and government as a practitioner, consultant, and researcher in systems engineering. He has published papers on systems requirements, technology readiness assessment and forecasting, Bayes nets, applied meteorology, the impacts of nuclear power plants on employment, agent-based simulation, and model-based systems engineering. Professor Kenley holds a Bachelor of Science in Management from Massachusetts Institute of Technology (MIT), a Master of Science in Statistics from Purdue University, and a Doctor of Philosophy in Engineering-Economic Systems from Stanford University.

Azad M. Madni is a Professor of Astronautical Engineering and the Technical Director of the multidisciplinary Systems Architecting and Engineering (SAE) Program at the University of Southern California's Viterbi School of Engineering. He is also a Professor of USC's Keck School of Medicine and Rossier School of Education. Dr Madni is the founder and Chairman of Intelligent Systems Technology, Inc., a high-tech company specializing in modeling, simulation, and gaming technologies for education and training. His research has been sponsored by several prominent government agencies including DARPA, DHS S&T, MDA, DTRA, ONR, AFOSR, AFRL, ARI, RDECOM, NIST, DOE, and NASA, as well as major aerospace companies including Boeing, Northrop Grumman, and Raytheon. He is the Co-Editor-in-Chief of Engineered Resilient Systems: Challenges and Opportunities in the 21st Century, Procedia Computer Science, 2014. His recent awards include the 2011 INCOSE Pioneer Award and the 2014 Lifetime Achievement Award from INCOSE-LA. He is a Fellow of AAAS, AIAA, IEEE, INCOSE, SDPS, and IETE. He is the Strategic Advisor of the INCOSE Systems Engineering Journal. He received his B.S., M.S., and Ph.D. degrees from the University of California, Los Angeles. He is also a graduate of AEA/Stanford Executive Institute for senior executives.

Alexander D. MacCalman is an Army Special Forces Officer in the Operations Research System Analyst Functional Area and has a Masters in Operations Research and a Ph.D. in Modeling and Simulations from the Naval Postgraduate School. He served in various assignments within the Special Operations and Army Analytical communities. He is currently an Assistant Professor in the Department of Systems Engineering at the United States Military Academy and works as the Systems Engineering Program Director. His research interests are in simulation experiments and how they can inform decision analysis and trade decisions.

John MacCarthy is currently serving as the Director of the Systems Engineering Education Program at the University of Maryland's Institute for Systems Research (College Park). Prior to taking this position, he completed a 28-year career as a systems engineer that included serving as a research staff member at the Institute for Defense Analyses, a senior technology and policy advisor for an senior government executive, as well as a variety of systems engineering leadership positions within Northrop Grumman and TRW (e.g., Senior Systems Engineer/Manager, Lead Systems Engineer, Manager of Proposal Operations, Deputy Director of the Center for Advanced Technology, and others). He has extensive experience in applying the full range of systems engineering processes to diverse domains that included very large defense systems and system of systems, a national nuclear waste disposal system, and a number of smaller state and local government systems. During his last 8 years in the industry, Dr MacCarthy taught a variety of graduate-level systems engineering courses as an Adjunct Professor at the University of Maryland, Baltimore County. He began his career as an Assistant Professor of physics at Muhlenberg College. He holds a Ph.D. in Physics from the University of Notre Dame, an M.S. in Systems Engineering from George Mason University, and a B.A. in Physics from Carleton College. His professional experience and interests include systems engineering; systems analysis, modeling and simulation; communications and sensor networks; sustainment engineering; life cycle cost analysis; the acquisition process; and science and engineering education.

James N. Martin is a Principal Engineer with The Aerospace Corporation. He teaches courses for The Aerospace Institute on architecting and systems engineering. Dr Martin is an enterprise architect and systems engineer developing solutions for information systems and space systems. He previously worked for Raytheon Systems Company as a lead systems engineer and architect on airborne and satellite communications networks. He has also worked at AT&T Bell Labs on wireless telecommunications products and underwater fiber optic transmission products. His book, Systems Engineering Guidebook, was published by CRC Press in 1996. He is an INCOSE Fellow and was leader of the Standards Technical Committee. Dr Martin is the founder and current leader of INCOSE's Systems Science Working Group. He received from INCOSE the Founders Award for his long and distinguished achievements in the field. Dr Martin was a key author on the BKCASE project in development of the SE Body of Knowledge (SEBOK). His main SEBOK contribution was the articles on Enterprise Systems Engineering. Dr Martin led the working group responsible for developing ANSI/EIA 632, a US national standard that defines the processes for engineering a system. He is the INCOSE representative to ISO for international standards on architecture, one of which is dealing with architecture evaluation, the topic of the chapter he wrote for this book. Dr Martin received his Ph.D. from George Mason University in Enterprise Architecture as well as a BS from Texas A&M University and an M.S. from Stanford University.

Kirk Michealson is the President of Tackle Solutions, LLC, a consulting firm for operations research analysis, project management, and training. He is an Operations Research Analyst, Fellow of the Military Operations Research Society (MORS), Lean Six Sigma Black Belt, retired Naval Officer, and an Adjunct Professor for the University of Arkansas' M.S. in Operations Management program teaching Decision Support Tools, Analytics, and Decision Models. He has degrees in Operations Research, graduating with a B.S. from the United States Naval Academy and an M.S from the Naval Postgraduate School. As a MORS Fellow, he leads the Affordability Analysis Community of Practice developing an affordability analysis process for government and industry and received the Clayton J. Thomas Award for lifetime achievement as an Operations Research Practitioner. Kirk was formerly a technical Fellow for Operations Research Analysis at Lockheed Martin where he was responsible for designing an Operations Analysis (OA) Practitioner's Success Profile and Competency Model determining the necessary skills and expertise to be a successful OA at Lockheed Martin and developing the corporate-wide experimentation process as the corporation's experimentation lead. Kirk is a retired Commander in the US Navy, and during his naval career, he was a surface warfare officer serving on ships and in various operations research positions supporting senior Navy and Department of Defense leadership.

William D. Miller