Liars and Outliers - Bruce Schneier - E-Book

Liars and Outliers E-Book

Bruce Schneier

0,0
18,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

In today's hyper-connected society, understanding the mechanisms of trust is crucial. Issues of trust are critical to solving problems as diverse as corporate responsibility, global warming, and the political system. In this insightful and entertaining book, Schneier weaves together ideas from across the social and biological sciences to explain how society induces trust. He shows the unique role of trust in facilitating and stabilizing human society. He discusses why and how trust has evolved, why it works the way it does, and the ways the information society is changing everything.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 690

Veröffentlichungsjahr: 2012

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Advance Praise for Liars and Outliers

Title

Copyright

Credits

A Note for Readers

Chapter 1: Overview

Part I: The Science of Trust

Chapter 2: A Natural History of Security

Chapter 3: The Evolution of Cooperation

Chapter 4: A Social History of Trust

Chapter 5: Societal Dilemmas

Part II: A Model of Trust

Chapter 6: Societal Pressures

Chapter 7: Moral Pressures

Chapter 8: Reputational Pressures

Chapter 9: Institutional Pressures

Chapter 10: Security Systems

Part III: The Real World

Chapter 11: Competing Interests

Chapter 12: Organizations

Chapter 13: Corporations

Chapter 14: Institutions

Part IV: Conclusions

Chapter 15: How Societal Pressures Fail

Chapter 16: Technological Advances

Chapter 17: The Future

Acknowledgments

Notes

References

About the Author

Index

End User License Agreement

List of Illustrations

Chapter 1: Overview

Figure 1: The Terms Used in the Book, and Their Relationships

Chapter 2: A Natural History of Security

Figure 2: The Red Queen Effect in Action

Figure 3: The Red Queen Effect Feedback Loop

Chapter 3: The Evolution of Cooperation

Figure 4: Metaphorical Knobs to Control a Hawk-Dove Game

Chapter 4: A Social History of Trust

Figure 5: Dunbar Numbers

Chapter 6: Societal Pressures

Figure 6: Societal Pressure Knobs

Figure 7: The Scale of Different Societal Pressures

Figure 8: How Societal Pressures Influence the Risk Trade-Off

Chapter 10: Security Systems

Figure 9: Security's Diminishing Returns

Chapter 11: Competing Interests

Figure 10: Competing Interests in a Societal Dilemma

Figure 11: Scale of Competing Interests

Chapter 14: Institutions

Figure 12: How Societal Pressures Are Delegated

Chapter 15: How Societal Pressures Fail

Figure 13: Societal Pressure's Feedback Loops

Chapter 16: Technological Advances

Figure 14: Societal Pressure Red Queen Effect

Figure 15: The Security Gap

Guide

Cover

Table of Contents

Begin Reading

Pages

C1

i

ii

iii

vii

viii

ix

xi

xiii

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

17

18

19

20

21

22

23

24

25

27

28

29

30

31

32

33

34

35

36

37

38

39

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

139

140

141

142

143

144

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

162

163

164

165

166

167

168

169

170

171

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

195

196

197

198

199

200

201

202

203

204

205

207

208

209

210

211

212

213

214

215

216

217

218

219

220

221

222

223

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

266

267

268

269

270

271

272

273

274

275

276

277

278

279

280

281

282

283

284

285

286

287

289

290

291

292

293

294

295

296

297

298

299

300

301

302

303

304

305

306

307

308

309

310

311

312

313

314

315

316

317

318

319

320

321

322

323

324

325

326

327

328

329

330

331

332

333

334

335

336

337

338

339

340

341

342

343

344

345

346

347

348

349

351

352

353

354

355

356

357

358

359

360

361

362

363

364

365

366

367

368

e1

Advance Praise for Liars and Outliers

“A rich, insightfully fresh take on what security really means!”

—DAVID ROPEIK

author of How Risky is it, Really?

“Schneier has accomplished a spectacular tour de force: an enthralling ride through history, economics, and psychology, searching for the meanings of trust and security. A must read.”

—ALESSANDRO ACQUISTI

Associate Professor of Information Systems and Public Policy at the Heinz College, Carnegie Mellon University

“Liars and Outliers offers a major contribution to the understandability of these issues, and has the potential to help readers cope with the ever-increasing risks to which we are being exposed. It is well written and delightful to read.”

—PETER G. NEUMANN

Principal Scientist in the SRI International Computer Science Laboratory

“Whether it’s banks versus robbers, Hollywood versus downloaders, or even the Iranian secret police against democracy activists, security is often a dynamic struggle between a majority who want to impose their will, and a minority who want to push the boundaries. Liars and Outliers will change how you think about conflict, our security, and even who we are.”

—ROSS ANDERSON

Professor of Security Engineering at Cambridge University and author of Security Engineering

“Readers of Bruce Schneier’s Liars and Outliers will better understand technology and its consequences and become more mature practitioners.”

—PABLO G. MOLINA

Professor of Technology Management Georgetown University

“Liars and Outliers is not just a book about security—it is the book about it. Schneier shows that the power of humour can be harnessed to explore even a serious subject such as security. A great read!”

—FRANK FUREDI

Professor Emeritus, School of Social Policy, Sociology and Social Research

The University of Kent at Canterbury and author of On Tolerance: A Defence of Moral Independence

“This fascinating book gives an insightful and convincing framework for understanding security and trust.”

—JEFF YAN

Founding Research Director, Center for Cybercrime and Computer Security

Newcastle University

“By analyzing the moving parts and interrelationships among security, trust, and society, Schneier has identified critical patterns, pressures, levers, and security holes within society. Clearly written, thoroughly interdisciplinary, and always smart, Liars and Outliers provides great insight into resolving society’s various dilemmas.”

—JERRY KANG

Professor of Law, UCLA

“By keeping the social dimension of trust and security in the center of his analysis, Schneier breaks new ground with an approach that’s both theoretically grounded and practically applicable.”

—JONATHAN ZITTRAIN

Professor of Law and Computer Science, Harvard University and author of The Future of the Internet—And How to Stop It

“Eye opening. Bruce Schneier provides a perspective you need to understand today’s world.”

—STEVEN A. LEBLANC

Director of Collections, Harvard University and author of Constant Battles: Why We Fight

“An outstanding investigation of the importance of trust in holding society together and promoting progress. Liars and Outliers provides valuable new insights into security and economics.”

—ANDREW ODLYZKO

Professor, School of Mathematics, University of Minnesota

“What Schneier has to say about trust—and betrayal—lays a groundwork for greater understanding of human institutions. This is an essential exploration as society grows in size and complexity.”

—JIM HARPER

Director of Information Policy Studies, CATO Institute and author of Identity Crisis: How Identification Is Overused and Misunderstood

“Society runs on trust. Liars and Outliers explains the trust gaps we must fill to help society run even better.”

—M. ERIC JOHNSON

Director, Glassmeyer/McNamee Center for Digital Strategies Tuck School of Business at Dartmouth College

“An intellectually exhilarating and compulsively readable analysis of the subtle dialectic between cooperation and defection in human society. Intellectually rigorous and yet written in a lively, conversational style, Liars and Outliers will change the way you see the world.”

—DAVID LIVINGSTONE SMITH

Associate Professor of Philosophy, University of New England and author of Less Than Human: Why We Demean, Enslave, and Exterminate Others

“Schneier tackles trust head on, bringing all his intellect and a huge amount of research to bear. The best thing about this book, though, is that it’s great fun to read.”

—ANDREW MCAFEE

Principal Research Scientist, MIT Center for Digital Business and co-author of Race Against the Machine

“Bruce Schneier is our leading expert in security. But his book is about much more than reducing risk. It is a fascinating, thought-provoking treatise about humanity and society and how we interact in the game called life.”

—JEFF JARVIS

author of Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live

“Both accessible and thought provoking, Liars and Outliers invites readers to move beyond fears and anxieties about security in modern life to understand the role of everyday people in creating a healthy society. This is a must-read!”

—DANAH BOYD

Research Assistant Professor in Media, Culture, and Communication New York University

“Trust is the sine qua non of the networked age and trust is predicated on security. Bruce Schneier’s expansive and readable work is rich with insights that can help us make our shrinking world a better one.”

—DON TAPSCOTT

co-author of Macrowikinomics: Rebooting Business and the World

“An engaging and wide-ranging rumination on what makes society click. Highly recommended.”

—JOHN MUELLER

Senior Research Scientist, Mershon Center, Ohio State University and author of Overblown: How Politicians and the Terrorism Industry Inflate National Security Threats, and Why We Believe Them

Liars and Outliers

Enabling the Trust That Society Needs to Thrive

Bruce Schneier

Liars and Outliers: Enabling the Trust That Society Needs to Thrive

Published by

John Wiley & Sons, Inc.

10475 Crosspoint Boulevard

Indianapolis, IN 46256

www.wiley.com

Copyright © 2012 by Bruce Schneier

Published by John Wiley & Sons, Inc., Indianapolis, Indiana

Published simultaneously in Canada

ISBN: 978-1-118-14330-8

ISBN: 978-1-118-22556-1 (ebk)

ISBN: 978-1-118-23901-8 (ebk)

ISBN: 978-1-118-26362-4 (ebk)

No part of this publication may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning or otherwise, except as permitted under Sections 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at www.wiley.com/go/permissions.

Limit of Liability/Disclaimer of Warranty: The publisher and the author make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation warranties of fitness for a particular purpose. No warranty may be created or extended by sales or promotional materials. The advice and strategies contained herein may not be suitable for every situation. This work is sold with the understanding that the publisher is not engaged in rendering legal, accounting, or other professional services. If professional assistance is required, the services of a competent professional person should be sought. Neither the publisher nor the author shall be liable for damages arising herefrom. The fact that an organization or Web site is referred to in this work as a citation and/or a potential source of further information does not mean that the author or the publisher endorses the information the organization or website may provide or recommendations it may make. Further, readers should be aware that Internet websites listed in this work may have changed or disappeared between when this work was written and when it is read.

For general information on our other products and services please contact our Customer Care Department within the United States at (877) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.

Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Not all content that is available in standard print versions of this book may appear or be packaged in all book formats. If you have purchased a version of this book that did not include media that is referenced by or accompanies a standard print version, you may request this media by visiting booksupport.wiley.com. For more information about Wiley products, visit us at www.wiley.com.

Library of Congress Control Number: 2011944879

Trademarks: Wiley and the Wiley logo are trademarks or registered trademarks of John Wiley & Sons, Inc. and/or its affiliates, in the United States and other countries, and may not be used without written permission. All other trademarks are the property of their respective owners. John Wiley & Sons, Inc. is not associated with any product or vendor mentioned in this book.

Credits

Executive Editor: Carol Long

Project Editor: Tom Dinse

Senior Production Editor: Debra Banninger

Copy Editor: Kim Cofer

Editorial Manager: Mary Beth Wakefield

Freelancer Editorial Manager: Rosemarie Graham

Marketing Manager: Ashley Zurcher

Business Manager: Amy Knies

Production Manager: Tim Tate

Vice President and Executive Group Publisher: Richard Swadley

Vice President and Executive Publisher: Neil Edde

Associate Publisher: Jim Minatel

Project Coordinator, Cover: Katie Crocker

Proofreader: Nancy Carrasco

Indexer: Johnna Dinse

Cover Designer: Ryan Sneed

Cover Concept: Luke Fretwell

A Note for Readers

This book contains both notes and references. The notes are explanatory bits that didn't make it into the main text. These are indicated by superscript numbers in both the paper and e-book formats. The references are indicated by links in the main text.

High-resolution versions of the figures can be found at www.schneier.com/lo.

Chapter 1Overview

Just today, a stranger came to my door claiming he was here to unclog a bathroom drain. I let him into my house without verifying his identity, and not only did he repair the drain, he also took off his shoes so he wouldn't track mud on my floors. When he was done, I gave him a piece of paper that asked my bank to give him some money. He accepted it without a second glance. At no point did he attempt to take my possessions, and at no point did I attempt the same of him. In fact, neither of us worried that the other would. My wife was also home, but it never occurred to me that he was a sexual rival and I should therefore kill him.

Also today, I passed several strangers on the street without any of them attacking me. I bought food from a grocery store, not at all concerned that it might be unfit for human consumption. I locked my front door, but didn't spare a moment's worry at how easy it would be for someone to smash my window in. Even people driving cars, large murderous instruments that could crush me like a bug, didn't scare me.

Most amazingly, this worked without much overt security. I don't carry a gun for self-defense, nor do I wear body armor. I don't use a home burglar alarm. I don't test my food for poison. I don't even engage in conspicuous displays of physical prowess to intimidate other people I encounter.

It's what we call “trust.” Actually, it's what we call “civilization.”

All complex ecosystems, whether they are biological ecosystems like the human body, natural ecosystems like a rain forest, social ecosystems like an open-air market, or socio-technical ecosystems like the global financial system or the Internet, are deeply interlinked. Individual units within those ecosystems are interdependent, each doing its part and relying on the other units to do their parts as well. This is neither rare nor difficult, and complex ecosystems abound.

At the same time, all complex ecosystems contain parasites. Within every interdependent system, there are individuals who try to subvert the system to their own ends. These could be tapeworms in our digestive tracts, thieves in a bazaar, robbers disguised as plumbers, spammers on the Internet, or companies that move their profits offshore to evade taxes.

Within complex systems, there is a fundamental tension between what I'm going to call cooperating, or acting in the group interest; and what I'll call defecting, or acting against the group interest and instead in one's own self-interest. Political philosophers have recognized this antinomy since Plato. We might individually want each other's stuff, but we're collectively better off if everyone respects property rights and no one steals. We might individually want to reap the benefits of government without having to pay for them, but we're collectively better off if everyone pays taxes. Every country might want to be able to do whatever it wants, but the world is better off with international agreements, treaties, and organizations. In general, we're collectively better off if society limits individual behavior, and we'd each be better off if those limits didn't apply to us individually. That doesn't work, of course, and most of us recognize this. Most of the time, we realize that it is in our self-interest to act in the group interest. But because parasites will always exist—because some of us steal, don't pay our taxes, ignore international agreements, or ignore limits on our behavior—we also need security.

Society runs on trust. We all need to trust that the random people we interact with will cooperate. Not trust completely, not trust blindly, but be reasonably sure (whatever that means) that our trust is well-founded and they will be trustworthy in return (whatever that means). This is vital. If the number of parasites gets too large, if too many people steal or too many people don't pay their taxes, society no longer works. It doesn't work both because there is so much theft that people can't be secure in their property, and because even the honest become suspicious of everyone else. More importantly, it doesn't work because the social contract breaks down: society is no longer seen as providing the required benefits. Trust is largely habit, and when there's not enough trust to be had, people stop trusting each other.

The devil is in the details. In all societies, for example, there are instances where property is legitimately taken from one person and given to another: taxes, fines, fees, confiscation of contraband, theft by a legitimate but despised ruler, etc. And a societal norm like “everyone pays his or her taxes” is distinct from any discussion about what sort of tax code is fair. But while we might disagree about the extent of the norms we subject ourselves to—that's what politics is all about—we're collectively better off if we all follow them.

Of course, it's actually more complicated than that. A person might decide to break the norms, not for selfish parasitical reasons, but because his moral compass tells him to. He might help escaped slaves flee into Canada because slavery is wrong. He might refuse to pay taxes because he disagrees with what his government is spending his money on. He might help laboratory animals escape because he believes animal testing is wrong. He might shoot a doctor who performs abortions because he believes abortion is wrong. And so on.

Sometimes we decide a norm breaker did the right thing. Sometimes we decide that he did the wrong thing. Sometimes there's consensus, and sometimes we disagree. And sometimes those who dare to defy the group norm become catalysts for social change. Norm breakers rioted against the police raids of the Stonewall Inn in New York in 1969, at the beginning of the gay rights movement. Norm breakers hid and saved the lives of Jews in World War II Europe, organized the Civil Rights bus protests in the American South, and assembled in unlawful protest at Tiananmen Square. When the group norm is later deemed immoral, history may call those who refused to follow it heroes.

In 2008, the U.S. real estate industry collapsed, almost taking the global economy with it. The causes of the disaster are complex, but were in a large part caused by financial institutions and their employees subverting financial systems to their own ends. They wrote mortgages to homeowners who couldn't afford them, and then repackaged and resold those mortgages in ways that intentionally hid real risk. Financial analysts, who made money rating these bonds, gave them high ratings to ensure repeat rating business.

This is an example of a failure of trust: a limited number of people were able to use the global financial system for their own personal gain. That sort of thing isn't supposed to happen. But it did happen. And it will happen again if society doesn't get better at both trust and security.

Failures in trust have become global problems:

The Internet brings amazing benefits to those who have access to it, but it also brings with it new forms of fraud. Impersonation fraud—now called identity theft—is both easier and more profitable than it was pre-Internet. Spam continues to undermine the usability of e-mail. Social networking sites deliberately make it hard for people to effectively manage

their own privacy

. And antagonistic behavior threatens almost every Internet community.

Globalization has improved the lives of people in many countries, but with it came an increased threat of global terrorism. The terrorist attacks of 9/11 were a failure of trust, and so were the government overreactions in the decade following.

The financial network allows anyone to do business with anyone else around the world; but easily hacked financial accounts mean there is enormous profit in fraudulent transactions, and easily hacked computer databases mean there is also a global market in (terrifyingly cheap) stolen credit card numbers and personal dossiers to enable those fraudulent transactions.

Goods and services are now supplied worldwide at much lower cost, but with this change comes tainted foods, unsafe children's toys, and the outsourcing of data processing to countries with different laws.

Global production also means more production, but with it comes environmental pollution. If a company discharges lead into the atmosphere—or chlorofluorocarbons, or nitrogen oxides, or carbon dioxide—that company gets all the benefit of cheaper production costs, but the environmental cost falls on everybody else on the planet.

And it's not just global problems, of course. Narrower failures in trust are so numerous as to defy listing. Here are just a few examples:

In 2009–2010, officials of Bell, California,

effectively looted

the city's treasury, awarding themselves unusually high salaries, often for part-time work.

Some early online games, such as Star Wars Galaxy Quest, collapsed due to

internal cheating

.

The senior executives at companies such as WorldCom, Enron, and Adelphia inflated their companies' stock prices through fraudulent accounting practices, awarding themselves huge bonuses but destroying the companies in the process.

What ties all these examples together is that the interest of society was in conflict with the interests of certain individuals within society. Society had some normative behaviors, but failed to ensure that enough people cooperated and followed those behaviors. Instead, the defectors within the group became too large or too powerful or too successful, and ruined it for everyone.

This book is about trust. Specifically, it's about trust within a group. It's important that defectors not take advantage of the group, but it's also important for everyone in the group to trust that defectors won't take advantage.

“Trust” is a complex concept, and has a lot of flavors of meaning. Sociologist Piotr Sztompka wrote that “trust is a bet about the future contingent actions of others.” Political science professor Russell Hardin wrote: “Trust involves giving discretion to another to affect one's interests.” These definitions focus on trust between individuals and, by extension, their trustworthiness.1

When we trust people, we can either trust their intentions or their actions. The first is more intimate. When we say we trust a friend, that trust isn't tied to any particular thing he's doing. It's a general reliance that, whatever the situation, he'll do the right thing: that he's trustworthy. We trust the friend's intentions, and know that his actions will be informed by those intentions.2

The second is less intimate, what sociologist Susan Shapiro calls impersonal trust. When we don't know someone, we don't know enough about her, or her underlying motivations, to trust her based on character alone. But we can trust her future actions.3 We can trust that she won't run red lights, or steal from us, or cheat on tests. We don't know if she has a secret desire to run red lights or take our money, and we really don't care if she does. Rather, we know that she is likely to follow most social norms of acceptable behavior because the consequences of breaking these norms are high. You can think of this kind of trust—that people will behave in a trustworthy manner even if they are not inherently trustworthy—more as confidence, and the corresponding trustworthiness as compliance.4

In another sense, we're reducing trust to consistency or predictability. Of course, someone who is consistent isn't necessarily trustworthy. If someone is a habitual thief, I don't trust him. But I do believe (and, in another sense of the word, trust) that he will try to steal from me. I'm less interested in that aspect of trust, and more in the positive aspects. In The Naked Corporation, business strategist Don Tapscott described trust, at least in business, as the expectation that the other party will be honest, considerate, accountable, and transparent. When two people are consistent in this way, we call them cooperative.

In today’s complex society, we often trust systems more than people. It’s not so much that I trusted the plumber at my door as that I trusted the systems that produced him and protect me. I trusted the recommendation from my insurance company, the legal system that would protect me if he did rob my house, whatever the educational system is that produces and whatever insurance system bonds skilled plumbers, and—most of all—the general societal systems that inform how we all treat each other in society. Similarly, I trusted the banking system, the corporate system, the system of police, the system of traffic laws, and the system of social norms that govern most behaviors.5

This book is about trust more in terms of groups than individuals. I'm not really concerned about how specific people come to trust other specific people. I don't care if my plumber trusts me enough to take my check, or if I trust that driver over there enough to cross the street at the stop sign. I'm concerned with the general level of impersonal trust in society. Francis Fukuyama's definition nicely captures the term as I want to use it: “Trust is the expectation that arises within a community of regular, honest, and cooperative behavior, based on commonly shared norms, on the part of other members of that community.”

Sociologist Barbara Misztal identified three critical functions performed by trust: 1) it makes social life more predictable, 2) it creates a sense of community, and 3) it makes it easier for people to work together. In some ways, trust in society works like oxygen in the atmosphere. The more customers trust merchants, the easier commerce is. The more drivers trust other drivers, the smoother traffic flows. Trust gives people the confidence to deal with strangers: because they know that the strangers are likely to behave honestly, cooperatively, fairly, and sometimes even altruistically. The more trust is in the air, the healthier society is and the more it can thrive. Conversely, the less trust is in the air, the sicker society is and the more it has to contract. And if the amount of trust gets too low, society withers and dies. A recent example of a systemic breakdown in trust occurred in the Soviet Union under Stalin.

I'm necessarily simplifying here. Trust is relative, fluid, and multidimensional. I trust Alice to return a $10 loan but not a $10,000 loan, Bob to return a $10,000 loan but not to babysit an infant, Carol to babysit but not with my house key, Dave with my house key but not my intimate secrets, and Ellen with my intimate secrets but not to return a $10 loan. I trust Frank if a friend vouches for him, a taxi driver as long as he's displaying his license, and Gail as long as she hasn't been drinking. I don't trust anyone at all with my computer password. I trust my brakes to stop the car, ATM machines to dispense money from my account, and Angie's List to recommend a qualified plumber—even though I have no idea who designed, built, or maintained those systems. Or even who Angie is. In the language of this book, we all need to trust each other to follow the behavioral norms of our group.

Many other books talk about the value of trust to society. This book explains how society establishes and maintains that trust.6 Specifically, it explains how society enforces, evokes, elicits, compels, encourages—I'll use the term induces—trustworthiness, or at least compliance, through systems of what I call societal pressures, similar to sociology's social controls: coercive mechanisms that induce people to cooperate, act in the group interest, and follow group norms. Like physical pressures, they don't work in all cases on all people. But again, whether the pressures work against a particular person is less important than whether they keep the scope of defection to a manageable level across society as a whole.

A manageable level, but not too low a level. Compliance isn't always good, and defection isn't always bad. Sometimes the group norm doesn't deserve to be followed, and certain kinds of progress and innovation require violating trust. In a police state, everybody is compliant but no one trusts anybody. A too-compliant society is a stagnant society, and defection contains the seeds of social change.

This book is also about security. Security is a type of a societal pressure in that it induces cooperation, but it's different from the others. It is the only pressure that can act as a physical constraint on behavior regardless of how trustworthy people are. And it is the only pressure that individuals can implement by themselves. In many ways, it obviates the need for intimate trust. In another way, it is how we ultimately induce compliance and, by extension, trust.

It is essential that we learn to think smartly about trust. Philosopher Sissela Bok wrote: “Whatever matters to human beings, trust is the atmosphere in which it thrives.” People, communities, corporations, markets, politics: everything. If we can figure out the optimal societal pressures to induce cooperation, we can reduce murder, terrorism, bank fraud, industrial pollution, and all the rest.

If we get pressures wrong, the murder rate skyrockets, terrorists run amok, employees routinely embezzle from their employers, and corporations lie and cheat at every turn. In extreme cases, an untrusting society breaks down. If we get them wrong in the other direction, no one speaks out about institutional injustice, no one deviates from established corporate procedure, and no one popularizes new inventions that disrupt the status quo—an oppressed society stagnates. The very fact that the most extreme failures rarely happen in the modern industrial world is proof that we've largely gotten societal pressures right. The failures that we've had show we have a lot further to go.

Also, as we'll see, evolution has left us with intuitions about trust better suited to life as a savannah-dwelling primate than as a modern human in a global high-tech society. That flawed intuition is vulnerable to exploitation by companies, con men, politicians, and crooks. The only defense is a rational understanding of what trust in society is, how it works, and why it succeeds or fails.

This book is divided into four parts. In Part I, I'll explore the background sciences of the book. Several fields of research—some closely related—will help us understand these topics: experimental psychology, evolutionary psychology, sociology, economics, behavioral economics, evolutionary biology, neuroscience, game theory, systems dynamics, anthropology, archaeology, history, political science, law, philosophy, theology, cognitive science, and computer security.

All these fields have something to teach us about trust and security.7 There's a lot here, and delving into any of these areas of research could easily fill several books. This book attempts to gather and synthesize decades, and sometimes centuries, of thinking, research, and experimentation from a broad swath of academic disciplines. It will, by necessity, be largely a cursory overview; often, the hardest part was figuring out what not to include. My goal is to show where the broad arcs of research are pointing, rather than explain the details—though they're fascinating—of any individual piece of research.8

In the last chapter of Part I, I will introduce societal dilemmas. I'll explain a thought experiment called the Prisoner's Dilemma, and its generalization to societal dilemmas. Societal dilemmas describe the situations that require intra-group trust, and therefore use societal pressures to ensure cooperation: they're the central paradigm of my model. Societal dilemmas illustrate how society keeps defectors from taking advantage, taking over, and completely ruining society for everyone. It illustrates how society ensures that its members forsake their own interests when they run counter to society's interest. Societal dilemmas have many names in the literature: collective action problem, Tragedy of the Commons, free-rider problem, arms race. We'll use them all.

Part II fully develops my model. Trust is essential for society to function, and societal pressures are how we achieve it. There are four basic categories of societal pressure that can induce cooperation in societal dilemmas:

Moral pressure

. A lot of societal pressure comes from inside our own heads. Most of us don't steal, and it's not because there are armed guards and alarms protecting piles of stuff. We don't steal because we believe it's wrong, or we'll feel guilty if we do, or we want to follow the rules.

Reputational pressure

. A wholly different, and much stronger, type of pressure comes from how others respond to our actions. Reputational pressure can be very powerful; both individuals and organizations feel a lot of pressure to follow the group norms because they don't want a bad reputation.

Institutional pressure

. Institutions have rules and laws. These are norms that are codified, and whose enactment and enforcement is generally delegated. Institutional pressure induces people to behave according to the group norm by imposing sanctions on those who don't, and occasionally by rewarding those who do.

Security systems

. Security systems are another form of societal pressure. This includes any security mechanism designed to induce cooperation, prevent defection, induce trust, and compel compliance. It includes things that work to prevent defectors, like door locks and tall fences; things that interdict defectors, like alarm systems and guards; things that only work after the fact, like forensic and audit systems; and mitigation systems that help the victim recover faster and care less that the defection occurred.

Part III applies the model to the more complex dilemmas that arise in the real world. First I'll look at the full complexity of competing interests. It's not just group interest versus self-interest; people have a variety of competing interests. Also, while it's easy to look at societal dilemmas as isolated decisions, it's common for people to have conflicts of interest: multiple group interests and multiple societal dilemmas are generally operating at any one time. And the effectiveness of societal pressures often depends on why someone is considering defecting.

Then, I'll look at groups as actors in societal dilemmas: organizations in general, corporations, and then institutions. Groups have different competing interests, and societal pressures work differently when applied to them. This is an important complication, especially in the modern world of complex corporations and government agencies. Institutions are also different. In today's world, it's rare that we implement societal pressures directly. More often, we delegate someone to do it for us. For example, we delegate our elected officials to pass laws, and they delegate some government agency to implement those laws.

In Part IV, I'll talk about the different ways societal pressures fail. I'll look at how changes in technology affect societal pressures, particularly security. Then I'll look at the particular characteristics of today's society—the Information Society—and explain why that changes societal pressures. I'll sketch what the future of societal pressures is likely to be, and close with the social consequences of too much societal pressure.

This book represents my attempt to develop a full-fledged theory of coercion and how it enables compliance and trust within groups. My goal is to suggest some new questions and provide a new framework for analysis. I offer new perspectives, and a broader spectrum of what's possible. Perspectives frame thinking, and sometimes asking new questions is the catalyst to greater understanding. It's my hope that this book can give people an illuminating new framework with which to help understand how the world works.

Before we start, I need to define my terms. We talk about trust and security all the time, and the words we use tend to be overloaded with meaning. We're going to have to be more precise…and temporarily suspend our emotional responses to what otherwise might seem like loaded, value-laden, even disparaging, words.

The word society, as used in this book, isn't limited to traditional societies, but is any group of people with a loose common interest. It applies to societies of circumstance, like a neighborhood, a country, everyone on a particular bus, or an ethnicity or social class. It applies to societies of choice, like a group of friends, any membership organization, or a professional society. It applies to societies that are some of each: a religion, a criminal gang, or all employees of a corporation. It applies to societies of all sizes, from a family to the entire planet. All of humanity is a society, and everyone is a member of multiple societies. Some are based on birth, and some are freely chosen. Some we can join, and to some we must be invited. Some may be good, some may be bad—terrorist organizations, criminal gangs, a political party you don't agree with—and most are somewhere in between. For our purposes, a society is just a group of interacting actors organized around a common attribute.

I said actors, not people. Most societies are made up of people, but sometimes they're made up of groups of people. All the countries on the planet are a society. All corporations in a particular industry are a society. We're going to be talking about both societies of individuals and societies of groups.

Societies have a collection of group interests. These are the goals, or directions, of the society. They're decided by the society in some way: perhaps formally—either democratically or autocratically—perhaps informally by the group. International trade can be in the group interest. So can sharing food, obeying traffic laws, and keeping slaves (assuming those slaves are not considered to be part of the group). Corporations, families, communities, and terrorist groups all have their own group interests. Each of these group interests corresponds to one or more norms, which is what each member of that society is supposed to do. For example, it is in the group interest that everyone respect everyone else's property rights. Therefore, the group norm is not to steal (at least, not from other members of the group9).

Every person in a society potentially has one or more competing interests that conflict with the group interest, and competing norms that conflict with the group norm. Someone in that we-don't-steal society might really want to steal. He might be starving, and need to steal food to survive. He just might want other people's stuff. These are examples of self-interest. He might have some competing relational interest. He might be a member of a criminal gang, and need to steal to prove his loyalty to the group; here, the competing interest might be the group interest of another group. Or he might want to steal for some higher moral reason: a competing moral interest—the Robin Hood archetype, for example.

A societal dilemma is the choice every actor has to make between group interest and his or her competing interests. It's the choice we make when we decide whether or not to follow the group norm. Those who do cooperate, and those who do not defect. Those are both loaded terms, but I mean them to refer only to the action as a result of the dilemma.

Defectors—the liars and outliers of the book's title—are the people within a group who don't go along with the norms of that group. The term isn't defined according to any absolute morals, but instead in opposition to whatever the group interest and the group norm is. Defectors steal in a society that has declared that stealing is wrong, but they also help slaves escape in a society where tolerating slavery is the norm. Defectors change as society changes; defection is in the eye of the beholder. Or, more specifically, it is in the eyes of everyone else. Someone who was a defector under the former East German government was no longer in that group after the fall of the Berlin Wall. But those who followed the societal norms of East Germany, like the Stasi, were—all of a sudden—viewed as defectors within the new united Germany.

Figure 1: The Terms Used in the Book, and Their Relationships

Criminals are defectors, obviously, but that answer is too facile. Everyone defects at least some of the time. It's both dynamic and situational. People can cooperate about some things and defect about others. People can cooperate with one group they're in and defect from another. People can cooperate today and defect tomorrow, or cooperate when they're thinking clearly and defect when they're reacting in a panic. People can cooperate when their needs are cared for, and defect when they're starving.

When four black North Carolina college students staged a sit-in at a whites-only lunch counter inside a Woolworth's five-and-dime store in Greensboro, in 1960, they were criminals. So are women who drive cars in Saudi Arabia. Or homosexuals in Iran. Or the 2011 protesters in Egypt, who sought to end their country's political regime. Conversely, child brides in Pakistan are not criminalized and neither are their parents, even though in some cases they marry off five-year-old girls. The Nicaraguan rebels who fought the Sandinistas were criminals, terrorists, insurgents, or freedom fighters, depending on which side you supported and how you viewed the conflict. Pot smokers and dealers in the U.S. are officially criminals, but in the Netherlands those offenses are ignored by the police. Those who share copyrighted movies and music are breaking the law, even if they have moral justifications for their actions.

Defecting doesn't necessarily mean breaking government-imposed laws. An orthodox Jew who eats a ham and cheese sandwich is violating the rules of his religion. A Mafioso who snitches on his colleagues is violating omertà, the code of silence. A relief worker who indulges in a long, hot shower after a tiring journey, and thereby depletes an entire village's hot water supply, unwittingly puts his own self-interest ahead of the interest of the people he intends to help.

What we're concerned with is the overall scope of defection. I mean this term to be general, comprising the number of defectors, the rate of their defection, the frequency of their defection, and the intensity (the amount of damage) of their defection. Just as we're interested in the general level of trust within the group, we're interested in the general scope of defection within the group.

Societal pressures are how society ensures that people follow the group norms, as opposed to some competing norms. The term is meant to encompass everything society does to protect itself: both from fellow members of society, and non-societal members who live within and amongst the society. More generally, it's how society enforces intra-group trust.

The terms attacker and defender are pretty obvious. The predator is the attacker, the prey is the defender. It's all intertwined, and sometimes these terms can get a bit muddy. Watch a martial arts match, and you'll see each person defending against his opponent's attacks while at the same time hoping his own attacks get around his opponent's defenses. In war, both sides attack and defend at the tactical level, even though one side might be attacking and the other defending at the political level. These terms are value-neutral. Attackers can be criminals trying to break into a home, superheroes raiding a criminal or cancer cells metastasizing their way through a hapless human host. Defenders can be a family protecting its home from invasion, the criminal mastermind protecting his lair from the superheroes, or a posse of leukocytes engulfing opportunistic pathogens they encounter.

These definitions are important to remember as you read this book. It's easy for us to bring our own emotional baggage into discussions about security, but most of the time we're just trying to understand the underlying mechanisms at play, and those mechanisms are the same, regardless of the underlying moral context.

Sometimes we need the dispassionate lens of history to judge famous defectors like Oliver North, Oskar Schindler, and Vladimir Lenin.

Part IThe Science of Trust

Chapter 2A Natural History of Security

Our exploration of trust is going to start and end with security, because security is what you need when you don't have any trust and—as we'll see—security is ultimately how we induce trust in society. It's what brings risk down to tolerable levels, allowing trust to fill in the remaining gaps.

You can learn a lot about security from watching the natural world.

Lions seeking to protect their turf will raise their voices in a “

territorial chorus

,” their cooperation reducing the risk of encroachment by other predators for the local food supply.

When

hornworms

start eating a particular species of sagebrush, the plant responds by emitting a molecule that warns any wild tobacco plants growing nearby that hornworms are around. In response, the tobacco plants deploy chemical defenses that repel the hornworms, to the benefit of both plants.

Some types of

plasmids secrete

a toxin that kills the bacteria that carry them. Luckily for the bacteria, the plasmids also emit an antidote; and as long as a plasmid secretes both, the host bacterium survives. But if the plasmid dies, the antidote decays faster than the toxin, and the bacterium dies. This acts as an insurance policy for the plasmids, ensuring that bacteria don't evolve ways to kill them.

In the beginning of life on this planet, some 3.8 billion years ago, an organism's only job was to reproduce. That meant growing, and growing required energy. Heat and light were the obvious sources—photosynthesis appeared 3 billion years ago; chemosynthesis is at least a half a billion years older than that—but consuming the other living things floating around in the primordial ocean worked just as well. So life discovered predation.

We don't know what that first animal predator was, but it was likely a simple marine organism somewhere between 500 million and 550 million years ago. Initially, the only defense a species had against being eaten was to have so many individuals floating around the primordial seas that enough individuals were left to reproduce, so that the constant attrition didn't matter. But then life realized it might be able to avoid being eaten. So it evolved defenses. And predators evolved better ways to catch and eat.

Thus security was born, the planet's fourth oldest activity after eating, eliminating, and reproducing.

Okay, that's a pretty gross simplification, and it would get me booted out of any evolutionary biology class. When talking about evolution and natural selection, it's easy to say that organisms make explicit decisions about their genetic future. They don't. There's nothing purposeful or teleological about the evolutionary process, and I shouldn't anthropomorphize it. Species don't realize anything. They don't discover anything, either. They don't decide to evolve, or try genetic options. It's tempting to talk about evolution as if there's some outside intelligence directing it. We say “prehistoric lungfish first learned how to breathe air,” or “monarch butterflies learned to store plant toxins in their bodies to make themselves taste bad to predators,” but it doesn't work that way. Random mutation provides the material upon which natural selection acts. It is through this process that individuals of a species change subtly from their parents, effectively “trying out” new features. Those innovations that turn out to be beneficial—air breathing—give the individuals a competitive advantage and might potentially propagate through the species (there's still a lot of randomness in this process). Those that turn out to be detrimental—the overwhelming majority of them—kill or otherwise disadvantage the individual and die out.

By “beneficial,” I mean something very specific: increasing an organism's ability to survive long enough to successfully pass its genes on to future generations. Or, to use Richard Dawkins's perspective from The Selfish Gene, genes that helped their host individuals—or other individuals with that gene—successfully reproduce tended to persist in higher numbers in populations.

If we were designing a life form, as we might do in a computer game, we would try to figure out what sort of security it needed and give it abilities accordingly. Real-world species don't have that luxury. Instead, they try new attributes randomly. So instead of an external designer optimizing a species' abilities based on its needs, evolution randomly walks through the solution space and stops at the first solution that works—even if just barely. Then it climbs upwards in the fitness landscape until it reaches a local optimum. You get a lot of weird security that way.

You get teeth, claws, group dispersing behavior, feigning injury and playing dead, hunting in packs, defending in groups (flocking and schooling and living in herds), setting sentinels, digging burrows, flying, mimicry by both predators and prey, alarm calls, shells, intelligence, noxious odors, tool using (both offensive and defensive),1 planning (again, both offensive and defensive), and a whole lot more.2 And this is just in largish animals; we haven't even listed the security solutions insects have come up with. Or plants. Or microbes.

It has been convincingly argued that one of the reasons sexual reproduction evolved about 1.2 billion years ago was to defend against biological parasites. The argument is subtle. Basically, parasites reproduce so quickly that they overwhelm any individual host defense. The value of DNA recombination, which is what you get in sexual reproduction, is that it continuously rearranges a species' defenses so parasites can't get the upper hand. For this reason, a member of a species that reproduces sexually is much more likely to survive than a species that clones itself asexually—even though such a species will pass twice as many of its genes to its offspring as a sexually reproducing species would.

Life evolved two other methods of defending itself against parasites. One is to grow and divide quickly, something that both bacteria and just-fertilized mammalian embryos do. The other is to have an immune system. Evolutionarily, this is a relatively new development; it first appeared in jawed fish about 300 million years ago.3

A surprising number of evolutionary adaptations are related to security. Take vision, for example. Most animals are more adept at spotting movement than picking out details of stationary objects; it's called the orienting response.4 That's because things that move may be predators that attack, or prey that needs to be attacked. The human visual system is particularly good at spotting animals.5 The human ability, unique on the planet, to throw things long distances is another security adaptation. Related is what's called the size-weight misperception: the illusion that easier-to-throw rocks are perceived to be lighter than they are. It's related to our ability to choose good projectiles. Similar stories could be told about many human attributes.6

The predator/prey relationship isn't the only pressure that drives evolution. As soon as there was competition for resources, organisms had to develop security to defend their own resources and attack the resources of others. Whether it's plants competing with each other for access to the sun, predators fighting over hunting territory, or animals competing for potential mates, organisms had to develop security against others of the same species. And again, evolution resulted in all sorts of weird security. And it works amazingly well.

Security on Earth went on more or less like this for 500 million years. It's a continual arms race. A rabbit that can run away at 30 miles per hour—in short bursts, of course—is at an evolutionary advantage when the weasels and stoats can only run 28 mph, but at an evolutionary disadvantage once predators can run 32 mph.

Figure 2: The Red Queen Effect in Action

It's different when the evolutionary advantage is against nature. A polar bear has thick fur because it's cold in the Arctic. And it's thick to a point, because the Arctic doesn't get colder in response to the polar bear's changes. But that same polar bear has fur that appears white so as to better sneak up on seals. But a better camouflaged polar bear means that only more wary seals survive and reproduce, which means that the polar bears need to be even better at camouflage to eat, which means that the seals need to be more wary, and on and on and on up to some physical upper limit on camouflage and wariness.

This only-relative evolutionary arms race is known as the Red Queen Effect, after Lewis Carroll's race in Through the Looking-Glass: “It takes all the running you can do, to keep in the same place.” Predators develop all sorts of new tricks to catch prey, and prey develop all sorts of new tricks to evade predators. The prey get more poisonous, so their predators get more poison-resistant, so the prey get even more poisonous. A species has to continuously improve just to survive, and any species that can't keep up—or bumps up against physiological or environmental constraints—becomes extinct.

Figure 3: The Red Queen Effect Feedback Loop

Along with becoming faster, more poisonous, and bitier, some organisms became smarter. At first, a little smarts went a long way. Intelligence allows individuals to adapt their behaviors, moment by moment, to suit their environment and circumstances. It allows them to remember the past and learn from experience. It lets them be individually adaptive. No one has a date, but vertebrates first appeared about 525 million years ago—and continued to improve on various branches of the tree of life: mammals (215 million years ago), birds (75 million years ago), primates (60 million years ago), the genus Homo (2.5 million years ago), and then humans (somewhere between 200,000 and 450,000 years ago, depending on whose evidence you believe). When it comes to security, as with so many things, humans changed everything.

Let's pause for a second. This isn't a book about animal intelligence, and I don't want to start an argument about which animals can be considered intelligent, or what about human intelligence is unique, or even how to define the word “intelligence.” It's definitely a fascinating subject, and we can learn a lot about our own intelligence by studying the intelligence of other animals. Even my neat intelligence progression from the previous paragraph might be wrong: flatworms can be trained, and some cephalopods are surprisingly smart. But those topics aren't really central to this book, so I'm going to elide them. For my purposes, it's enough to say that there is a uniquely human intelligence.7

And humans take their intelligence seriously. The brain only represents 3% of total body mass, but uses 20% of the body's total blood supply and 25% of its oxygen. And—unlike other primates, even—we'll supply our brains with blood and oxygen at the expense of other body parts.

One of the things intelligence makes possible is cultural evolution. Instead of needing to wait for genetic changes, humans are able to improve their survivability through the direct transmission of skills and ideas. These memes can be taught from generation to generation, with the more survivable ideas propagating and the bad ones dying out. Humans are not the only species that teaches its young, but humans have taken this to a new level.8 This caused a flowering of security ideas: deception and concealment; weapons, armor, and shields; coordinated attack and defense tactics; locks and their continuous improvement over the centuries; gunpowder, explosives, guns, cruise missiles, and everything else that goes “bang” or “boom”; paid security guards and soldiers and policemen; professional criminals; forensic databases of fingerprints, tire tracks, shoe prints, and DNA samples; and so on.

It's not just intelligence that makes humans different. One of the things that's unique about humans is the extent of our socialization. Yes, there are other social species: other primates, most mammals and some birds.9 But humans have taken sociality to a completely different level. And with that socialization came all sorts of new security considerations: concern for an ever-widening group of individuals, concern about potential deception and the need to detect it, concern about one's own and others' reputations, concern about rival groups of attackers and the corresponding need to develop groups of defenders, recognition of the need to take preemptive security measures against potential attacks, and after-the-fact responses to already-occurred attacks for the purpose of deterring others in the future.10

Some scientists believe that this increased socialization actually spurred the development of human intelligence.11 Machiavellian Intelligence Theory—you might also see this called the Social Brain Hypothesis—holds that we evolved intelligence primarily in order to detect deception by other humans. Although the “Machiavellian” term came later, the idea first came from psychologist Nicholas Humphrey. Humphrey observed that wild gorillas led a pretty simple