The Truth About Lies - Aja Raden - E-Book

The Truth About Lies E-Book

Aja Raden

0,0

Beschreibung

Fibbing, prevaricating, stretching the truth, white lies, of omission, of commission. Lying is so pervasive that we have countless words for it. But have you ever considered why you believed a lie you were told - or why we lie at all? In this witty, whirlwind tour through the annuls of deceit, bestselling author Aja Raden combines psychology, popular science and history to explore everything you've ever wanted to know about manipulation and lying, showing how it evolved and why even the birds and the bees do it. From 'big lies' like the English gent who invented a South American country to pyramid schemes like Bernie Madoff, this is an eye-opening primer that decodes how we behave and function, and reveals how lying shapes our experience of the world around us.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern
Kindle™-E-Readern
(für ausgewählte Pakete)

Seitenzahl: 431

Das E-Book (TTS) können Sie hören im Abo „Legimi Premium” in Legimi-Apps auf:

Android
iOS
Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



ABOUT THE AUTHOR

Aja Raden is the New York Times bestselling author of Stoned: Jewelry, Obsession, and How Desire Shapes the World. Raden is an experienced jeweler, trained scientist, and well-read historian, and her expertise sits at the intersection of academic history, industry experience, and scientific perspective. She lives in Santa Fe, New Mexico.

 

 

First published in the United States in 2021 by St. Martin’s Press,

An imprint of St. Martin’s Publishing Group

Published in paperback and trade paperback in Great Britain in 2021 by Atlantic Books, an imprint of Atlantic Books Ltd.

Copyright © Aja Raden, 2021

The moral right of Aja Raden to be identified as the author of this work has been asserted by her in accordance with the Copyright, Designs and Patents Act of 1988.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of both the copyright owner and the above publisher of this book.

10 9 8 7 6 5 4 3 2 1

A CIP catalogue record for this book is available from the British Library.

Paperback ISBN: 978-1-83895-192-4

Trade paperback ISBN: 978-1-83895-193-1

E-book ISBN: 978-1-83895-194-8

Printed in Great Britain

Atlantic Books

An imprint of Atlantic Books Ltd

Ormond House

26–27 Boswell Street

London

WC1N 3JZ

www.atlantic-books.co.uk

 

This book is dedicated to everyone who’s ever lied to me.

If nothing else, you made me smarter.

 

INTRODUCTION: THE CURRENCY OF LIVING

PART I: LIES WE TELL EACH OTHER

Perception, Persuasion, and the Evolution of Deceit

1. THE OLDEST TRICK IN THE BOOKCredulity, Duplicity, and How to Tell a Really Big Lie

2. KEEP YOUR EYE ON THE BALLShell Games, Card Games, and Mind Games

3. DON’T BUY ITGoldbricking and the Often-Misleading Nature of Facts

PART II: LIES WE TELL OURSELVES

Faith, Fraud, and the Funny Thing About Belief

4. HOLY SHITCharlatans and Other Authority Figures

5. BITTER PILLSnake Oil, Salesmen, and Subjective Reality

6. IT’S LOVELY AT THE TOPPyramid Schemes and Why You’re Probably Part of One

PART III: LIES WE ALL AGREE TO BELIEVE

Consensus, Control, and the Illusion of Truth

7. FAKE NEWSHoaxes, Hysteria, and the Madness of Crowds

8. HOW TO MAKE A BUCKTrue Facts About Fake Things

9. WAIT FOR IT . . .At Long Last: The Long Con

AFTERWORD: LIES ABOUT THE TRUTH

ACKNOWLEDGMENTS

NOTES

BIBLIOGRAPHY

INDEX

 

Everything has to be taken on trust; truth is only that what is taken to be true. It’s the currency of living. There may be nothing behind it, but it doesn’t make any difference so long as it is honoured.

—TOM STOPPARD

INTRODUCTION

The Currency of Living

You shall know the truth, and the truth shall make you mad.

—ALDOUS HUXLEY

 

 

 

 

Why do you believe what you believe?

You’ve been lied to. Probably a lot. Maybe you knew, maybe you didn’t. Maybe you found out later. The thing is, when we realize we’ve been deceived, we’re always stunned. We can’t believe we were taken in: What was I thinking? How could I have believed that?

We always wonder why we believed the lie. But have you ever wondered why you believe the truth? People tell you the truth all the time, and you believe them; and if, at some later point, you’re confronted with evidence that the story you believed was indeed true, you never wonder why you believed it in the first place.

But maybe you should have.

Facts are just that which continue to exist, whether or not you believe them. But there’s nothing special about a fact. A fact doesn’t sound different from a falsehood. The truth isn’t written in italics. So why do we believe we can tell the difference?

The Truth About Lies is a book about famous swindles that endeavors to give a telescopic vision of society through the phenomena and mechanics of belief: why we lie, why we believe, and how, if at all, the acts differ.

In just the same way that there are only a handful of actual original stories in the collective human consciousness upon which all other stories are only variations, so too are there only so many unique, primal lies. From those few original lies, all the others are derived, endlessly iterated, and polished for new audiences. As the American economist John Kenneth Galbraith wrote in his book The Age of Uncertainty, “The man who is admired for the ingenuity of his larceny is almost always rediscovering some earlier form of fraud.” Ultimately, as original as the lie may seem in the moment, there are only so many ways to deceive. The Truth About Lies looks at nine basic cons from several angles, among those: the swindlers who worked them, the lies they told, and the people who were taken in.

Each chapter tells the outrageous story of a classic con and illustrates the mechanism by which it works, using both contemporary and historical examples. From the story of a fake Martian invasion that started a very real riot, twice, to the modern madness of Twitter; from a Wild West diamond scam so vast it made fools (and in some cases criminals) of the well-heeled investors of 1872 (including Charles Tiffany) to the tale of that same bait-and-switch scam dressed up in a new investment opportunity called mortgage-backed securities, which nearly toppled the world banking system in 2008.

This book examines the Pyramid Schemes you’ve heard of, the ones you haven’t, and the ones we’ve all bought into without even realizing.

More important, each chapter examines mechanisms of belief and the persistent—and maybe fundamental—role that too-good-to-be-true and faith-based deals have played in human history. Is the twisted tale of selling Snake Oil, which started the craze for so-called patent medicines and led to America’s first Victorian opioid crisis and the subsequent crackdown by the newly formed FDA, really about gullibility, or does the strange science of placebos tell us more about the biology of belief than we realize?

Organized in three parts: Lies We Tell Each Other, Lies We Tell Ourselves, and Lies We All Agree to Believe, The Truth About Lies examines the relationship of truth to lie, belief to faith, and deception to propaganda using neurological, historical, sociological, and psychological insights and examples. It will propose that some of our most cherished institutions are essentially massive versions of those self-same, very old cons and also complicate the vision we have of both the habitual liar and the classic “sucker.”

My first book, Stoned, was ostensibly a book about jewelry, but at its heart Stoned sought to answer a single question: Why do people value what they value? The more I thought about it, the more I began to see something else in those stories. I realized that nearly every story in Stoned, whether about a scandal surrounding a stolen necklace, an island bought with glass beads, or the invention of the diamond engagement ring, had a lie right at its center. That revelation, in combination with the conclusions I had come to in Stoned, led me directly to The Truth About Lies and to its core question: Why do people believe what they believe?

Ask yourself: What are you sure of? We can start simple; let’s just talk about basic facts. How many facts are you certain you know? Quite a few of them, probably. You know your ABCs, you know state capitals, you know water molecules are composed of two hydrogen atoms bonded to one oxygen atom.

You know that the earth is round, right?

Are you sure? How did you come by this certainty? Surely you didn’t do the calculations yourself. The odds are, if you tried to right now, you wouldn’t be able to, because you don’t even know exactly which geometric calculations were used, thousands of years ago, to determine that fact in the first place. And even if you did know what they were, your math skills probably aren’t that strong. My point is not to convince you that the earth is flat—of course it’s not. My point is to show you how many truths you accept without ever considering why you believe them to be true. I don’t want you to question whether or not the earth is round; I just want you to realize that you never really did.

We blindly trust certain facts: things we’re taught, things we can observe or reason. And once we “know” these things, we never really question them again. But often we also believe things to be fact simply because we’re presented with them. Neurologists refer to this tendency as an honesty bias. It’s how we know almost everything that we know: someone else told us. Or someone showed us, or we read it in a book. And though honesty bias may sound too stupid to be true,*in a strange, roundabout way, it’s what makes us all—as a group—so formidably intelligent.

Without this tendency to trust, to assume, to simply believe, every human on earth would be born starting from scratch, unable to benefit from the knowledge of the collective. This bias toward simple belief in the truth of what we are told or shown has allowed humans to build higher, see farther and through shared collective intelligence, become the dominant species on Earth. And yet this vital ability, this necessity to stand on the shoulders of giants and accept secondhand information as truth, is also the very flaw that allows us to be deceived.

Duplicity and credulity are not opposites; they’re just two sides of the same very old coin, and can’t be spent separately. Could it be that at the ancient and tattered heart of humanity, what drives civilization is the capacity in each of us for both deception and belief—and that without this complex duality, there would also be no progress, no social cohesion, no trust, and no ability to collaborate?

Is it possible, perhaps, that you must believe certain lies in order to believe anything at all?

____________

*Though, if I know any one thing for certain, it’s that nothing is too stupid to be true.

Perception,Persuasion, and theEvolution of Deceit

Natural selection is anything but random.

—RICHARD DAWKINS

One should always play fairly when one has the winning cards.

—OSCAR WILDE

 

 

 

 

WE TEND TO ASSUME THAT DELIBERATELY TELLING LIES IS some sort of pernicious aberration unique to liars—perhaps the result of a mental defect or, more likely, some sort of moral failing. It is not. We all lie, all the time—including you.

Before you dismiss this thought, consider: human deception and evasion are no different than the animal equivalent of camouflage, spots, and stripes. Charm is our very own version of frilly fins and peacock feathers. Whether it’s a stick insect adapted to cheat by hiding among twigs or a pretty pink orchid mantis lying in wait to devour the next gullible hummingbird looking for a little nectar, the effort to deceive, from camouflage to creative bullshit, is an evolutionary arms race as old as organic life.

Humans are not the only species that lies—far from it, in fact; any living species that can communicate, verbally or nonverbally, has absolutely figured it out. Take, for example, the Cryptostylis orchid, adapted to both look and smell like the alluring backside of the aptly named orchid-dupe wasp—giving a whole new meaning to honey trap. Or the snake-mimic hawkmoth caterpillar, sporting a pattern resembling the face of a snake to mislead and frighten away any bird that might otherwise see a tasty meal.

Trickery is fundamental to interaction, and the instinct to sometimes subvert or misrepresent objective reality to suit our own needs is fundamental to communication.

In the evolution of deceit, language only came about quite recently, billions of years after more basic and more effective tools of the con. Yet there’s some debate that humans may have developed language specifically to manipulate each other in new and cleverer ways. It’s just the latest innovation in a billion-year-old chess game. As Robert Trivers, professor of anthropology and biological sciences at Rutgers University, put it: “our most prized possession—language—not only strengthens our ability to lie but greatly extends its range.”1

Consider: when you lie with your scent, your pattern, or your petals you can only lie about what you are, and you can only lie about the here and now. Lie with words, and you can lie about anything, anyone, anywhere; you can rewrite facts past, present, and future.

Human speech allows deceptions to transcend space and time.

Learning to lie is one of the earliest developmental milestones children have to hit to be considered functional. Because once we know there is truth, the next stage of normal development is to attempt to hide, misrepresent, or swap out that truth. Lying is one of our fundamental building blocks. It’s a big part of not just who but what we are. When it comes to humans, dishonesty is a feature, not a bug.

These first three chapters explore mechanisms of deceit—how we lie and how lying works—through the lens of three of the world’s oldest and most basic cons: the Big Lie, the Shell Game, and the Bait and Switch.

The first, the Big Lie, exploits people’s theory of mind through their intrinsic capacity for disbelief simply by employing a lie so big that to disbelieve it would threaten our collective sense of objective reality. If that’s too bold—and big-lying is a con for the very bold—you can also manipulate another’s physical perception; as the Shell Game exploits hardwired flaws in our perceptual cognition. Last, because it’s natural to believe our own eyes (even as the Shell Game teaches us we should not), a Bait and Switch allows real evidence to misrepresent fact, leaving the mark to believe whatever you want them to believe.

Deception is an evolutionary tool no different from any other. Whether you’re the liar or the dupe, you are acting on instincts, cognitive processes, and abilities billions of years in the making. As we examine these three most basic cons, we will explore not only the nuts and bolts of deception, or how a lie actually works, but also why it works, from its evolutionary function and form to what it reveals about our own. Part I, Lies We Tell Each Other, examines the mechanics of lying, the evolution of deceit, and asks the question How do you tell a lie?

Now relax; you were quite literally born to do this.

THE OLDEST TRICK IN THE BOOK

Credulity, Duplicity, andHow to Tell a Really Big Lie

The impossible often has a kind of integrity which the merely improbable lacks.

—DOUGLAS ADAMS

The great mass of people will more easily fall victims to a big lie than to a small one.

—ADOLF HITLER1

 

 

 

THE BIG LIE

As cons go, this one’s got training wheels. The Big Lie is accomplished by making an outrageously unbelievable claim with total confidence. It is, very simply, the telling of a great big whopper. Strangely enough, people actually are more likely to believe you if you lie about owning an island than if you lie about owning a boat. And don’t worry about the possibility that your mark isn’t completely brain-dead—you want some healthy skepticism. The Big Lie works in tandem with our belief in truth, rather than in opposition to it: its success is reliant on people’s understanding of, and faith in, shared objective reality.

Starting Small

The Big Lie is actually the simplest kind of swindle. All you have to do is tell—and preferably sell—a really outrageously Big Lie. Think: “I own land on Mars and I’m selling time-shares.” You don’t need to actually have the thing or even evidence that you do; the deception works entirely based on the fact that no reasonable person can believe that another seemingly normal, reasonable person would brazenly lie about something so enormous. As suspicious as the story itself may be, it seems more unbelievable that someone would make a story like that up and expect other people to believe it. But more often than not, they do believe it.

The Big Lie’s power lies in its audacity.

Humans require a shared idea of reality to function—for instance, if you drop a ball, it will fall down, not up. Time moves forward. Things are mostly what they appear to be (wet, solid, broken, etc.). Liars are bad; crazy people seem crazy. We all believe these things together, and our faith in a universal objective reality is necessary, even if it’s not always accurate. In the final analysis it does us far more good than harm, but the fact remains: belief in a shared objective reality can be exploited just by flagrantly lying.

We’ll talk more in this chapter about what creates this shared template and expectation that we call “reality,” how we come by it, and why we need it to function, let alone to believe or disbelieve anything at all. But for now the most important thing to remember is that the tighter we adhere to the very normal and very necessary idea of a shared objective reality, the more susceptible we actually are to its subversion.

You Wanna Hear a Really Big Lie?

Gregor MacGregor was the charming, handsome heir to an ancient noble family from Glengyle, Scotland. But like many ancient noble families, the MacGregor family had seen better days. By the time he was born, the MacGregors were making their livings as local tradesmen. And so, like so many other broke aristocrats, MacGregor joined the military, and off he went to seek fortune and glory.

Mostly fortune.

Alas, MacGregor found that there was not enough of either to be had in the Royal Navy, so in 1811 he ditched it and sailed to South America to fight under the command of the legendary Simón Bolívar, El Libertador, in the Venezuelan war of independence against Spain. Bolívar granted MacGregor a commission, ostensibly on the strength of his record in the Royal Navy, or what he claimed was his record in the Royal Navy. It was harder to fact check people’s resumes in 1811.

MacGregor, though neither a good soldier nor a good leader (he was said to occasionally cut and run, abandoning his men when the odds looked bad), was charming, daring, and flamboyant. He made a name for himself and made his way up through the ranks quite rapidly. So far up the ranks, in fact, that he married Bolívar’s daughter. But having no discernible ideology, nor personal loyalty, MacGregor abandoned La Revolucion and moved on to fight in various other skirmishes throughout the region. And by 1820 he’d discovered actual pay-for-play killing when he took a job as a mercenary on an expedition against a Spanish settlement called Portobello, on the Mosquito Coast of Panama.

It was there he claimed to have encountered the pristine paradise of Poyais, an undiscovered country, found and founded by MacGregor himself, on the Caribbean coast near what is now Nicaragua and Honduras. While Mosquito Coast sounds horribly buggy, it was actually named after the Miskito Amerindians, who dominated the larger region—not the insects. Mosquito derives from the Spanish mosca, or fly. And so in Spanish mosquito means “tiny fly.” The fact that the Miskito kingdom was also full of mosquitos is just one of those creepy coincidences that make you question whether or not retro-causality is really that far-fetched.

Seeing the potential in this marvelous, idyllic new New World, MacGregor persuaded the local potentate (after getting him blind drunk) to sign over to him 12,500 square miles of territory along what is now Honduras’s Black River and to formally acknowledge him as Gregor I, Cazique (prince) of Poyais.2 Or possibly he named himself Gregor I, Cazique of Poyais. The latter seems slightly more likely, but it’s impossible to say, as one man was blackout drunk and the other was a really big liar. Either way, in October 1822, after over a decade of fighting and traveling through the jungles of South America, Gregor MacGregor returned to England from this paradise found. But he didn’t come home as mere soldier or even a decorated war hero; MacGregor returned to London as Gregor I, prince of the Caribbean nation of Poyais.3

Paradise Found

Upon his return to London in October 1822, MacGregor immediately began a massive media blitz to educate the public about Poyais. He published articles about Poyais in respected journals, describing the land’s unspoiled beauty and excessive natural resources. The prose was accompanied by detailed illustrations, which he claimed he’d brought back from the country itself. These pictures showed a land slightly larger than Wales, full of clean, fresh water and fertile soil for cultivation. There were forests full of trees and game and other exotic flora and fauna. The riverbeds were lined with big chunks of gold, and numerous other wonders, including precious gems, all there for the taking.4

MacGregor even brought a real-live person back from Poyais, whom he declared an ambassador, as well as a copy of the Poyaisian Constitution and the very land grant and proclamation that made him Cazique of Poyais. He claimed the natives were friendly, that the cities were brimming with culture, and that the land was ripe for development and a Christian colonial ruler—a proposition that was particularly appealing in his native Scotland, as the country had no colonies of its own.

Should anyone require a second source, he pointed them to an entire book published on Poyais, written by one Captain Thomas Strangeways, titled Sketch of the Mosquito Shore, including the Territory of Poyais.* The captain’s account not only confirmed but expanded on MacGregor’s description and fantastical claims that Poyais was a land of plenty, brimming with untapped natural resources. Most promising, Strangeways’s book described a land of endless summer and triannual harvests, with a tropical climate so warm and inviting that fruit was falling off the trees year-round—and yet remarkably not so hot or wet as to host the sort of biting insects and tropical diseases Europeans had learned to fear.

In addition to the almost unbelievable potential for agriculture, prospecting, or just lying on the beach eating tropical fruit, there were urban opportunities as well, for Poyais already had a capital, called Saint Joseph—a small but fully Western city with roads, houses, public buildings, a bank, a civil service, and even an opera house.5 So if neither farming nor mining (not to mention loitering) was really your thing, there were plenty of other types of work and opportunities for trade that an enterprising colonist could pursue in Poyais. Particularly considering Saint Joseph boasted a deep-water port, perfect for mercantile vessels to come and go, allowing for the development of all sorts of transatlantic commerce.

Fortunes were waiting to be made between the climate, the natural resources, and the abundant available labor in the form of the “Poyers.” The Poyers were unreasonably friendly, mythically hardworking natives.6 They were plentiful enough to build an entire European city, staff the civil service, operate a small military, and do anything else you might need; but at the same time, they were not so plentiful that they took up any space, owned any land, or otherwise got in anyone’s way. They were basically Schrödinger’s natives. And they were so excited about the idea of white colonists coming to occupy and employ them that they’d supposedly written up a proclamation welcoming them.

Honesty, Authority, and Other Debatable Claims

Does this sound too good to be true? Well, yes, clearly. The whole idea of Poyais being conveniently perfect in every regard and that anyone believed that for a second, sounds stupid as hell—now. But the default setting in humans is to accept the reality with which they have been presented.

So much so that a little kink in our thinking called honesty bias constitutes one of the twelve basic cognitive biases that circumscribes our perception of reality. Cognitive biases are systematic errors in cognition that occur in processing and deciphering information we glean from the world around us. They’re not mistakes or logical fallacies; they’re hardwired limitations in our thought process. Honesty bias is pretty much exactly what it sounds like: a heuristic (a sort of mental shortcut our brains take) in which we accept as true anything we’re presented with, in the absence of obvious contradiction. For example, if you ask someone the time and they look at their watch and tell you it’s three p.m., you will believe them. You don’t reflexively question whether they’re lying to you or whether their watch is wrong. Unless, of course, it’s too dark out to be three o’clock or you have reason to suspect that the person wants you to be late.

Though cognitive biases tend to skew our judgment badly in some situations, they exist for a reason. Social psychologists believe that cognitive biases aren’t there to screw us up but, rather, to help us process information more efficiently. Honesty bias may leave you open to being deceived, but by the numbers, the vast majority of information you’re presented with is true. Not having to reason that out every millisecond, about every bit of data you encounter, is a valuable neurological ability, a shortcut that allows us to function and learn. Moreover, by compelling us to accept whatever people present us with as true, honesty bias is a huge part of what creates our shared template for reality, which informs our expectations and judgments.

Consider: If I told you that there was a commercial rocket launch this year, taking a shuttle full of paying customers to the moon, would you believe me? You probably would; people believe in the basic reality that they’re presented with, and this is ours. Something almost that absurd really does happen in aerospace every year. Just a few years ago a guy launched a red convertible blasting a Bowie album into the void, forever, for no obvious reason at all. Your grandparents wouldn’t have believed the story about a commercial passenger shuttle to the moon seventy years ago. Your parents wouldn’t have believed it forty years ago. But you and I would. Because most of us have lived our whole lives post–moon landing, post–space stations, post–SpaceX. The space age and its eventual commercialization of space travel is the reality with which we have been presented our whole lives.

So with that in mind: Does Poyais still sound too good to be true?

Yeah, it still does. But in 1822, so did the rest of the New World. This was the era of empire building via seized foreign territories, country-sized land claims based on very little, and unimaginable stolen riches. India was real, with its gleaming golden palaces and massive gemstones. The Near East was real, with its vast oceans of sand and ancient stone cities. Australia was real, with its bizarre, exotic flora and fauna. Why not Poyais?

A story like the one about the riches and idyllic nature of Poyais, and MacGregor’s claim to it, was not completely unbelievable in the 1820s, essentially the heyday of British colonialism. This story, to one degree or another, reflected the reality of the eighteenth and early nineteenth centuries. It was hardly unprecedented to declare a strange, faraway place full of money up for grabs just because someone from Europe went there and tripped over it.

It’s a claim that almost seems reasonable in that context.

So when Gregor MacGregor returned to London referring to himself as Highness Gregor I, Cazique of Poyais, and told the world that he was the newly minted prince of a South American paradise they’d never seen or heard of, they mostly just believed him—and they badly wanted to hear about this new country. It didn’t hurt that the British had only just lost their North American colonies about forty years earlier and were hungry for more American holdings of their own. Both the royalty and the aristocrats of London accepted all he had to say about Poyais remarkably quickly and easily.7 And once London’s most privileged class had signaled that they believed MacGregor’s claims, the rest of English society quickly joined them, followed in turn by the commoners of England and Scotland, each stratum of society’s trust enhanced by the faith of the one above.

There are a lot of funny quirks in our minds that explain, both neurologically and psychologically, why this sort of cascading failure of basic reason would occur. First and foremost, there’s that pesky honesty bias. But another cognitive bias, called authority bias, describes the way in which we tend to trust and are predisposed to believe the people who we see as having any kind of authority (including mere social stature) greater than our own. We’re wired to believe and trust our “betters,” essentially. Authority bias is also a primary factor in why we act in accordance with or follow orders from perceived authority figures—even when we feel like those authority figures might be in the wrong.

The first experiment in authority bias was the Milgram obedience experiment, conducted by Yale University psychology professor Stanley Milgram in 1961.8 Today it is considered the gold standard in unethical psychological experimentation. In the experiment, which was falsely described to the volunteers as an experiment in “learning and memory,” pairs of participants would give and receive tests. In each pair one participant, the “subject,” quizzed a second participant, the “learner.” Every time the learner got an answer wrong, the subject was ordered to administer increasingly painful and potentially dangerous electric shocks to the learner. They started with 15 volts (“slight shock”) and progressed all the way to 450 volts (“danger: severe shock”).

The point of the Milgram obedience experiment was not to electrocute volunteers to death; it was to determine if, and for how long, the subjects could be compelled to do as they were bid without any potential reward or risk of punishment for themselves.9 Would they do it even though they felt that what they were doing was wrong—and kind of sadistic? Would they continue to administer the electric shocks even when the other supposed volunteer wanted them to stop, even when the other person begged them to stop, or got scared and decided to alert them to their heart condition, or, after pleading and screaming in pain, fell suddenly, alarmingly silent? Would they continue just because the person in charge of the experiment told them to?

Depressingly, the answer is yes.

Most of the participants would indeed, simply because they felt compelled by the perceived authority of the person in charge of the experiment. In fact, 65 percent of them would continue all the way to the end of the experiment, even after their partner had stopped begging for reprieve and had fallen silent. Such is the power of authority bias in human consciousness. It wasn’t until after the experiment was over that subjects were told that their partners were not only fine but acting, the electric shocks were never even real.10

Stranger still, we defer to the opinions and directives of perceived authority figures even when their authority has nothing to do with the matter at hand. For instance, you’re more likely to believe a doctor who tells you you’re sick than you are a friend who tells you the same. This makes sense: we default to generally trusting and believing people we deem to be authority figures. But what’s really interesting is that you’d also be more likely to believe your doctor than your friend if they were to tell you how to program the computer in your car; even if you knew that they knew nothing about it. The same holds true for politicians, professionals, “experts” of any kind—even celebrities. We unconsciously assume that they’re better informed than we are, and we are more inclined to take what they say on faith. It’s why celebrity product endorsements are so valuable to companies and so lucrative for the celebrities: you’re hardwired to trust that a famous actress really does know which fruit juice will prevent aging or that your favorite musician really does have the inside track on which charities are legitimate. We don’t believe them because we have any reason to; we believe them because our brain has taken a shortcut.

Like all heuristics, authority bias benefits us—individually and as a cooperative group—we don’t need to know everything about our math teacher to trust them when they show us how long division works. But at the same time, it’s an open loophole in our thought process that can backfire, as it did here, or be deliberately taken advantage of by bad actors. The fact that the aristocratic classes of England and Scotland fell for His Highness Gregor I’s, Cazique of Poyais, Big Lie, and then, like dominoes, everyone else down the social and economic ladder fell for it, too, isn’t confusing or absurd; it’s predictable, and it’s evidence that their brains were all functioning perfectly normally.

Crime Does Pay, Mostly in Cash

Poyais was a land rich in literally everything, except for white Christian colonial overlords—and some start-up cash. So, understandably, MacGregor was looking for investment capital to develop the land and settlers to move there. He started by touring England with a very dramatic and colorful native entourage, all of them immaculately civilized but still charmingly exotic. He spoke about Poyais publicly and privately; he gave interviews and showed exhibits that included all of the samples, pictures, and written materials he’d brought back with him. Finally, when the public could wait no longer, he stopped selling his principality figuratively and began selling it literally. And at that point, all those normally functioning brains went bonkers for Poyais.

In short order, London’s lord mayor held a banquet in MacGregor’s honor. One patron even set MacGregor and his wife up in a posh country estate. He was already the toast of London society by the time King George IV knighted him, which the king mostly only did to ensure that MacGregor would be motivated to keep Poyais (a very valuable territory) a loyal British colony.

Once he had been made Sir Gregor MacGregor by the king, a very legitimate authority figure in his own right, he had no trouble at all securing a loan of £200,000 from the prestigious bank of Sir John Perring & Co. and floating shares in the Poyais venture on the market.11 That’s a big part of how most reasonably Big Lies work—in parts: every smaller, previously believed lie lays the groundwork for the next lie to be seen as more credible.

And soon Sir Gregor had opened offices for the Poyaisian legation to Britain in London. And then he opened land offices in Edinburgh, Stirling, and Glasgow from which to sell Poyaisian land to eager colonists. He sold estates to both the British and Scottish aristocracy. He also sold the same sort of estates and the titles of nobility to accompany them to wealthy commoners looking to move up the social ladder in the New World. He sold vast plantations to the would-be upper class of Poyais and even more modest, hundred-acre farms to average colonists.12 MacGregor, empowered as the sole potentate of Poyais, also sold social and professional positions in his new world; the wealthier but not quite aristocratic investor could buy anything from a posting as an important government official to a commission in the Poyaisian military. For the enterprising merchant or business investor, he sold monopolies on industries and on various trade goods.

Last but not least, MacGregor sold money; that is, he facilitated the exchange of vast amounts of Poyais’s official currency* for equal amounts British currency. After all, what good would their British pounds be in the New World, particularly in a country like Poyais that had not only banks and bankers but its own printed, formally recognized legal tender, thanks to MacGregor? If they intended to make a life there, they would need money, and they’d do well to rid themselves of British paper notes that would become worthless as soon as they got there. It was just this thinking that led hundreds of settlers to exchange every penny they had for Poyaisian currency to use in their new home.

MacGregor succeeded in getting seven massive ships’ worth of colonists to leave their homes and embark on a journey to the New World, having spent or exchanged everything they had to start a new life in the now famous paradise. In September 1822 and January 1823 the first two ships, the Honduras Packet and the Kennersley Castle, embarked for Poyais carrying hundreds of passengers as still more ships back in England were filling with people and preparing to set sail, each head on the block worth money.

Over the year and a half leading up to 1823, MacGregor raised far in excess of £200,000 in cash and brought the bond market value of Poyais up to £1.3 million, or about £3.6 billion (or $4.6 billion) in today’s currency. He personally made hundreds of thousands of pounds, all before the first ship had even dropped anchor.

When the Honduras Packet finally did reach land, the colonists had no idea where they were, except that it was definitely not Poyais.

They assumed they had landed in the wrong location, as they found no ready-made cities, no valuable resources, no farmable land, not even edible food. And there were no friendly natives to meet them. There were no unfriendly natives. In fact, there were no signs of any other humans at all, because even the climate was a lie; that section of coast was a hot, swampy, mosquito-infested jungle so uninhabitable that it remains largely undeveloped to this day.

They did their best to build shelters out of sticks and mud and to find fresh water, but a majority of the stranded settlers died from starvation, exposure, and tropical diseases like yellow fever and malaria on the isolated, dangerous, and basically deserted Mosquito Coast. It wasn’t until a small group of survivors—all that was left—was rescued by a passing British ship from a nearby colony in Belize and taken back to London that the disaster was exposed. Gregor MacGregor had not merely oversold Poyais’s virtues or exaggerated his ownership or authority in the matter, he’d actually made up the entire country.

“Poyais” had never existed at all.

Theory of Mind and Big Lying

It’s one thing to make up a girlfriend who lives in Canada. But who makes up Canada? You’re probably thinking: a crazy person. And that’s true—to some degree. At least, that’s what we believe. So if a person is ostensibly sane, and talks up the existence of a beautiful country no one else has ever visited or heard of, people are more likely to believe him than not.

That’s a shocking and even kind of scary thought. It sheds light on how easily our own beliefs and preconceived notions about the truth—for example, the notion that only a crazy person would lie about the existence of something as enormous (and demonstrably either true or false) as a whole country—can themselves be exploited to deceive.

The Big Lie relies on your disbelief in the possibility that so many of your fundamental assumptions about objective reality could be wrong—far more than it requires your belief in the lie itself. That’s why a Big Lie doesn’t need to be convincing. In fact, the bigger and more absurd it is—the less believable—the more it reinforces your basic instinct that no one would lie about something so obviously preposterous. But where do we get these beliefs about objective reality, and how do we know that everyone’s are the same?

The developing field of social neuroscience emerged about thirty-five years ago with what’s known as theory of mind. The term itself was first used by U.S. psychologist David Premack in a now famous experiment carried out on a chimpanzee named Sarah, to try to determine if she possessed self-awareness. The experiment was called the mirror test; Premack’s team altered Sarah’s appearance by placing a red dot on her forehead. Then they put her in front of a mirror. Instead of assuming she was seeing another chimp with a different appearance and swatting at the imagined intruder, Sarah approached the mirror and instead peered closely at her reflection in the mirror. She reached out her hand to her own forehead, not the mirror, and began touching her forehead, attempting to find and wipe away the red dot that didn’t belong there. In doing so, she demonstrated that she recognized her own reflection (that she had a reflection), understood herself to be an individual, and noticed something out of place.

Suffice it to say, she passed the Do you know you exist? test with flying colors.

Since then, however, the term theory of mind has expanded to describe the ability of an individual to take that self-awareness (hey, that’s me in the mirror) and sense of objective reality (there’s some schmutz on my forehead) and to understand, first, that he or she not only exists as an individual, like Sarah, but also has individual perceptions, thoughts, feelings, and beliefs that cause or can predict reactions to information and, second, that other people also have individual perceptions, thoughts, feelings, and beliefs, which similarly create and can predict their reactions to information.

Shorthand: theory of mind means the ability to think about what someone else might be thinking.

Theory of mind describes the facility to ascribe states of mind or intentions to both oneself and to everyone else. Or as A. M. Leslie defines it in the International Encyclopedia of the Social and Behavioral Sciences, “Theory of Mind concerns our ability, not simply to have beliefs as such, but to have beliefs about mental states, including the recursive ability to have beliefs about beliefs.”13 This cognizance that others perceive, think, feel, and even lie just as we do lends itself to a sense of objective reality assumed by each individual but tacitly agreed upon by all.

For example, you’re in a park and you see a stone wall. Naturally, you assume anyone else in the vicinity can also see the wall. And you make many other assumptions; for instance, that it is a wall, that the wall is inanimate, that it’s basically solid—you couldn’t walk through it—and that it’s largely immobile. None of this is remarkable. What is remarkable is that you also assume—you absolutely believe—that everyone else who sees the wall automatically assumes the same “facts,” and accepts these same facts as objectively true.

This active engagement of theory of mind—my thinking about what you see or believe or know about the same wall—is referred to as “mentalizing,” a capacity that includes “the critical ability to make inferences about the intentions of other people and their beliefs and to infer whether the emotions or other states signaled by social cues are or are not an accurate reflection of the actual emotional state of the individual”—in other words, to consider whether others might be lying to us.14

Theory of mind allows us to think about what another person might think, know, assume, or feel. Additionally, our ability to understand that others think, feel, and have intentions in the same way we ourselves do allows us to attribute a variety of mental and emotional states to other people, and then to use those assumptions to interpret, explain, or predict their responses. Our theory of mind and ability to mentalize also allows us to elicit specific responses from those people and influence their reasoning: to manipulate, to lie.

Let’s go back to that wall, an instance of our shared objective reality. I know that rocks are hard and solid—and don’t generally carry cash. Because I possess a functioning theory of mind, I know that you know that too. So, if I stole your wallet and wanted to hide it where I know you wouldn’t think to look, inside a false stone in that wall would be a pretty good choice. You might check my pockets, my car, even my bank account—but we both “know” that rocks are solid, so inside of a rock isn’t even going to occur to you. You see, it turns out all that’s really required to lie is the understanding that other people are thinking in more or less the same way that you are. Once you know that, it’s very easy to present others with information that, true or untrue, will elicit the desired response.

Theory of mind, the very thing that allows us to understand that there is objective fact—and that others might attempt to subvert it—is also the very thing that allows us the ability to lie.

That other people are thinking more or less the same way you are is also why little lies can be so hard to pull off. Remember when I said that it’s easier to convince someone that you own an island than a boat? Because we’re all sharing a theory of mind, we all know that other people lie. It’s why we’re most on guard when we think someone is trying to sell us something: it pays to be careful. You might think the bigger the potential swindle, the more cautious we’d be, but paradoxically the reverse is true. The Big Lie works not by preying on people’s gullibility, but by preying on their anticipation of a swindle. Any seemingly sane, normal person with the confidence to assert such a patently unbelievable claim must, we assume, have grounds to back it up. Otherwise, our theory of mind assures us, they would never expect us to believe it. The Big Lie completely subverts our shared sense of objective reality by telling a falsehood so outrageous that it must be true—a falsehood like an entire country, and like the fictitious economy that drove its bond price to the equivalent of $4.6 billion, a falsehood that sent hundreds of settlers in boats out on the open ocean to die.

Natural Born Liars

Once we develop a theory of mind, we begin to recognize and believe in this shared objective reality. Once we believe in it, we immediately look for ways to subvert it—that is, to deceive. And I do mean immediately; lying is not only a normal human behavior and a profound adaptive advantage, it’s such a fundamental one that we develop and hone it from infancy alongside other basics like walking, talking, and fine-motor skills.

Infants obviously have a far more intense honesty bias than adults. They accept basically everything they’re presented with as true—even when they do have reason to doubt it. It’s how they learn so much more quickly than we do, because cognitive biases exist to help us process almost infinite information more quickly and efficiently, even if it leads to a certain amount of errors. (It’s also why their little minds are absolutely blown by peek-a-boo.) But in addition to that honesty bias, infants and toddlers also have a fully functioning theory of mind, ready to deceive people.

How do we know that—apart from the fact that they’re manipulative little suckers? Actually, that is how we know. There’s a whole host of important social cognitive skills, including establishing joint attention, intentional communication of any kind, and the ability to imitate specific movements and gestures (like patty-cake or waving back at someone) and facial expressions—all of which belie a functioning theory of mind. And the vast majority of infants have mastered these, at least, by nine months. By eighteen months, most toddlers begin to engage in not just manipulative but deliberately subversive or deceptive behavior: creating distractions, hiding food they don’t want to eat, and even imitating their own emotions to elicit a desired, previously observed response (also known as fake crying).

I’m making them sound like tiny sociopaths, but in fact infants are simply accumulating reference points and practicing interactions with people—honing their ability to mentalize—at an incredibly rapid clip. And all the things they’re doing are very normal and important—so important, in fact, that if a child hasn’t figured out how to blatantly, verbally lie by three or four years old, it’s considered a concerning sign of developmental delay.

Based on all these things, researchers have suggested that the ability to understand and predict the behavior of another person (to mentalize) actually has “an innate, biological, and modular basis.”15 In other words: you were born to lie.

And you’re not the only one.

Bridge for Sale

In 1925, another purveyor of Big Lies, Count Victor Lustig (probably not his real name, definitely not his real title), sold the Eiffel Tower to a scrap-metal dealer. And then he sold it again, a week later, to a different buyer. And then Lustig hightailed it out of Paris, because he did not, in fact, own the Eiffel Tower.16

When he got to America, Lustig was in plentiful, if not necessarily good, company: a con man by the name of William McCloundy had sold the Brooklyn Bridge in 1901 and then spent two and a half years in Sing Sing prison for grand larceny—and he’d sold it only once. Decades before him, only a few years after its construction was completed in 1883, another big liar, named Reed C. Waddell, ran the same con, successfully selling the Brooklyn Bridge to unwitting marks for almost twenty years. After Waddell, a pair of brothers named Fred and Charles Gondorf had a go at it and improved on the con, timing beat cops’ routes so that they could put out a sign that read bridge for sale and then quickly, if temporarily, take it away again, just as the cops walked back past the bridge. The sign didn’t have a price listed, because that changed with each mark. According to an infamous fellow con man, Joseph “Yellow Kid” Weil, “once they sold half the bridge for $250 because the mark didn’t have enough cash.”17 The Gondorfs sold the Brooklyn Bridge many times, to many different would-be buyers, for amounts ranging from two or three hundred dollars to one thousand dollars; price dependent upon what they discerned each mark could (and would) pay.

But the year Lustig arrived from Paris, a man named George C. Parker had taken over selling the Brooklyn Bridge. And sell he did. The only thing more unbelievable than how many people sold the Brooklyn Bridge is how many people bought the Brooklyn Bridge, convinced, among other things, that they’d be able to set up a tollbooth. For about a decade police were constantly taking down obstructions and informing recent buyers that they did not own the throughway.18 Of all the liars who sold the bridge, Parker might have been the biggest. In fact, Parker, also the infamous seller—if not owner—of the Statue of Liberty, the Metropolitan Museum of Art, and Grant’s Tomb, sold the Brooklyn Bridge so many times that we have his racket to credit with the famous expression: “If you believe that, I got a bridge to sell you.”

So what sort of dark arts did men like Waddell, the Gondorfs, Lustig, and Parker use to convince people that they owned, and had the right to sell, these famous monuments? None whatsoever. They just told people that they did—a claim so bold and so outrageous that it must be true—and those people, being basically sane and normal, believed them.

Because who makes up an island? So to speak.

You see, despite the somewhat flexible nature of distinctions between what is true and what is false among humans, and constant disagreement about “The Truth,” we all agree that facts are facts. And that certainty that there are absolutes, and we all perceive and understand and tacitly agree upon them the same way, leaves the door unlocked for deception, because general belief in truth is absolutely required for lies to work.

Consider Jefferson Randolph Smith, a fellow hustler working on the opposite side of the country, who set up the first telegraph office out of Skagway, Alaska, in 1898. For the steep price of five dollars, settlers, frontiersmen, and prospectors could send a telegraph message to anyone in the United States. Unsurprisingly, there were lines of customers around the building every day.19