How "Being Wrong" can be so right

Thumbnail image for Being WrongHave you ever been wrong? Well then, this book is for you.

It's a trick question, because everyone is wrong all the time. A more detailed review after the jump, but the bottom line: read it.

I'm barely exaggerating when I say that reading Being Wrong: Adventures in the Margins of Error by Kathryn Shultz should be compulsory for anyone and everyone that ever that has ever thought they know the truth, which is to say everyone. Drawing from history, philosophy, science, current events and a smattering of personal reflection, Shultz takes us through what it means to be wrong, why we get things wrong, how it feels to be wrong, and finally why understanding and embracing wrongness can be a good thing.

Part 1: The idea of Error
I'm no philosopher, but it seems like an understanding that error happens has been around for a long time. But, as with many things, defining it is quite a burden and Shultz spends the first two chapters describing the problem. The idea of being wrong is inseparable from the idea of being right, and so questions of Truth, Knowledge, and belief must also be dragged into the discussion.

Shultz then describes her two models of being wrong. The "pessimistic model" is one that many people consciously adhere to: error is a terrible thing, to be avoided and eliminated at all costs. Everyone knows that being wrong, about politics, about people, or even about the bus schedule can irritating and at worst disastrous. But she also invokes an "optimistic model," Which acknowledges the joy, humor and sometimes the necessity of being wrong. From optical illusions, to literature to humor - Shultz makes the case that in being human, wrongness is a feature, not a bug.

My ego (and the ego's of many reading this blog I imagine) gets stroked a bit when she describes science as embracing the optimistic model as an approach to knowledge. Unfortunately, those feelings of superiority come from the certainty that we're right. She also manages to chastise that certainty, and reminds us that the power of science does not come from generating certainty, but from generating doubt (not coincidentally, generating doubt is also what gives this book it's power).

Part 2: The Origins of Error
This was my favorite part of the book. It's an impressive accounting of the sources of error, from perception to cognition to the forces of society. I was a bit put off by the characterization of all forms of knowledge as "belief," but that's probably based on my proximity to the gnu atheist arguments. She makes a compelling case, and also acknowledges that some forms of evidence are more reliable than others. In the end, the argument is that ALL beliefs are subject to bias, misrepresentation and error, and I think this is a point well worth making.

The section ends with a chapter on "The Allure of Certainty," which uses the example of the Zealots and of the 2004 presidential election to demonstrate our passion for being certain and of the terrifying consequences when that certainty reaches pathologic levels. This chapter has my favorite passage [emphasis in the orignial, errors are probably mine]:

The psychologist Rollo once wrote about the "seeming contradiction that we must be fully committed, but we must also be aware at the same time that we might possibly be wrong." Note that this is not an argument for centrism, or for abandoning our convictions -- and our conviction -- while jettisoning the barricade of certainty that surrounds them. Our commitment to an idea, he concluded, "is healthiest when it is not without doubt, but in spite of doubt."

Most of us do not want to be doctrinaire. Most of us do not want to be zealots. And yet it is bitterly hard to put May's maxim into practice. Even with the best of intentions, we are often unable to relinquish certainty about our beliefs. One obstacle to doing so is the feeling of being right, shored up as it is by everything from our sensory impressions to our societal relations to the structure of human cognition. But a second and paradoxical obstacle is our fear of being wrong. True, certainty cannot protect us from error, any more than shouting a belief can make it true. But it can and does shield us, at least temporarily, from facing our fallibility.

Part 3: The Experience of Error
This segment is my least favorite, and probably the least substantive, but that's only because it's in a book that's jam packed with substance. It documents (as well as it is possible) how it feels to be wrong, from the mundane errors of every day life, to the earth-shattering destruction of an entire world view, to errors that can destroy other people's lives. But this section is not a catalogue of different ways of being wrong, it's an attempt to sew them all together witha common thread, to give a sense of why being wrong often sucks, but also how it can convince us to be better people.

Part 4: Embracing Error
The conclusion of Shultz's thesis begins with a paradox - embracing error as a way to eliminate it. She describes how corporations like Motorola and Ford, as well as the entire aviation industry, have managed to dramatically curtail error by acknowledging it, quantifying it, and actively and consciously eliminating it. The whole book is about the upside to error, but this chapter readily describes the disasters that can unfold as a result of mistakes. This seems to be the fulfillment of the pessimistic model of error discussed in part 1, but in a strange way it undermines it. The only way to understand and eliminate error is to understand it and recognize that it is inevitable.

Shultz concludes with the optimistic model - error is inevitable, yes, but it's also at the root of our humanity. It makes art and humor possible, and makes human cognition possible. "Our capactity to err," she says, "is inseparable from our imagination."

-------

PS - This is my first ever book review. Was it any good? Did I do anything wrong (ha ha) or anything right? Do you think you'll read the book?

Any feedback would be greatly appreciated.

Categories

More like this

It looks good. I'm going to have to give it a shot. Recently, I read a book on how math can be used to fool us. At any rate, the author was discussing polling and he pointed out that in a nearly even race, taking a poll is pointless. You might as well flip a coin. This led me to question coin flips. So, I got eight pennies, shook 'em in my hand and tossed them on a counter. I then did this 17 more times, tabulating my results. I kept a tally as I went, always expecting a regression to the norm...I never got it. 1008 coin tosses...535 were heads. A statistically significant result. After I did this, I talked to someone else who informed me this had already been done and it was found that heads was statistically more likely. My point of course is that I was in error. I had a belief that if I kept throwing those coins long enough soon I would see tails coming up very, very, very frequently. If someone else had asked me..I'd have told them about regression to the mean. I'd have been wrong and had been wrong for a long time...til a couple of weeks ago. Oddly, Feynman made the same point with a comment on being comfortable not knowing. During the shuttle investigation he also made similar comments regarding NASA's claims about safety and the statistics they used. They provided comfort, but with consideration what they portrayed was pretty ridiculous.

By Mike Olson (not verified) on 13 Jan 2011 #permalink

Without having read it, or even thought about it, I will defend Part 3. Knowing how people feel about being wrong is an important part of our fight because it determines what they'll do about being wrong, and how they'll be wrong. We really need to know this stuff. Just like a detective needs to how a killer kills his victims, and how he covers his tracks. If we're ever going to get good at luring people away from pseudoscience, we need to know how it intimidates its victims.

I think I need to read section 3. I've always had the attitude described in section 4, and I'm always grateful to have errors pointed out to me. People getting upset when they're called on a mistake "does not compute" for me. This makes collaborations interesting, in the interesting times sense of the word.

By stripey_cat (not verified) on 14 Jan 2011 #permalink

I'll probably read it.

By herp n. derpington (not verified) on 14 Jan 2011 #permalink

@ Mike - But did you try nickles?

@ Steve & stripey - I did not mean to imply that section 3 wasn't good. It's pretty tough to get inside other people's heads and come up with something like a universal experience of being wrong - I think that Schultz probably does as well as she could have with it.

@ herp - cool, let me know what you think.

Odd you would mention that. I've given real thought to tossing (a representational sample of...) all currently circulating coins to see what would happen. I haven't gotten motivated enough yet to do it.

By Mike Olson (not verified) on 14 Jan 2011 #permalink

You've convinced me to read it.

Re comment by Mike.
You may find it interesting. Regarding regression to the mean: you are right to expect it !! But it is not unusual to observe pseudo trends in large sets of random data.
You may object to the statement that Suskind makes when he discusses the outcome of 1000 flips.
Oh, and it is easier to simulate flips (computers).

LEONARD SUSSKIND â¨Physicist, Stanford Universityâ¨

Conversation With a Slow Studentâ¨â¨Student: Hi Prof. I've got a problem. I decided to do a little probability experimentâyou know, coin flippingâand check some of the stuff you taught us. But it didn't work.
Professor: Well I'm glad to hear that you're interested. What did you do?
Student: I flipped this coin 1,000 times. You remember, you taught us that the probability to flip heads is one half. I figured that meant that if I flip 1,000 times I ought to get 500 heads. But it didn't work. I got 513. What's wrong?
Professor: Yeah, but you forgot about the margin of error. If you flip a certain number of times then the margin of error is about the square root of the number of flips. For 1,000 flips the margin of error is about 30. So you were within the margin of error.
Student: Ah, now I get if. Every time I flip 1,000 times I will always get something between 970 and 1,030 heads. Every single time! Wow, now that's a fact I can count on.
Professor: No, no! What it means is that you will probably get between 970 and 1,030.
Student: You mean I could get 200 heads? Or 850 heads? Or even all heads?
Professor: Probably not.
Student: Maybe the problem is that I didn't make enough flips. Should I go home and try it 1,000,000 times? Will it work better?
Professor: Probably.
Student: Aw come on Prof. Tell me something I can trust. You keep telling me what probably means by giving me more probablies. Tell me what probability means without using the word probably.
Professor: Hmmm. Well how about this: It means I would be surprised if the answer were outside the margin of error.
Student: My god! You mean all that stuff you taught us about statistical mechanics and quantum mechanics and mathematical probability: all it means is that you'd personally be surprised if it didn't work?
Professor: Well, uh...
If I were to flip a coin a million times I'd be damn sure I wasn't going to get all heads. I'm not a betting man but I'd be so sure that I'd bet my life or my soul. I'd even go the whole way and bet a year's salary. I'm absolutely certain the laws of large numbersâprobability theoryâwill work and protect me. All of science is based on it. But, I can't prove it and I don't really know why it works. That may be the reason why Einstein said, "God doesn't play dice." It probably is.

Nice. one word laughing.

By Mike Olson (not verified) on 14 Jan 2011 #permalink

Seems to me she goes astray with the characterization of all forms of knowledge as âbeliefâ; and that all such are subject to bias, misrepresentation, and error.

Bertrand Russell pointed to the difference between someone holding that something is so, and one who believes it is so. The scientist who holds something will change his mind if presented with evidence to the contrary. The believer will not. Evidence is irrelevant to a belief.

Logician Willard Quine in âQuidditiesâ (1987) decries the quirk of language that allows âbeliefâ to be applied indiscriminately to both the above:

âWe can think hard, but we cannot believe hard. We can believe something, but we cannot think something. Grammar forbids.

âBelieving is a disposition. Thinking⦠is an activity, however sedentary. We could tire ourselves out thinking, if we put our minds to it, but believing takes no toll. We sit and think, but do we sit and believe? The White Queen, indeed, professed to do so: 'When I was your age, I always did it for half-an-hour a day. Why, sometimes Iâve believed as many as six impossible things before breakfast.'

âShe represented beliefs, some of them anyway, as voluntary activities rather than dispositions. ⦠To speak of simply deciding to believe something, independently of any evidence real or imagined, is to stretch the term âbeliefâ beyond belief.

âAn enamored young man has his reasons for subscribing to the tenets of his fiancéeâs church⦠but these are cases of feigning belief, or of paying lip or pen service, and not of believing.â

[Quine closes with the reasonable manâs paradox:]

âTo believe something is to believe that it is true; yet experience has taught him to expect that some of his beliefs, he knows not which, will turn out to be false. A reasonable person believes, in short, that each of his beliefs is true and that some of them are false. I, for one, had expected better of reasonable persons.â

By Mark Saha (not verified) on 15 Jan 2011 #permalink

@ Mark - That's interesting. Though I suppose it (like so many arguments) turns on the definition of belief.

In Shultz's definition, every time you realize you are wrong, you are overturning a belief. Sometimes the beliefs are minor (I believe it will rain this afternoon and bring an umbrella) and sometimes they are significant (I believe I will be with this person for the rest of my life). The strength of the belief may vary, along with your willingness to concede you are wrong, but they all take similar forms.

I think she makes her own case better than I am, but I do agree with you to some extent. I'm uncomfortable labeling everything as belief, as that word doesn't lend itself to different forms of evidence, but if it's well qualified (as it is in the book), I can't quibble too much.

Kevin,

Good comments, which suggest maybe Iâm taking the book too seriously â making more of it than the author intends.

Probably she is concerned with how people react or are affected by realizing they are mistaken about something in everyday life.

That would exclude believers in the religious sense. Since nothing will persuade them otherwise, they are immune to the experience of error (at least with regard to religion).

That still leaves the problem of beliefs as dispositions. Many of these are mere preferences, such as believing vanilla ice cream is best. Similarly in politics I doubt anyone is really âwrongâ. One might denounce a Nazi for his disposition, but thatâs about as far as it goes. The Nazi may be âwrongâ in a moral sense, but that isnât what she intends.

We can definitely be wrong about a fact, such as the date of Washingtonâs birthday. But a change of disposition could merely be overreacting to an imagined slight, and then apologizing upon realizing this (which alters your disposition).

The real problem with dispositions is that youâre not necessarily wrong. More often you just want to fit in with the gang.

By Mark Saha (not verified) on 16 Jan 2011 #permalink

Good review - I'll add this to my wish-list.

I think Mark (and Quine) are on to something. We lack the terms to grasp the finer points of what "knowing", and therefore being right, or being in error, about things is about. My body knows about Newtons laws - it moves to catch balls anticipating they will move in parabolic curves, and it knows about gravity too. And so does a deer's body, or any other calculator of motion. My conscious grasp of these things may be in error (probably is), but my unconscious grasp of them is "right" in some very certain ways. Even if I often miss the balls. There's a big difference between this and "knowing" about, say, the virtues of democracy. Science is often about making explicit this first kind of knowing, than it is about the second, which is belief more than knowledge.

I would add that the core of religion is the knowledge that there's a lot more going on - inside as well as outside your head - than you will ever be able to know.

@ Mark - This is definitely a book for laymen and not for philosophers, but she doesn't exactly exclude religious beliefs. She talks about religious conversion or of losing belief many times. I think that it is precisely because religious beliefs are so deeply ingrained, when someone does experience that error, it is quite profound.

You said, "they are immune to the experience of error," which actually brings up a point she makes early in the book that I find fascinating: we are all immune to the experience of error. Which is to say, you can't ever feel what it's like to actively be wrong. Once you know you were wrong, you are no longer wrong - you can only remember what it was like to think you were right about something you now know to be wrong. (I know this has nothing to do with your point, but I think it's really interesting)

@ Peter - That's an interesting distinction between something your body knows and something you consciously know, though I imagine they're not as separate as you might think. Actually, discovering error about something your body "knows" is probably even more disconcerting.