The Legend of Ezekiel Bulver

BulverHere’s how the legend began: Ezekiel Bulver, at the tender age of five, once heard two people having a dispute. (I’ve modernized the story a bit.) The first person insisted that the sum of two sides of any triangle will always be greater than the length of the third side. The second person argued that the first person only believed that because he was a socialist.

“At that moment”, Ezekiel Bulver assures us, “there flashed across my opening mind the great truth that refutation is no necessary part of argument. Assume that your opponent is wrong, and explain his error, and the world will be at your feet. Attempt to prove that he is wrong or (worse still) try to find out whether he is wrong or right, and the national dynamism of our age will thrust you to the wall.” That is how Bulver became one of the makers of the Twentieth Century.

You’ve never heard of Ezekiel Bulver? Astonishing! Anyone who wants to gain some measure of freedom from their all-too-human tendencies to use poor logic and to cut through the confusing clutter of contemporary media, needs to understand Bulver. Well, no worries–here’s a clever little doodle video to bring you up to speed . . .

*Thanks so much to the CSLewisDoodle YouTube channel!

The Other Side Is Evil (Moralized Disagreements)

us and them arroganceRarely do I come across something so closely aligned with my own goals in blogging that I use it in place of an original post. But this video is such a thing. In the context of the Kavanaugh hearings, Kyle Blanchette skillfully breaks down how we tend to view those who disagree with us as stupid or evil. This is NOT about which side is right, or even the reasons behind each side. It’s about how we judge those who disagree with us. Worth you time.

Do Motives Cloud Judgment?

clouded judgment, bias, logic, skepticism

Can our motives cloud our judgment? Yes. Without a doubt. (See this post and this post.) But does this mean we should always suspect our judgments and the judgments of others? That seems unreasonable. When I say that motives or psychological states can “cloud our judgment,” what I mean is (roughly) this–if we want something to be true, we tend to see the reasons for that view more favorably, and when we don’t want something to be true, we tend to see the reasons for that view less favorably. “More/less favorably” just means that the reasons appear to have more/less force to us than they would to someone with similar intellectual abilities and no desire either way (no horse in the race).

preformationism, bias, perceptionFor example, some early scientists believed in “preformationism,” which is the view that a tiny embryo exists in every sperm cell. So, when these scientists looked through primitive microscopes, they were inclined to see the outline of such an embryo in sperm cells. Others who did not hold this view did not see the embryos. Even the most ardent truth-seekers sometimes allow their biases and desires to affect their perception and judgment.

But to leap into the swamp of skepticism is a mistake. Here’s a common line of reasoning I observe.

  1. Psychological states, such as desires, often cloud human reasoning.
  2. Peter is expressing reasons for a view that he desires to be true.
  3. Therefore, I should mistrust Peter’s reasoning.

The most common example of this is when a religious skeptic dismisses the reasons presented by a Christian for her belief (which she wants to be true). Almost as common: a Christian assumes that the skeptic is only a skeptic (thus dismissing his arguments) because they don’t want there to be a God! Call this the “bad motives” attack. Several things strike me as wrong-headed about this kind of thinking.

Problems with the “Bad Motives” Attack

First, the reasoning presented by a person for their belief must stand or fall on it’s own merits. The motivations, desires, fears, etc. of that person are completely irrelevant when asking, “Is the reasoning they present any good?” (i.e., is the argument valid). To critique or question a person’s motives instead of critiquing their actual argument is evasion. We resort to this red-herring tactic only when we lack the intellectual skills to logically evaluate the argument being presented. (I should also add that you can admire the logic of an argument without agreeing with it! Being wrong is not the same as being irrational. Several very rational theories exist to explain the extinction of the dinosaurs, but most of them are wrong!)

spotlight, reasonSecond, this view is a two-edged sword. If all judgment is suspect because of hidden psychological interference, then the critic must turn this spotlight on her own reasoning as well. Could it be that (speaking as the critic) my own skepticism about Peter’s reasoning (in the example above) is actually the flawed product of my own motives–I don’t want him to be right! We should doubt the skeptic’s reasoning on exactly the same grounds that the skeptic doubts ours.

Third, wanting something to be true does not automatically cripple our judgment and reasoning. In fact, I don’t think anyone really believes it does. I know this because we apply this critique inconsistently. We pick and choose when to apply the “bad motives” attack, typically applying it to arguments for views we personally don’t like. And certainly we shouldn’t refrain from arguing in favor of things we care deeply about. For instance, I care deeply about the evils of human trafficking. Does this mean I am disqualified from making judgments or arguments against human trafficking? That seems absurd. Let me make my arguments, and then evaluate their soundness on their own merit! This is one reason why good academic journals and conferences don’t want the author’s name on a paper submission. The author’s motives and desires should be irrelevant in evaluating the quality of the arguments presented. 

Last Words

dead end, judgment, reasoning, bias, skepticTrue, there is such a thing as confirmation bias. Our wishful thinking can mislead our reasoning at times if we are not vigilant. But hyper-skepticism about everyone’s beliefs and reasoning is unjustified. So, I want to discourage you from using this “bad motives” attack as an easy response to arguments you don’t like. Deconstructing everyone’s judgment this way, including your own critiques, leads us to a dead end.

*I’m indebted to Josh Rasmussen for his insightful comments on his own recent Facebook post.

When Speech Feels Like Violence

speech, violence, angrySpeech sometimes offends, even injures, our sensibilities. Alex Jones and the decisions of Apple and Facebook to remove his content illustrate this. But there are at least two ways speech can “hurt” us. Some hurtful speech stabs to the core of our self and our sense of dignity as a human being. Other times, speech threatens us because our inadequate cognitive defenses and filters fail to protect our psyche. I want to address the second kind of scenario because it is more “up to us” than the first kind.

Epistemic Immune System

My father endured numerous chemotherapy treatments during his battle with cancer in 2002. I distinctly recall one time when his immune system was so severely compromised by the chemo that we had to wear face masks just to come into his hospital room. And if anyone was sick–forget it! A common cold could kill him. If someone walked into the room without a mask, a nurse would immediately escort them out with a stern reprimand. Ordinary germs–ones that any healthy immune system would handle easily–constituted a threat.

epistemic immune system, defenseSomething similar goes on with our beliefs. You could say we have another immune system–an epistemic immune system. Instead of protecting us against bacteria and viruses that threaten our body, the epistemic immune system protects our “worldview” (our system of beliefs about reality) against false ideas and bad logic. When our epistemic immune system is healthy, it identifies bad ideas and bad reasoning and escorts them to the mental trash bin. It also identifies good ideas and sound reasoning and allows them through unharmed, where they find eventual integration with our worldview. If our epistemic immune system functions well, we feel more secure and less fearful  because we know our beliefs will remain healthy despite our exposure to bad ideas.

We need a healthy epistemic immune system because bad ideas really can harm us. If bad ideas gain “admission” into our belief structure, they can start to cause problems. They can cause psychological anguish or pain. They can result in actions that harm us or others. They can conflict with other (good) beliefs, or erode the foundations of our worldview. We sometimes feel this in the form of cognitive dissonance or instability. Like a man on a boat for the first time in choppy seas, we wobble around, out of balance and extremely uncomfortable. We sense that any small push might send us tumbling, our worldview crashing like a Jenga tower. Every disagreement feels like a threat, like spoken violence.

Inside and Out

speech, violence, defenseThink of your worldview as a city with two lines of defense: outside the gate and inside the gate. You control what you are exposed to “outside” the gate by choosing what to read, watch, listen to, etc. But once you have seen or heard an idea, it’s through the gate and your internal mental defenses (epistemic immune system) have to do their job. It is very, very difficult to completely control what gets through your gate. It’s like movie spoilers–if you’re using social media, it’s really hard not to find out that everyone dies in Infinity War. (See!?!?) Ideas zip through the gate of your eyes and ears so fast! This is why we need a healthy epistemic immune system on the inside.

Now here’s the real crux of the matter. When our internal defenses are weak, we are too easily thrown off balance by disagreement and contrary views. Fear and insecurity rule us. So here’s what we do: we try to shut the gate. Or we at least build a barricade in front of it to block new ideas out. How do we do this? I’ve observed (even in myself) two main strategies. For one, we avoid exposure to new ideas–we become epistemic hypochondriacs. We shun (or censor) books, websites and people who disagree with us. Secondly, we use anger or outrage as a shield. Instead of looking carefully at the idea presented and constructing a reasonable response, we try to intimidate the other party into silence with loud, abusive speech.

speech, violence, angryNow before you write a nasty email or comment, let me clarify something. Remember I mentioned two ways that speech can hurt. When you’re dealing with the first sort (see paragraph 1), epistemic defenses won’t help much. This sort of deeply abusive speech that penetrates to our core does not require careful analysis and logical counterargument. It’s what the Supreme Court referred to as “fighting words.” But the other sort of “hurtful” speech–the kind that only hurts because we lack a healthy internal defense–should not be banned or censored. The challenge lies in discerning which sort you’re dealing with.

Conclusion

So let me offer a suggestion. Cultivate a healthy epistemic immune system. This solves much of the problem. You can do this several ways.

  1. Take a course on logic or critical thinking. Great on-line resources abound as well. For starters, try here and here. If you know of a good resource, share it in the comments.
  2. Spend some time with someone who can mentor you on these skills. Find a philosopher, lawyer, or someone else who gets paid to argue, take them out to lunch and pick their brain.
  3. Lower your shield of anger and moral outrage. A shield helps in certain cases, but overuse will only impede your mental maturation. Just like a healthy physical immune system, you need exposure to “germs” over time to develop your “antibodies.” Learn to stand your ground and respond respectfully and intelligently. Read before you dismiss.
  4. process, slow, thinkingFinally, process new ideas more slowly. Unless you’re dealing with the first kind of hurtful speech, take time to digest and consider what is being said. Then you’ll be in a better position to either accept it or thoughtfully respond.

You Too!

you too, tu quoque
U2, not related to logical fallacies

Since I know very little about political issues and immigration, I tend to stay out of debates. But what I do know is good debate. So, I won’t often weigh in on one side, but I will comment on the quality of the arguments. In the recent brew-ha-ha over separating children from parents at the border, people used whatever tactics they could to “win the argument.” But there was quite a bit of “tu quoque” (Latin for “you too”) going on. Using this tactic doesn’t get us any closer to knowing what’s true or right.

“You too” happens when side A says that there is something really bad about the policy of side B, and side B responds by saying, “Well, you’re just as bad!” Typically, side B resorts to this tactic because they know their policy is really bad. They don’t want to defend it. So, instead, they shift the focus off of whether the policy is bad and put it on something side A has done which is just as bad. This puts side B on better terms with an argument they can win. But the original argument about the policy of B is left unresolved. Even worse, resolution is now impossible because side A and side B aren’t even arguing about the same thing anymore. (See this post for other kinds of bad logic.)

Children At the Border

children, tu quoqueFor example, side A argued that the policy (held by side B) of separating children from their parents at the border is really, really bad. But instead of discussing the merits of the policy, side B accuses side A of being hypocrites because side A originated the policy years ago! “You’re just as bad as us!” But this, while perhaps true, misses the point completely. The original question is, “Should we continue separating children from their parents?” not, “Who is to blame for this bad policy?” What side B should have done, right from the start, is either defend the policy or admit that the policy is bad and change it, rather than try to return fire. This would be good, helpful conversation and debate. (Also, some people on side B did defend the policy by saying, “Well, it’s the law!” But this amounts to arguing that “this policy is the right policy because it is the current policy.” Try telling that to MLK!)

Please note that my point here is not about who was right. My point is about how one side argued badly. Who was correct is a completely different issue.

cake, tu quoqueBut I’m not letting side A off the hook that easily! When side B won a recent Supreme Court decision that made it (legally) permissible for a cake designer to refuse to make a cake for a gay wedding, side A was outraged. But then a new story came out: business owners on side A refused service to people who worked for the Trump administration. This unleashed a “You too!” tornado on social media. Both sides starting lobbing “you too” grenades at the other. Instead of debating whether it is right to refuse service, both sides said, “Your side did the same thing!” This simply avoids the actual issue.

It’s always a good idea to stop and think about the tactics you’re using to “win” a debate. Some tactics help us discover what is good, true and beautiful. Others only serve to distract, shut down, or silence our opponents.

An Atheist, an Agnostic, and A Theist Walk Into A Bar

(That literally happened to me one time.) Ok, this joke still needs writing, and that’s not my thing. But I do want to try and tease out a related conversational knot that’s been giving me trouble. In short, the knot involves the answers to the following questions:

  • What does it mean to be an atheist?
  • What does it mean to be a theist?
  • What does it mean to be an agnostic?

Why does this matter? Because labels matter to us. If someone called me a “feminist,” my reaction might depend on what they mean by the term. If it just means “someone who advocates for the complete social, economic, and political equality of the sexes,” then I’m happy to carry the label. But if they mean it pejoratively to mean “someone who hates men and wants women to take over the world,” then I’m going have a problem with that. So which is the correct definition of feminist?

Similarly, if someone calls you an atheist, what exactly does that mean? You might accept the term when defined a certain way, but not when defined in another way. Also, if I say something like, “atheism is irrational,” the reasonableness (or truth) of that claim depends on the definition being used. If it means, “someone who knows with certainty that no gods exist,” then few people will accept the label, and rightly so.

What I want to do here is compare two approaches to defining these terms, and explain why I recommend one over the other. I’ll start with what I call the “four quadrants” model.

The Four Quadrants Model

Some people propose we renovate these terms (atheist, agnostic, theist) a bit to make things clearer and avoid foisting burdensome views upon others. Here is the renovation proposal:

atheism, agnosticism, belief grid

There is a certain elegance and symmetry to this model. You have ‘theism’ and ‘a-theism’ juxtaposed with ‘gnostic’ and ‘a-gnostic.’ Very nice.

This “quadrant model” carries other advantages as well. First, it takes some pressure off of atheists who don’t want to claim that they “know” there are no gods. Second, it also takes pressure off of theists in exactly the same way. Third, it uses the term ‘agnostic’ in a way more true to the original meaning of the Greek word. In ancient Greek, ‘gnosis’ means ‘knowledge’ and the prefix ‘a’ mean ‘without’ or negation. So, to say that I am “agnostic” literally means “I don’t know.”

Cons of the Quadrant Model

Unfortunately, there are some drawbacks to this model. For one, the unconventional use of ‘gnostic’ and ‘agnostic.’ While I do love etymologies (word origins), most people find them rather pedantic. The simple truth about language is that meanings change over time. A word gets it’s meaning from usage. If we were arguing about what technical term philosophers should use, that might be a different story. But if we want an ordinary term for common usage, it is simpler to use the word conventionally. So, when it come to ‘agnostic,’ most people understand this to describe someone who is undecided about God’s existence. And when it comes to ‘gnostic,’ this term gnostic, mysticalrefers to followers of an religion that revolves around the possession of esoteric, mystical knowledge (Gnosticism). For Christians, this is especially important, because ‘gnostic’ carries a heavy negative connotation, and has for thousands of years. To be a “gnostic theist” is to be a heretic, for most Christians. So, shifting meanings in this way muddies the waters.

Secondly, this model suggests that knowledge and belief are like height and width–two separate dimensions of thought. But belief and knowledge relate more like acceleration and force. Belief is a component of knowledge, just as acceleration is a component of force (F = ma). So it is misleading to represent them on two perpendicular axes.

This quadrant model also fails to provide a safe conceptual space for the truly undecided. True undecidedness is a real position on many important questions, including scientific ones. In numerous cases (e.g., the multiverse), the most rational thing to say is “I neither believe there is an X, nor do I believe there is not an X.” But the diagram above tells me that I must pick a quadrant.  I must either believe (the space below the ‘x’ axis) or not believe (the space above the ‘x’ axis). If we revised it to allow people to be on the axis, perhaps right at (0,0), then what do we call them? There is no in-between because the model is binary in principle.

confused, atheist, agnosticFinally, the fellow in the upper-left quadrant confuses a few things. He wants to say “I don’t believe any god exists,” but he also wants to say “I’m not CLAIMING that–I might be wrong.” I understand the discomfort here. He doesn’t want to make a strong claim, because that would require a strong defense, which is a burden he doesn’t want. Fair enough. But he isn’t like a helicopter that has yet to land. He has landed, even if tentatively. He thinks there are no gods. He’s not saying he is certain, or that he can prove anything, he is simply describing where he has landed. And he admits that he may have landed on the wrong spot. That’s fine. But even a tentative landing represents a claim about what you think is true about the universe. I’ll say a bit more beloe about the the difference between “having no belief about p” and “not believing p.”

The Sliding-Scale Model

Instead of a binary-based model, and the restrictions that entails, I prefer a sliding-scale approach.  This non-binary model allows for a wide range of possibilities, grouped into three natural categories. Rather than being forced to choose from only four possible positions, people can personalize their position based on their beliefs and confidence level.

I didn’t have a cute graphic for mine, so I made this:

belief scale, agnostic

On this view, you can be anywhere between 0-100% confidence about a certain idea or claim. (“p” refers to any claim, or proposition, like “God exists.”) If you find yourself hovering around the 50% mark, we’ll say you neither believe it nor disbelieve it. This is where we should fall on claims like “this fair coin will land on heads when flipped.” Sometimes we say things, loosely, like “I don’t know.” But this conversationally implies that we simply don’t have a belief one way or the other.

pool jump, confidence, beliefIf you land roughly between 65-100% confident that p, then you clearly believe it is true.  At 100% confidence, you have no doubts and think there is no chance that p is false. (Notice that we’re saying nothing about knowledge here. This is only about beliefs, just to keep things simple and clear.)  If you fall anywhere in between 0-35%, you think that p is false, though the closer you get to 50%, the more you lean toward thinking there’s some chance it could be true. For example, suppose I’m looking over a balcony, wondering if I could jump safely into the pool. I give myself about a 15% chance of plunging safely into the water. So, if you ask me, “Do you believe you can make it?” I’d say, “no.”  If you have 0% confidence that p, then you have no doubt it is false–you disbelieve it with maximum confidence of its falsehood.

The Sliding-scale & God

Now, if we apply this to our debate about definitions, here’s how I think it works in terms of belief about God. If you have ~65% or more confidence that God exists, then you believe that God exists and we call you a theist.  (I think most of us agree with that.) But theists, like atheists, can possess little or much confidence. If you are ~35% or less confident that God exists, then you disbelieve that God exists and we call you an atheist. (Nothing about knowledge here!)

Both segments (red and green) of the scale represent a “belief state,” two sides of the same coin.

coin flip, tails, beliefBut that’s ok because merely believing or disbelieving that p doesn’t saddle you with an undue burden. I call them both belief states because disbelieving that p is roughly synonymous with believing that p is false. I.e., “I disbelieve that God exists” is the same as “I believe there is no God.” It’s like someone saying, “I don’t think the coin will be heads”– you wouldn’t need to ask whether they believe it will be tails. It’s just a belief! No big deal. Whether it is rational or whether you know is a different ball game and will require more justification. But the atheist need not attain certainty or prove there is no God in order to be a rational atheist.

Questions & Concerns

Some atheists prefer the quadrant model because they are more comfortable saying “I don’t have a belief about God–I lack belief in God.” But saying you lack a belief about God’s existence is not accurate. Atheists lack an affirmation of God’s existence, but they have a belief state (doxastic attitude), and that belief state is disbelief. They take the claim “God exists” to be false. If you don’t take it to be false, then you are either undecided or a theist. The only people who truly lack a belief about God are those who have never considered God’s existence, like my dog Duke or my friend’s baby. They just have no belief state about God whatsoever.

Huxley, agnostic, agnosticismWhat about agnostics? Now, I admit that the term ‘agnostic’ as a label for the undecided is somewhat regrettable, given the literal Greek meaning. Coined by Thomas Huxley in the late 19th century, the term served to contrast his position against those who felt they had attained “gnosis” or knowledge of answers to the big questions. Huxley used the term to express either skepticism or humility or both. But regardless of Huxley’s intentions, the term now refers to someone who is undecided on a matter, religious or otherwise. For now, it works. Launch a campaign to shift the usage if you dislike it, but it isn’t quite right to tell people that they’re using it wrong now.

To avoid mixing up atheism and agnosticism, note that the claim “I don’t believe that any gods exist” (as in the four quadrant graphic above) can mean several different things. Consider the claim E: “the number of stars in the universe is even.” If I say that “I don’t believe E,” that could mean: (1) I think E is false, which implies that I hold the odd-number-stars view; or (2) I don’t believe E, but I don’t think it’s false either. I’m just undecided, or agnostic on the matter. So when you want to express atheism and NOT agnosticism, it is better to say something like, “I believe there are no gods,” or more simply “I disbelieve theism.”

If you discover that the belief-state you are in is difficult to defend, welcome to the club! Each position has its unique challenges and weaknesses. There’s no problem with redefining your position in order to make it more defensible, as long as the changes are not “Ad hoc” and the new definition is coherent and unconfusing.

The Rationality of a Flu Shot

flu shot, vaccine, doctor, reason, rational, epistemologyI don’t like shots, in fact, I avoid them. Ironically, I visited my doctor yesterday, and left with a band-aid on my arm. I didn’t plan to get a flu shot, in fact I’ve never had one and never wanted one, but he talked me into it. I thought the whole dialectic was interesting, so I’ll share it with you. I think it illustrates some valuable principles of rationality and good belief formation. (The doctor actually said some of these things, and some of them I said to myself during the conversation.)

The Conversation

“Have you considered getting a flu shot?”
“No, not really. I never get them.”
“Would you be open to the idea?”
“Isn’t the flu a whole range of viruses rather than only one virus?”
“Yes.”
“But aren’t flu vaccines just aimed at one strain of the flu? That means that it protects you (imperfectly) against one strain out of many, which doesn’t seem very helpful. It would be like having an air bag that only inflates when I hit a red car.”
“Actually, the vaccine is aimed at multiple flu viruses, based on the most common ones from last year.”
“Ok, that’s good to know. But still, I hardly ever get sick or get the flu.”
“Well, even if you have a very low risk of getting the flu, the shot will lower the risk even more.” (The CDC website says that, “flu vaccination reduces the risk of flu illness by between 40% and 60% among the overall population.”)
“Yeah, that seems right. But I’m still not sure it lowers the risk enough to make it worth it.”
“What’s the downside of getting one, especially if it’s free?”
flu shot, vaccine, doctor, reason, rational, epistemology“I don’t like shots. Yeah, that’s not a great argument, I suppose.”
“Consider this: Lowering your own risk also benefits family health and public health. If your chance is lower, that lowers the risk of your kids getting sick or anyone around you getting sick, like in your church. That’s just good for everyone in Columbia.”
“Ok, I’m starting to realize that I don’t have any good reason, or enough good reasons to justify not getting a shot.” (He did address the concerns many people have about the vaccine causing various side-effects or illness, though I wasn’t worried about it. The chances are negligible. He also explained that the vaccine they use is protein-based, which means it doesn’t contain the actual virus, so it can’t give you the flu.)

So, next thing I know, the nurse comes in with the syringe. I tried to relax and remember that this is a very fleeting pain. Happily, the nurse was quite skilled and I hardly felt it. The arm is a bit sore today, but that’s the only negative effect.

The Takeaway 

What’s the takeaway here? 1) Be open to dialogue. You might learn something. You also might discover that your reasons, once they are out on the table, turn out not to be very strong. 2) Irrational fears shouldn’t guide our actions. The fear of a shot, for us needlephobes, is generally way over-blown and not realistic. I.e., it isn’t as bad as you think. 3) Public health may not have occurred to you as a relevant factor, but it should. It isn’t just about *you*.

For the Flu Shot Skeptics

skeptic faith thinkingNow, I know people worry about certain dangers of vaccines or flu shots. But I researched it a little (perhaps inadequately), and I couldn’t find any documented sources citing scientific evidence about the dangers of today’s flu shots. Flu shots have been modified over the years to eliminate anything that was discovered to be harmful.

“But what about the dangers we have yet to discover?” True, we must always admit the possibility that we’ll discover a dangerous chemical  later, after the damage is done. But it simply isn’t reasonable or practical to live your life dodging mere possible dangers. There would be no way to avoid everything that could harm you. We should try to avoid probable harms — things that we have good evidence for. That’s the only feasible way to live. Right now, the research says that flu shots are safe. Also, if you avoid flu shots based on a few bad stories you’ve heard, you’re probably falling prey to the availability bias (I might be doing this as well) or the fallacy of probability neglect.

“But given that we’ve repeatedly found new dangers in some medicines and treatments, shouldn’t we expect that there are lots of undiscovered dangers lurking in these drugs?” That’s an inductive argument, and I think it’s weak. Here’s why: medicine isn’t progressing slowly, like repeatedly adding 1 to a number and watching it grow. It progresses more like multiplying. So not only do we detect and solve new problems every year, but our methods for detecting, solving and preventing problems gets better every year, multiplying the effectiveness of medicine. That’s my perception, but I could be wrong.

Feedback

Persuaded? Let me know what you think. I’m open to hearing the arguments on the other side, provided you have documented evidence from reliable sources.

Got Thinking Skills?

cat funny critical thinkingIt’s hard to say whether the internet contains more good resources than bad, but anytime we highlight a good resource, we help nudge the inequality a little. So here’s a cat video. Just kidding! This website, “Critical Thinking Web,” stands out among many similar sites for it’s ease-of-use, rigor and interactive design. If you want to explore the world of logic and analytical thinking (and more!), this is a wonderful playground. I’ve used it as a supplement in my philosophy and logic courses.

critical thinking web, resource, logicFounded in 2004 by Dr. Joe Lau of Hong Kong University, the site offers help with:

Bad Thinking, Part 3: The SI Jinx

missouri chase daniels SI jinx
Not Pete Rose

Pete Rose, infamous Cincinnati Reds baseball player, appeared on the cover of Sports Illustrated in August of 1978, in the midst of a 44-game hitting streak. That same week, his streak ended. Numerous other examples over the years foster the belief that players or teams who achieve SI cover-status will experience the “SI Jinx” soon thereafter. A pair of local favorites: the University of Kansas football program appeared on the November 2007 cover after an 11-0 start, and lost the following week to rival Missouri; Missouri then graced the cover in December 2007 after reaching their first #1 ranking, and lost the following week to Oklahoma. The SI Jinx strikes again!

Coincidence or curse?

SI jinx sports thinkingTo this day, many athletes shun appearing on the cover of Sports Illustrated. In January 2002, Kurt Warner declined to pose for the cover, so the magazine ran a photo of a black cat instead. The headline: “The Cover that No One Would Pose For.” Are their fears well-founded? If it isn’t a curse, then what explains the bizarre coincidence?

Thankfully, Daniel Kahneman provides enlightenment. In Thinking, Kahneman describes a statistical phenomenon called “regression to the mean.” (Ch. 17) According to Wikipedia,

Regression to the mean is the phenomenon that if a variable is extreme on its first measurement, it will tend to be closer to the average on its second measurement—and if it is extreme on its second measurement, it will tend to have been closer to the average on its first.

In other words, if an athlete performs at a remarkably high level one week or one season, the following week or season is very likely to be worse, and vice versa. I imagine that if SI started featuring especially low-performing athletes on their magazine cover, we would soon discover a SI Cover Miracle!

Getting Lucky

regression to the mean kahneman jinxOne reason for our error in judgment here: we fail to account for luck. In many endeavors, luck plays a huge role, including sports, academic testing, and business success. Our performance in these areas tends to follow a curve, with frequent average performances (relative to personal skill), and few examples of either really awful or amazing performances. Luck (or the lack) is usually what accounts for the “outlier” performances on the edges of the curve. But we attribute this to skill or other non-causal factors instead.

dice luck jinx thinkingKahneman relates an interesting anecdote about a flight instructor who claimed that praise for good performances was detrimental, but intense criticism for bad performances was helpful. Why? Because when he yelled at a pilot for an especially poor flight, the pilot performed better the next time out. And when he praised him for “clean execution,” he got worse. The instructor failed to realize that this was statistically predictable and probably attributable to pilot luck. A classic example of regression to the mean.

The Upshot

happy luck jinx regression to the mean kahnemanIn my daily life, identifying regression to the mean can help me avoid emotional whiplash. I know that an amazing day is likely to be followed by an average day, so I’m not as disappointed when this occurs. Similarly, a really horrendous day will probably be succeeded by a better day, so there’s hope! Substitute whatever professional metrics you like for “day,” and you can apply the same truth in your life: sales figures, enrollment, attendance, stock performance, child behavior, or team wins.

I also remember to include luck, or perhaps unpredictable Divine intervention, in my evaluation of performance. This means that my absolute best and worst performances are probably not solely attributable to my skill. I should look at my average as a better gauge for evaluation, rather than taking the “outlier” as the norm.

bad luck jinx thinking kahnemanFinally, we can do away with belief in jinxes. Even if you could show a high correlation between some odd event and bad performance, this would not prove causation. Interestingly, while 37% of SI cover stars were “jinxed,” 58% maintained or improved their performance following their cover appearance, according to an 1984 study. The jinx myth endures because of yet another kind of “bad thinking:” the negativity bias! We tend to remember negative events and give them more weight in our reasoning.

I still plan to give away a copy of Kahneman’s book to a lucky subscriber! Sign up for Ground Belief updates with your email for a very high chance to win (I only have 2 subscribers as of yesterday).