Do Motives Cloud Judgment?

clouded judgment, bias, logic, skepticism

Can our motives cloud our judgment? Yes. Without a doubt. (See this post and this post.) But does this mean we should always suspect our judgments and the judgments of others? That seems unreasonable. When I say that motives or psychological states can “cloud our judgment,” what I mean is (roughly) this–if we want something to be true, we tend to see the reasons for that view more favorably, and when we don’t want something to be true, we tend to see the reasons for that view less favorably. “More/less favorably” just means that the reasons appear to have more/less force to us than they would to someone with similar intellectual abilities and no desire either way (no horse in the race).

preformationism, bias, perceptionFor example, some early scientists believed in “preformationism,” which is the view that a tiny embryo exists in every sperm cell. So, when these scientists looked through primitive microscopes, they were inclined to see the outline of such an embryo in sperm cells. Others who did not hold this view did not see the embryos. Even the most ardent truth-seekers sometimes allow their biases and desires to affect their perception and judgment.

But to leap into the swamp of skepticism is a mistake. Here’s a common line of reasoning I observe.

  1. Psychological states, such as desires, often cloud human reasoning.
  2. Peter is expressing reasons for a view that he desires to be true.
  3. Therefore, I should mistrust Peter’s reasoning.

The most common example of this is when a religious skeptic dismisses the reasons presented by a Christian for her belief (which she wants to be true). Almost as common: a Christian assumes that the skeptic is only a skeptic (thus dismissing his arguments) because they don’t want there to be a God! Call this the “bad motives” attack. Several things strike me as wrong-headed about this kind of thinking.

Problems with the “Bad Motives” Attack

First, the reasoning presented by a person for their belief must stand or fall on it’s own merits. The motivations, desires, fears, etc. of that person are completely irrelevant when asking, “Is the reasoning they present any good?” (i.e., is the argument valid). To critique or question a person’s motives instead of critiquing their actual argument is evasion. We resort to this red-herring tactic only when we lack the intellectual skills to logically evaluate the argument being presented. (I should also add that you can admire the logic of an argument without agreeing with it! Being wrong is not the same as being irrational. Several very rational theories exist to explain the extinction of the dinosaurs, but most of them are wrong!)

spotlight, reasonSecond, this view is a two-edged sword. If all judgment is suspect because of hidden psychological interference, then the critic must turn this spotlight on her own reasoning as well. Could it be that (speaking as the critic) my own skepticism about Peter’s reasoning (in the example above) is actually the flawed product of my own motives–I don’t want him to be right! We should doubt the skeptic’s reasoning on exactly the same grounds that the skeptic doubts ours.

Third, wanting something to be true does not automatically cripple our judgment and reasoning. In fact, I don’t think anyone really believes it does. I know this because we apply this critique inconsistently. We pick and choose when to apply the “bad motives” attack, typically applying it to arguments for views we personally don’t like. And certainly we shouldn’t refrain from arguing in favor of things we care deeply about. For instance, I care deeply about the evils of human trafficking. Does this mean I am disqualified from making judgments or arguments against human trafficking? That seems absurd. Let me make my arguments, and then evaluate their soundness on their own merit! This is one reason why good academic journals and conferences don’t want the author’s name on a paper submission. The author’s motives and desires should be irrelevant in evaluating the quality of the arguments presented. 

Last Words

dead end, judgment, reasoning, bias, skepticTrue, there is such a thing as confirmation bias. Our wishful thinking can mislead our reasoning at times if we are not vigilant. But hyper-skepticism about everyone’s beliefs and reasoning is unjustified. So, I want to discourage you from using this “bad motives” attack as an easy response to arguments you don’t like. Deconstructing everyone’s judgment this way, including your own critiques, leads us to a dead end.

*I’m indebted to Josh Rasmussen for his insightful comments on his own recent Facebook post.

When Speech Feels Like Violence

speech, violence, angrySpeech sometimes offends, even injures, our sensibilities. Alex Jones and the decisions of Apple and Facebook to remove his content illustrate this. But there are at least two ways speech can “hurt” us. Some hurtful speech stabs to the core of our self and our sense of dignity as a human being. Other times, speech threatens us because our inadequate cognitive defenses and filters fail to protect our psyche. I want to address the second kind of scenario because it is more “up to us” than the first kind.

Epistemic Immune System

My father endured numerous chemotherapy treatments during his battle with cancer in 2002. I distinctly recall one time when his immune system was so severely compromised by the chemo that we had to wear face masks just to come into his hospital room. And if anyone was sick–forget it! A common cold could kill him. If someone walked into the room without a mask, a nurse would immediately escort them out with a stern reprimand. Ordinary germs–ones that any healthy immune system would handle easily–constituted a threat.

epistemic immune system, defenseSomething similar goes on with our beliefs. You could say we have another immune system–an epistemic immune system. Instead of protecting us against bacteria and viruses that threaten our body, the epistemic immune system protects our “worldview” (our system of beliefs about reality) against false ideas and bad logic. When our epistemic immune system is healthy, it identifies bad ideas and bad reasoning and escorts them to the mental trash bin. It also identifies good ideas and sound reasoning and allows them through unharmed, where they find eventual integration with our worldview. If our epistemic immune system functions well, we feel more secure and less fearful  because we know our beliefs will remain healthy despite our exposure to bad ideas.

We need a healthy epistemic immune system because bad ideas really can harm us. If bad ideas gain “admission” into our belief structure, they can start to cause problems. They can cause psychological anguish or pain. They can result in actions that harm us or others. They can conflict with other (good) beliefs, or erode the foundations of our worldview. We sometimes feel this in the form of cognitive dissonance or instability. Like a man on a boat for the first time in choppy seas, we wobble around, out of balance and extremely uncomfortable. We sense that any small push might send us tumbling, our worldview crashing like a Jenga tower. Every disagreement feels like a threat, like spoken violence.

Inside and Out

speech, violence, defenseThink of your worldview as a city with two lines of defense: outside the gate and inside the gate. You control what you are exposed to “outside” the gate by choosing what to read, watch, listen to, etc. But once you have seen or heard an idea, it’s through the gate and your internal mental defenses (epistemic immune system) have to do their job. It is very, very difficult to completely control what gets through your gate. It’s like movie spoilers–if you’re using social media, it’s really hard not to find out that everyone dies in Infinity War. (See!?!?) Ideas zip through the gate of your eyes and ears so fast! This is why we need a healthy epistemic immune system on the inside.

Now here’s the real crux of the matter. When our internal defenses are weak, we are too easily thrown off balance by disagreement and contrary views. Fear and insecurity rule us. So here’s what we do: we try to shut the gate. Or we at least build a barricade in front of it to block new ideas out. How do we do this? I’ve observed (even in myself) two main strategies. For one, we avoid exposure to new ideas–we become epistemic hypochondriacs. We shun (or censor) books, websites and people who disagree with us. Secondly, we use anger or outrage as a shield. Instead of looking carefully at the idea presented and constructing a reasonable response, we try to intimidate the other party into silence with loud, abusive speech.

speech, violence, angryNow before you write a nasty email or comment, let me clarify something. Remember I mentioned two ways that speech can hurt. When you’re dealing with the first sort (see paragraph 1), epistemic defenses won’t help much. This sort of deeply abusive speech that penetrates to our core does not require careful analysis and logical counterargument. It’s what the Supreme Court referred to as “fighting words.” But the other sort of “hurtful” speech–the kind that only hurts because we lack a healthy internal defense–should not be banned or censored. The challenge lies in discerning which sort you’re dealing with.

Conclusion

So let me offer a suggestion. Cultivate a healthy epistemic immune system. This solves much of the problem. You can do this several ways.

  1. Take a course on logic or critical thinking. Great on-line resources abound as well. For starters, try here and here. If you know of a good resource, share it in the comments.
  2. Spend some time with someone who can mentor you on these skills. Find a philosopher, lawyer, or someone else who gets paid to argue, take them out to lunch and pick their brain.
  3. Lower your shield of anger and moral outrage. A shield helps in certain cases, but overuse will only impede your mental maturation. Just like a healthy physical immune system, you need exposure to “germs” over time to develop your “antibodies.” Learn to stand your ground and respond respectfully and intelligently. Read before you dismiss.
  4. process, slow, thinkingFinally, process new ideas more slowly. Unless you’re dealing with the first kind of hurtful speech, take time to digest and consider what is being said. Then you’ll be in a better position to either accept it or thoughtfully respond.

You Too!

you too, tu quoque
U2, not related to logical fallacies

Since I know very little about political issues and immigration, I tend to stay out of debates. But what I do know is good debate. So, I won’t often weigh in on one side, but I will comment on the quality of the arguments. In the recent brew-ha-ha over separating children from parents at the border, people used whatever tactics they could to “win the argument.” But there was quite a bit of “tu quoque” (Latin for “you too”) going on. Using this tactic doesn’t get us any closer to knowing what’s true or right.

“You too” happens when side A says that there is something really bad about the policy of side B, and side B responds by saying, “Well, you’re just as bad!” Typically, side B resorts to this tactic because they know their policy is really bad. They don’t want to defend it. So, instead, they shift the focus off of whether the policy is bad and put it on something side A has done which is just as bad. This puts side B on better terms with an argument they can win. But the original argument about the policy of B is left unresolved. Even worse, resolution is now impossible because side A and side B aren’t even arguing about the same thing anymore. (See this post for other kinds of bad logic.)

Children At the Border

children, tu quoqueFor example, side A argued that the policy (held by side B) of separating children from their parents at the border is really, really bad. But instead of discussing the merits of the policy, side B accuses side A of being hypocrites because side A originated the policy years ago! “You’re just as bad as us!” But this, while perhaps true, misses the point completely. The original question is, “Should we continue separating children from their parents?” not, “Who is to blame for this bad policy?” What side B should have done, right from the start, is either defend the policy or admit that the policy is bad and change it, rather than try to return fire. This would be good, helpful conversation and debate. (Also, some people on side B did defend the policy by saying, “Well, it’s the law!” But this amounts to arguing that “this policy is the right policy because it is the current policy.” Try telling that to MLK!)

Please note that my point here is not about who was right. My point is about how one side argued badly. Who was correct is a completely different issue.

cake, tu quoqueBut I’m not letting side A off the hook that easily! When side B won a recent Supreme Court decision that made it (legally) permissible for a cake designer to refuse to make a cake for a gay wedding, side A was outraged. But then a new story came out: business owners on side A refused service to people who worked for the Trump administration. This unleashed a “You too!” tornado on social media. Both sides starting lobbing “you too” grenades at the other. Instead of debating whether it is right to refuse service, both sides said, “Your side did the same thing!” This simply avoids the actual issue.

It’s always a good idea to stop and think about the tactics you’re using to “win” a debate. Some tactics help us discover what is good, true and beautiful. Others only serve to distract, shut down, or silence our opponents.

Rushing Into Fake News

Mike Leach, fake newsJust had to post this story as a classic example of bad believing. (AKA, bad epistemology.) Football coach Mike Leach allows himself to be suckered in by “fake news,” and compounds the error by broadcasting it to thousands in his twitter feed. Ironically, Leach has a brilliant offensive mind when it come to football. He probably uses “fakes” (deception) in his plays all the time, and expects the defense to fall for them. Let this serve to encourage you to always proceed with extreme caution when processing things on the internet. Especially conspiracy theories.

Should I Change What I Believe?

Magdalen College, Oxford, C.S. LewisIn the summer of 2017, I visited the University of Oxford and walked the flower-covered grounds of Magdalen (oddly pronounced “Maudlin”) College. I imagined myself retracing the steps of C. S. Lewis as he first wrestled with the idea of faith in God. He describes his conversion this way:

“You must picture me alone in that room in Magdalen, night after night, feeling, whenever my mind lifted even for a second from my work, the steady, unrelenting approach of Him whom I so earnestly desired not to meet. That which I greatly feared had at last come upon me. In the Trinity Term of 1929 I gave in, and admitted that God was God, and knelt and prayed: perhaps, that night, the most dejected and reluctant convert in all England.” (emphasis mine)

Lewis experienced a process that everyone goes through at one time or another. We start with a belief, we encounter something that unsettles that belief, and then we either find a way to retain our belief, or we change to a different belief. Personally, I don’t think we can directly control what we believe, but we can often indirectly influence the process. But how do we decide what to do when we feel that unsettling?

Winds of Change

weather vane, changeOnly in the last few years have I come to appreciate the expression “winds of change.” When there is a change in air pressure in one place, you feel that change in the form of air moving quickly in or out of your location. That moving air brings a change in weather. (Apologies to any meteorologists out there for my crude description.) Sometimes we feel the “winds of change” in our mental life. Something is unsettled and moving. We encounter new evidence (either in the form of an experience or a set of reasons presented to us) against our view of something and our belief becomes unstable.

The question is, what should we do when we feel that unsettledness? It seems there are several possibilities:

  1. Do nothing. I can take a passive stance and just let the winds of belief blow me wherever they will. Change? Sure! Anytime, any belief.
  2. Stick my head in the sand. I can ignore the new evidence and distract myself from thinking about it further, until I can hopefully just forget about it. Then I will avoid anything that reminds me of that evidence.
  3. Investigate. I can check out the new evidence and test it’s quality or seek corroboration. I can also seek counter-evidence (reasons to doubt the new evidence) and additional evidence for the position I currently hold. Once this is done, I can move toward a new position or affirm my current one.

Something about the first approach appeals to us. It takes no effort, for one. It also sounds so flexible and open-minded. But while flexibility and open-mindedness can be virtues, you can have too much of a good thing. One danger with this approach is that some evidence is misleading evidence.

evidence, knifeEver read a good murder mystery where someone tries to frame another person for the murder? Jenna plants a bloody knife in Jake’s house, she transfers money into his bank account, etc. The average person sees these “clues” and believes Jake must be guilty. But a good detective doesn’t form conclusions quite so quickly or easily. They hold out, they investigate and test the evidence. This sort of reactionary believing happens on social media all too often. We swallow “fake news” or posts that turn out to be hoaxes or just mistakes. So, it seems better to form beliefs more carefully, but without losing flexibility and open-mindedness.

Believe it or not, approach #2 also has a virtue. If all your current beliefs are true, then the head-in-sand technique can help you avoid ever forming a false belief! But it will be at the cost of ever learning any new truths. And besides, I know that my current stock of beliefs isn’t perfect. #2 isn’t as safe as it seems.

Investigation

Of the three options, #3 provides the best way to ensure you are moving toward the truth, or at least toward the most reasonable belief. If you care deeply about having true and reasonable beliefs, then it is wise to invest some time in investigation when you experience an “unsettling” in your worldview.

san francisco, beliefsMy senior year in college, I flew west for a summer mission initiative in San Francisco. My room mate in the dorms, Jasper, didn’t at all fit into the box of what I thought an evangelical college student should be. He had long, crazy hair, dressed in a sort of “grunge” style, and (gasp) listened to secular music! So I thought I was more “mature” than Jasper in my Christian faith, since I wasn’t as “worldly.”

By the end of the summer, however, it became clear that not only was Jasper more mature in his faith than I was, but he emerged as the spiritual leader of the entire mission. Over those 6 weeks, I watched Jasper carefully, and I saw enough evidence in his life to “shift” my belief about secular music. When I returned home from California, I had changed my view and finally felt the freedom to re-embrace my favorite band, U2. (I know how ironic that sounds, given that U2 are very Christian in their message.)

I’ve also experienced times where my view has been challenged, and after investigation, I’ve held my ground. I’ve even moved to a position of “I don’t know” on a few topics. It’s not about which position you take, it’s about responsibility. I want to be responsible with my mind and my beliefs, the same way I try to be careful what I eat and assimilate into my body. (I wrote about another time I changed beliefs here.)

Flexible, but Discerning

earthquake, beliefsI actually went back to live in California for a few years, about a decade after that summer mission. Loved it. Except for the earthquakes. When you feel that rumbling, and your picture frames start rattling off the shelves, it’s quite unsettling.

Sometimes we feel that rumble in our worldview when we have new experiences and talk to people with different perspectives. But we don’t have to respond in panic and fear. Quality buildings are strong, but also flexible, to better withstand quakes. We need that, too. Stay flexible and ready to adjust as needed when the quake comes. We can stop and decide to take some time to investigate. “It is the mark of a mature mind,” Aristotle says, “to be able to entertain an idea without accepting it.” Hold that idea (and the evidence) in your hand and give it a good hard look. Then you can rationally, responsibly discern whether to toss it, table it, or move toward it.

Is Faith Irrational?

I came across this wonderful post by Liz Jackson, a Notre Dame PhD candidate in philosophy. She argues for the rationality of faith by taking an argument against her view and showing that it fails. Of course, this doesn’t “prove” anything, but it does undermine several common attacks made against the rationality of faith. I’d be interested to hear from skeptical readers whether they think Jackson succeeds, or if they have an alternative way to argue for faith’s irrationality.

One point that stands out to me is that skeptics shouldn’t just define faith as irrational. She explains why in the post.

Read her post here.

I discovered this blog (The Open Table) just today, but it seems like a good one.

An Atheist, an Agnostic, and A Theist Walk Into A Bar

(That literally happened to me one time.) Ok, this joke still needs writing, and that’s not my thing. But I do want to try and tease out a related conversational knot that’s been giving me trouble. In short, the knot involves the answers to the following questions:

  • What does it mean to be an atheist?
  • What does it mean to be a theist?
  • What does it mean to be an agnostic?

Why does this matter? Because labels matter to us. If someone called me a “feminist,” my reaction might depend on what they mean by the term. If it just means “someone who advocates for the complete social, economic, and political equality of the sexes,” then I’m happy to carry the label. But if they mean it pejoratively to mean “someone who hates men and wants women to take over the world,” then I’m going have a problem with that. So which is the correct definition of feminist?

Similarly, if someone calls you an atheist, what exactly does that mean? You might accept the term when defined a certain way, but not when defined in another way. Also, if I say something like, “atheism is irrational,” the reasonableness (or truth) of that claim depends on the definition being used. If it means, “someone who knows with certainty that no gods exist,” then few people will accept the label, and rightly so.

What I want to do here is compare two approaches to defining these terms, and explain why I recommend one over the other. I’ll start with what I call the “four quadrants” model.

The Four Quadrants Model

Some people propose we renovate these terms (atheist, agnostic, theist) a bit to make things clearer and avoid foisting burdensome views upon others. Here is the renovation proposal:

atheism, agnosticism, belief grid

There is a certain elegance and symmetry to this model. You have ‘theism’ and ‘a-theism’ juxtaposed with ‘gnostic’ and ‘a-gnostic.’ Very nice.

This “quadrant model” carries other advantages as well. First, it takes some pressure off of atheists who don’t want to claim that they “know” there are no gods. Second, it also takes pressure off of theists in exactly the same way. Third, it uses the term ‘agnostic’ in a way more true to the original meaning of the Greek word. In ancient Greek, ‘gnosis’ means ‘knowledge’ and the prefix ‘a’ mean ‘without’ or negation. So, to say that I am “agnostic” literally means “I don’t know.”

Cons of the Quadrant Model

Unfortunately, there are some drawbacks to this model. For one, the unconventional use of ‘gnostic’ and ‘agnostic.’ While I do love etymologies (word origins), most people find them rather pedantic. The simple truth about language is that meanings change over time. A word gets it’s meaning from usage. If we were arguing about what technical term philosophers should use, that might be a different story. But if we want an ordinary term for common usage, it is simpler to use the word conventionally. So, when it come to ‘agnostic,’ most people understand this to describe someone who is undecided about God’s existence. And when it comes to ‘gnostic,’ this term gnostic, mysticalrefers to followers of an religion that revolves around the possession of esoteric, mystical knowledge (Gnosticism). For Christians, this is especially important, because ‘gnostic’ carries a heavy negative connotation, and has for thousands of years. To be a “gnostic theist” is to be a heretic, for most Christians. So, shifting meanings in this way muddies the waters.

Secondly, this model suggests that knowledge and belief are like height and width–two separate dimensions of thought. But belief and knowledge relate more like acceleration and force. Belief is a component of knowledge, just as acceleration is a component of force (F = ma). So it is misleading to represent them on two perpendicular axes.

This quadrant model also fails to provide a safe conceptual space for the truly undecided. True undecidedness is a real position on many important questions, including scientific ones. In numerous cases (e.g., the multiverse), the most rational thing to say is “I neither believe there is an X, nor do I believe there is not an X.” But the diagram above tells me that I must pick a quadrant.  I must either believe (the space below the ‘x’ axis) or not believe (the space above the ‘x’ axis). If we revised it to allow people to be on the axis, perhaps right at (0,0), then what do we call them? There is no in-between because the model is binary in principle.

confused, atheist, agnosticFinally, the fellow in the upper-left quadrant confuses a few things. He wants to say “I don’t believe any god exists,” but he also wants to say “I’m not CLAIMING that–I might be wrong.” I understand the discomfort here. He doesn’t want to make a strong claim, because that would require a strong defense, which is a burden he doesn’t want. Fair enough. But he isn’t like a helicopter that has yet to land. He has landed, even if tentatively. He thinks there are no gods. He’s not saying he is certain, or that he can prove anything, he is simply describing where he has landed. And he admits that he may have landed on the wrong spot. That’s fine. But even a tentative landing represents a claim about what you think is true about the universe. I’ll say a bit more beloe about the the difference between “having no belief about p” and “not believing p.”

The Sliding-Scale Model

Instead of a binary-based model, and the restrictions that entails, I prefer a sliding-scale approach.  This non-binary model allows for a wide range of possibilities, grouped into three natural categories. Rather than being forced to choose from only four possible positions, people can personalize their position based on their beliefs and confidence level.

I didn’t have a cute graphic for mine, so I made this:

belief scale, agnostic

On this view, you can be anywhere between 0-100% confidence about a certain idea or claim. (“p” refers to any claim, or proposition, like “God exists.”) If you find yourself hovering around the 50% mark, we’ll say you neither believe it nor disbelieve it. This is where we should fall on claims like “this fair coin will land on heads when flipped.” Sometimes we say things, loosely, like “I don’t know.” But this conversationally implies that we simply don’t have a belief one way or the other.

pool jump, confidence, beliefIf you land roughly between 65-100% confident that p, then you clearly believe it is true.  At 100% confidence, you have no doubts and think there is no chance that p is false. (Notice that we’re saying nothing about knowledge here. This is only about beliefs, just to keep things simple and clear.)  If you fall anywhere in between 0-35%, you think that p is false, though the closer you get to 50%, the more you lean toward thinking there’s some chance it could be true. For example, suppose I’m looking over a balcony, wondering if I could jump safely into the pool. I give myself about a 15% chance of plunging safely into the water. So, if you ask me, “Do you believe you can make it?” I’d say, “no.”  If you have 0% confidence that p, then you have no doubt it is false–you disbelieve it with maximum confidence of its falsehood.

The Sliding-scale & God

Now, if we apply this to our debate about definitions, here’s how I think it works in terms of belief about God. If you have ~65% or more confidence that God exists, then you believe that God exists and we call you a theist.  (I think most of us agree with that.) But theists, like atheists, can possess little or much confidence. If you are ~35% or less confident that God exists, then you disbelieve that God exists and we call you an atheist. (Nothing about knowledge here!)

Both segments (red and green) of the scale represent a “belief state,” two sides of the same coin.

coin flip, tails, beliefBut that’s ok because merely believing or disbelieving that p doesn’t saddle you with an undue burden. I call them both belief states because disbelieving that p is roughly synonymous with believing that p is false. I.e., “I disbelieve that God exists” is the same as “I believe there is no God.” It’s like someone saying, “I don’t think the coin will be heads”– you wouldn’t need to ask whether they believe it will be tails. It’s just a belief! No big deal. Whether it is rational or whether you know is a different ball game and will require more justification. But the atheist need not attain certainty or prove there is no God in order to be a rational atheist.

Questions & Concerns

Some atheists prefer the quadrant model because they are more comfortable saying “I don’t have a belief about God–I lack belief in God.” But saying you lack a belief about God’s existence is not accurate. Atheists lack an affirmation of God’s existence, but they have a belief state (doxastic attitude), and that belief state is disbelief. They take the claim “God exists” to be false. If you don’t take it to be false, then you are either undecided or a theist. The only people who truly lack a belief about God are those who have never considered God’s existence, like my dog Duke or my friend’s baby. They just have no belief state about God whatsoever.

Huxley, agnostic, agnosticismWhat about agnostics? Now, I admit that the term ‘agnostic’ as a label for the undecided is somewhat regrettable, given the literal Greek meaning. Coined by Thomas Huxley in the late 19th century, the term served to contrast his position against those who felt they had attained “gnosis” or knowledge of answers to the big questions. Huxley used the term to express either skepticism or humility or both. But regardless of Huxley’s intentions, the term now refers to someone who is undecided on a matter, religious or otherwise. For now, it works. Launch a campaign to shift the usage if you dislike it, but it isn’t quite right to tell people that they’re using it wrong now.

To avoid mixing up atheism and agnosticism, note that the claim “I don’t believe that any gods exist” (as in the four quadrant graphic above) can mean several different things. Consider the claim E: “the number of stars in the universe is even.” If I say that “I don’t believe E,” that could mean: (1) I think E is false, which implies that I hold the odd-number-stars view; or (2) I don’t believe E, but I don’t think it’s false either. I’m just undecided, or agnostic on the matter. So when you want to express atheism and NOT agnosticism, it is better to say something like, “I believe there are no gods,” or more simply “I disbelieve theism.”

If you discover that the belief-state you are in is difficult to defend, welcome to the club! Each position has its unique challenges and weaknesses. There’s no problem with redefining your position in order to make it more defensible, as long as the changes are not “Ad hoc” and the new definition is coherent and unconfusing.

Feelings, Beliefs, and Evidence

In the iconic scene, Darth Vader tells Luke that his feelings will lead him to the truth. Is this true? My feelings aren’t helping here.

If you rely on feelings to tell you what is true, are your beliefs less stable? Are they less likely to be true? (This is a post about a post about a post about a podcast about beliefs and evidence. I’ll thank the relevant people as I go.)

Experience v. Evidence

In a recent Unbelievable podcast, hosted by Justin Brierley, this question jumps onto the table. Brierley interviews two sons-of-famous-Christian-fathers, Bart Campolo and Sean McDowell. Both grew up in the shadow of their father’s world-wide influence, charismatic speaking, and prolific publishing. Both followed their father’s trail into Christian ministry, but their paths diverge at that point. Somewhere along the way, Bart Campolo lost his belief in God, while Sean’s faith became even stronger. What made the difference?

house built on rock, faith, evidenceSome hay has been made on the blogosphere about this. I found out about the story from Jeremy Smith at Faith Ascent , and he read about it on Alisa Childers excellent blog. So thanks to both of them! The narrative being suggested is roughly this: one man built his faith on the sand of experience (feelings), and the other on the solid rock of evidence, so to speak. The former’s crumbled in the storm, and the latter’s held firm.

More To the Story

This narrative may capture one aspect of the stories involved, but surely we can (and should) say more. I’ll share two thoughts. The first is this: “feelings” or “experience” aren’t opposed to evidence, they are evidence. More precisely, I take them to be a kind of evidence, much in the way that chicken is a kind of poultry. I outline several kinds of evidence in this post (see #15-19).

investment, diversify, evidenceThis means that Campolo’s shift from theism to humanism may not be due to a simple lack of evidence. Instead, I think it may have been due to a lack of diversity in his evidence. Investors always advise their clients to “diversify.” I.e., they should invest in many companies so that if one company tanks, they will still have a stable portfolio. It’s the “don’t carry all your eggs in one basket” maxim. My hunch is that McDowell possessed a wider variety of evidence, including experience, philosophical arguments, testimony of reliable sources, and historical evidence. It’s certainly possible that Campolo had plentiful amounts (comparable to McDowell’s) of all these types as well, but I’m guessing this wasn’t the case.

Here’s the second thought: whether or not Campolo enjoyed a copious and variable evidence base, there is another factor that hasn’t been highlighted. Campolo experienced a decades-long struggle with the problem of evil. This experience holds significant evidential power, and can tip the scales against just about any collection of pro-theism evidence, perhaps with a few exceptions. Campolo mentions in the podcast that while ministering in the inner city, he saw horrible things happening to people, including children. He prayed and prayed, with no noticeable results. I can understand how his faith eroded over time in such a milieu. And I don’t know that McDowell ever experienced a “storm” of comparable magnitude. So, that’s a difference that should factor into the explanation.

Conclusion

storm, evidenceIn sum, stories are complex. Everyone’s evidence base is different, necessarily. Different evidence supports different beliefs. Still, I think the contrast between Campolo and McDowell illustrates the importance of a diversified evidential portfolio, if you’re wanting a stable belief set.

Criticism, Knowledge, and Authority

Learning about informal logical fallacies turns young philosophy students into gun-slinging logic vigilantes. I love how this comic (courtesy of Existential Comics) portrays the phenomenon.

fallacy man, authority, belief fallacy man, authority, belief

But, as Alexander Pope wrote, “a little learning is a dangerous thing.” In his Essay on Criticism, Pope critiques the critics, warning them of trying to evaluate beyond their skill. The essay (written in verse) holds great wisdom, well-worth the hour it might take to read through. One takeaway is this: if you plan to engage in criticism of a view, be sure you know what you’re talking about. Otherwise your photo may end up on Wikipedia’s Dunning-Kruger Effect  page. “Drink deep, or taste not the Peirian spring.”

The Appeal To Authority

hawking, authority, testimony, scienceOne of the fallacies mentioned above that gets frequent abuse is the “appeal to authority.” Those who have only sipped at the Peirian Spring misunderstand this concept, and so make two common errors: 1) they accuse others of it falsely, and 2) they become oblivious to their own appeals to authority. Let me illustrate a little.

Fallacious appeal to authority:  Brett claims that beer causes Alzheimer’s Disease. Conrad replies, “That’s silly.” Brett says, “My friend, Dr. Swanson, said it. Therefore, it’s true.”

Legitimate appeal to authority: Mark claims that black holes emit radiation. Kenny says, “But nothing can escape from a black hole.” Mark retorts, “Stephen Hawking has argued powerfully for this and talks about it in his book, A Brief History of Time.” 

What’s the difference? For one, Stephan Hawking clearly satisfies any reasonable criteria for being a legitimate expert on black holes. It is not at all clear that Dr. Swanson is an expert on Alzheimer’s. Conrad may not even know who Dr. Swanson is.  Second, Brett bases his argument solely on the word (hearsay) of Dr. Swanson, while Mark offers at least one checkable resource. Third, Brett fashions his argument in deductive form. But an argument from authority should take inductive form, i.e., the evidence from authority does not guarantee the conclusion–it only makes it more likely to be true.   A fourth mistake sometimes made in appeals to authority, though not in this case, is when someone misquotes or misrepresents an expert.

We All Do It

court room, testimonyThe bottom line is: we all rely on legitimate appeals to authority, and rightly so. Testimony (information transmitted to us from other persons, as in court) acts as one of at least five sources of knowledge (inference, memory, perception, and consciousness being the others). I simply cannot help but rely on the words of other people to help me form my beliefs about the world, like when my daughter tells me she is at a friend’s house. And I especially rely on those who have expertise in various areas: scientists, philosophers, doctors, lawyers, musicians, etc.

But I still need to treat authority carefully. When I decide whether to believe something I read or hear, I should make sure I know the source. Not all sources — people, publications, websites–are created equal. I would check to see whether the writer/speaker is an expert or is quoting an expert. And I still use reason and background knowledge to filter the expert’s claims. I address some of these ideas in this 2 minute clip from a talk at the University of Missouri Skeptics Club:

(You can see this video, “Responsible Believing,” in it’s entirety here.)

 A Final Paraklesis

pipe, health, authority, testimony(I like the Greek word ‘paraklesis’ because it can mean both “encouragement” and “exhortation.”) Sometimes extra caution is required. I may take risks, at times, with my own health–like when I indulge in pipe-smoking. But I should think twice about the health risks when recommending such things to others. Similarly, I am sometimes negligent with my epistemic health–like believing something without sufficient consideration. But I try to exercise extra caution and care when conveying ideas (teaching, writing, speaking, using social media), based on authority, to others. Take an extra moment to ask, before you post or assert something based on authority,

  • Is the authority legitimate? (not always an easy question)
  • If the issue is controversial, have I portrayed it as one-sided by only quoting one expert?
  • Is the authority an expert in the relevant field?
  • Did I accept this expert’s word uncritically, or have I checked it out?
  • Have I represented the authority accurately?

And before you draw your fallacy six-gun and dispense epistemic justice on someone, ask whether they might be making an appropriate appeal to authority.