If you haven’t listened to my latest podcast, here’s a preview:
I always appreciate comments here on the blog or on Facebook. Subscribe to my YouTube channel!
My second “official” podcast (on Stephen Hawking) is now available on iTunes! Here’s the iTunes link. If you don’t have iTunes, you can listen on Sound Cloud. Feedback on the podcast, including production features, is welcome.
I continue my interview with Dr. Kenny Boyce, Asst. Prof. of Philosophy at the University of Missouri. This episode focuses on the work of Stephen Hawking, who passed away on March 14, and the implications of his work for philosophy and theology.
In part 2, we focus on three main topics, all centered around the epistemology of science. First, we discuss the difference between realism and anti-realism in science and how this affects arguments for or against God. Second, we explore whether science can say anything about the evidence for God. Third, we talk about the “god of the gaps” objection to theism that is commonly raised by skeptics.
This one is a bit longer than the first, and I still have enough material left over for another podcast! We’ll see if it ends up becoming Part 3.
Thank you Dr. Kenny Boyce!!!!
Dr. Boyce (the one on the right.)
Speech sometimes offends, even injures, our sensibilities. Alex Jones and the decisions of Apple and Facebook to remove his content illustrate this. But there are at least two ways speech can “hurt” us. Some hurtful speech stabs to the core of our self and our sense of dignity as a human being. Other times, speech threatens us because our inadequate cognitive defenses and filters fail to protect our psyche. I want to address the second kind of scenario because it is more “up to us” than the first kind.
My father endured numerous chemotherapy treatments during his battle with cancer in 2002. I distinctly recall one time when his immune system was so severely compromised by the chemo that we had to wear face masks just to come into his hospital room. And if anyone was sick–forget it! A common cold could kill him. If someone walked into the room without a mask, a nurse would immediately escort them out with a stern reprimand. Ordinary germs–ones that any healthy immune system would handle easily–constituted a threat.
Something similar goes on with our beliefs. You could say we have another immune system–an epistemic immune system. Instead of protecting us against bacteria and viruses that threaten our body, the epistemic immune system protects our “worldview” (our system of beliefs about reality) against false ideas and bad logic. When our epistemic immune system is healthy, it identifies bad ideas and bad reasoning and escorts them to the mental trash bin. It also identifies good ideas and sound reasoning and allows them through unharmed, where they find eventual integration with our worldview. If our epistemic immune system functions well, we feel more secure and less fearful because we know our beliefs will remain healthy despite our exposure to bad ideas.
We need a healthy epistemic immune system because bad ideas really can harm us. If bad ideas gain “admission” into our belief structure, they can start to cause problems. They can cause psychological anguish or pain. They can result in actions that harm us or others. They can conflict with other (good) beliefs, or erode the foundations of our worldview. We sometimes feel this in the form of cognitive dissonance or instability. Like a man on a boat for the first time in choppy seas, we wobble around, out of balance and extremely uncomfortable. We sense that any small push might send us tumbling, our worldview crashing like a Jenga tower. Every disagreement feels like a threat, like spoken violence.
Think of your worldview as a city with two lines of defense: outside the gate and inside the gate. You control what you are exposed to “outside” the gate by choosing what to read, watch, listen to, etc. But once you have seen or heard an idea, it’s through the gate and your internal mental defenses (epistemic immune system) have to do their job. It is very, very difficult to completely control what gets through your gate. It’s like movie spoilers–if you’re using social media, it’s really hard not to find out that everyone dies in Infinity War. (See!?!?) Ideas zip through the gate of your eyes and ears so fast! This is why we need a healthy epistemic immune system on the inside.
Now here’s the real crux of the matter. When our internal defenses are weak, we are too easily thrown off balance by disagreement and contrary views. Fear and insecurity rule us. So here’s what we do: we try to shut the gate. Or we at least build a barricade in front of it to block new ideas out. How do we do this? I’ve observed (even in myself) two main strategies. For one, we avoid exposure to new ideas–we become epistemic hypochondriacs. We shun (or censor) books, websites and people who disagree with us. Secondly, we use anger or outrage as a shield. Instead of looking carefully at the idea presented and constructing a reasonable response, we try to intimidate the other party into silence with loud, abusive speech.
Now before you write a nasty email or comment, let me clarify something. Remember I mentioned two ways that speech can hurt. When you’re dealing with the first sort (see paragraph 1), epistemic defenses won’t help much. This sort of deeply abusive speech that penetrates to our core does not require careful analysis and logical counterargument. It’s what the Supreme Court referred to as “fighting words.” But the other sort of “hurtful” speech–the kind that only hurts because we lack a healthy internal defense–should not be banned or censored. The challenge lies in discerning which sort you’re dealing with.
So let me offer a suggestion. Cultivate a healthy epistemic immune system. This solves much of the problem. You can do this several ways.
Since I know very little about political issues and immigration, I tend to stay out of debates. But what I do know is good debate. So, I won’t often weigh in on one side, but I will comment on the quality of the arguments. In the recent brew-ha-ha over separating children from parents at the border, people used whatever tactics they could to “win the argument.” But there was quite a bit of “tu quoque” (Latin for “you too”) going on. Using this tactic doesn’t get us any closer to knowing what’s true or right.
“You too” happens when side A says that there is something really bad about the policy of side B, and side B responds by saying, “Well, you’re just as bad!” Typically, side B resorts to this tactic because they know their policy is really bad. They don’t want to defend it. So, instead, they shift the focus off of whether the policy is bad and put it on something side A has done which is just as bad. This puts side B on better terms with an argument they can win. But the original argument about the policy of B is left unresolved. Even worse, resolution is now impossible because side A and side B aren’t even arguing about the same thing anymore. (See this post for other kinds of bad logic.)
For example, side A argued that the policy (held by side B) of separating children from their parents at the border is really, really bad. But instead of discussing the merits of the policy, side B accuses side A of being hypocrites because side A originated the policy years ago! “You’re just as bad as us!” But this, while perhaps true, misses the point completely. The original question is, “Should we continue separating children from their parents?” not, “Who is to blame for this bad policy?” What side B should have done, right from the start, is either defend the policy or admit that the policy is bad and change it, rather than try to return fire. This would be good, helpful conversation and debate. (Also, some people on side B did defend the policy by saying, “Well, it’s the law!” But this amounts to arguing that “this policy is the right policy because it is the current policy.” Try telling that to MLK!)
Please note that my point here is not about who was right. My point is about how one side argued badly. Who was correct is a completely different issue.
But I’m not letting side A off the hook that easily! When side B won a recent Supreme Court decision that made it (legally) permissible for a cake designer to refuse to make a cake for a gay wedding, side A was outraged. But then a new story came out: business owners on side A refused service to people who worked for the Trump administration. This unleashed a “You too!” tornado on social media. Both sides starting lobbing “you too” grenades at the other. Instead of debating whether it is right to refuse service, both sides said, “Your side did the same thing!” This simply avoids the actual issue.
It’s always a good idea to stop and think about the tactics you’re using to “win” a debate. Some tactics help us discover what is good, true and beautiful. Others only serve to distract, shut down, or silence our opponents.
Just had to post this story as a classic example of bad believing. (AKA, bad epistemology.) Football coach Mike Leach allows himself to be suckered in by “fake news,” and compounds the error by broadcasting it to thousands in his twitter feed. Ironically, Leach has a brilliant offensive mind when it come to football. He probably uses “fakes” (deception) in his plays all the time, and expects the defense to fall for them. Let this serve to encourage you to always proceed with extreme caution when processing things on the internet. Especially conspiracy theories.
In the summer of 2017, I visited the University of Oxford and walked the flower-covered grounds of Magdalen (oddly pronounced “Maudlin”) College. I imagined myself retracing the steps of C. S. Lewis as he first wrestled with the idea of faith in God. He describes his conversion this way:
“You must picture me alone in that room in Magdalen, night after night, feeling, whenever my mind lifted even for a second from my work, the steady, unrelenting approach of Him whom I so earnestly desired not to meet. That which I greatly feared had at last come upon me. In the Trinity Term of 1929 I gave in, and admitted that God was God, and knelt and prayed: perhaps, that night, the most dejected and reluctant convert in all England.” (emphasis mine)
Lewis experienced a process that everyone goes through at one time or another. We start with a belief, we encounter something that unsettles that belief, and then we either find a way to retain our belief, or we change to a different belief. Personally, I don’t think we can directly control what we believe, but we can often indirectly influence the process. But how do we decide what to do when we feel that unsettling?
Only in the last few years have I come to appreciate the expression “winds of change.” When there is a change in air pressure in one place, you feel that change in the form of air moving quickly in or out of your location. That moving air brings a change in weather. (Apologies to any meteorologists out there for my crude description.) Sometimes we feel the “winds of change” in our mental life. Something is unsettled and moving. We encounter new evidence (either in the form of an experience or a set of reasons presented to us) against our view of something and our belief becomes unstable.
The question is, what should we do when we feel that unsettledness? It seems there are several possibilities:
Something about the first approach appeals to us. It takes no effort, for one. It also sounds so flexible and open-minded. But while flexibility and open-mindedness can be virtues, you can have too much of a good thing. One danger with this approach is that some evidence is misleading evidence.
Ever read a good murder mystery where someone tries to frame another person for the murder? Jenna plants a bloody knife in Jake’s house, she transfers money into his bank account, etc. The average person sees these “clues” and believes Jake must be guilty. But a good detective doesn’t form conclusions quite so quickly or easily. They hold out, they investigate and test the evidence. This sort of reactionary believing happens on social media all too often. We swallow “fake news” or posts that turn out to be hoaxes or just mistakes. So, it seems better to form beliefs more carefully, but without losing flexibility and open-mindedness.
Believe it or not, approach #2 also has a virtue. If all your current beliefs are true, then the head-in-sand technique can help you avoid ever forming a false belief! But it will be at the cost of ever learning any new truths. And besides, I know that my current stock of beliefs isn’t perfect. #2 isn’t as safe as it seems.
Of the three options, #3 provides the best way to ensure you are moving toward the truth, or at least toward the most reasonable belief. If you care deeply about having true and reasonable beliefs, then it is wise to invest some time in investigation when you experience an “unsettling” in your worldview.
My senior year in college, I flew west for a summer mission initiative in San Francisco. My room mate in the dorms, Jasper, didn’t at all fit into the box of what I thought an evangelical college student should be. He had long, crazy hair, dressed in a sort of “grunge” style, and (gasp) listened to secular music! So I thought I was more “mature” than Jasper in my Christian faith, since I wasn’t as “worldly.”
By the end of the summer, however, it became clear that not only was Jasper more mature in his faith than I was, but he emerged as the spiritual leader of the entire mission. Over those 6 weeks, I watched Jasper carefully, and I saw enough evidence in his life to “shift” my belief about secular music. When I returned home from California, I had changed my view and finally felt the freedom to re-embrace my favorite band, U2. (I know how ironic that sounds, given that U2 are very Christian in their message.)
I’ve also experienced times where my view has been challenged, and after investigation, I’ve held my ground. I’ve even moved to a position of “I don’t know” on a few topics. It’s not about which position you take, it’s about responsibility. I want to be responsible with my mind and my beliefs, the same way I try to be careful what I eat and assimilate into my body. (I wrote about another time I changed beliefs here.)
I actually went back to live in California for a few years, about a decade after that summer mission. Loved it. Except for the earthquakes. When you feel that rumbling, and your picture frames start rattling off the shelves, it’s quite unsettling.
Sometimes we feel that rumble in our worldview when we have new experiences and talk to people with different perspectives. But we don’t have to respond in panic and fear. Quality buildings are strong, but also flexible, to better withstand quakes. We need that, too. Stay flexible and ready to adjust as needed when the quake comes. We can stop and decide to take some time to investigate. “It is the mark of a mature mind,” Aristotle says, “to be able to entertain an idea without accepting it.” Hold that idea (and the evidence) in your hand and give it a good hard look. Then you can rationally, responsibly discern whether to toss it, table it, or move toward it.
I came across this wonderful post by Liz Jackson, a Notre Dame PhD candidate in philosophy. She argues for the rationality of faith by taking an argument against her view and showing that it fails. Of course, this doesn’t “prove” anything, but it does undermine several common attacks made against the rationality of faith. I’d be interested to hear from skeptical readers whether they think Jackson succeeds, or if they have an alternative way to argue for faith’s irrationality.
One point that stands out to me is that skeptics shouldn’t just define faith as irrational. She explains why in the post.
Read her post here.
I discovered this blog (The Open Table) just today, but it seems like a good one.
(That literally happened to me one time.) Ok, this joke still needs writing, and that’s not my thing. But I do want to try and tease out a related conversational knot that’s been giving me trouble. In short, the knot involves the answers to the following questions:
Why does this matter? Because labels matter to us. If someone called me a “feminist,” my reaction might depend on what they mean by the term. If it just means “someone who advocates for the complete social, economic, and political equality of the sexes,” then I’m happy to carry the label. But if they mean it pejoratively to mean “someone who hates men and wants women to take over the world,” then I’m going have a problem with that. So which is the correct definition of feminist?
Similarly, if someone calls you an atheist, what exactly does that mean? You might accept the term when defined a certain way, but not when defined in another way. Also, if I say something like, “atheism is irrational,” the reasonableness (or truth) of that claim depends on the definition being used. If it means, “someone who knows with certainty that no gods exist,” then few people will accept the label, and rightly so.
What I want to do here is compare two approaches to defining these terms, and explain why I recommend one over the other. I’ll start with what I call the “four quadrants” model.
Some people propose we renovate these terms (atheist, agnostic, theist) a bit to make things clearer and avoid foisting burdensome views upon others. Here is the renovation proposal:
There is a certain elegance and symmetry to this model. You have ‘theism’ and ‘a-theism’ juxtaposed with ‘gnostic’ and ‘a-gnostic.’ Very nice.
This “quadrant model” carries other advantages as well. First, it takes some pressure off of atheists who don’t want to claim that they “know” there are no gods. Second, it also takes pressure off of theists in exactly the same way. Third, it uses the term ‘agnostic’ in a way more true to the original meaning of the Greek word. In ancient Greek, ‘gnosis’ means ‘knowledge’ and the prefix ‘a’ mean ‘without’ or negation. So, to say that I am “agnostic” literally means “I don’t know.”
Unfortunately, there are some drawbacks to this model. For one, the unconventional use of ‘gnostic’ and ‘agnostic.’ While I do love etymologies (word origins), most people find them rather pedantic. The simple truth about language is that meanings change over time. A word gets it’s meaning from usage. If we were arguing about what technical term philosophers should use, that might be a different story. But if we want an ordinary term for common usage, it is simpler to use the word conventionally. So, when it come to ‘agnostic,’ most people understand this to describe someone who is undecided about God’s existence. And when it comes to ‘gnostic,’ this term refers to followers of an religion that revolves around the possession of esoteric, mystical knowledge (Gnosticism). For Christians, this is especially important, because ‘gnostic’ carries a heavy negative connotation, and has for thousands of years. To be a “gnostic theist” is to be a heretic, for most Christians. So, shifting meanings in this way muddies the waters.
Secondly, this model suggests that knowledge and belief are like height and width–two separate dimensions of thought. But belief and knowledge relate more like acceleration and force. Belief is a component of knowledge, just as acceleration is a component of force (F = ma). So it is misleading to represent them on two perpendicular axes.
This quadrant model also fails to provide a safe conceptual space for the truly undecided. True undecidedness is a real position on many important questions, including scientific ones. In numerous cases (e.g., the multiverse), the most rational thing to say is “I neither believe there is an X, nor do I believe there is not an X.” But the diagram above tells me that I must pick a quadrant. I must either believe (the space below the ‘x’ axis) or not believe (the space above the ‘x’ axis). If we revised it to allow people to be on the axis, perhaps right at (0,0), then what do we call them? There is no in-between because the model is binary in principle.
Finally, the fellow in the upper-left quadrant confuses a few things. He wants to say “I don’t believe any god exists,” but he also wants to say “I’m not CLAIMING that–I might be wrong.” I understand the discomfort here. He doesn’t want to make a strong claim, because that would require a strong defense, which is a burden he doesn’t want. Fair enough. But he isn’t like a helicopter that has yet to land. He has landed, even if tentatively. He thinks there are no gods. He’s not saying he is certain, or that he can prove anything, he is simply describing where he has landed. And he admits that he may have landed on the wrong spot. That’s fine. But even a tentative landing represents a claim about what you think is true about the universe. I’ll say a bit more beloe about the the difference between “having no belief about p” and “not believing p.”
Instead of a binary-based model, and the restrictions that entails, I prefer a sliding-scale approach. This non-binary model allows for a wide range of possibilities, grouped into three natural categories. Rather than being forced to choose from only four possible positions, people can personalize their position based on their beliefs and confidence level.
I didn’t have a cute graphic for mine, so I made this:
On this view, you can be anywhere between 0-100% confidence about a certain idea or claim. (“p” refers to any claim, or proposition, like “God exists.”) If you find yourself hovering around the 50% mark, we’ll say you neither believe it nor disbelieve it. This is where we should fall on claims like “this fair coin will land on heads when flipped.” Sometimes we say things, loosely, like “I don’t know.” But this conversationally implies that we simply don’t have a belief one way or the other.
If you land roughly between 65-100% confident that p, then you clearly believe it is true. At 100% confidence, you have no doubts and think there is no chance that p is false. (Notice that we’re saying nothing about knowledge here. This is only about beliefs, just to keep things simple and clear.) If you fall anywhere in between 0-35%, you think that p is false, though the closer you get to 50%, the more you lean toward thinking there’s some chance it could be true. For example, suppose I’m looking over a balcony, wondering if I could jump safely into the pool. I give myself about a 15% chance of plunging safely into the water. So, if you ask me, “Do you believe you can make it?” I’d say, “no.” If you have 0% confidence that p, then you have no doubt it is false–you disbelieve it with maximum confidence of its falsehood.
Now, if we apply this to our debate about definitions, here’s how I think it works in terms of belief about God. If you have ~65% or more confidence that God exists, then you believe that God exists and we call you a theist. (I think most of us agree with that.) But theists, like atheists, can possess little or much confidence. If you are ~35% or less confident that God exists, then you disbelieve that God exists and we call you an atheist. (Nothing about knowledge here!)
Both segments (red and green) of the scale represent a “belief state,” two sides of the same coin.
But that’s ok because merely believing or disbelieving that p doesn’t saddle you with an undue burden. I call them both belief states because disbelieving that p is roughly synonymous with believing that p is false. I.e., “I disbelieve that God exists” is the same as “I believe there is no God.” It’s like someone saying, “I don’t think the coin will be heads”– you wouldn’t need to ask whether they believe it will be tails. It’s just a belief! No big deal. Whether it is rational or whether you know is a different ball game and will require more justification. But the atheist need not attain certainty or prove there is no God in order to be a rational atheist.
Some atheists prefer the quadrant model because they are more comfortable saying “I don’t have a belief about God–I lack belief in God.” But saying you lack a belief about God’s existence is not accurate. Atheists lack an affirmation of God’s existence, but they have a belief state (doxastic attitude), and that belief state is disbelief. They take the claim “God exists” to be false. If you don’t take it to be false, then you are either undecided or a theist. The only people who truly lack a belief about God are those who have never considered God’s existence, like my dog Duke or my friend’s baby. They just have no belief state about God whatsoever.
What about agnostics? Now, I admit that the term ‘agnostic’ as a label for the undecided is somewhat regrettable, given the literal Greek meaning. Coined by Thomas Huxley in the late 19th century, the term served to contrast his position against those who felt they had attained “gnosis” or knowledge of answers to the big questions. Huxley used the term to express either skepticism or humility or both. But regardless of Huxley’s intentions, the term now refers to someone who is undecided on a matter, religious or otherwise. For now, it works. Launch a campaign to shift the usage if you dislike it, but it isn’t quite right to tell people that they’re using it wrong now.
To avoid mixing up atheism and agnosticism, note that the claim “I don’t believe that any gods exist” (as in the four quadrant graphic above) can mean several different things. Consider the claim E: “the number of stars in the universe is even.” If I say that “I don’t believe E,” that could mean: (1) I think E is false, which implies that I hold the odd-number-stars view; or (2) I don’t believe E, but I don’t think it’s false either. I’m just undecided, or agnostic on the matter. So when you want to express atheism and NOT agnosticism, it is better to say something like, “I believe there are no gods,” or more simply “I disbelieve theism.”
If you discover that the belief-state you are in is difficult to defend, welcome to the club! Each position has its unique challenges and weaknesses. There’s no problem with redefining your position in order to make it more defensible, as long as the changes are not “Ad hoc” and the new definition is coherent and unconfusing.
In the iconic scene, Darth Vader tells Luke that his feelings will lead him to the truth. Is this true? My feelings aren’t helping here.
If you rely on feelings to tell you what is true, are your beliefs less stable? Are they less likely to be true? (This is a post about a post about a post about a podcast about beliefs and evidence. I’ll thank the relevant people as I go.)
In a recent Unbelievable podcast, hosted by Justin Brierley, this question jumps onto the table. Brierley interviews two sons-of-famous-Christian-fathers, Bart Campolo and Sean McDowell. Both grew up in the shadow of their father’s world-wide influence, charismatic speaking, and prolific publishing. Both followed their father’s trail into Christian ministry, but their paths diverge at that point. Somewhere along the way, Bart Campolo lost his belief in God, while Sean’s faith became even stronger. What made the difference?
Some hay has been made on the blogosphere about this. I found out about the story from Jeremy Smith at Faith Ascent , and he read about it on Alisa Childers excellent blog. So thanks to both of them! The narrative being suggested is roughly this: one man built his faith on the sand of experience (feelings), and the other on the solid rock of evidence, so to speak. The former’s crumbled in the storm, and the latter’s held firm.
This narrative may capture one aspect of the stories involved, but surely we can (and should) say more. I’ll share two thoughts. The first is this: “feelings” or “experience” aren’t opposed to evidence, they are evidence. More precisely, I take them to be a kind of evidence, much in the way that chicken is a kind of poultry. I outline several kinds of evidence in this post (see #15-19).
This means that Campolo’s shift from theism to humanism may not be due to a simple lack of evidence. Instead, I think it may have been due to a lack of diversity in his evidence. Investors always advise their clients to “diversify.” I.e., they should invest in many companies so that if one company tanks, they will still have a stable portfolio. It’s the “don’t carry all your eggs in one basket” maxim. My hunch is that McDowell possessed a wider variety of evidence, including experience, philosophical arguments, testimony of reliable sources, and historical evidence. It’s certainly possible that Campolo had plentiful amounts (comparable to McDowell’s) of all these types as well, but I’m guessing this wasn’t the case.
Here’s the second thought: whether or not Campolo enjoyed a copious and variable evidence base, there is another factor that hasn’t been highlighted. Campolo experienced a decades-long struggle with the problem of evil. This experience holds significant evidential power, and can tip the scales against just about any collection of pro-theism evidence, perhaps with a few exceptions. Campolo mentions in the podcast that while ministering in the inner city, he saw horrible things happening to people, including children. He prayed and prayed, with no noticeable results. I can understand how his faith eroded over time in such a milieu. And I don’t know that McDowell ever experienced a “storm” of comparable magnitude. So, that’s a difference that should factor into the explanation.
In sum, stories are complex. Everyone’s evidence base is different, necessarily. Different evidence supports different beliefs. Still, I think the contrast between Campolo and McDowell illustrates the importance of a diversified evidential portfolio, if you’re wanting a stable belief set.