Logical Mistakes, Part 2

straw man, logic, fallacy

Superman, fallacy, logicLogic is like a superpower. Without logic, we are like Superman exposed to Kryptonite: vulnerable to attack and without our X-ray vision. But when you begin to acquire skill in logic, you experience the powers natural to a flourishing human being. You won’t see through walls, but you’ll see through manipulative commercial and political advertisements. You can’t bounce bullets off your chest, but emotional appeals without logic will be useless against you. 

For example, many “attack ads” exaggerate negatives and omit important details in order to sound more persuasive. Also, most ads for consumer products use sentimentality and humor to give you a warm fuzzy feeling about their product, even though those things are irrelevant to quality or value of the product. Here’s an example I just found:

When you understand basic logic, this sort of manipulation has little effect on you. It’s because you understand that warm fuzzies are not a logical reason to believe in the quality of a product. You can spot distraction and misdirection.

Bad Logic as Exploitation

Freud, logic, fallacy, advertisingIt’s important to realize that bad logic isn’t always the result of carelessness or ignorance. Sometimes fallacious reasoning is used intentionally to exploit you. I heard a great story on NPR  a few years ago about how a man named Edward Bernays, a nephew of Sigmund Freud, used Freud’s ideas to pioneer the public relations/advertising industry. Essentially, he learned from his uncle that people aren’t motivated by reason and rational argument, but rather by unconsious, primitive desires and repressed sexual urges. Thus, the way to successfully market a product involved bypassing the rational and appealing to the unconscious, irrational and emotional. Sound anything like today’s TV commercials?

So, if you want to be able to (1) avoid making logical errors that often lead to false beliefs, and (2) resist the manipulation of money-driven media, then read on! (See Part 1 here.)

Eleven Logical Mistakes, #7-11

#7 Traditional Fallacy

X has always been done, so X is true and good

This mistake is grounded in what some call the “Is/ought fallacy.” Suppose we see a blue ball on the ground. Wouldn’t it seem odd to look at that ball and say, “That ball, and in fact all balls, ought to be blue.” You can’t simply look at the way things are and say, “that’s how they should be!”

Tevye, tradition
Tevye (Fiddler on the Roof) wrestled with tradition

In the Christian realm, tradition weighs in heavily. But we sometimes become lazy and let tradition become a substitute for thinking. “That’s the way it’s always been done” doesn’t mean it should continue. In the Bible, we often forget the distinction between description and prescription. Actions are often described, in story form, with no intention of condoning or recommending those actions. Sometimes people in the Bible do bad things! One example worth considering is patriarchy. It’s true that nearly the entire Bible is written within the milieu of patriarchy, or male rule. But does the Bible actually teach and command that this is good and right? That’s a tricky question and needs careful analysis. It should never be assumed that, because Abraham and Paul did it that way, that we should do it that way.

#8 Confirmation Bias

Anything that supports my view is good/true.

Kavanaugh, hearings, logic, fallacy, confirmation biasThis mistake held center stage during the recent Brett Kavanaugh confirmation discussions. People who were already for Kavanaugh saw only heroism and persecution. People who were already against Kavanaugh saw only emotional intemperance and evasion. I’m not sure anyone’s viewpoint was affected by the precedings. This is because human beings are naturally disposed to pay almost exclusive attention to what supports their beliefs.

Not only this, but confirmation bias makes us more likely to endorse bad arguments, as long as they support our view. When a Christian hears something like, “Well, billions can’t be wrong!” she should cringe, not applaud. An argument is not good simply because it supports your view.

Even when we seek out additional evidence and research a debate, we tend to latch onto evidence that supports our view and ignore it when it opposes us. We need to be extra vigilant in our thinking and research so that we can avoid this tendency.

#9 Either/Or Fallacy

Claiming there are only two options when, in fact, there are more.

In the “conflict” between religion and science, we often hear only two options. “Either you believe science and reject religious stories, or you reject science and turn off your brain!” In truth, many subtle positions exist in between these two extreme viewpoints.

Nazis, false dichotomy, logicChristians often portray morality this way: “Either God exists or there is no objective morality and we’ll all become nazis!” Again, there are atheists who are very good people, occupying that middle ground. Pushing people to choose between your view and something ridiculous is manipulation and bullying, not logic.

#10 Bad Appeal To Authority

“Jones said X is true, therefore it is true!” (But Jones is not an expert on X.)

Legitimate appeals to authority can serve as evidence or reasons to believe a claim. “Supreme Court Justice Ginsberg says that this position is unconstitutional.” She is an expert and her opinion can be taken as evidence in favor of that view. But the appeal to authority can go wrong in several ways.

First, it is fallacious to claim that the authority’s word settles the issue. It is only one (weighty) bit of evidence. Not a substitute for thinking and reason.

Peyton Manning, authority, logic, fallacySecond, when the authority quoted is speaking on matters outside his/her expertise. Pastors sometimes make this mistake in speaking authoritatively on science. Similarly, scientists do it when they speak authoritatively on religion. Or if you quote someone else who isn’t an expert in that field, like saying, “Peyton Manning says this camera is the best!

#11 Straw Man Fallacy

Misrepresenting your opponent’s view in order to mock or easily refute it.

Bill: What are your views on God?
Ted: I don’t believe in any God.
Bill: So you believe we are just products of evolving pond scum and live in a self creating universe?
Ted: I didn’t say that.
(Example courtesy of Colin Burgess.)

straw man, logic, fallacyIn the “old days,” if you didn’t like someone (say, a politician or football coach), you would make a scarecrow, put the person’s name on it, and burn them publicly in effigy. Why? Because it is much easier to burn a scarecrow/straw man than the actual person. Straw men don’t fight back! But this is a sneaky or ignorant tactic to use in a debate. By constructing a flimsy caricature of your opponent’s view, it is much easier to refute it and make it look ridiculous. Instead, try constructing a “steel man” of their view — the best possible version of it — and refute THAT. Then you’ll have accomplished something.


The only way to avoid these mistakes is with humility and vigilance. Humility helps us consider our views and our words more carefully, because we know how prone we are to logical error. Humility helps us focus more on the arguments and ideas, rather than egos.

vigilant, logic, fallacyVigilance helps us keep our mental eyes peeled so that we can spot the mistakes we make and prevent or correct them. Sometimes when I teach, just to correct for my own biases, I intentionally skew my presentations the opposite way or slightly favor the students who disagree with me.

If you want to build up your logical immune system, read a book on reasoning or visit online resources. I recently enjoyed (and highly recommend) Alan Jacobs’ book, How To Think. One of my favorite go-to websites for building critical thinking skills is the Critical Thinking Web. Leave your own suggestions in the comments below.

Burden of Proof, Pt. 2

beer, bar, burden of proofAn atheist, an agnostic and a Christian walk into a bar. For real. I sat in the Bird Dog Bar in Lawrence, Kansas with my fellow panelists from the “Beliefs Matter” event at the University of Kansas. Friends of various religious and secular persuasions surrounded our table. The event, completed only an hour earlier, featured three distinct perspectives on meaning, justice and morality. We each presented a short sketch of our view, followed by about an hour of Q&A. Now we continued the conversation over drinks. But what is the take away from all this, and how is it relevant to the burden of proof?

One thing I hope people take away from such an event is this: everyone has a worldview, and every worldview must stand or fall on its own merits. No worldview gets a “pass;” there is no true “default” view. What is a worldview? A worldview encompasses a set of beliefs about reality—a set of answers to the big questions. Is there a God? Are humans more than physical matter? Are there objective moral truths? Do human beings have objective value and purpose? Why is there suffering? What is justice? Each of the panelists at our event answered these big questions from their unique perspective and defended the coherence and rationality of their answers.

The Playing Field Is Level

So if all worldviews stand on equal footing, then where does the notion of “burden of proof” come from? I think it comes into play when one person tries to persuade another. (Or perhaps when you are called to give a defense of your view!) So here’s another plausible principle:

If you are trying to persuade someone of something, you (probably) have a burden of proof.

sweetener, burden of proof(I add ‘probably’ because there are exceptions.) If I’m trying to convince you that artificial sweeteners are bad for you, and you say, “Why do you believe that?” it would be inappropriate for me to reply with, “Well, why do you believe they aren’t?” But if we were chatting with a mutual friend who brought up the issue of net neutrality, and you and I took opposite views, we could both be expected to explain the reasons for our positions. Or better yet, if you and I were wondering whose worldview was more coherent, we would both need to provide a reasoned defense. The last two examples don’t involve just one person trying to persuade another. Only in the first example does there seem to be a burden. I’m relying here on intuitions about what seems appropriate in conversation.

Positive Claims

Some people think that the burden of proof lies with whoever has the “positive claim.” But this is clearly not the case. If I hold the view that trees exist, and you hold the view that they don’t, the burden of proof would still be on you. Why is this? I think it reveals another relevant principle:

If you hold a view which goes against common sense or against the consensus of experts, you (probably) have a burden of proof.

trees, dendrologists, burden of proofTree-denial is both against common sense and against the consensus of relevant experts, which I assume to be dendrologists. If you think evolution is false, the burden is on you, since the consensus of biologists are against you. Theism would not receive a burden of proof on this principle, because it is neither against common sense (the vast majority of people who have ever lived have been theists), nor against the consensus of relevant experts. Who are the experts? Philosophers of religion would be the natural answer, since they study reasons for belief in gods and need not assume theism in their work. A recent survey (2009) among philosophers of religion found that 72.3% accept or lean toward theism.

Additionally, (in the case of trees) the burden of proof is squarely on the tree-denier, despite the complaint that “you can’t prove a negative.” Remember that we’re using the term ‘proof’ here very loosely. Nothing can truly be proven outside of mathematics, geometry, and symbolic logic. You need not “prove” your view—you only need to construct a solid argument for it. And this can certainly be done for negative statements. For example, “there are no dinosaurs in this room,” or “no triangles have four sides” are both easy to argue for.

What I’ve said so far is intended to answer the “Is atheism the default view?” question. It should be apparent that I don’t think any view enjoys true “default” status. Everyone should believe according to their evidence—the total evidence they have now, including inferences and experiences. There’s nothing neutral or default about atheism. The only view that might be considered “default” would be to withhold (sometimes called agnosticism).

Not Enough Evidence

So what about the other questions from my last post? Let me tackle one more and save the last two for another post.

  • Is “not enough evidence” a good reason for atheism?

scales, evidence, balance, burden of proofIt depends. On my view, you should only believe a proposition when you have more reasons in favor of it than against it. The greater the imbalance, the higher your confidence should be. If you have equal reasons on both sides, then the correct position is to withhold—neither believe it nor disbelieve it. So, if you have lots of reasons against theism (the problem of evil, for example), and you have no reasons for theism, then the “not enough evidence” claim is appropriate. But if you don’t have good reasons for theism, and you don’t have good reasons for atheism, then you can’t use the “not enough evidence” defense for your atheism.

Should We Remain Open to New Evidence?

open evidence ground beliefI filmed a short commentary to respond to something I came across a few weeks ago. Randy Helzerman posted a video response (in 2007) to William Lane Craig’s claim that Bayes’ Theorem can be employed to argue for the resurrection of Jesus. Here’s Helzerman’s video: (if you don’t want to watch the whole thing, maybe try starting around 4:23)

I’m not certain that Helzerman is an atheist, but he plays a great Devil’s advocate if not. I’m also not sure whether he’s saying that it’s a psychological fact that atheists cannot entertain new evidence, or that they shouldn’t entertain new evidence. Interesting either way. So here are my thoughts:

Here’s my post on Cromwell’s Rule.

Ground Belief Podcast #2 with Mark Swanson

My first ever attempt at a podcasty thing. I “interviewed” Mark Swanson, Associate Professor in the MU School of Journalism. Mark is also the feudum game critical thinkingcreator of Feudum, a new table top “Euro” style strategy game. Mark and I talk frequently about how complex board games require and develop critical thinking skills, and that’s the subject of our conversation on this “podcast.”  This is part 2 of the interview — part 1 is here. The audio quality isn’t great, since we recorded the whole thing completely on a whim using my iPhone. If you like board games, nerds, and the psychology of critical thinking and game play, take a listen.

Since this is my first attempt at podcasting, I would  appreciate your feedback!

What To Listen For

  • Do strategy board games rely more on System 1 or System 2 type thinking?
  • Mark talks about his preferences for the intuitive approach to games and other kinds of problem solving.
  • We discuss how a person’s intuitive talents might be developed, not just in games bu in art and music as well. I reminisce a bit about my jazz saxophone days.
  • I ask Mark about how game play could help older people maintain their mental sharpness, similar to other kinds of games found on popular websites.
  • We talk briefly about how games are a microcosm of life.
  • We discuss the best “Gateway” games.
  • Mark muses about how becoming a game developer has brought about growth in other areas of life.




How I Believe

believe belief think rationalBelow are 21 statements that form the basis for my own epistemology: how I believe. I’ve tried to avoid technical, philosophical language wherever possible, but it might still sound clunky to some readers. The sub-points, also numbered, offer something like an example of the claim. (Omitted from this post, for the sake of space, is any discussion about updating beliefs based on new evidence.) If you love this topic and want to go deeper, click the links.

Here’s the challenge: Read all the statements, see if you disagree with any of them, then tell me why. Refer to sub-points as “6.1 or 14.3.” Let’s avoid technical nitpicking and focus on substantial differences. I’m open to suggestions for revision. I think these can have important implications for what you believe and helping us clarify how we talk about our beliefs, regardless of your worldview.

  1. A claim is expressed by a descriptive sentence like, “Bananas are fruits,” or “Triangles have three sides,” or “Unicorns do not exist.” (Philosophers like to call these ‘propositions.’)
  2. It is irrational to believe a claim without any evidence to support it.*
  3. If my overall evidence is strongly against a claim, then it is irrational to believe it.
  4. If my overall evidence strongly supports a claim, then it is rational to believe it.
  5. If my evidence for and against a claim is (roughly) even, then the most rational thing is to remain undecided (or “suspend judgment” or “withhold belief”) .
    1. The evidence for and against Jake’s guilt is even, so I don’t know what to think.
  6. If I have no evidence for or against a claim, then the most rational thing is to remain undecided.
    1. I am undecided whether polar bears enjoy raspberry sorbet.
    2. I am undecided whether there is a largest prime number.
  7. believe belief think rationalIf I’m undecided on a claim, then I think the chances of it being true or false are roughly even.
    1. I’m undecided on whether this coin will land on heads or tails.
    2. I’m undecided on whether there are twelve ants on that plant.
  8. I can believe a claim without being 100% convinced it is true.
    1. I believe that my car will last another 2 years.
  9. I can disbelieve a claim without being 100% convinced it is false.
    1. I disbelieve that I will be in a car accident today.
    2. (It is more common to say “I don’t believe that I will be . . .,” but for clarity, we’ll say ‘disbelieve.’)
  10. If I’ve never thought about a claim before (or if I hear a claim that I don’t understand), then I neither believe it, disbelieve it, nor remain undecided. I have no position on it at all.
    1. Prior to typing this sentence, I had no position at all on the claim that there are twelve ants on a plant in my front yard.
    2. I have no position on whether all blorgs are quazzies.
    3. If I have no position, then I have no idea whether there is evidence for or against the claim.
  11. Believing that a claim is false is equivalent to disbelieving that it is true.
    1. I believe that “Tom is a kangaroo” is false. I disbelieve that Tom is a kangaroo.
    2. I disbelieve that triangles have four sides. I believe that “Triangles have four sides” is false.
  12. If I am rationally undecided about a claim, then I neither believe it nor do I disbelieve it.
    1. I am undecided whether the number of stars in the universe is even. Thus, I do not believe it, and I do not disbelieve it.
  13. Having a certain belief means that I believe in a certain claim.
    1. I believe that green is a color. I have the belief that green is a color. I lack the belief that green is a number.
  14. If I lack a certain belief, then either (i) I have never considered the claim, (ii) I disbelieve the claim, or (iii) I am undecided on the claim.
    1. Prior to typing this sentence, I lacked the belief that my dog understands Klingon. I had never considered whether he did or not. Now I disbelieve that he understands Klingon.
    2. I lack the belief that green is a color. I disbelieve that green is a color.
    3. I lack the belief that the number of stars in the universe is even. I am undecided about this claim.
  15. My experiences are a part of my evidence.
    1. My experience of seeing the legal pad as yellow is evidence for the belief that it is yellow.
    2. My experience of Cassie being friendly is evidence for the belief that she is friendly.
    3. My experience of my own thoughts is evidence for the belief that I exist.
  16. conversation testimony believeThe testimony of others is a part of my evidence.
    1. Clark telling me that his shoes fit well is evidence for the belief that Clark’s shoes fit well.
    2. Julia telling me that she ate toast for breakfast is evidence for the belief that Julia ate toast for breakfast.
    3. Montgomery-Smith telling me that there is no known solution to Goldbach’s conjecture is evidence for the belief . . . (you get the idea).
  17.  My memories are a part of my evidence.
  18.  My perceptions are a part of my evidence.
  19.  My inferences are a part of my evidence.
    1. My belief that it will probably rain is supported by other beliefs (there are dark clouds outside, the temperature has suddenly dropped) and a logical inference that is made from them.
    2. My belief that every human has a mother is supported by my beliefs about human reproduction and a logical inference that is made from them.
    3. My belief that all triangles have three angels is supported by my belief that all triangles have three sides and a logical inference that is made from it.
  20.  Evidence can be misleading.
    1. Sometimes we remember incorrectly, misunderstand testimony, make faulty inferences, or have perceptual hallucinations.
  21. We should trust our evidence unless we have a good reason to doubt it.
    1. A good reason to doubt my evidence is either (i) that there was a problem the source of the evidence, or (ii) an independent reason to think the evidence-belief is false.
    2. frog evidence perception believeMy experience of a frog in front of me is evidence for the belief that there is a frog before me. A good reason to doubt my evidence is that I recently took LSD, which makes perception unreliable. (It’s possible that there’s still a frog there.)
    3. My daughter telling me that the door is locked is evidence for the belief that the door is locked. A good reason to doubt this evidence would be that I tried the door myself and found it unlocked.
    4. Note that it would be circular reasoning (or mere contradiction) to claim that my evidence for the frog is bad because there is no frog in front of me, without an independent reason to think there is no frog.

* Some philosophers have argued that if a belief is formed automatically by my brain in an appropriate way (the way brains should work), then that belief is a good one, even without anything that resembles evidence in the usual sense.

3 Reasons Why You Love Click-bait

mouse trap baitClick bait. The impossibly enticing headline. We love it the way fish love . . . whatever it is they love. (I’m not a fisherman.) Maybe like proverbial mice love the cheese in the trap. But the allure of click bait isn’t that visceral, like some leftover of evolution. It is intellectual, or at least cognitive. We bite on those juicy stories because they give us something our minds crave. I admit it—I feel the pull of those tabloid headlines when I’m standing in the check-out line, or scrolling to the bottom of a news feed. I think there are at least three reasons we love click bait.

  1. They tell us what we want to hear. Some people call this “confirmation bias.” We reach for and swallow these stories unchewed because they confirm our precious beliefs. Of course, the stories may (luckily) be true and actually lend support to our worldview, but to wolf them down like so many children of Chronos slowly corrupts our crucial ability to think critically. Still, the allure of stories that play our favorite ideological tunes is powerful.
  2. tabloid headlineWe’re suckers for sensational headlines and images. It’s why tabloids sell. We just have to know whether Angelina Jolie really had a bat-baby or whether that dolphin grew human arms! The possibility of the macabre and fantastical is magnetic. “Abraham Lincoln was a woman??!!”
  3. We forget that media outlets are businesses. If we kept this fact in mind, we would find those headlines far less appealing. Imagine the 19th century American villager titillated by the flashy, handsome snake-oil salesman who rolls into town. He looks impressive, and his claims are magical. But now imagine that same villager who later discovers that he was sold a bottle of sugar-water with no medicinal effects whatsoever. When that salesman rolls back through his town, his new skepticism shields him from the mesmerizing show. The bait no longer allures.

Consider two recent stories that generated plenty of clicks. One story explained how scientists have discovered genetic links between modern Lebanese and ancient Canaanites. The Canaanites were famous for being the bible archaeologyunfortunate victims of ancient Israel’s attempts at genocide in Palestine—well-documented failures. But despite this readily available information, the typical headline read “Scientists Disprove the Bible.

Another story exaggerated with equal flare. This one detailed how archaeologists found evidence for the burning of Jerusalem 2600 years ago. But “found some evidence for X” isn’t nearly as sexy as “proved X!” One headline read “Biblical Event Proven TRUE” and another actually announced this as proof of God’s existence!

So why do we love to click on these stories? Run each one through the three reasons above. Some people desperately WANT them to be true. Or the claims are so outrageous that we have to see if they’re real, like a carnival freak show. And most of us think of news outlets, scientists and religious folks as automatically worthy of trust, forgetting that the headline is probably the product of a business plan, rather than top-notch journalism.

How do we develop a healthy attitude of mild skepticism that will slow down our mouse button just enough to let a little evaluation squeeze into our media consumption process? For me, it came through trial and error. I bought enough snake oil and ate enough crow over the years that I became wary—wary enough to get some training and education to slow down my thinking a bit. Our brains are somewhat wired to do this:

belief process

But I know that I have to consciously discipline my mind to do this:

belief process 2We have to work at slowing down our cognition enough to squeeze in this extra step. I confess that at least once, I shared an article on Facebook without even reading beyond the headline! (I’ve also shared a few “fake” things as a joke to see who would jump straight to acceptance.)

judgeI should mention that you won’t get it right every time. You’ll get fooled occasionally, even if you’re careful. This happened to me just a couple weeks ago when I shared an article that looked totally unbiased, only to have a friend point out that the source, which was extremely biased, had “disguised” itself in the post. The goal should be to do our “due diligence” in thinking things over, passing judgment in the same circumspect manner you would want a judge to decide your own case. Have you seen a “click bait” headline recently? Share it in the comments.

How to Lose an Argument

I only hate losing when it comes to things I’m good at. I’m happy to concede a basketball game or a tennis match. But I hate losing arguments. Since childhood, I’ve relished a good adrenaline-surging verbal exchange. It’s fight argumentprobably one part genetic, one part environment. You know how most families have a variety of personality types who complement and balance one another? My parents , me and my sister were all hyper-assertive, stubborn fighters. You adapt to survive. You learn to like it.

My wife, on the other hand, hates conflict. So that has been challenging. And just as she thought I might be mellowing out a bit, I went and got a PhD in arguing. I probably laughed more than most people when I read humorist Dave Barry’s essay, “How to Win Arguments.” He writes,

I argue very well. Ask any of my remaining friends. I can win an argument on any topic, against any opponent. People know this, and steer clear of me at parties. Often, as a sign of their great respect, they don’t even invite me. You too can win arguments.

Barry tenders such advice as, “make things up,” “use meaningless, but weighty-sounding words,” and “use snappy and irrelevant comebacks.” And though Barry’s piece is satire, he gets the point across in a witty, indirect way: don’t be an ass. I wish I could say that I’ve reigned in my argument demon sufficiently and in time to avoid passing it along to my children. Now I’m just hoping one of them becomes a cutthroat attorney and cashes in on their pugilistic legacy.

argumentBut I suppose I have made a little progress, halting though it’s been. So here’s a bit of what I’ve learned. First, stop trying so hard to win. Good arguments don’t have to be winning arguments. As soon as you turn it into a battle or a competition, someone has to lose, and there is no one harder to persuade than the person reeling from your verbal violence. And isn’t persuasion what we’re after?

A little terminological clarification might help illuminate things here. The word ‘argument’ carries multiple meanings. One is relational. It is a (hopefully civil) discussion between two or more people who disagree about some point. Despite what you see on TV, or on the hidden cameras you’ve installed in my home, arguments don’t require yelling, insults and emotional outbursts. The other sense of ‘argument’ is a collection of ideas, put together in a logical way to support a conclusion. So, in this sense, we don’t have an argument with someone, we give an argument to someone. And since logic and truth reign supreme in a good argument, emotions, egos, and agendas must be set aside as much as possible. (Now, if you are a particularly adept ass, you can throw this in people’s faces. There’s nothing a highly-aggravated person loves to hear more than, “calm down!”) Here are some tips that promote good arguments, in both senses:

  • Believe you may be wrong – the foundation of all helpful dialogues; don’t even bother without it
  • Take the other person seriously – respect, seek to understand
  • Be a truth-seeker – are you honestly after truth, or just out to make your point?
  • Do your homework – don’t just expect people to take your word for it; have some evidence or research to back it up
  • Assertions vs. arguments – just saying “Joe is a moron” is not an argument
  • Know when to walk away – if you or they get too angry or disrespectful

SocratesIf I could offer just one word, it would be humility. Humility of character and intellect. Socrates himself claimed that what set him apart from other people was that he alone realized how little he knew. Perhaps this is what motivated his method of asking questions (often called the “Socratic method”) in a dialogue rather than launching into a lecture or a monologue. Because of his humility and his logical, conversational approach, Socrates persuaded people. Well, except for the Athenian officials who executed him.

crispy foodSo instead of trying to win, think about persuasion as your goal. Just as “revenge is a dish best served cold,” persuasion is a dish best served warm and crispy with tasty seasonings and a colorful garnish. It may be cliché to talk about “win-win scenarios,” but when you and your conversation partner move closer to the truth through argument, everyone wins.