The Power of Puns

What do you call a blind dinosaur? A doyouthinkhe-saurus.

Puns hold great power. But it is a pernicious kind of power and one that should be avoided.

John Cleese (of Monty Python Fame) maintains that there are three inviolable rules of comedy:

  • No puns
  • No puns
  • No puns

Now it's possible his objections are entirely aesthetic. Comedy really is a matter of taste. But I think there is something to it. While some people do enjoy the humour in puns, there is an inescapable groan-worthy quality to them that few would deny—it's just a different kind of funny.

I was thinking about this when I was thinking about the "Odd Problem of Sugar Cubes". I first encountered this problem in a Linear Algebra course at the University of Toronto from a professor who, he reported, learned it in another Linear Algebra course he attended when he was a student at the University of Toronto. So forgive me if you've heard this one before:

How do you divide thirty cubes of sugar amongst three cups of coffee such that each cup contains an odd amount of sugar.

This is followed by the usual brain teaser caveats like, "no breaking sugar cubes in parts" and "no banishing cubes of sugar into alternate dimensions".

The "correct" answer (if that word is to have any coherent meaning) is that the task is impossible. An odd number plus an odd number is necessarily an even number and an even number plus an odd number is necessarily odd. So solutions only exist if the total number of sugar cubes is odd.

However, there is another answer:

You place one cube in the first cup [odd], one cube in the second cup [odd], and then twenty-eight cubes in the final cup, which is an odd quantity of sugar cubes to have in a cup of coffee.

Unsatisfying? Yes!

While this may pass as a minor intellectual joke and you can respect the ingenuity and cleverness behind it, the answer strikes all who hear it as cheating. But is this a legitimate objection? Doesn't it only matter if the solution is effective or not?

The answer is, "Yes, this is cheating!" precisely because the solution does not work. One of the clearest signs that this does not work is that if you translate the problem into another language besides, it becomes insoluble. And intuitively we know that if the problem is about properties of objects in the real world, it is that they are independent of language. The properties of a sugar cube and a coffee cup should be invariant under translation from English to French.

There is a Chinese proverb which states:

The beginning of wisdom is to call things by their right names.

There is actually a rule in making a logical argument that you are not allowed to alter the definition of words you use in the middle. To break that rules is to commit the fallacy of equivocation. The idea that words don't change their definitions over time is central to all kinds of argument. If A implies B and B implies C, then A implies C. But this is only true if someone hasn't made sneaky modifications to what we mean by B in the middle. Otherwise the whole enterprise breaks down.

Think about how you would feel if you had a significant other that promised never to cheat on you, was free to be flexible with the definition of cheat.

A rather goofy example appeared recently in Zach Weinersmith's Saturday Morning Breakfast Cereal:


If you want a truly frightening example watch this excerpt from a recent episode of Last Week Tonight: 

If you skip to the 13:30 mark, you'll see someone try and claim that "Un huh" doesn't strictly mean "yes" so that they can deceive an insurance company to pay for prescription opioids.

This is not to try and advocate for linguistic prescription where you have a committee which decides once and for all what words must mean and resists any effort to change them. Wicked can occasionally mean something good or something bad, and that definition can change—but not in the same conversation.

But this kind of strategic redefinition of words is not about any kind of intellectually honest problem solving. It's more about cherry picking and massaging the facts to agree with a pre-determined conclusion.

What brought this on?

The appearance of the comic and the newscast coincided with coming across a solution to the sugar problem which actually works. (Thanks to Richard Wiseman's 101 Bets you Will Always Win.) Instead of messing with the definition of "odd", we mess with the definition of "coffee cup" to be a paper cup (as at Tim Hortons or Starbucks) so that one cup can fit nested inside the other. Arrange the cubes so that you have two cups containing odd numbers then one containing an even number. (The 1-1-28 from above works here.) To solve the problem lift one of the odd cups and place it inside the cup with the even number of cubes. Now this cup contains an odd number (albeit with a cup in the way) and so does the one above.

It's not a funny solution but it is an effective one.

The Lesson

When trying to engage in problem solving and critical thinking, the solutions you explore need to be grounded in reality. There is a (very dangerous) kind of magical thinking where we can be led to believe that by changing the name of a thing, we can change the properties of a thing. In the real world it's not possible to define your problems away.

Along these lines I was shown this "math" problem (taken from this clip by #Mind Warehouse).


The solution (in case you're not willing to watch the video... spoiler alert) is that the "equals sign" is counting the number of holes in the numerals in the numbers. (0000 = 4, 1111 = 0). Setting aside the enjoyment and satisfaction you might feel in treating this as a brain teaser and staring at it for an hour or two, as a "problem to be solved" it fails to grasp how numbers actually work. The shape of the numeral isn't a meaningful property of the number. In fact, it's entirely arbitrary—an accident of history—what shape they take. The shape of the digits in 200, is meaningless. That's just the way we've agreed to represent, as a convenient shorthand, a pile of two hundred somethings. The answer changes if you change to a different alphabet (or if you insist, "Roman Numerals").

You can't pursue this kind of thinking in real world problem solving. Once you do, you may as well just redefine bankruptcy as victory and borrow your way to a successful business.

Mystery Solved

On my second day at the University of Toronto in a course called Introduction to Proof (which really was a life-changing course that I heard they stopped offering) the Professor gave this question (actually a variation with 100 people and no aliens) and (owing to the fact that all math teachers are inherently creatures of pure evil) neglected to provide the answer.

In the dozen or so years it's been this is the first time I've seen that problem and so here's the answer. Now you don't have to wait quite as long as I did.

Logic is overrated

I was having a discussion with a person of faith who was trying out the latest intellectual arguments for demonstrating that Christianity is the world's one true religion and I was trying to explain to him (shockingly) that logic is overrated. As a brief bit of catch-up for those who don't keep up with this sort of thing, the current trend is to use pseudo-logical arguments. They're decades past the point of trying to use empirical evidence to make their case. They've quietly admitted there is none and moved on. God doesn't interact in the world in any measurable way. No studies have ever produced any evidence that intercessory prayer has any benefit above and beyond placebo effects. And the wrath of god we're all supposed to be afraid of doesn't exist. Natural disasters and diseases deploy in an entirely materialistic way. If god were using tsunamis and earthquakes to wipe out sinners, his aim is terrible and he winds up obliterating innocent children in locations geographically unrelated to this sin in question. And no one has ever been able to provide the slightest hint as to what souls are made of, where they go after death (is heaven in outer space; a very spacious hotel on the far side of the moon?)

The new trend is to talk about vague ill defined philosophical concepts. In particular this conversation revolved around how "atheists can account for things like morality, value, purpose and the laws of logic without a God?" [The Christian] can provide reasons for why it's wrong to kill people

After a period of interacting with these arguments, it becomes clear that these aren't here to convince anyone. These "arguments" exist to provide busy work for your interlocutor to make it more difficult to get out their own arguments so a believer can sit back and believe in peace. What really gives this away is the part about the laws of logic. The argument, expressed in its most blunt and cheeky form is I don't have to consider any of your arguments until you can explain to me how logic works from first principles. 

There are two problems with this question. A little bit of science will show that humans have existed as a species for something on the order of a hundred thousand years. Plus a few million years before we shared a common ancestor with modern day apes. Depending on how you want to pin down "logic", what we currently use as the laws of formal logic are between 300 and 3000 years old. So at the most conservative (3k to 100k) it took 97% of human history to work our way up to something called logic. Asking someone you just met to explain it to you quickly and for free is slightly rude. It's sort of like having someone pour you a cup of tea, quietly pouring your tea back into the teapot then demanding indignantly to know when tea will be served.

The other immediate problem is that humans clearly accumulated a great deal of knowledge in those 97+ thousand years. Where is this knowledge coming from if these "people" aren't thinking logically? The answer to that question goes a huge way to answering these "unanswerable" questions posed by the religious. A large part of it is the privileged position given to "logic" in explaining how we know what we know. If it turns out

There are a few philosophical background items to deal with before going further. Patience grasshopper!

Abandoning Platonism

Platonism is the belief that abstract concepts "exist" in some metaphysically necessary way. If you ask a normal person whether it's possible to touch "the number eight" in the same way it's possible to touch "an apple", most will misinterpret the question and assume you're talking about a piece of paper with the number eight written on it or a pile of eight grapes you can hold in the palm of your hand. If you ask a Platonist, they will say yes. Although most philosophers to day will agree they are wrong. There doesn't appear to be a literal sense in which "the number eight" or "perfect circles" or "objective moral values" exist in that sense where they have a material composition and act causally in the universe. "Two" is a useful word to have in the sense that it's the trait in common shared by a pair of rocks and a pair of velociraptors but it does't have existence in the literal sense.

Platonism sneaks in in subtle ways into religious arguments. The presumptions that "moral values" or "objective purposes" exist in a Platonic sense which would imply that someone has to have created them. But if you reject their Platonic existence, which seems reasonable, then you lose the need for them to be "created".

Most religious people are Platonists without realizing it. You can tell by the way they use language and construct arguments. Abstract concepts are constantly being treated in logical terms as though they were magical energy fields - the kind of bright glowing goo from a fantasy films that leeks out of the magical crystal and slips in through the eyes creating demon possessed villain. Good and evil are often spoken of as entities or forces with anthropomorphic properties, or physical objects that you could eliminate. It were was though evil were something you could physically extract and shoot into the sun the way you filter bacteria out of your drinking water.

Abandoning Essentialism

Essentialism is the compulsive sorting of things in the universe into non-overlapping classes. Intuitively we want to think of a person as a "good person" or a "bad person". Ellen is a "good person" and Hitler was a "bad person". But when someone forces you to step back and accept a particular individual as a complex collection of attributes some of which are "good", "bad" or "neutral" and you realize that your definition of "good person" is really just an arbitrary cutoff of for having "enough" good characteristics and our ability to get over the "bad ones" and we'd have a really hard time trying to defend our choices rationally. We usually recoil and say "you're just quibbling over semantics" and we try to push the conversation forward on our ill defined terms so we can go back to the comfortable black hat/ white hat universe we started in.

Most people, myself included, find this kind of thinking emotionally unsettling. Deep down we have a strong desire for everything to be fit-able into a clearly labeled box. We want to know "that's a conservative", "he's pro-choice", "she's a determinist". People who are difficult to classify tend to irritate us. Adam Rubin recently posted a long-form interview with a gay conservative. Our essentialist instincts demand that such a is a contradiction in terms like a married bachelor. But he's there and his opinions are real and that instinct, while it might be generally useful, isn't guaranteed to be true.

Richard Dawkins provided his reasons for why essentialism is a pernicious bias in his answer to one of the 2014 Edge Questions. You can read his full article for free here.

Unnecessary visual aide.

Not being comfortable with this kind of thinking leads to both edges of the "slipper slope" sword. The first of which says that if we have one opinion about a situation on one point of the slope, we will be incapable of having a different opinion about another point somewhere else on the slope. The other side of which is simply to deny that a slope exists and to claim, a priori, that the world exists as a set of discrete flat plateaus like a Mario platform level.

To be able to reason about the world, we need to become comfortable with this reasoning on a sliding scale. Glenn Gould was a pianist. If ever there were a definition of pianist, Glenn Gould would have to be in there. (I chose this example because I recently performed at an event at the Glenn Gould Studio in Downtown Toronto where every single room backstage contained at least one piano.) But is a high school student taking piano lessons sitting at a keyboard also a pianist? Clearly being a pianist is a definition on a sliding scale. But is Glenn Gould more pianist than the high school student or are they equally pianist-ic but of different kinds? What about a cat who steps on to the keys of a piano? Clearly not a pianist because there's no intent of making music. But on a continuous slope you can find a point between any two points, so what about a toddler who wanders by a piano and strikes some keys to experience the novel sounds? He's creating music, albeit very unskilfully, so can we call him a pianist? You're not going to be able to decide based on the quality of the music produced because art is subjective.

Our inability to demarcate the exact line where pianist-ness ends doesn't force us to say that any life form that happens to be in the same room as a piano is a pianist.

When you combine this non-essential thinking with our freedom from platonism, you can see that logic and reason aren't metaphysical entities that can be delivered through revelation (say by listing a set of logical axioms on a stone tablet delivered on a mountaintop). The logic and reason being discussed when we are asked to justify logic and reason are words that aren't pointing to anything real in our universe. They're like philosophical unicorns, things we can picture in our minds but aren't really there.

Abandoning Teleology

Teleology is about planning or end goals. The unstated premise is that things can only work if they were constructed deliberately by a conscious agent with forethought. Minds order the world around them that would otherwise be complete chaos. Interesting things can only be the result of thoughtful planning. Nothing good (and good can have any manner of wishy washy definition here) can happen by chance.

This almost sounds reasonable but it's the result of a rather simple fallacy. The arrow of inference is simply pointed in the wrong direction. It's true that if you engage in thoughtful planning, you can construct some pretty impressive stuff (stuff and construct could be building ships, writing books, painting paintings). If you invert the arrow (which is something our minds do quite readily because we don't readily differentiate between correlation and causation) then the argument appears to be, if something appears impressive (think watchmaker analogy) then it was the product of design and planning. And if you take the baby step of mushing the two together, you get the unwarranted stronger statement impressive things can only come to be as the result of design. It's the can only part which allows you to make the leap that there must exist a designer. Otherwise you only get the much wimpier there could be a designer out there maybe.

But saying design works as method for creating complex system is a far cry from establishing it as the only way. If a meal is tasty does it matter whether or not it came from a chef with decades of experience in a high end restaurant, or a ten year old who threw a bunch of ingredients into a pot and got lucky? We can say things about which process is more likely to produce tasty food. We don't get to claim that tasty food can only come from high end chefs.

But that's precisely the argument I encounter constantly. It's a prior assumption of difficult questions require complex top down magical answers. The only way to have purpose is for someone from up above to sprinkle magical purpose powder and in the absence of such powder we are forced to declare life empty and purposeless. It's actually an overlap of platonistic purpose and teleological thinking. But it forms a strange circular argument where it insists anything short of a divine-miracle-level answer to the question will be rejected because it's been established as a question which requires a divine-miracle-level answer.

In the Blind Watchmaker, Richard Dawkins referred to this as the argument from insufficient imagination. We now know that all sorts of interesting and complex systems can arise by combining simple systems. It requires a strange inversion of reasoning, the most popular kind of which is Darwinian evolution by natural selection.

Learn what "Chance" really means

Humans are notoriously bad at understanding probability. I just finished reading Statistics Done Wrong: A woefully complete guide which outlines how frequently scientific papers published in peer reviewed journals get statistics wrong in significant ways. If they're the ones who are supposed to have received formal training in the subject, what chance are the rest of us supposed to have.

But most people confuse three things:

  1. Mindless natural processes
  2. Complex systems whose future behaviour is difficult to predict
  3. Non-deterministic systems whose future behaviour is impossible to predict (being unpredictable is almost being non deterministic by definition, but I went with the clunky definition for clarity.)

If you think of that as a single class of phenomena, it's easy to see why people view large parts of the world as magical intractable mysteries. But taken separately, it's more manageable.

The first class can easily be understood in terms of the experiments you likely saw in science class. You put a ball at the top of a hill, it rolls to the bottom. There is no choice on the part of the ball or the hill. The combination of ball, hill and gravity just produce the result of the ball winding up at the bottom. You wouldn't try and say that the ball's purpose is to reach the bottom of hills or that the hill's purpose is to escort balls to its bottom (my gosh that sounded dirty!).

The third class, non deterministic systems simply do not exist. For everything we know about the universe is deterministic down to the last particle. There is one set of physical laws from which there is no escape and those laws guarantee that you can always predict the future accurately and the only limits are your present ignorance and uncertainty about the present situation. Think of a gunman whose hand is shaking. Your inability to predict where the bullet will land does not arise from the bullet defying the laws of physics.

The first class then would seem to describe the entire universe. If everything in the universe is made up of particles that must follow a uniform set of unchanging physical laws at all times, then everything in the universe is simply determined by the interactions of the particles according to the physical laws. There is no magical mind element to the universe that enters in at any point. So a super intelligent creature with enough computational power who knew enough about the current state of the universe you would be able to predict the future perfectly with no uncertainty.

For historical reasons, this creature is usually called a Laplacian Demon. The objection "Hey wait, but quantum mechanics is indeterministic" is irrelevant. There is a fundamental unpredictability in the predictions of individual interactions, which is washed out by the absolute predictability about the amalgamation of large numbers of predictions. (You can't tell where the electron will strike the screen, but you can describe the distribution of the behaviour of billions of electrons with mind boggling accuracy.) And in large terms the amount of quantum uncertainty is dwarfed effortlessly by regular measurement error. That's why it took until the 20th century for anyone to notice quantum phenomena were even a thing.

The second category is where the problems come in. You can't predict the future, but you can make guesses about the future and you can quantify how right you think you're going to be. This is the familiar phenomenon of a weather forecaster saying there is a 50% chance of rain tomorrow. People get freaked out by uncertainty and often overestimate its importance. He's saying it might rain and it might not. Equal chances of rain/not rain can seem like the weatherman doesn't know anything at all. Saying there is a 50% chance of rain is actually conveying a massive amount of certainty about the world. Knowing there won't be a hurricane, knowing that it will rain water droplets instead of marshmallows or donkeys.

So when the religious use the phrase "just random chance" pejoratively as though it's a no-holds-barred anything can happen situation, they're seeing only the uncertainty and ignoring all of the certainty we do possess. When you roll a die, you're saying you're certain the result will be a 1, 2, 3, 4, 5 or 6 (and that it won't turn into a Pokemon or a black hole). When you actually consider what the words mean, random chance is actually a source of very little uncertainty.

This makes phenomena like Darwinian natural selection very easy to appreciate because they're not "random" in the anything-can-happen sense. They're just unpredictable in the sense of not knowing whether your coin will land heads or tails.

Learn Logic

Formal logic is a highly artificial system constructed for organizing our thoughts. It's a wonderful way for groups of people to get together (or get together metaphorically by putting their work on paper for others to read) and express their thoughts and review them in a methodical way to weed out as many errors and false beliefs as possible. But it's not the way most people think and it's not the way most people generate new knowledge.

Logic works as a set of rules of inference. It has three working parts (plus some accessories)

  1. A
  2. B
  3. If A is true then B must also be true

Now any of the three may be true or false for any given choice of statements. It's easy to see how to make individual statements which are true and false but it's also possible for if/then statements. So you could make the if/then statement "If an animal is a bird, then it lays eggs to reproduce." Which is true. You could also have one which is "If an animal is a whale, then Justin Bieber is the president of the United States" which is false.

Logic helps us to organize our thoughts So that if we agree that if A then B and that A is true, we also agree that B is true. In this example if both parties agree that if an animal is a bird then it lays eggs to reproduce and that the animal we're discussing is a bird, we both that it's not possible to have a a bird who reproduces without laying eggs.

There is a very subtle distinction here which gets missed. This doesn't imply that it's impossible to have a bird that reproduces without laying eggs. New evidence could convince us that a species of bird exists that can do just that, and that our belief of if A then B is false. Our agreement that A implies B is not a guarantee that A actually implies B in real life. Logic can't produce any new knowledge about the world. The best it can do is alert you when new knowledge conflicts with knowledge you already had.

You can only use logic as a persuasive tool if you start from mutually agreed premises. However, there's a macho thing which happens in philosophy which is people try and reduce these relationships (the As, the Bs and the if/thens) to the minimum number possible without loosing any information. It has marginal utility because logic operates like computer programs, garbage in - garbage out, so by reducing the number of starting assumptions you reduce the number of things people could possibly agree over. Ultimately, whether or not we accept premises comes down to observations of the real world.

Several times I've had people attempt to repudiate this point. Eventually, I discovered that they actually agreed with me, but that they had taken "every day experience" and mislabeled it as assumptions or presuppositions. So for example, they claim that statements like "everything which begins to exist has a cause" (a premise which is demonstrably false, but that's a discussion for another post) are things they "just know", but are actually generalizations from a very large number of empirical observations throughout their lives. Douglas Adams famously satirized this point with his ultimate computer Deep Thought who was able to start from the premise "I think therefore I am" and use that to deduce the existence of rice pudding and Income tax. Logical arguments can't be persuasive unless the premises have strong empirical support.

Why this is so painful?

These conversations are difficult to the point of being painful because they are often (necessarily) had with people who lack the formal training to know that don't understand the topic they're speaking of.

When most people use the phrase "let's think about this logically" what they actually mean is "let's make a series of educated guesses until we arrive at a conclusion which is compatible with our intuitions and then claim success." What they think logic is and what logic actually is are most often very different.

The image we have of logic is usually a caricature in the form of Sherlock Homles or Mr. Spock. These are portrayed as hyper rational robot like humans who also have the benefit of a huge storehouse of background knowledge and impeccable powers of observation. Most people, when they observe an actor in the role of Holmes are honest enough to recognize that their brains don't operate like that. And often the representation is misleading. Most people think that Sherlock Holmes solves cases through deduction, which he doesn't. Deduction is one of those words that means something different from what most people think it does.

Our use of logic

Think about as objective an empirical fact as you're about to find. The earth is round. Do you discover that information through logical thinking. The answer that few people are prepared to admit to is absolutely not. You believe the world is round because when you were young, someone pointed to a picture or a globe and said "this is where we live" and you accepted it as true. You likely have never flown around the world, never been to space to see for yourself and you never asked the person who told you to cite their peer-reviewed academic source to support the assertion that the planet we live on looks more or less like that globe.

Of course, I'm not trying to deny the earth is round and I'm not trying to deny the value of relying on information provided to you by others. There are good reasons to infer all the people telling you the earth is round are not lying. But when you learned the fact, you were essentially taking the information on faith. But because the fact happens to be true, we go back and re-write the narrative and thing that we believe the earth is round because of all the reasons (sails of ships over the horizon, photos from NASA, etc). You're altering history and imagine yourself as being more logical and rational than you actually were at the age when your parents first told you that miscellaneous fact about the globe.

But like the teleological problem above, yes it's possible to be right by working out your beliefs through pure Spock-like reason. But it's also possible to guess the answer and happen to be right. And once you have the correct belief, you get all the benefits from the belief regardless of what mechanism generated it. It's the evolution argument in an intellectual form. You don't need someone to sit down and design an eye to wind up with a working eye.

The "guess and check" method of reasoning we explored when we were younger actually accounts for a large part of human knowledge. Now most of that hard-won knowledge has been accumulated under the term "presupposition" by people who forgot how we came to know it in the first place.

Most of us feel it's important to be thought of as reasonable and rational people. So the admission that we are fundamentally irrational creatures seems unutterable. It's also deeply disturbing to people to realize that we, as a species, seem to be doing rather well considering our collective inability to reason properly.

Four thousand words later and this is barely a dent in the surface of an area which I know people dedicate their lives to studying. The point isn't to show that concepts like logic aren't mystical monolithic metaphysical entities handed down on stone tablets from on high and this so called "failure to account for such and such philosophical concept under materialism" is an imaginary problem invented to require a magical solution.

These aren't genuine criticisms of "atheism", they're arguments clad i the guise of philosophy unleashed with the hope that people will go away and not come back.

The Atheist's (alleged) Nightmare

This is another one of those things that I didn't think would be spending time on. A polite cute young Christian (pictured in the video below) asked both to the internet in general, and me specifically, if I would be willing to post something to refute some of his arguments.

His entire channel is a wonderful source of intellectual stimulation; a kind of academic whack-a-mole. Matt is a Young Earth Creationist, meaning he takes the bible as the inspired and inerrant word of god to Her glorious creation.

Christians will ideally twitch physically when I do things like refer to their deity with a female pronoun. If you're going to call an entity an "intelligent designer", why would it get a male pronoun?

Now a biblical inerrancy is a problematic position to hold. To claim to be infallible is a tall order to live up to. So if someone were to discover a single mistake, the bubble is burst and then it's just all downhill as readers feel free to pick it apart and dismiss any parts they feel like. So creationism requires constant vigilance; if one inconvenient fact slips through, the entire edifice might come crashing down.

I take the position of biblical literaturism, which means I think it should be read in the same way as Shakespeare or Harry Potter. You can extract "truth" and "wisdom" from it in the form of elegant ideas which stand on their own without granting the entirety of the text any unwarranted significance, or requiring any of the events told within to be historical fact. Hermione Granger can be a role model without being a real person.

Matt has a major problem because it's hard to be less inerrant than the bible (it was the bronze age; they didn't know better), and Matt is stuck in the unenviable position of being forced to deny massive amounts of established science outright; something which is especially hard to do without any formal training in math and science. It's the equivalent of trying to incubate a chicken egg on the tip of the CN Tower; it's windy and it's a long way straight down. With that much science denial, it's difficult to get a foothold to reason from; to find a piece of common ground to start at and work from. And you're unlikely to be able to sustain a discussion with them long enough to convince them of enough science to build up a consistent picture. You have to deal with issues in isolation (hence my whack-a-mole analogy above).

He's chosen the most difficult position to start from; which I'm eager to try and answer, more for the intellectual stimulation of it.

Once, when speaking on free speech, Christopher Hitchens asked the rhetorical question of how you would debate someone who believed the earth was flat. It's not so simple a question. On short notice, the best any of us could probably come up with was, "It just is, you moron... Trust me." It's even worse now, since you could even show him a photograph taken from he space station, and he could reply quickly and confidently with "Photo-shop." I feel this will be like trying to prove I'm not Keanu Reeves in The Matrix.


I'm going to ignore completely the question/s as posed in his video/s. I hope the reader will agree, and as I'm sure the cute Christian will concede he was never looking for a serious answer, but rather to stir up some controversy to draw an audience for some Christian preaching (or at least atheist-making-fun-of... notice the request that people participate and respond adding fuel to the fire.) But the question as posed is far to vacuous to offer a serious response. And this may go on at some length; I am, after all, refuting an entire world view. Since I'll be passing over a lot, so if there's something you'd like me to elaborate on or want to offer a correction; I'll try my best in a follow up post of some sort.

I'm not going to try to undermine the bible - reading it is more than sufficient to do that.

I also want to say that I am not addressing his position in a direct way. I fully intent to approach it sideways, from behind and from the inside out. In other words, I'm a magician: I cheat. Lector emptor. 

I want to make it very clear that I recognize that "Christian" is a broad umbrella term for a very heterogenous group of people. Some Christians believe the Bible is literal, others don't. Many make use of an obscure and ad hoc system for excluding parts of the Bible that are embarrassing. Some Christians adopt secular morality (for things like marriage equality) and others don't. A portion accept Darwinian Evolution by Natural Selection; others reject it; and yet a third group tries to accept common ancestry but replaces the mechanism with some kind of divine guidance. I'm sure a large fraction of Christians would laugh at this gentleman as much or even more than my fellow heretics and I do.

I also don't want it to sound like I am insulting Matt's intelligence. I learned a long time ago that people don't reach different conclusions because they are more smart or less smart than you are. They come to different conclusions because they have access to different information. When I watch defenders of religion, they appear to me as holy as Swiss cheese. But I'm the exception, rather than the rule. I'm a confessed math and science geek and I've been absorbing science like a sponge since my teens. Unstated assumptions glow like radioactive three-eyed fish to me and I can spot bad logical inferences in my sleep. (I have faults to balance this out... under no circumstances should you invite me to sing, draw or play any kind of team sport.) The result is that when someone like Matt makes statements about the history of science or the current scientific consensus, I almost always know when he's made an error.

I maintain that Matt is coming to very silly conclusions because he is lacking fundamental information, both about specific branches of science, and how science works in general. He's also probably been told untrue things by people he trusts, like if he were to change his mind, he would be subjected to everlasting torment in a sea of fire, or suffer a more practical punishment like his family would refuse to pay for his education. If he were operating under that kind of ideological blackmail, I couldn't really blame him for not wanting to change his mind. I'm not sure I would be able to. But again, I'm able to let my curiosity run rampant and read about anything I want and don't have to feel bad about following evidence wherever it may lead. If he were to take the time to do some serious study, by which I mean take a complete work on a subject that interests him like biology or cosmology, supported by a nice grounding in math and statistics so he could learn a bit about the rules of inference and Bayesian reasoning, then he could really start to discover some amazing things that would rock his intellectual world.

Now Matt as been the recipient of widespread ridicule on the internet, and this is as it should be. This is the price you pay for freedom of speech. Matt has every right to say what he believes is true (although he should be denied the right to say so in a science classroom for various legal and practical reasons) and people who find him amusing have the right to say so. This is the tradeoff. And since it is highly unlikely I'll be able to reach the end of this essay without making fun of him, I want to be clear that I'm mocking an individual, not an entire religion. Besides, he's been at this for a few years, seems like a good sport and I'm sure he can take it.

Presuppositional apologetics (in brief)

Young-earth creationists accept the bible as the inspired and inerrant word of the creator of the universe, and therefore a reliable reference source on matters of fact (like who created the universe, how many days he took to do it and the approved technique for sacrificing baby goats.)

Pre-suppositional apologetics is the additional position that an explanation is needed for how we are able to reason, otherwise our reasoning will be faulty or unreliable. They contend the only way to surmount this obstacle would be if a being with ultimate knowledge offered revealed truth. They claim that the creator entity is a necessary assumption to make logical reasoning internally consistent. This being is necessarily Jehovah, for reasons that are never well articulated.

Note that apologetics here has nothing to do with saying you're sorry. This form of the word loosely means "defence" or "defender", so he's defending his beliefs (from people like me). Equally important, it should not escape notice that defending a position is not equivalent to seeking the truth. Apologists are like government lobbyists; they only collect a paycheque if they reach the right conclusion. It is their job to steer you away from evidence which might contradict their position. The term "spin-doctor" springs to mind. The bible must be explained! To conclude that it's just a piece of ancient literature is an unacceptable outcome for them.

As a magician I know a thing or two about leading people to false conclusions. This brings me to James' First Law of Logic: Any statement can be made to sound plausible as long as you're willing to ignore the evidence against it. 

Presup (because I'm too lazy to type the entire word) is the new in thing for young-earth creationism. Eric Hovind, the head of a prominent anti-evolution "ministry" used to use an evidence-based approach to dispute evolution and promote Christianity (usually by quoting scientific sources out of context. He even won the coveted Golden Crocoduck last year for some grotesquely bad math.) He has switched to presup as his primary debate technique. Matt seems to be taking his lead, and jumped on the presup bandwagon.

This is significant because Eric is the son of Mr. Kent Hovind. (If anyone wants a copy of Kent's doctoral dissertation, I'll send it to you - it's the funniest thing I read this year.) Hovind the Father was basically the lord high mucky-muck anti-evolution debater - to see him in action was truly astonishing. That Eric would switch tactics away from his dad's stuff is shocking. (Interesting tie-in: the font on all of Kent's powerpoint slides is the same one that wound up in the Abracadabaret logo... I hide my head in shame.)

The presup tactic is very appealing for a debater. Because it deals with the very subject of reason itself, it makes it very difficult to reason about. On the one hand it's obviously silly, but I'm not sure if it's possible to refute it concisely without first priming the listener with a few years of formal philosophy training.

It also gives the Christian a very curious debating point:

God is necessary for reason; therefore any argument you make which involves reason implies you accept the existence of god. You're speaking. Therefore god exists.

Now he may dress it up in different grammatical guise, this is Matt's main premise. Stephen Law described this tactic as "going nuclear." It's a debate stopper. Regardless of whether or not you agree with it, once reason itself is in question, it's almost impossible to carry on any kind of productive discussion about anything.

This is the point and the reason I want to address the argument.

No sensible human being is going to find the argument convincing. No non-believer is going to hear this and say, "Oops, I guess I was wrong about this god lady. Sign me up for my baptism." (Or at least I've never heard of one... if I'm wrong about this, please let me know.) Conversely, anyone who has his head stuck so far up the bible to resort to it is unlikely to embrace rational empiricism anytime soon.

Nevertheless the challenge needs to be addressed precisely because it stops the conversation dead. Which means it's a powerful means to insulate indoctrinated Christians who might be endanger of being educated. "You can't explain how reason works, so you can't explain [insert subject here.]" For the already-faithful, it gives one more layer of armour that stands in the way of that pesky thing called learning.

Going around in circles

It's fundamental design flaws are completely hidden by its superficial design flaws. -Douglas Adams

There are two superficial problems that would make it easy to dismiss the argument. I'm slightly more stubborn and will give more, but these two are sufficient for a casual, "Get lost creatard."

The first is that the argument is circular: god exists because it says so in the bible and the bible is true because it's revelation from god. Of course, it's possible for premises inside a circular argument to be true, you can't use them to try and convince anyone. So the rational person is correct in concluding they're probably wrong in the absence of evidence.

The second problem is that the argument is missing pieces. This is the deism/theism problem.

Deism, or a deist god is the god of Einstein or Spinoza - a designer that sets the machine in motion but lets it run without interference. Theism is a god that gets involved in the world, answers prayers and decides who gets into heaven.  

All of the arguments that don't rely on personal experience (cosmological, moral, ontological, teleological) only get you to Deism. They're also not that convincing; barely held together by bad grammar. To establish an intervening god requires more arguments (and those are even worse). It's supremely arrogant to throw out a bunch of arguments for a god and then throw in as an afterthought, "and of course, it's our god... naturally!"

There's a final superficial objection that, while not strictly logical, is fairly compelling is that this is now their best argument. If this is the best you've got pu-lease. It's basically a confession, up front, that they have no evidence. A leading proponent of presup, Sye-Ten Bruggencate, has even admitted that it is unwise to defend Christianity based on evidence. He will even confess it's crazy (talking snakes, magic fruit) and only makes sense if you assume that there's a god behind it.

If the bible is to be believed, there would be a list of evidence that we would expect to see ahead of this circular flotsam about reason. We would love to see a scientific study that shows Christian prayer is measurably more effective at healing the sick that any other religious prayer or no prayer. We would love to see medical miracles happening to Christians; regrown limbs, resurrections (which were a relatively common appearance in biblical times). We would pay to go visit the flock of Crocoducks created specially and living in Ray Comfort's swimming pool, proving not only god's power, but also her sense of humour.

Drifting Deeper Into Madness

Above were relatively quick, superficial objections to the presuppositionalist. But they are just criticisms and don't replace the trashed beliefs with anything constructive, so they tend to leave me feeling unsatisfied. I hope these will appear a bit more profound.

The first observation is something drawn from computers (shout out to Dan Dennett's book). I'm sure you could argue this on purely abstract terms if you wanted to, but a concrete example is better. What computers teach about philosophy is that you don't have to understand how a tool works in order to use it. Imagine a carefully constructed machine, something like Charles Babbage's Difference Engine and imagine setting it up to perform a simple task like long division.  It's a machine and works by whatever principles it works by (I certainly don't know how it works) but if it can accomplish long division, then it works. It's origins become, as they say, purely academic.

Now, I agree it would be better to know how and why it works. I'm sure that if you knew about its inner workings, you might be able to do something creative, like generalize it and build a multiplication machine, or a logarithm machine, or an iPhone. You would also be in a better position to fix it if it became broken. The machine may also have limitations. For example, it may only be able to divide numbers less up to eight digits long, or it may only be able to give answers to three decimal places. You might be able to deduce that if you knew how the machine worked, otherwise you'd have to infer those limitations after watching the machine in action for a long period of time.

So being unable to explain the origin of reason in no way undermines your ability to make use of it successfully. Anyone can use an iPhone, even if they don't know how a telephone works. This would be true even if the iPhone were assembled by the 747 randomness (see The God Delusion of The Blind Watchmaker). This is the important point, because, it means we can agree that reason works. We also agree that reason only works some of the time (because both the unbeliever and the believe cannot be right at the same time.) So, at least we're back in a position to have, as Matt calls it, a polite discussion.

Now entering the rabbit hole

The longer you spend with the presup argument, the more you're left thinking something's not right and you can't quite put your finger on it. The argument comes attached to several assumptions, which often go unstated. You can usually figure out what they are based on the tone of the speaker.

In this case, they're talking about reason. But which kind of reason? You can reason:

  • inductively (from the specific to the general)
  • deductively (from simple to complex statements)
  • intuitively (a series of mental shortcuts like educated guesses)
  • experimentally (trying a bunch of alternatives to see which one works)
  • statistically (an answer which varies depending on how right you need to be)

No, I'm not using the technical terms. Sue me.

I think (and I'm reading this into Matt's argument) that he's referring to deductive logic, where I will linger a while. This is what I have extensive training using as a math geek. Math is a wonderful domain to reason in because as long as the writer is properly trained, any "sentence" in math only has a single unique and unambiguous meaning. Logic in philosophy exported this was of reasoning, but due to the limitations of written language, you can never seem to manage to achieve the same level of precision.

Now speaking as someone who has done some serious training in math and worked for years tutoring high school students, I will state with authority: Human beings suck at deductive reasoning. It's not a natural process for the human brain, it takes a great deal of energy and it's counterintuitive in the extreme.

But that's not the kind of "reason" we use. About 99% of what we know doesn't come through reason; it comes through acquired learning. The way a child learns not to touch the stove is by touching the stove and seeing what happens. This becomes a fact in memory. A larger number of these facts get accumulated (maybe even shared with other people) and develop into folk wisdom (touching stovetops is a bad idea).

Very rarely, some slightly clever person may come along and assemble this folk wisdom into a theory (cooking on the stove makes it hot, hot things burn, you get the idea) and that may lead to a more sophisticated understanding. In this uber-simple example it might read like this:

Excessive and painful hotness on the stovetop is caused by cooking. When no cooking takes place, the hotness goes away. If cooking is going on or has gone on recently touching the stove is a bad idea. If no cooking has gone on for a while, touching the stove is fine. However my house has young children in it and they're quite stupid and helpless, so our house rule will be that "we never touch the stove with our bare hands.

Now a huge amount of information comes out of this very simple example. Again, it reiterates the wisdom from above, that we can have a useful strategy (don't touch the stove) which is independent of knowing how the stove works. But at the same time, we can have a more sophisticated rule that comes out of knowing how the stovetop works (only touch the stove when it's not hot).

Notice that we still know relatively little about the stovetop. We have not established that the stove is made of atoms, or that the burning is a statistical effect caused by the energy of many many rapidly jiggling little atoms. And we have no clue yet that those atoms are made of quarks and leptons. All things considered, we're doing quite well on very little understanding.

But really, most of what is being subsumed in Matt's definition of "reason" is just blind trial and error. Let's try this... Did it work? No? Okay, let's try something else...

This reminds me of a student I worked with to prep for an important multiple choice math test. (The story will make him seem slightly dim, he wasn't. You should also cut him some slack because he was twelve.) He read a question, thought about it briefly, then announced confidently, "It's B." I told him it wasn't B. He thought some more then said, possibly more confidently, "It's A." He was shocked to find out it wasn't A. Repeat the same with D. The fourth thing he tried was C, which happened to be the right answer. He guessed, by my facial expression that he had the correct answer, to which he added triumphantly, "YES, I got it."

Of course, he didn't get anything. And it was precisely this habit we were trying to break. This is the "reason" which is innate in us. It's just a memory for what works and what doesn't. It's only a small fraction of the population that reason in an orderly deductive fashion. They are the exception, not the rule. It's very much an acquired skill.

Presup has an understanding of what reason is and how reason works to be useful. Which makes it useless for a discussion like this which is necessarily pedantically nuanced. At its heart, the human brain is not a machine for reasoning, it's a guessing machine that has a memory. It just so happens that we gradually pieced together a series of thinking tools (now lumped together under "reason") that improve the likelihood of guesses being correct.

Deductive detail

What if instead of talking about a stovetop, we were talking about math. (Don't run away.) A curious individual like Matt would consult a very clever person (even more clever than the quite clever person just above talking about the stove). He might have a fact which he is curious about, like one of the most fascinating formulas in all of math:


First he would have to ask what is, then pi, then i. That might take a while. Sooner or later, he would ask the five year old's question, "How do you know?"

Actually he could ask many such questions. How do you know what the value of e is? How do you know what the value of pi is? What is an imaginary number and how do you use it as an exponent? He could continue to ask "Why?" and "How do you know?" and it would be like peeling the layers of an onion. This particularly onion would take him through calculus, trigonometry, complex analysis and basic algebra, before winding up at addition.

Side note: If he were serious about actually understanding the answers to all of the various why and how questions down the ladder, the process of asking and answering would likely take him years. More likely, the person explaining it to him would start at the bottom and work his way up to it so by the end, he would have a rather complete understanding of math. Which goes to show that just because there is an explanation, it doesn't necessarily follow that you have the ability to understand it and confirm its validity for yourself. If you're going to ask hard questions, this is a real practical problem worth keeping in mind. At some point, you are going to have to find a way to trust experts to give you information you're not capable of verifying for yourself; there's just not enough days in a lifetime.

Ignoring the practical problems, does the chain of why and how questions proceed ad infinitum or is there an end. The answer is there is an end. You reach a point where the claims are so atomically simple that you cannot justify them in terms of simpler claims. You run into the situation you ran into above with the flat earther, just saying, "They're true, just trust me." They're called axioms. They're not proven because we don't have any simpler statements we can use to derive them, so they're a kind of assumption that float around ungrounded in the intellectual ether.

Fortunately, these propositions are not in dispute. They're statements of kind "there is a number called 1", "parallel lines in a plane never cross", "if you change the order of addition the answer doesn't change." (that's a + b = b + a, remember it because we'll return to it shortly.) The list would fill about a page and you can figure out all of mathematics from them, with some definitions added in mostly for brevity and convenience.

The process of reducing math to the smallest number of these statements  is called axiomatization. It's an intellectually macho exercise - a kind of nerd contest - where you try to use as few of these axioms as possible. This minimalism exists primarily for aesthetic reasons. Since the facts form a coherent network, and the higher level facts just fall out with enough of what a professor of mine called "Squigglies".

Many have asked the question of whether it's possible to eliminate the axioms; to have a coherent system that contains no assumptions. It was actually one of the great unanswered questions at the end of the Nineteenth Century.

The answer, it turns out, is a definitive NO. The "Incompleteness Theorem", proven by Kurt Gödel  in the 1930s (pronounced Girdle) demonstrates that it's an impossible task. Remember that math is the only domain where it's actually possible to prove things to be impossible. The rest of the world just has to settle for degrees of uncertainty. Although even in mathematics, we don't have absolute certainty, because when someone "proves" a result, there's always a non-zero probability that there's a mistake in the argument and no amount of peer review can reduce the chances of a mistake absolutely to zero.

Of course, it may worry some people that the whole of mathematics (and therefore huge chunks of the rest of science) rests on "unproven assumptions." You may have noticed that the mathematics community at large hasn't jumped on board endorsing Matt's presuppositional argument, and they show no evidence of loosing any sleep over it either. What's their escape?

Can you guess? It's observation... evidence. The e-word a presup must shy away from. The real world (whatever you mean by real) is the get out of jail free card. You don't have to have perfectly valid deductive logic. You just have to poke your head out the window to see if your conclusions line up with reality. Trial and error, a kind of evolutionof ideas by rational selection, is what is responsible for most of our knowledge. Remember all of Edison's failed attempts at that lightbulb.

Let's look at the axiom from before:

a + b = b + a *

*This is only one of the axioms. You'd have to give each one its own separate similar treatment.

This is an unproven statement - the sort of thing that makes Matt and Eric and Sye Ten drool. But it's mutually agreed upon by everyone in the field as being true by the following process: imagine two rows of stones; the first row contains stones, the second row contains stones (I realize I'm being pedantic, but Matt brought it up.) Now if you want to find the total. You move the two rows of stones close together, then count the total number of stones starting from one and stopping when you run out of stones... this is how addition works boys and girls (remember... Matt asked for this; it's his fault.)

Now if instead of calculating a + b, you wanted to calculate b + a. You can imagine standing up and walking around the stones to stand on the other side. You are looking at the same stones and they haven't moved but you're now counting them in the opposite order. You get the same number of stones. Case closed, you've proven it.

No, you haven't.

This is only an observational fact - anecdotal evidence, the worst kind of evidence - and we have no way of knowing if it's true for any number of stones. We can try the experiment many times counting small numbers of stones (say less than 100). We can imagine what the experiment would look like with very large numbers of stones and imagine getting the same answer. But there could be a certain number of stones where this is no longer the case; where walking around the stones to look at them from the other side changes their number. (Okay, I can't imagine it. I can imagine trying to imagine it.)

I might also be able to imagine in place of stones, some sort of quantum mechanical particles, where the act of "looking" at the particles actually changes the number of particles present. ("Looking" in quantum mechanics looks nothing like the looking you're used to looking at... trust me.) Or if you wanted to switch from a plus b to a times b, there are lots of mathematical objects, more complicated than ordinary numbers, that when multiplied in different orders give different answers. So it's not impossible to imagine a world where the order of addition matters; it's just difficult.

So the presuppositionalist has stumbled onto a very real problem. You can't have a complete logical system without at least some axioms. When we dig down deep enough, we hit a kind of intellectual bedrock below which there is no logical foundation.

Now quickly, before a dishonest presuppositionalist quotes the above out of context and implies that I've conceded his position, I'll offer two solutions:

The first is a practical solution. You may not be able to prove that a + b always equals b + a even though you strongly suspect it's true for all and b. But you can estimate the probability that it is universally true. That estimate might be a 99.9% with so many repeating 9's that it's indistinguishable from 100% certainty to anyone except the most anal philosopher, but it will never truly be 100%. But if you ran into someone in the street who was sweating and fretting that perhaps it might turn out to be a false assumption sometime soon, you can confidently tell him to chill out, go get some ice cream and forget the whole silly idea. So with the same level of confidence, you can take a young presuppositionalist and tell him to chill out, go get some ice cream and forget the whole silly idea.... Although with my luck there will be a ban on ice cream tucked in the sillier parts between Deuteronomy and Numbers; those books ruin all the fun stuff.

The second is a more philosophical solution. When looked at from a distance, this axiom can be seen as a type of symmetry in the universe. Now all of the conservation laws (energy, momentum, electric charge, etc) have a symmetry associated with them, which basically assume that from moment to moment and place to place, the processes that govern the universe (it's misleading to call them laws, because that brings the temptation to invoke a law giver) are the same. They become the basis of what creationists pejoratively refer to as uniformitarianism.

Once this fundamental reliance on the observed regularities is noticed, it will quickly become apparent that all of the axioms of mathematics and logic are just abstractions of this regularity into linguistic terms. The "Law of Non Contradiction" (to choose an example randomly) is not a fundamental principle, but is a generalization derived from the natural world.

As near as we can tell, uniformitarianism is a necessary precondition for life. We can't wake up tomorrow and discover that God has enacted a new fundamental law, say "The Law of Conservation of Twerk". Imagine a world where new particles spontaneously popped into and out of existence (violating what we call the "laws" of mathematics and physics) or spontaneously transmuted into other particles. A creature who might wake up one morning to find important parts of its heart had poofed out of existence, or that some of the H20 in its blood had magically been replaced with cyanide would not last long. Unpredictability is deadly to life forms and so we require a universe in which these types of "laws" operate just to make it past breakfast.

This may sound familiar... it's the Anthropic Principle: The problem that you can only find intelligent and reflective creatures in a universe where it's possible for intelligent and reflective creatures can exist. And since we have absolutely no way of gathering any information about universes in which we cannot exist, or about what goes on "outside" our universe, anyone who claims to know anything about these situations, or even claims to know they are properly posed questions is either lying or a moron... or both.

So there you have it, Matt. I hope you slogged through it. 5000 words and you can see that your argument is just a re-hash of the Anthropic problem. This is not a new issue. I heard about it in a Stephen Hawking book from the 80s and it wasn't new then. So you can rest easy knowing that an army of physicists around the world is working on the problem, racing to be the first to have an answer for you. (Kind of makes you feel special, doesn't it?)

I don't have an answer for the anthropic problem. It continues to make my head hurt. I'm comfortable enough to admit I don't know the answer, and also knowledgeable enough to know that you don't have the answer either, nor will you be able to find it in any book presently available. Remember that you can determine that an answer is wrong even if you don't have the right answer.

I guess that all we have left now is to go out for ice cream.

An idiot's guide to apologetics

For those of you that didn't go out for ice cream and are still reading, I offer some tidbits of advice for anyone who wants to try and defend their faith in the public square. I watch, read and listen to a lot of this stuff and thought I would offer some helpful hints to reduce the amount of mockery you have to deal with. Just my 2 cents. Hope you can make use of them.

1. Leave your ego at the door

A religious person, almost by definition, considers himself special. He, and the universe he grew up in, were created by a deity (cause apparently god needed a cheerleading squad to tell her how great she was.) That means someone consciously decided that she wanted to have a human playmate to have some kind of relationship with. He has a soul, and the ability to live on in some kind of afterlife - something not available to any other animal we know of. And 99.% of species ever discovered have gone extinct.

Of course there are about 4,000 different religions in the world (it really depends on how you count; I'm not married to that number. Suffice it to say it's more than twenty.) But of course a religious person is safe because he picked the correct one. On top of that Christianity itself can be subdivided into tens of thousands of sects, but you've still you've managed to find the correct one... even without examining 99% of the alternatives; which is like mad skillz dawg!

Undaunted by large numbers, the entire universe was created as part of this "plan" as well. The number of stars in the visible universe is twenty-five digits long. They don't do much except provide shiny lights to base our horoscopes on. (Although Scientology made up a use for them.)

He's special!

Or much more likely, just incredibly biased. If your conclusion happens to end with the good news that you're part of a divine plan and are in some sense the centre of the universe created by someone who notes whether or not you say she exists, don't expect anyone to take you seriously. You need to develop a more reasonable evaluation of your place in the universe.

Develop an appreciation for the limitations of your knowledge and cognitive faculties. Know how quickly and easily you can slip into self-deception and reach fallacious conclusions (if you need help reaching this state of mind, go watch a magic show. It's an intellectually liberating experience.)

Turn the lame "what percentage of the world's knowledge would you say you have?" back on yourself and accept that the next book you pick up could completely change your worldview... and pick up the book anyway. And accept the corollary to that, which is your incredulity or lack of understanding can't be used as the basis for a refutation for someone actually studying in a given field. (There's no shame in admitting that. We have to accept that life has gotten too complicated for individual humans to master everything in a single lifetime and we need to work together and share knowledge. But at the end of the day, the only people whose opinions about quantum physics matter are people who've seriously studied quantum physics. Business people are fond of saying, "You don't count votes, you weigh them.")

And never assume (or speak in such a tone so that people will think you assume) that because you aren't familiar with the refutation of your argument, that such a refutation doesn't exist. You'd be amazed how often I stumble across 20 and 30 year old references that contain perfect rebuttals to creationist questions. You'd be even more amazed how many of them are dealt with directly in Origin of Species... which you probably never bothered to read.

And don't presume that your question will give atheists nightmares. Even if the answer really is "I don't know," remember that we don't mind "I don't know" as an answer.

I can live with doubt. I think it’s much more interesting to live not knowing then to have answers that might be wrong. -Richard Feynman

2. Avoid simplistic ideas and generalizations

Life is complicated. If you can explain the origin of the universe, the origin of life, the basis for morality and what happens to you after you die with a single theory simple enough to be grasped by a second grader, you do not have an accurate picture of the world.

Stop seeing the world in black and white and learn to appreciate that on complex issues we try to take nuanced positions, and view the world not just in shades of grey, but all of the fabulous colours.  Learn to parse the difference between the magic p-words possible: plausible, probable and proven. Learn how to distinguish between correlation and causation. Learn the difference between refutation and repudiation. Don't generalize conditional statements to biconditional statements without evidence to support them Remember that nature is far more clever than you ever could be; problems tend to have more than one solution. Learn the difference between skepticism and cynicism. Learn the difference between an authority and an expert. Learn the difference between someone admitting that they could be wrong and demonstrating that they are, in fact, wrong.

To mention the video above (finally) don't assume that because something was not designed by an intelligence, that it doesn't work. True, you can argue for why producing something intelligently, is more likely to create something that works. It's a fully unjustified leap to conclude that that intelligent forethought from draughting table direct to finished product is the only way to arrive at working design. This is especially true since an alternative was discovered 150 years ago and has been actively researched ever since.

3. Learn the Real Scientific Method

This is the one flaw that I think the scientific community can take the full blame for. Historically, scientists have been lousy communicators to the lay public. Scientists have even been guilty of looking down on those people who tried to share their love of science with the public (think of Carl Sagan or Stephen Hawking). Media coverage of scientific discoveries is dreadful being overhyped and misleading, making many events sound more important and more certain than they actually are. Ironically, if you go to primary scientific source material, the language is much more conservative. You'll never find the phrase "earth shattering breakthrough" in a real scientific document. And anyone so says that scientists "prove" things is misled, stupid or both.

Okay, that's not literally true. Certain scientists will deductively prove theorems about the mathematical relationships between quantities measured in physics. That's not quite the same as mathematical proof because the assumptions aren't axiomatic in nature.

And the state of science education in schools is sorely lacking (it's even worse in schools where teachers are trying to faust stultifying nonsense like the universe is 6000 years old.)

I would wager that less than 1 in 20 people stopped randomly in the street could give an accurate description of the scientific method. And yes, that's a real problem that still needs to be dealt with.

Rather than try to give you my definition of the scientific, I will offer you the definitive explanation of the scientific method by a Nobel Prize winning physicist, Richard Feynman. Thanks to the generosity of Bill Gates, the lecture is available to watch for free on the Project Tuva website. The lecture of primary importance to this discussion is #7: Seeking New Laws. Of course the rest of them are marvellous as well, so if it's a rainy day, go nuts.

And learning about science will teach you how ideas get overturned in science. Restating the thesis in a derisive tone does not constitute a convincing refutation. I'm thinking of your "You believe that Hydrogen Gas turns into people given billions of years." line of argument. While you could semantically argue it's true that we believe, you omit the things that drive the process (like gravity, nuclear physics and chemistry) and that hydrogen gas didn't even form for the first 300,000 years because the universe was too hot. There was stuff before that, which was even cooler. I'll refer you to a great book on the subject.

4. Learn history

I'm thinking specifically of the history of science, although you can't go wrong with history in general (it would help you get over nonsense like America is a Christian nation and the bible is historically accurate). Science is far more than a collection of facts, it's also the stories of the people who made discoveries through cleverness, accident and happenstance.

And if god were real, science would just be the story of us fixing god's mistakes. Body not intelligently designed? Let's invent some medical science for that, perhaps some clothes and buildings to protect us from the elements while we're at it.

You'll discover just how badass and clever humans can be. You'll also learn things that disconcert you and make you reevaluate your preconceptions. There also four things that you should take particular note of:

1. Major discoveries are always the result of hard work. We may look back and them and spin them into narratives with "Aha!" moments. But if you look at the stories, you notice that the glorious "Aha!" always has some lengthy and arduous work process, and many discarded failed attempts associated with it that gets left out of the movies. Darwin didn't stroll off a boat, sketch a few finches and think "I'll write a book about this." You should see the notes he kept between artificial selection of plants and pigeons, his mollusk stuff... Really awesome.

2. Realize that major advances are always sparked by the discovery of new evidence. (Good quality evidence is always lacking in religious argumentation... and no, your bible doesn't count as evidence.) The germs were discovered by analyzing hospital statistics. Relativity was developed because of a confusing experimental result about the speed of light (then validated experimentally years afterwards). Kepler realized the earth went around the sun after painstaking measurements of the solar system through his new telescope. Hubble discovered the universe was expanding by making observations of starlight through spectroscopy. The list goes on.

You'll also start to understand what constitutes good evidence. Learning that people get things wrong when they use their gut or rely on bad evidence (like say eyewitness testimony or hearsay). And see how evidence can be misinterpreted. Witches were executed with mountains of well documented "evidence" and we have confirmed eyewitness testimony  Joseph Smith and  his golden tablets. Then there's UFOs.... If we're able to reject those despite the quantity of evidence, a 2000 year old book doesn't stand a chance.

3. Understand that  significant advances in science have always been counterintuitive. The heliocentric theory, universal gravitation, the germ theory of disease, evolution by natural selection, just about any result in statistics, relativity, quantum mechanics, atomic theory, Newton's laws of motion... All of this stuff runs completely against the grain of how we "think" the world ought to work.

4. A HUGE number of phenomena were, at one point in time, attributed to the actions of conscious, anthropomorphic supernatural agents (gods) and the majority of them are now well understood as the results of mindless natural processes. Similarly many processes were deemed to be beyond human understanding and they turned out to be wrong. (I think even Bill O'Reilly knows why the tides go in by now.) If you use history as a guide, asserting something is due to a supernatural cause is undeniably a loosing bet.

5. Learn to talk the talk

It's extremely easy to spot someone who doesn't know what they're talking about (and therefore will not be taken seriously) and that's because they use words without knowing what they mean.

You need to learn to separate atheism, evolution, abiogenesis, particle physics and cosmology. For example, evolution only deals with the nonrandom survival of randomly varying replicators, while having nothing to say on the subject of where the original replicators came from. Cosmology can explain how a whole bunch of energy develops into stars and planets and puppies. The big bang is a theory that unifies a bunch of observations within cosmology.

If you say "Evolutionists believe..." followed by a statement about cosmology or the multiverse, or refer to the big bang as an "explosion", you sound incompetent. Even though you may be using the terms "evolutionist", "the religion of atheism" or "the religion of evolution" as a term of derision, the unintended consequence is to make you seem less articulate and less credible.

I realize that if you change "evolutionists believe" to "serious scientists who have spent decades studying biology believe" then it's much harder to form a compelling argument. No one said life was easy.

Being clear and specific is essential for science. So within science, words like uncertainty, energy, nothing, information, and mutation all have extremely specific meanings, often different from the way those words are used in every day speech.  Using words in non-traditional (like applying "faith" to scientific theories or "information" to the organization of DNA molecules) also costs you major credibility.

And keep in mind the limits of language; that lots of things have misleading names because in the past we knew less than we knew now. So the big bang as neither big, nor did it bang. Electric eels aren't eels. Tin foil is made of aluminium. Pencil lead is made from carbon. And Creation Science has nothing to do with science.

You also need to learn to reign in your analogies. It's not enough to find a useful analogy, you also have to know what its limits are, where it breaks down. An atheist is exactly like a soda can fizzing, if by exactly, you mean it's neo-post-modern definition of not at all.

If you don't understand analogies, don't use them. It's like a pig on a tightrope. -Penn Jillette

If you're trying to persuade people with words but don't seem to understand the words coming out of your own mouth, you will be dismissed and mocked. I get that proselytizing is as much about getting the slap on the back from your fellow Christians and you're talking as much to them, if not more, than you are to the unbelievers you might hope to sway. But you should take the high road; be and sound smart for it's own sake. Being smart is awesome.


He said heading off to watch NPH host The Emmys... Hope there's a big dance number!