There's a wonderful article in the New Yorker by Maria Konnikova (available online free here). A research topic that has been recently in vogue (and thankfully so) is the study of how easily humans are deceived. I've read a substantial amount about the various biases that we are prone to, but this article highlighted something I hadn't really considered before; that facts become more deceptive simply by being arranged in a narrative. The entire article is nicely summed up by this:
As the economist Robert Heilbroner once confided to Bruner, “When an economic theory fails to work easily, we begin telling stories about the Japanese imports.” When a fact is plausible, we still need to test it. When a story is plausible, we often assume it’s true.
The irrationality of humans is not a controversial thing (it's how I earn my living!) I have said before that when most people use the phrase "let's think logically (or rationally) about ______" what they really meant to say was, "let's make a series of educated guesses and stop when we reach a result which is consistent with our intuitions." Irrationality really does seem to hinge on tricking people into stopping thinking too soon.
The upside is it seems that Konnikova has a forthcoming book on the subject which I'll try to read when the opportunity presents itself.