scientific american

On What Can Be Known

A wonderful short article appeared yesterday on the Scientific American blog about the limits of human knowledge — "How Much Can We Know?"

What makes my job truly special is that I get to show people things they can't explain. And — assuming I do my job well enough — they have no hope of explaining. But the fact that I can do this consistently means that what I do is not un-understandable, because clearly I understand it. Still I find many people are all to eager to rope off certain areas of inquiry, declaring in advance that no understanding is possible and therefore no attempt is necessary. 

The article begins 

“What we observe is not nature in itself but nature exposed to our method of questioning,” wrote German physicist Werner Heisenberg, who was the first to fathom the uncertainty inherent in quantum physics. To those who think of science as a direct path to the truth about the world, this quote must be surprising, perhaps even upsetting. Is Heisenberg saying that our scientific theories are contingent on us as observers? If he is, and we take him seriously, does this mean that what we call scientific truth is nothing but a big illusion?

But just like Darwin's treatment of the eye in On The Origin of Species this is a set-up. There is scientific light at the end of the tunnel:

Sometimes people take this statement about the limitation of scientific knowledge as being defeatist: “If we can’t get to the bottom of things, why bother?” This kind of response is misplaced. There is nothing defeatist in understanding the limitations of the scientific approach to knowledge. Science remains our best methodology to build consensus about the workings of nature. What should change is a sense of scientific triumphalism—the belief that no question is beyond the reach of scientific discourse.

The Power of Self-Deception

Self-deception is supposed to be a bad thing, right? There's no way being less informed about the way the world actually is could be beneficial... Surely.

Or not. According to a recent article in the Scientific American Blog:

People mislead themselves all day long. We tell ourselves we’re smarter and better looking than our friends, that our political party can do no wrong, that we’re too busy to help a colleague. In 1976, in the foreword to Richard Dawkins’s The Selfish Gene, the biologist Robert Trivers floated a novel explanation for such self-serving biases: We dupe ourselves in order to deceive others, creating social advantage. Now after four decades Trivers and his colleagues have published the first research supporting his idea.

I'm both fascinated and disturbed by the idea that misleading yourself can be beneficial in small doses. Not being entirely aware of your own shortcomings can lead to the confidence necessary to take a risk you wouldn't otherwise take, or nail a job interview when you may not actually be the most qualified candidate.

I've been aware of the phenomenon for years and I've read both The Selfish Gene and Trivers' full work on the subject, The Folly of Fools. Both are extremely important texts in modern science that help you to see human behaviour from a new perspective. 

I know from performing experience that simple acting is incredibly powerful. When I perform, I consciously adjust how hard it looks like I'm working, so members of the audience can't tell how difficult any particular trick is. When I need to mentally work my ass off behind the scenes, I've figured out how to project confidence so those internal machinations remain hidden. 

Appearances can be deceiving, even to ourselves. 

Learning to think more rationally

Wouldn't it be nice?

Thinking rationally means not being led astray by misleading information, false assumptions and bad arguments. As Daniel Willingham writes for the Scientific American Blog:

[R]ational thinking encompasses our ability to draw justifiable conclusions from data, rules and logic.

It's not always easy:

In general, our brain did not evolve to think in this logical fashion, and some types of reasoning are simply a bad fit for what our brain can do.

And there are things that get in the way — cognitive biases

We tend to fear a loss more than we relish an equivalent or greater gain. For example, most people would turn down a favorable gamble in which they could earn $22 if a coin lands on heads but lose $20 if it settles on tails. Although most recognize that taking such a bet makes sense, people often choose not to because the potential pain of losing often outweighs the pleasure of winning. These types of reasoning problems are widespread and interfere with our ability to cultivate rational skills.

What I found surprising is that when we train ourselves to think more rationally (through that pesky process called education) that training typically does not cross into other domains of life: 

But decades of research have also consistently found that students improve only in the type of reasoning skills emphasized in the course, not in other tasks.

That is, if students work on logic puzzles, they get better at logic puzzles but not at other things, such as forming coherent arguments or winning debates.

This pattern makes sense. Rational thinking requires different skill sets in different situations. The logic we use when interpreting a science experiment is not the same logic we need when buying a car or following a new recipe.

That goes a long way towards explaining how someone can be extremely well educated, but still fall victim to believing silly things outside their own sphere of expertise. I'm reminded of a recent US presidential candidate and acclaimed neurosurgeon who didn't know enough about biology to understand the theory of evolution. 

The full article is available at the Scientific American blog

A New Kind of Deception?

Deception is the backbone of my work. I'm fascinated both by its practical applications to the world of entertainment — creating a a profound sense of astonishment by doing things that appear at first glance to be impossible — and by how that knowledge can be turned on its head so people can learn to arm themselves against being deceived by others.

In a recent post on the Scientific American Blog, they discussed a kind of deception I had never heard of; the blue lie

Little White Lies, I've heard of. Those are the lies you tell to someone to spare their feelings; lies told out of compassion. Dark lies are told for selfish reasons — to protect yourself from blame or to mislead others. 

As an aside, I almost blew right by one of the most profound insights in the entire piece. 

Children start to tell selfish lies at about age three, as they discover adults cannot read their minds...

The fact that we have a silent monologue that only we can hear. The fact that our thoughts stay contained within our own skulls and aren't accessible to the outside world. This is something that has to be learned! The things about the brain that we take for granted.

The new variety, the blue lie, is a lie which is selfish and harmful while being beneficial to an in-group. Here again, the article about deception is offered in relation to T**** and his nasty habit of prevarication. It's an interesting theory to explain. But it also applies to groups like government spy agencies, which deceive for the benefit of the nation's citizens. Wonderful food for thought.

Stephen Pinker explained years ago in a talk that the notion that we ought to believe things because they are true is a relatively new notion. Mainly we profess beliefs as a way of showing allegiance; that "I'm on your side."