You’re Irrational For A Reason

“We are Homo sapiens. That’s the distinguishing characteristic about us, that sapiens part. We’re supposed to be smart.” — Carl Sagan

While we have an incredible capacity to apply logic and effort to our thought processes, there are many times we don’t, or think we do but do so in the wrong way. We’re rational up to the point that we’re not, and we’re not in surprisingly predictable ways.

Take a look at Wikipedia’s page on biases, and you’ll find almost 200 ways in which people make regular errors. Most of them are mental shortcuts — intuitions and rules-of-thumb that unfortunately lead us astray.

So what gives?

The Evolved Mind

We didn’t evolve to be perfectly rational all the time. Natural selection shaped the mind to take actions that have, in our past, tended to increase reproductive success. These mental shortcuts are features, not flaws, when viewed in evolutionary terms.

To aid us in survival and procreation, we developed tools to attract mates, find food, cooperate, and fend off threats.

We built a predilection for sugar that now has us overindulging.

We prefer immediate gratification because the future was uncertain.

We stress over big decisions because that’s how we dealt with threats.

We group together into tribes because there’s safety in numbers.

A great deal of our behaviour is related to sexual selection. Just like peacock’s bear a heavy cost to sport a quality tail, therefore indicating a certain level of fitness, people signal their fitness through conspicuous consumption.

Whether it’s a fast car, big house, fashionable clothes, or a wide range of unread books, people often attempt to highlight their wealth, intelligence, kindness, or health.

Much of this behaviour could be viewed as irrational, especially when viewed purely from an economic standpoint. But we do these things for a reason, even if we’re not cognizant of that reason as we do them.

The Wrong Model

To understand our behaviour, it’s important to use the correct model, and that model will need to take into account human evolution. It’s all too easy to compare people to homo economicus, a perfectly rational being capable of finding the optimal solution to any decision.

Rationality of this type is a pipe-dream for most humans. A perfectly rational, logical agent wouldn’t decline a gamble that was in their favour, wouldn’t keep setting deadlines that they couldn’t keep, or continue on a course of action when they should cut their losses.

In the 1960s, Daniel Kahneman and Amos Tversky gave rise to behavioural economics by highlighting the many predictable ways that people did not behave like homo economicus. Since then the list of biases has flourished, with almost 200 filling the Wikipedia page.

Economist Jason Collins compares this to early models of the solar system. In 1500 the dominant model of the universe had the sun, planets, and stars orbiting the earth. 

However, because it wasn’t accurate, there was a long list of deviations from the model as planets moved in ways not properly explained. What we needed was a new model.

Similar to the early theories of the universe, it’s unlikely that we’re a model of impeccable logic, with the near-200 biases on the Wikipedia page representing deviations — rather, there are near-200 indications that we should be using a different model.

The Information Landscape

To better organise the many biases, Buster Benson, marketing manager at Slack, created a bias codex by grouping them together into certain categories (it was designed by John Manoogian III):

The Cognitive Bias Codex
Bias Codex by Buster Benson & John Manoogian III. Get a printed version here.

“I made several different attempts to try to group these 20 or so at a higher level, and eventually landed on grouping them by the general mental problem that they were attempting to address. Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs (and resulting mental errors) that they introduce.” — Buster Benson

The categories ended up as follows:

  1. There’s too much information: We’re inundated with information, we have no choice but to selectively filter it, to somehow find the signal in the noise. This can lead to overvaluing the information we find and focus on, while undervaluing the stuff we miss.
  2. There’s not enough meaning: We have to make sense of the filtered information, so we spot patterns, think of cause and effect, and construct stories by filling in gaps. This can lead to us spotting irrelevant patterns & mistaking correlation for causation.
  3. We need to act fast: We’re constrained by time but it’s hard to act if we acknowledge our uncertainty, so we jump to conclusions and tend towards overconfidence.
  4. What we should remember: Memory is not perfect, we form generalisations over specifics, and edit and embellish them over time. What we remember also informs what information we pay attention to.

It’s hard to think of anything that has gone through such a dramatic change as the information landscape — from how much there is, how we access it, how we make sense of and store it, and how we communicate it.

There are costs to all of this. While it’s certainly easier to access information, finding the most important and reliable of it, consuming it, making sense of it, and connecting it with other relevant information, all takes time, effort, and the smarts to know the right method from the wrong.

Making Better Choices

Perfect decisions and information-processing in this noisy and information-heavy environment are near impossible. What choice do we have but to take certain shortcuts, fallible as they are?

This isn’t to say there aren’t improvements to be had. By recognising that many of these errors result from our relationship with information, certain habits might help us on our search for better decision-making and more accurate knowledge.

Here are some ideas for making better use of our time, mind, and the information at our disposal:

Know the difference between decisions and opinions:

This might seem obvious, but it’s very easy to let a little information form a conclusion in your mind. Unless that conclusion must be acted on in the near-term, it’s possible to stay open-minded and hold-off making any definitive assessment.

“We are not obliged to make up our minds before the evidence is in. It’s okay not to be sure.” — Carl Sagan, The Burden of Skepticism

Think of decisions as placing bets:

Don’t judge the quality of a decision by the outcome alone, this ignores the role of luck or the significance of information you didn’t have. Instead, treat each decision like you’re placing a bet at a poker game — you want to find the best odds.

“What makes a decision great is not that it has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge. That state of knowledge, in turn, is some variation of ‘I’m not sure.’” — Annie Duke, Thinking In Bets

Be a good sceptic:

You want to be critical of new ideas and information, but not so critical that good ideas don’t make it in; from the other angle, you want to be open-minded, but not so open that your brains fall out — or you start accumulating bad ideas.

“What is called for is an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas.” — Carl Sagan, The Burden of Skepticism

Know when to trust your intuitions

When you have a gut response to something, stop to consider it. If it’s in a reasonably stable field that you’ve had experience in, then it’s going to be more reliable than a lack of experience in fields that are irregular. 

Basically, good intuitions must be learned, whether that is by educating yourself and becoming an expert in your field; or by evolutionary means, which requires the situation to be similar to those we faced in the distant past. 

“Claims for correct intuitions in an unpredictable situation are self-delusional at best, sometimes worse. In the absence of valid cues, intuitive “hits” are due either to luck or to lies. … Remember this rule: intuition cannot be trusted in the absence of stable regularities in the environment.” —Daniel Kahneman, Thinking, Fast and Slow.

Mistakes Will Be Made

We are not the perfectly rational, logical creatures once supposed. The future is uncertain and there’s only so much we can make sense of. Even on a collective level, there is so much we have yet to discover.

An unbiased information search, an accurate and objective view of the world, an impartial interpretation of evidence, are high expectations for a human mind with such limitations. We weren’t built to accomplish such feats, and while it’s noble to strive towards them, we have to acknowledge how truly difficult they are.

Perhaps it’s more rational to accept that our decisions won’t be perfect and learn to be ok with our mistakes and misconceptions. Instead of aiming for perfect, right, or true, we should aim for good-enough, probable, and less wrong.

Share the word

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *