When you encounter new information but have yet to form a belief about it, what’s the immediate response? Do you hold it up in your mental spotlight and subject it to a critical analysis? Do you suspend your belief until you’ve either confirmed or denied it?
We have limited time and mental resources to properly evaluate everything that impinges on our mind. We don’t stop to fact-check every headline that passes us by on social media, or every claim made by certain news outlets, or each idea presented in a book.
What happens to those things that we encounter but don’t critically analyse? Research suggests that they can slip by into our truth baskets. Once there, they can persist rather stubbornly.
Believing, it turns out, is the easy part. The second step is where things get tricky.
The Library of Belief
Rene Descartes wrote, “All that the intellect does is to enable me to perceive, without affirming or denying anything, the ideas which are subjects for possible judgments.”
To Descartes, the mind is supposed to hold ideas up to be examined, to keep them in limbo until a decision about their accuracy is made.
Baruch Spinoza had a different idea. He thought that by perceiving or comprehending something we implicitly believe it first, and only through an extra step can we disconfirm it.
William James, writing of Spinoza’s position, said, “All propositions, whether attributive or existential, are believed through the very fact of being conceived.”
In this account, if we don’t have the time or energy to critically consider the information impinging on our mind, we’re at risk of accepting it as true. Belief is the first step, evaluation the second.
Psychologist Dan Gilbert uses a library metaphor to understand the two ideas.
Imagine you have a library, most of the books are nonfiction, but there are a few fiction in there too. To distinguish between them, you decide to tag the spines of the books, using one of two systems.
- In Descartes’ system, you place a red tag on the fiction books and a blue tag on the nonfiction.
- In Spinoza’s system, you only place a red tag on the nonfiction.
Both systems function effectively in that you can tell apart the nonfiction and fiction books. Spinoza’s system actually saves you time from not having to label every book.
However, what happens when new books arrive?
- In Descartes’ system, the new book doesn’t have a label, so even if you don’t read and label it right away, it will stand out amongst all the other labelled books.
- In Spinoza’s system, the new book lacks a label, just like all your nonfiction books. If you don’t examine it then and there, it appears like a nonfiction book when you look for it later.
When new books are being fed into the library, Spinoza’s system can break down.
Perhaps if you are already familiar with the book, you can label it true or false right away. But unfamiliar books need to be reviewed, and what if you don’t have the time to look through them all? What if new books are coming in too quickly for you to evaluate them one by one?
Then some fiction books will look like nonfiction books.
Does the mind sort through incoming information and ideas more like Descartes or Spinoza’s systems? Gilbert ran several studies that found Spinoza’s system to be more accurate.
How To Make Someone Believe Something
In a 1990 study, Gilbert told participants statements such as “A monishna is an armadillo,” and followed that up by telling them if the statement was true or false.
On some trials, there was a tone immediately after hearing the validity of the statement, disrupting the participants thought processes. This disruption caused them to mislabel false statements as true, but not true statements as false.
If information remained neutral until we have confirmed or denied it, then we either shouldn’t make these mistakes and be able to recognise ideas that haven’t been tested; or mistakes should be made in both directions—false become true, and true become false. That didn’t happen.
In other studies, he gave participants statements about a crime. The information was clearly labelled as reliable or false based on its colour, and so was identifiable at the same time as it was read. The information either exacerbated or lessened the extent of the crime.
On some trials, participants had a number task to do as they tried to read the information. The added mental effort caused false information to seep through as true, but didn’t cause the true information to be labelled false.
When asked to sentence the criminal, those who read the exacerbating false information gave sentences nearly twice as long as those who read the extenuating false information, even though they should both have been ignoring it.
Gilbert concludes:
“Findings from a multitude of research literatures converge on a single point: People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.”
New information is believed first, refuted second. But if something disrupts that refutation, whether by taxing our mental resources or some other reluctance on our part, false information can pass by unnoticed.
Consider reading a book, or scrolling through social media. You’re exposed to a lot of information, and it’s unreasonable to fact check every claim or critique every idea. It’s difficult to know what of this information might leave its mark on your memory without going through the evaluation phase.
First Impressions Are Lasting Impressions
If we accept information as true until we find a reason to discredit it, we need to be good at recognising when it’s been discredited. It raises the value of humility and a slight scepticism of our own ideas.
Here, too, research often finds us wanting.
Initial pieces of information often form the base from which we branch out and evaluate others. First impressions guide us more in our interpretations of new evidence, than new evidence guides us in evaluating our old ideas.
In 1946, Solomon Asch played around with the order of lists. Take a look at these descriptions of Ben and Alan, and consider what you think of them:
- Ben: intelligent, industrious, impulsive, critical, stubborn, envious
- Alan: envious, stubborn, critical, impulsive, industrious, intelligent
You probably noticed they’re the same descriptors, yet even so, Ben came off best. The first characterisations form the basis that the latter slightly adjust.
Many of our social interactions work this way. We meet someone who seems grumpy on a first encounter, and assume it’s a general disposition. Even if they seem pleasant on a second encounter, it doesn’t completely nullify that first impression.
Now consider someone with no opinion or knowledge of a hotly debated and complex issue. First, they see some statistics and hear a brief argument, and not having anything to compare it to or any obvious reason to reject it, they let it sink into their mind—believing is the default.
Might that initial impression influence how willing they are to accept other arguments or interpret different statistics? They might, of course, spend the time and exert the effort to properly evaluate different claims and reject their first hypothesis.
But believing is easy and first impressions endearing, validation requires time and energy and an openness to being wrong that I fear makes it the less common route.
Finding Support for Your Beliefs
The confirmation bias is now one of the more well-known topics in psychology. We look for and readily accept information that supports our ideas and beliefs.
Tom Gilovich writes that when we are evaluating agreeable positions, we ask ourselves “can I believe this?” However, when we are evaluating unagreeable positions we ask “must I believe this?”
This makes us lenient towards acceptable ideas and critical of those we don’t align with. When approaching ideas you don’t agree with, you look around for any reason to invalidate it. When you find ideas you agree with, you look only for the minimal requirements.
“By framing the question in such ways … we can often believe what we prefer to believe, and satisfy ourselves that we have an objective basis for doing so.”
Gilovich ran an experiment using different versions of the classic Wason task. In the original task, you are shown four cards, each has a number on one side and a colour on the opposite, as shown here:

The task is to pick which cards you need to flip over to prove the rule that all even numbers have red on the other side.
Most people look only to confirm the rule, so they turn over the 8, because it must show red on the other side. However, the orange is also necessary—if it has an even number on the other side, it would prove the rule false.
Only about 20% answer both the 8 and the orange card. But by changing the options slightly Gilovich had it up around 52%.
Instead of using numbers and colours, Gilovich used negative stereotypes. When people belonged to the stereotypical group, they looked for the information that would prove the rule false, as that would save their image of themselves.
Because they didn’t want to believe the rule, they asked: “must I believe this?” They looked for any piece of evidence that might disconfirm it. When it’s just numbers and colours, people don’t have the same incentive, they simply ask “can I believe this?”
There is some rationality to this. Why investigate what already makes sense to you? Why spend time and effort evaluating information that works cohesively with your existing knowledge? Why question something unless it raises a question?
Unfortunately, given that believing is easy and initial opinions held onto in the face of discrepant information, questioning what we agree with is the only way we’re going to erase errors and adjust to a more accurate view.
Don’t Wait For A Question to Start Questioning
Belief is easy at 3 different levels—the initial process of comprehension, the lasting quality of first impressions, and our lopsided evaluation of ideas that we do or don’t agree with.
When we want to, we can remain firm in our opinions. It takes motivation and effort to scrutinise our beliefs, to overcome our biases, and to accept ideas we don’t initially like.
Gilbert makes the point that children are fairly suggestible. We grow into doubt and scepticism. Our ability to rationally examine ideas comes later in life, probably with the development of our frontal lobes.
I wonder how much of our belief system is built in those early years, how much we accept as kids that flows on through life without ever being properly questioned.
Do you still believe that we only use 10% of our brain? That gum will sit in your stomach for seven years? That humans only have five senses? That your childhood pet was sent to the farm to live with other animals?
If you want your beliefs to be accurate, change is unavoidable. Nobody was born with perfect rationality, nobody went through life accumulating only good ideas. At some point, you will have to right the wrongs.
And we do. We entertain and reject certain ideas and arguments, sometimes even our own. Despite biases and a general reluctance towards discrepant information, we change our minds. It’s just not easy.
It pays to look at your own beliefs and opinions with some scepticism, as constructs which will inevitably be altered, refined, and improved over time. Some will be removed altogether, which will make way for better ones.
You don’t have to get it all right to begin with, but you do have to be open and accepting of new information and other ideas. Don’t reject ideas just because they’re not yours, and don’t accept ideas just because they are.
Be First to Comment