There are two ways you can judge something as true or false: do the work and figure it out for yourself, or let your gut tell you.
Putting in the effort to examine an argument is clearly the more reliable choice, but it takes time and energy that most people don’t have spare to devote to any old claim. And so quite often we’ll accept the intuition that pushes up from our belly an answer that just feels correct.
Those intuitions can get us into trouble, however. We are susceptible to bullshit, misinformation, fake news, and manipulation—most of which rely on our inability or lack of desire to go into the nitty-gritty complex details that might lead us towards the truth.
An article by psychologists Norbert Schwarz and Eryn Newman examined the research behind our evaluations of truth, and in particular those evaluations that rely upon unconscious mechanisms.
“Analytic answers, akin to knowledge from the book, draw on relevant knowledge and may involve extensive information search, which is taxing and requires cognitive resources. Intuitive answers, akin to knowledge from the gut, are less demanding and rely on feelings of fluency and familiarity. The easier a claim is to process and the more familiar it feels, the more likely it is judged “true.” When thoughts flow smoothly, people nod along.”
They highlight 5 criteria for judging truth along with the ways we can meet them either analytically or intuitively.
1. Social Consensus
Do other people believe it? Is there agreement among reliable sources and informed minds? When there is it is more likely we’ll be safe in siding with them, even if we lack the same level of knowledge and understanding.
However, the analytical approach to social consensus requires searching for these opinions and ideas, checking the backgrounds of who and where these ideas come from, and evaluating the other competing ideas.
The intuitive alternative, the researchers state, is a sense of familiarity.
“… determining the extent of consensus can be difficult and familiarity offers a plausible shortcut — if many people think so, one should have heard it a few times, making it familiar.”
But of course a feeling of familiarity and go awry. Research has found that we are more likely to believe a statement if it is printed several times on one page—even as the result of a printing error, and even when we know that such an opinion came from only one member of a group.
Familiarity can also increase the aesthetic value we give to music and art. The more we listen to a song, for example, the more we grow to like it—something termed the mere-exposure effect. Collectively this research shows that repeating something can lead people to place more value in it, even when that information is flawed.
Is this new information consistent with what you already know and believe? If it fits in nicely with our current ideas then we are more likely to accept it, while discrepancies and conflicts raise our distrust.
The analytic approach to compatibility involves a careful analysis of the claim and the understanding/introspection regarding what we already know—along with the degree of certainty of such beliefs. Alternatively we rely on feelings and subjective experience to guide us.
“Information that is inconsistent with one’s beliefs elicits negative feelings and is processed less fluently than information that is consistent with one’s beliefs. These subjective experiences serve as problem signals that trigger more careful assessments of the veracity of a statement.”
The researchers use a question to illustrate this point: how many animals of each kind did Moses take on the ark? Most people answer 2, even if they are aware that it was Noah, not Moses that was responsible for the arc.
Familiarity with the story and the fact that both names carry Biblical weight leads to the sense that everything is coherent and correct. Were the question related to Tom’s arc, we’d surely notice the problem. Even writing the question in a difficult-to-read font, which makes the information harder to process, causes people to spot the issue.
Do the elements within the claim make sense together? Does the information being provided truly support the conclusion being drawn?
We are suckers for a good story, and when it’s relatable, and filled with emotion, even better. This is why we find single person narratives so compelling—an image of a child in the midst of starvation is more likely to propel us into action than is a cold statistic, even if the latter provides more context and objective analyses.
In one study, University students were asked to complete a short questionnaire, after which they’d be given $5. When they got their money, however, they were presented with a letter from the Save the Children foundation. The letter contained either statistics about food and water shortages in places in Africa, or a story about a young girl and her struggle to live in such terrible circumstances. Those presented with the story donated $2.38, compared to $1.14 when presented with the facts.
Is the person offering the information knowledgeable in this domain? Have they been correct in the past? Credibility can also come intuitively, in which case familiarity and fluency again come up trumps.
Nobel prize-winning psychologist Daniel Kahneman writes in his book Thinking Fast and Slow that the only times we should trust someone’s intuition is when the field being predicted is stable or regular, and the person making the prediction has had enough experience within this field to be considered an expert.
“If the environment is sufficiently regular and if the judge has had a chance to learn its regularities, the associative machinery [intuition] will recognize situations and generate quick and accurate predictions and decisions.”
Failing either of these should greatly lower our trust in the information being shared, no matter how confident the source.
Is there much supporting evidence? Is there any information that contradicts it?
The analytical way to find support for a claim is, of course, to go looking for it. The lazy way is to rely on what information comes easily to mind—the availability heuristic.
“Evidence can be assessed analytically by consulting relevant literature or one’s own knowledge. But it can also be gauged by how easy it is to bring some evidence to mind — the more evidence exists, the easier it should be to think of some.”
That intuitive response, however, can overvalue the ease of bringing examples to mind, rather than the reliability of the examples themselves. The researchers point out that people can be more confident in a claim when they can successfully present two supporting arguments than when they come up with six—it was not the number of supporting arguments but the ease to which they came to mind, that is, it’s easier to think of two arguments than six.
When searching for evidence it is also very easy to slip in confirmation bias—when we only look for information that confirms an idea. Actively seeking out contradictory information is again time-consuming and is more likely to make us confused or uncertain, but it’s the only way to know for sure if the initial claim is correct.
Such effects stand to be problematic in the online age, when everyone has a voice and can dress up information to look and sound credible—people still believe the Earth is flat, and have bought products incapable of doing what they claim.
It is also easy to disregard certain claims as being inconclusive or doctored—for instance, ignoring the fact that most scientists agree that climate change is a problem. The authors note how easy it is to be misled on the likes of Facebook:
“On Facebook, one’s friends (a credible source) post a message that is liked and reposted by other friends (social consensus), resulting in multiple exposures to the same message. With each exposure, processing becomes easier and perceptions of social consensus, coherence and compatibility increase. Comments and related posts provide additional supporting evidence and further enhance familiarity. At the same time, the filtering mechanism of the feed makes exposure to opposing information less likely.”
Sorting fact from fiction involves more analytical thinking than it does relying on our gut. This isn’t always possible—and very often it is more efficient to go with our intuition—but when the consequences of our decisions loom large, we must take care to see both sides of an argument rather than jumping to conclusions.
Most people will probably agree with such an analysis, yet think that it applies more to everyone else than it does to them, or, you. But these are not limited to a certain type of person, we are all susceptible and, more than likely, we each hold a belief that just isn’t true—but we’ll only find out what that is when we examine what we think we know.
. . .
to help support my writing
. . .