Is it good to think you’re better than you are? By most accounts, confidence is a positive characteristic. It convinces people to trust you, motivates you towards big goals, and is for the most part a more positive experience than doubt or uncertainty.
Low confidence suggests a lack of belief in oneself. It will likely inhibit us from reaching our full potential, as our bar will be set lower than what we’re capable of. It will display to others that we lack an important quality in leadership.
It is no wonder we find courses, coaches and self-help books aimed at raising our humble confidence levels. But is overconfidence that much better?
The large majority of startups fail yet every day a new batch of entrepreneurs march headstrong into the same results—we benefit on the whole when a few of them create a Google, but there’s a lot of people out there with little to show for it.
Also, overconfidence is going to blind us to our inadequacies. Thinking you’re better than you are means you’re not aware of where you could be improving. It is also likely you will take on challenges of which you cannot succeed, making failure a more likely occurrence.
Failure isn’t bad when we learn from it. But what we should learn is that we’re not as good as we thought, or that what we believed was wrong. We should adjust our confidence level down a notch, to match reality—and in essence, escape overconfidence.
General or specific?
It is going to be a good idea to make a distinction between general and specific confidence. In a general sense, confidence should mean a belief that we can take on whatever the day throws at us. We should walk out our front door in the morning with our head held high.
Confidence in more specific terms refers to us knowing or being able to accomplish a particular goal. It’s this confidence that sets in when we know the answer to a question, or we’re performing a task we’ve done many times before.
When we make this distinction we see that it is good to have a general sense of confidence. This is what we want in leaders, and this is what inspires trust in the people around us. What we don’t want is people confident they know something they don’t or that they can do something they can’t.
Doubt may not be as noble a trait as confidence, but when reality calls for it, we should listen. We should expect leaders to admit when they don’t know something, just as we should admit to ourselves. This doubt of specific abilities should do nothing to inhibit our general confidence—it is even the case that one can admit their inabilities or ignorance with confidence, as in, “I am sure that I do not know.”
We should encourage general confidence, a belief in one’s potential. Specific confidence, however, should be based on experience—yet it is here many of us get lost.
Confidence that you know an answer, can perform some action, or solve some problem, exists on a spectrum. If asked how sure you are that the body contains two kidneys, you can probably couple your answer with an estimation of your confidence level, with 100% being dead certain.
It turns out we overshoot that estimation often. In a 1977 study, people were given a series of multiple-choice, general-knowledge questions, to which they would say which answer they felt was correct, along with their level of certainty. When people were 90% certain, they were right only 75% of the time.
In 1999 the psychologists David Dunning and Justin Kruger came upon an interesting criminal case. McArthur Wheeler had attempted to rob a bank while believing he was invisible. This belief came from a misunderstanding of how lemon juice can be used as invisible ink—he smeared lemon juice on his face thinking it would hide him from the cameras.
In follow up studies the psychologists found that people that don’t know very much think they know more than they do, while those who know quite a lot think that they know relatively little.
“If you’re incompetent, you can’t know you’re incompetent … The skills you need to produce a right answer are exactly the skills you need to recognize what a right answer is.” —David Dunning
Unknowledgeable people are unaware of everything they don’t know, but as you learn more, you begin to realize just how much there is to know, and how little of that vast unknown you actually have. This creates a paradox whereas both the stupid and the smart think they belong to the other group.
Why aren’t we more aware?
For the most part, our judgments of knowledge, truth, and the resulting confidence in them, come from the gut. We have a feeling that something is correct and we go with it. While we may be able to come to better conclusions through directed cognitive effort, and by examining our intuitions more closely, this takes time and energy we’re often not prepared to invest.
Take this question: if a bat and a ball cost $1.10, and the bat costs $1.00 more than the ball, how much does the ball cost? Like most others you probably had a thought of $0.10 thrust upon your mind, but wherever it came from, you should send it back—the correct answer is $0.05.
“Many thousands of university students have answered the bat-and-ball puzzle, and the results are shocking. More than 50% of students at Harvard, MIT, and Princeton gave the intuitive—incorrect—answer.” —Daniel Kahneman
Putting too much faith in our gut can open us up to manipulation. A pair of researchers presented people with statements such as “Osorno is a city in Chile,” with the participants asked to determine the truth of each statement.
You would expect people to judge the truth based on their own knowledge, yet the respondent’s confidence levels could be swayed by altering an odd variable—how clear the text was. By changing the color of the text that the statements are printed in, the researchers could alter the difficulty of reading it, and that difficulty soured the judgments of the participants.
“How many animals of each type did Moses take on the ark?” Researchers presented this question to participants in two different fonts. An easy-to-read font prompted 88% of people to answer with the intuitive ‘two,’ a difficult-to-read font lead to only 53%—the struggle of interpreting the text caused more people to realize it wasn’t Moses that built the ark, it was Noah.
Why would difficulty of processing the information have anything to do with our confidence in the truth of the statement? This plays into something called the fluency illusion. The easier some information is to process, the better it makes us feel, and the more we judge it as truthful and beautiful. The more difficult it is, the more critical we are of it.
Technology hasn’t made our confidence any more realistic. The internet has a wealth of information, we are capable of learning almost anything through it. We certainly use it to find answers to all our niggling questions. And yet if anything it seems the internet is having the opposite effect.
For starters, getting all our answers from Google doesn’t help us remember them. In fact, we forget them fairly quickly, and to make matters worse, we don’t think that we forget them—we misattribute Google’s knowledge for our own.
“If you don’t know the answer to a question, it’s very apparent to you that you don’t know, and it takes time and effort to find the answer. With the Internet, the lines become blurry between what you know and what you think you know.”—Matthew Fisher
When we do search for answers, we are likely to search for things that confirm our preconceptions. We are also likely to arrive at these answers quickly and easily, which as we established with the fluency effect, can cause us to place greater trust in them.
Couple this with the echo chambers we’re forming through social media. Every like, share, and comment provides information on our interests, which informs these networks regarding what we should be shown in the future to maximize engagement.
We end up creating a social feed of curated information. We’re shown what we’re deemed to like and agree with, while being distanced from opposing views. Our biases then come to be reinforced as the online community disperses into little bubbles of support and agreement.
“The global village that was once the internet has been replaced by digital islands of isolation that are drifting further apart each day. … Without realizing it, we develop tunnel vision. Rarely will our Facebook comfort zones expose us to opposing views, and as a result we eventually become victims to our own biases.”—WIRED
Most young adults get their facts from Google and news from social media. The result? We think we know more than we do, we trust our intuition to know right from wrong, we think most people share our opinion, and we think any opposing views come from a less-informed minority.
Keeping our feet on the ground
Over half of people think they’re above average drivers, cooks, and lovers. People still believe the world is flat, that global warming isn’t a thing, that lemon juice can make us invisible. We have far more online arguments and abuse than we do discussion and analytical ideation.
If we all stay in our online bubbles and place too much trust in our intuition, these issues will not get resolved. Unfortunately, it is unpleasant to question our beliefs or to engage with views that seem so opposed to our own. So how can we fix it?
It is easy to point the finger, to see this as being everyone else’s problem. We are the educated people who read articles on overconfidence, we are the ones that keep ourselves balanced and true by carefully analyzing information from multiple perspectives. It’s the others that are so wrong.
But that’s exactly the type of overconfidence we’re talking about. We cannot sit on our high chair, condemning the unintelligent. We cannot solve the problem by telling other people to learn more or doubt themselves more—if anything, this will make things worse. Instead, we need to treat this as our issue, yours and mine. We need to set the example.
There are a few ways we can keep ourselves in check.
1. Keep an adjustment journal: note down whenever you’re wrong, your expectations are broken, or you’re surprised by some information. This will help you see the common errors and biases in your judgment.
2. Play the devil’s advocate: try to write an argument against your own ideas and assumptions to see how well you really understand the opposing arguments.
3. Create an opposing social media account: follow people that think differently to you, so that you can expose yourself to different bubbles and media expressing other ideas.
4. Question your intuitions: whenever you find an answer just pop into your head, make sure to examine its source, pick it apart and use the better part of your brain to test it.
When it comes to other people, we might try to encourage them to follow the same tactics. But even if they don’t, we should be able to debate (or argue) with them from a more informed perspective. Quite often when someone can see that we have taken the time to delve into their views and ideas, they will be more open to hearing what we have to say.
I believe that the best way to change someone’s mind is to allow for them to discover the truth in their own way. That makes it a decidedly personal journey, but it can be coaxed out through good questioning—don’t respond with your opinion, ask people why they think something, why they don’t think something else, and try to lead them towards an area of doubt, take them to where their knowledge is lacking in an obvious way, so they can bear witness to their own overconfidence.
If we fail to convince anyone, at least we will feel—and rightly so—that we have adjusted our confidence in our own knowledge to a more accurate depiction of reality. In the end, it should first be about aligning our confidence and knowledge before we go off trying to correct everyone else.
“In follow up studies the psychologists found that people that don’t know very much think they know more than they do, while those who know quite a lot think that they know relatively little.” .. I wonder how is this related to the impostor syndrome.
I also really liked the self-aware tactics. Great article!