When you turn on your new computer, what do you do? You play around, figure out how it works, you get to know it. Figure out what you like and don’t like. Importantly, you likely personalize it with a new background, even a name.
You go on to log into important websites and on most occasions hit “save password on this device.” You sync it across your other devices so your preferences cross over.
Over time you develop a hierarchical system of folders that come to represent a great deal of your life, even if you hate to admit it. To save space you keep all your personal photos on it, home videos, your journal and school notes. You install specific programs that help you with specific tasks, and that you would be screwed without.
Now you turn to your computer as a trusted confidant and problem solver, one more ubiquitous than ever, more reliable in remembering than ourselves, and more vast in its knowledge base than the world has ever seen.
Mind + 1
The very act of creating tools is an act of improving upon our own capacities. We buy hammers because we cannot—most of us at least—bash a nail into a thick surface with our hand; we built shields because our skin is easily penetrated by sharp weapons—which themselves are created to improve upon the deadliness of our limbs. From cars to snowboards, cutlery to TV remotes, so much of the world around us is there to improve and extend upon our body.
It’s not only the body we can extend—our mind too has been added to for many years. We drew on cave walls, wrote books, and took pictures either because our memories are not as clear or reliable, or to improve our ability to impart wisdom on others. Calculators extend our mathematical abilities, clocks improve our timing, and todo lists help us organize and prioritize our thoughts.
That little device in your pocket may be the greatest extension yet. Not only does it have a calculator, camera, and todo list app, it has answers. Whatever question we have, our device is there for us. No need to think for ourselves, just open the browser and type.
Wants > Needs
A good tool should be reliable. The hammer shouldn’t bend, the calculator shouldn’t guess, and the internet shouldn’t lie. Yet it does. Unlike other tools the internet has a mind of its own. We don’t only use it, it uses us.
When you type a question into Google, what does Google want to do? Answer the question? Sure, but in what way? Objectively? With multiple perspectives? Nope. They want to answer it in a way that appeals to you or benefits them. If you have a slight bias, and have given that away through previous searches, you will be shown answers in line with that bias, because that’s more likely to get you to click.
It’s the classic tale of tell the truth vs tell me what I want to hear, and the favor is falling to the latter.
If, like many others, you tend to get your news from Facebook, the effect is in full swing. Your newsfeed is personalized regarding your interests and therefore biases. You see what Facebook thinks you want to see, not what you need to see. If you believe the sun circles the Earth, it’s less likely you’ll see information stating otherwise.
You come to exist in an ideological bubble. By telling the internet what you like, it not only knows what you want to see, it knows what you likely don’t want to see. It’s a reinforcing cycle, what some call an echo chamber—like a post, Facebook updates your feed to show you more of that, you like more of it, and so more of it comes.
“The Newsfeed algorithm is a super-optimized gratification machine, observing what types of content you enjoy, and then curating that content for maximum user engagement, regardless of whether that content is true or false. … Users are shielded from stories the algorithm determines they won’t enjoy.”
This past election was a prime example. Facebook is under fire for allowing fake news to propagate throughout its feeds. This wasn’t simply news from the other side, it was fake, made up, not true of either party. Yet people believed it, shared it, and used it to reinforce current biases.
Worryingly, research has found that 61% of millennials use Facebook as their primary source for news about politics.
“The global village that was once the internet has been replaced by digital islands of isolation that are drifting further apart each day. … Without realizing it, we develop tunnel vision. Rarely will our Facebook comfort zones expose us to opposing views, and as a result we eventually become victims to our own biases.”
Truth – 1
From Facebook’s perspective this is a difficult situation to tread through. Its goal is to show you more of what you like, I doubt they want to get into the muddy water of defining what is true and enforcing a standard upon what people share.
People want free speech. However, if what’s said is fake—beyond reasonable doubt—and if it can influence people—or at least maintain their ignorance—should we still allow it? What’s more, if Facebook were to filter items in some way, would we trust them not to inject their own biases, knowingly or otherwise?
We shouldn’t only be cautious of what we read on Google or Facebook. Any blogger with a professional looking website can write whatever they want. Companies can produce scientific papers spun in a way to support whatever product they’re selling. Even popular news sources can have sponsored content, which is sometimes difficult to tell apart from real news.
“An explanation of climate change from a Nobel Prize-winning physicist looks exactly the same on your Facebook page as the denial of climate change by somebody on the Koch brothers payroll.”
Researchers from Stanford sought to examine the critical eye of participants regarding online content and its truthfulness. They were tasked with looking at images of websites, advertisements, and articles, and to describe which they thought to be more believable and why. Things did not go so well.
“Nearly 70% argued that the Shell article was more reliable because it provided more data and information about the problem. … In contrast, only about 15% of respondents wrote that the article from the ‘Science’ section was more reliable than a sponsored post by Shell.”
It takes time to check sources. Heck, we don’t even have the time to read beyond a headline. How are we going to know what to believe? It would be naive to believe everything we read, but if what we read has been put there because we believe it already, or are likely too, stopping ourselves from blindly accepting it can be difficult.
Content designed to mislead or coerce is becoming more difficult to spot. Hidden in places we have come to trust. Worse, we’re going with the flow, in many cases accepting it, hardly ever critically analyzing it, and even less often taking ourselves outside our comfort zone to examine another perspective.
Now is the time in which critical thinking and media literacy skills are a must. We need to be trained in the methods of information examination, so that intuitive red flags will pop up in our mind whenever a source or article is only feigning legitimacy.
We shouldn’t hope for Facebook or Google to solve the problem, this is our fight, and it starts with educating ourselves and alerting each other to misinformation. The internet is an extension of our mind like nothing we have ever seen, but we shouldn’t let it do our thinking for us.
. . .
Check out more in the Digital Brain series here
Be First to Comment