Modern human beings have inherited a vast treasure trove of hard data, a wealth of knowledge. With it has come the slings and arrows inherent in any such outrageous fortune.
Humankind has been both liberated and burdened by the Information Age. On the one hand, we have all of recorded human history at our fingertips; the cumulation of all our medical, scientific and moral knowledge; the perspectives of other civilizations, past and present.
On the other hand, human nature is still human nature. We don’t see the world as it is; we see the world as we are.
All that knowledge, all those irresistible facts and trivia, still must be strained, reframed and resolved through a series of human biological systems we only vaguely understand.
Our sensory organs and nervous system gather information about the world; our brains, which extend far beyond the mere grey matter in our craniums into our gut biology and beyond, extrapolate and interpret that data.
Add a few layers of enculturation, life-experiences, media influence, family heritage, ad infinitum and one begins to understand how things can seem so complicated.
How do we know what we know?
And given our tendency toward logical fallacy, can we trust ourselves to recognize the truth?
Even what we “see” isn’t as straightforward as we like to think. What you are seeing right now has been helpfully flipped over and otherwise altered by your brain’s microprocessors to help you make sense of it.
The visual input you are receiving even arrives with a hole in the center of the image; your brain fills that hole. Your brain is in fact just guessingabout what should appear there, which makes what we see a partial hallucination.
If what you see is suspect, what you “remember” is even more so. Human memory is nothing as straightforward as a video recording that can be played back at will. Not at all.
What you think of as a “memory” is really more of a memory of the last time you recalled that particular memory than a dispassionate surveillance video of actual events. Older memories become a like a copy of a copy of a copy and are thus degraded over time.
We don’t all think alike either.
It isn’t just that humans disagree about things, even if given the exact same data- though of course we certainly do. Some human minds can conjure up the visual image of a sunset in their mind’s eye; some people are “mind blind” and can’t.
Then there are hundreds of cognitive biases to which we naturally fall victim. We see these play out in large ways and small ways across every aspect of modern society.
Cognitive bias is why people believe in conspiracy theories on the internet. We need huge, historical events—like 9/11 or the assassination of President John F. Kennedy—to have a correspondingly huge reason behind them. Our brains don’t like the idea that it could have been the work of ordinary criminals who aren’t at all special or important.
The banality of evil is something we find harder to accept than a vast criminal conspiracy.
Cognitive bias is a work when someone playing a game of chance falls prey to thinking their chances improve the longer they lose. It is at work when we attribute to malice what is just as easily explained by stupidity.
We try mind reading. We predict the future. We are usually terrible at both, yet no matter how often we are proven wrong—the person we thought was mad at us wasn’t thinking about us at all or that terrible thing we were dreading never happens—we stubbornly cling to our aspirations, to fortune-telling.
We project our own fears, insecurities and issues onto other people. We do this so unconsciously, and so often, we often confuse this with reality. We don’t listen, jump to conclusions, act unconsciously.
Confirmation bias is something that is driving the divisiveness in American politics and around the world; almost no one wants to read news they don’t happen to agree with anymore. And they have so many choices, they don’t have to.
Confirmation bias causes us to seek out, consume, and remember things that reinforce our existing beliefs; and forget, ignore, or dismiss things that don’t. Instead of trying to understand the other side of arguments or disagreements, our confirmation bias causes us to dig in deeper to our own beloved position of moral certitude, forsaking all others.
This is terribly inconvenient at a time when we could all use a little less certainty from just about everyone, and a great deal more inquiry from same.
Because the worst logical fallacy to which human beings are prone is our tendency to constantly overestimate our own objectivity and good sense.
The greatest human invention was invention itself. The greatest human blind spot is that we can’t see our own blind spot.
Researchers and behavioralists have been studying this phenomenon for decades.
“Across these studies, approximately 85% of participants indicated that they were more objective than the average member of the group from which they were drawn, a state of affairs that cannot possibly be true.” — David Alain Armor. Stanford University, 1999.
Like that center portion of our visual universe, our blind spot is that we don’t think we have a blind spot at all. Worse, we fill in what we thinkshould be there if we suspect one.
Thinking yourself unbiased, or even less biased than the average person, is itself a logical and moral fallacy.
But there is good news: neuroplasticity.
If you are reading these words right now, it means you are currently in possession of a top-quality, state-of-the-art, quantum thinking machine. It is a machine that can change itself; a self-examining, diagnostic, supercomputer.
The brain can think about itself and actually change itself, physically altering neural-pathways and resetting circuits over time. It really is incredible.
So there isn’t any reason to let your cognitive biases get you down. Getting neuroplasticity to work for you is easy.
By cultivating what neuroscientists and psychologists call a “growth” mindset or a “flexible” mindset over a rigid or “fixed” mindset, anyone can learn to look beyond their biases.
Expect to change your mind over time; accept the limits of your own intelligence, perspective and objectivity. Stay open to information and experiences that challenge existing beliefs.
Buddhists might call this flexible mindset a “beginning mind”.
“In the beginning mind, there are many possibilities. In the expert’s mind, there are few.”
(contributing writer, Brooke Bell)