TAGGED :
bias
The Cognitive Biases That Make Us Terrible People

One of the things we need most right now, as a society, is the willingness to say “I don’t know.” The ability to admit that we aren’t experts on every issue, to recognise that our feelings are often wrong, and to be willing to hear new evidence instead of sticking our fingers in our ears.

Why is this so hard? Because we always think we’re right. In the face of our obvious correctness, under the weight of al of the evidence we feel that we have, how can any other argument possibly hope to stand? And everybody else thinks the same, and nobody ever agrees on anything.

Mark Manson brings us a deep dive on some of the most common biases which lead to this problem. He sums up the entire problem so perfectly in his introduction that I was hooked:

I know what you’re thinking. You’re thinking, “How are there so many idiots in the world who can’t seem to see what is right in front of them?” You’re thinking, “Why do *I* seem to be blessed (or cursed) with the ability to see truth through a torrential downpour of bullshit?” You’re thinking “What can I do to make people understand? How can I make them see what I see?”

I know you think this because everyone thinks this. The perception that we understand life in a way that nobody else does is an inherent facet of our psychology. That disconnect we feel is universal.

Discourse is truly a “be the change you want to see in the world” issue. If we want other people to think more deeply, we have to think more deeply. If we want others to listen more, we have to be willing to listen. If we want our perspective to be understood, we have to recognise that there are perspectives out there other than our own.

Understanding Unconscious Bias

Most of have a great deal of confidence in our subconscious mind’s ability to make decisions. And rightly so. Every day we subconsciously make hundreds, if not thousands of decisions which keep us safe, help us to navigate our relationships, and keep us happy.

But there’s one area where it lets us down more often than we realise; rational thinking:

The unconscious mind is amazing. It can process vastly more information than out conscious mind by using shortcuts based on our background, cultural environment and personal experiences to make almost instantaneous decisions about everything around us.

The snag is, it’s wrong quite a lot of the time. Especially on matters that need rational thinking.

This video from the Royal Society explores how our subconscious can lead us to judge people and situations unfairly. As it highlights, all of us are guilty of this, it’s a normal aspect of human psychology. But by becoming more aware of its existence, and its influence on our decision making, by resisting the temptation to make snap judgements, questioning the reasoning behind our decisions, and questing cultural stereotypes.

Again, there’s nothing wrong with the fact that our brains do this any more that there’s something wrong with the fact that our taste buds prefer ice cream to vegetables. What matters is that we engage our rational minds, so that we can make healthy decisions more often.

"Meditation isn’t an escape from life. It’s an encounter with it."
Which are deadlier; horses or sharks?

In a family short period of time, the term “fake news” has worked its way into the public consciousness. And whatever you might think about the man who popularised the term, there’s no escaping the fact that a lot of news is fake.

This isn’t to say that we’re being fed outright lies (although who can say for sure), but that the news media is incentivised to present the world to us in the most emotive, divisive, and incendiary terms possible. The more strongly are emotions are triggered, whether they be sympathy, disgust or preferably anger, the more likely we are to share the article or video, brining them more pf hose sweet, sweet, advertising bucks.

But this is bad news for us. The more common this becomes, the less closely our view of the world matches up with reality. We start to believe that everybody is out to get us, or that violence and danger is everywhere, or that our political opponents are pure evil.

None of this is true. In fact, by pretty much any measure you care to choose, the world is a happier, safer, kinder and fairer place than sit has been at any point in human history. It’s the fact that we’re fed such a steady diet of bad news, while being positively starved of good news, that we feel as if everything is terrible. Here’s a quote for the video above on the subject?

Unfortunately we remember some things much more easily than others, so this “availability heuristic” can lead us to make really bad assessments of risk.

Things that are more commonly and vividly shown in the media, like a violent murder, will be a lot easier to remember rehab things that are far more common in real life but aren’t as dramatic, like a cancer patient dying peacefully at home.

So what can we do about it? Sadly, not much. The news has been fearing into this feature of human psychology for a long time, and probably isn’t about to stop any time soon. But we can go some way to protecting our sanity simply by being aware of the way our minds work. Simply by recognising that our brains are more likely to hold onto and believe bad news, we become a little more suspicious of the most pessimistic voices in our heads. We can remind ourselves that the information we see on our phones is just a tiny, negatively biased sliver of what’s actually going on in the world. And maybe we can remember to watch a little less bad news.

Better Reasoning For A Storytelling Species

Humans understand the world through stories. It’s hardwired into us in the same way as our sense of fairness or our tendency to see familiar objects in random shapes like clouds.

But unlike seeing a cloud that looks like a bunny rabbit, our love of stories isn’t always harmless. The simplicity of stories works in direct opposition to the complexity of reasoning. Stories, as wonderful as they are, don’t often capture the nuance of real life. In stories there is a good guy and a bad guy, a beginning, middle and end, there are happy endings and deserved punishments. Life is rarely so cooperative.

But thankfully, the human brain is remarkably adaptable. Even if we can’t stop ourselves from processing information as stories, we can guard against some of the errors this tendency produces. This article in Aero magazine aims to do just that:

Our heuristics—the rules of thumb we use in our reasoning—can create and maintain wonderful stories, at the cost of nuance and accuracy: such as when we assign more blame to those we dislike, and more merit to those we like, creating a clear separation between the good guys and the bad guys; when we feel that rhymes imply truth; assess ambiguous information as confirming our preconceived narratives, and evaluate arguments as if they were stories—on the basis of their believability, instead of their logical structures and evidential bases.

Overcoming these tendencies is a crucial step on the path towards intellectual progress, since reality has shown, time and time again, that it is under no obligation to conform to the stories we tell about it.

5 Tips To Improve Your Critical Thinking.

More than any time in history we need to be able to think critically. The Information Age, rather than making decisions easier by offering us more and better information, has made them harder by blurring the line between trustworthy and unreliable sources. Not to mention that the sheer volume of information we’re bombarded every day is overwhelming.

Nonetheless we still need to make choices. And with so many important decisions to be made about health, politics at the moment, as well as the usual financial and life decisions we have to worry about, the video above about how to prove our critical thinking skills feels like it couldn’t have come at a better time.

Rather than choosing an answer because it feels right, a person who uses critical thinking subjects all available options to scrutiny and scepticism. Using the tools at their disposal, they’ll eliminate everything but the most useful and reliable information.

Why Do We Make Irrational Decisions?

Why do we make irrational decisions? Sometimes even when we know they’re irrational. Well, it turns out they’re only irrational if you overlook the mechanics behind the decision making process. Not only are a lot of our decisions heavily influenced by unconscious bias, our emotions make us weigh identical outcomes differently depending on whether we feel like we’re winning or losing at the end:

Under rational economic theory, our decisions should follow a simple mathematical equation that weighs the level of risk against the amount at stake. But studies have found that for many people, the negative psychological impact we feel from losing something is about twice as strong as the positive impact of gaining the same thing….

As convenient as it would be if we could weigh up our options as calm, reasonable human beings, the fact is, it’s quite rare that we make decisions this way. But understanding the mechanisms at work increases our ability to avoid the traps our natural irrationality creates.

Curbing implicit bias

Implicit bias is a difficult topic to talk about. After all, it’s one thing to recognise cognitive biases such as survivorship bias or availability bias, where the flaw in thinking is largely a statistical error, and another to accept that our biases genuinely affect the way we treat other human beings.

Knowable Magazine discusses the problem of implicit bias with social psychologist Anthony Greenwald. It’s a fascinating insight into one of the more unfortunate aspects of our minds:

A quarter-century ago, social psychologist Anthony Greenwald of the University of Washington developed a test that exposed an uncomfortable aspect of the human mind: People have deep-seated biases of which they are completely unaware. And these hidden attitudes — known as implicit bias — influence the way we act toward each other, often with unintended discriminatory consequences.

Are You Missing Something?

A couple of months ago I watched this video about a preacher named Gerald Glenn. Gerald believed that the coronavirus was nothing to worry about. He went as far as to advise his parishioners, and even his children, to ignore social distancing measures, saying that though the virus was out there, God was out there too. Gerald tested positive for coronavirus shortly after the video was made, and died around a week later.

I’ve been thinking about it ever since. What makes somebody, especially a layperson, feel so secure in their opinion about a pandemic which has proven to be dangerous, that they would advise their children not to take it seriously.

Of course, the easy answer here is faith. He was a man of God. But since then we’ve seen countless examples of people failing to take the pandemic seriously. Some of whom have paid the same price. Did they all imagine that God would protect them? Or was there something else behind their complacency?

In his excellent book, “Thinking Fast And Slow”, Daniel Kahneman says this about the mind’s tendency towards bias:

…the focus on error does not denigrate human intelligence, any more than the attention to diseases in medical texts denies good health. Most of us are healthy most of the time, and most of our judgements and actions are appropriate most of the time.

Of course, this is true. We all make countless decisions every day about people’s emotional state, about which of numerous courses of action to take, about how dangerous a situation is, and for the most part, we get it right.

The problem today is that many of our decisions aren’t simply at this local, personal level. Today we’re required to take in vast amounts of complex, often conflicting information, and come to conclusions which don’t just affect our immediate circumstances, but the lives of those around us.

In the age of coronavirus, we’re all making decisions about whether to wear masks and under what circumstances. Soon, we’ll be deciding whether to take a vaccine which has been developed far more quickly than usual, but is also our best hope for protecting lives and returning to some degree of normalcy. We need to decide which voices to listen to and which to ignore, about issues which we don’t have a good grasp on.

When making decisions such as these, it’s vital to recognise that our gut instincts can be wrong, because the price of being wrong might be very high indeed.

In the interest of encouraging a healthy scepticism of our gut instincts, the video above introduces the concept of survivorship bias. Survivorship bias describes our tendency to focus on the people or things that succeeded in overcoming an obstacle, and overlooking those (usually a much greater number) that didn’t.

It’s the reason why we listen to the advice of celebrities who tell us that the key to success is to “work hard and believe in yourself”, even though countless people who did both failed to achieve the same results because they weren’t as lucky.

In a situation like a global pandemic, it’s tempting to take the seriousness of the threat lightly if it doesn’t affect you personally. In this case, you’re one of the lucky people who overcame the risk of infection, and so believe that the things you’ve done are obviously sufficient to minimise the risks of becoming sick. But this overlooks the millions of people who became sick even though they behaved similarly to you but weren’t as lucky.

I’m not trying to tell anybody what to do. Frankly, nobody who isn’t an expert in epidemiology has any business telling anybody what to do. We’re all in the dark on this. But our gut feelings are unlikely to be a reliable guide on how best to keep ourselves and others safe. We’re all decent people. None of us wants to put other people at risk needlessly. When making decisions about how to deal with this problem, let’s keep those feelings at the forefront of our minds.

The struggle against inevitable bias.

In an age where we’re bombarded with more information than ever before, our ability to process it accurately is crucial. Unfortunately, we’re not very good at it.

This problem is made worse by the fact that we feel like we’re good at it. Which is why it’s so important to be conscious of our biases so that we’re not so easily duped by them. This isn’t a problem that can be solved by intelligence either, in fact, more intelligent people are often more vulnerable to their biases, as they’re able to construct stronger arguments to support them.

Here, Adam Wakeling writes for Quilette about how bias affects the best and brightest minds on both sides of the political aisle. Even apparently non-related skills like numeracy are affected.

In another study conducted by Yale Law School, subjects were asked their political views and given a short numeracy test. They were then divided into groups and asked to interpret the results of a fictional study. When the study dealt with the efficacy of a skin cream, those subjects who had the best results in the numeracy test understood the study’s results best. But when the study related to the effect of gun control on crime, ideology and partisan affiliation played a much stronger role. Self-identified conservative Republicans struggled to correctly interpret results which suggested that gun control reduced crime, while self-identified liberal Democrats were equally stumped by results suggesting it increased it. They didn’t challenge the results or complain about them—they just couldn’t make the sums work.

Among those in the top 90th percentile for numeracy, 75 percent of people got the answer right for the skin cream question, but only 57 percent for the gun control question. In fact, people who were good at maths often did worse than those without a bent for numbers. The experiment seemed to vindicate Michael Shermer’s maxim that “smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.” In “Notes on Nationalism,” Orwell noted that some of the best-educated embraced some of the most bizarre ideas. “One has to belong to the intelligentsia to believe things like that,” he wrote, after describing some 1940s-era conspiracy theories. “No ordinary man could be such a fool.”

The case against using memories as evidence.

To say that it’s tempting to use memory as the basis for our arguments feels like taking understatement to the level of high art. How else are we supposed to draw conclusions? Are we just supposed to overlook the information we have right there in our heads?!

Actually, there’s an argument for suggesting just that. Of course, I’m not suggesting that we stop trusting the contents of our minds, that way madness lies, but there’s no doubt that we depend too heavily on a faculty that we know is heavily vulnerable to manipulation, bias, and just plain error.

In this article for Knowable Magazine, Chris Woolston explores the research of psychologist Elizabeth Loftus. Loftus has spent over 40 years pioneering research into the fallibility of memory. Her research has even changed the way that judges, lawyers and juries interpret the recollections of witnesses:

One 1975 Loftus paper that Roediger discusses with his students shows that memories of fast-moving events can be easily manipulated. Subjects in the studies watched film clips of car accidents and then answered questions about what they saw, and Loftus showed that when misleading details were slipped into the questions, the subjects often later remembered things that weren’t there. For example, when asked, “Did you see the children getting on the school bus?” 26 percent of respondents said they had, in fact, seen a school bus in the film, even though there wasn’t one. Of those given no misleading prompt, only 6 percent remembered a bus.

In another classic experiment, students watching film clips of car accidents were asked to estimate the speed of the cars involved. Loftus showed that the wording of follow-up questions had a big effect on the answers. When asked how fast cars were going when they “smashed” each other, the average answer was more than 40 miles per hour. When asked about the cars’ speed when they “contacted” each other, the average answer was just over 30 mph

Memory is one of the tools we have at our disposal, but shouldn’t be treated as the only tool, or the seat of a personal truth which nobody has the right to dispute. History has shown us that reason, logic and consensus aren’t guaranteed to protect us from making mistakes either, but if we take them as seriously as we take our memories, we give ourselves a chance of getting things right more often.