Reductio ad absurdum might sound like a spell from the Chamber of Secrets, but it actually refers to the practice of testing an argument by pushing it to its logical extremes.
Perhaps the most well known example is that favourite if parents everywhere: “if your friends all decided to jump off a cliff, would you do it too?”
But reductio ad absurdum does have more intellectual applications. in fact, as Daniel Dennett explains in this video, Galileo used reductio ad absurdum to demonstrate that light objects and heavy objects must fall at the same speed (accounting for air resistance) all the way back in the 15th Century:
Let’s suppose that heavier things do fall faster than light things. Now take a stone, (A), which is heavier than another stone, (B).
That means, if we tied B to A with a string, B should act as a drag on A when we drop it, because A will fall faster, B will fall slower, and so A tied to B (A+B) should fall slower than A by itself. But A tied to B is heavier than A by itself, so A+B should fall faster.
If lighter objects fall more slowly, A+B should fall both faster and slower than A by itself.
One of the things we need most right now, as a society, is the willingness to say “I don’t know.” The ability to admit that we aren’t experts on every issue, to recognise that our feelings are often wrong, and to be willing to hear new evidence instead of sticking our fingers in our ears.
Why is this so hard? Because we always think we’re right. In the face of our obvious correctness, under the weight of al of the evidence we feel that we have, how can any other argument possibly hope to stand? And everybody else thinks the same, and nobody ever agrees on anything.
Mark Manson brings us a deep dive on some of the most common biases which lead to this problem. He sums up the entire problem so perfectly in his introduction that I was hooked:
I know what you’re thinking. You’re thinking, “How are there so many idiots in the world who can’t seem to see what is right in front of them?” You’re thinking, “Why do *I* seem to be blessed (or cursed) with the ability to see truth through a torrential downpour of bullshit?” You’re thinking “What can I do to make people understand? How can I make them see what I see?”
I know you think this because everyone thinks this. The perception that we understand life in a way that nobody else does is an inherent facet of our psychology. That disconnect we feel is universal.
Discourse is truly a “be the change you want to see in the world” issue. If we want other people to think more deeply, we have to think more deeply. If we want others to listen more, we have to be willing to listen. If we want our perspective to be understood, we have to recognise that there are perspectives out there other than our own.
"The greatest enemy of knowledge isn’t ignorance, it’s the illusion of knowledge."
Most of have a great deal of confidence in our subconscious mind’s ability to make decisions. And rightly so. Every day we subconsciously make hundreds, if not thousands of decisions which keep us safe, help us to navigate our relationships, and keep us happy.
But there’s one area where it lets us down more often than we realise; rational thinking:
The unconscious mind is amazing. It can process vastly more information than out conscious mind by using shortcuts based on our background, cultural environment and personal experiences to make almost instantaneous decisions about everything around us.
The snag is, it’s wrong quite a lot of the time. Especially on matters that need rational thinking.
This video from the Royal Society explores how our subconscious can lead us to judge people and situations unfairly. As it highlights, all of us are guilty of this, it’s a normal aspect of human psychology. But by becoming more aware of its existence, and its influence on our decision making, by resisting the temptation to make snap judgements, questioning the reasoning behind our decisions, and questing cultural stereotypes.
Again, there’s nothing wrong with the fact that our brains do this any more that there’s something wrong with the fact that our taste buds prefer ice cream to vegetables. What matters is that we engage our rational minds, so that we can make healthy decisions more often.
Sometimes the solution to a problem is so obvious that your brain just stops working. Asking it to continue to look for a solution is frankly an insult to its information processing abilities. This is why it’s sometimes so easy to be wrong whilst being convinced you’re right.
In fact, that’s exactly what happens to the people in this video. They’re approached by a man with a simple three digit sequence (2, 4, 8), and then asked to figure out what rule he is using to generate the sequence. He won’t give them clues to help them get the answer, but they can give him a three digit sequence of their own, and he’ll tell them whether their sequence fits his rule or not.
Like you probably have, they come up with a rule fairly quickly, but it’s not the rule the man has in mind. But what happens next is interesting. Instead of proposing sequences which don’t fit their previous rule, but might give them insight into his, they just keep proposing the sequences that fit their assumption. Even though he’s told them this isn’t correct.
They get stuck on their original idea so completely that they become incapable of looking for a different solution (even though the actual solution is very easy to find). Here’s how he summed it up with the people he approached once they’d figured it out (or given up)..
I was looking for you guys to propose a set of numbers that didn’t follow your rule, and didn’t follow my rule [Ed: I think he meant to say “and might have followed my rule” here]. I was looking for you guys not to try to confirm what you believed…You’re always asking something where you expect the answer to be yes, right? You wanna get the “no” because that’s much more informational for you than the “yes”.
In all kinds of spheres, it’s tempting to stick with the first answer that feels right and defend it rather than putting it aside and looking for others. But as this simple little experiment shows, sometimes that’s the only way to see the other answers that are staring you in the face.
In a sense, you have two brains. There’s the brain that handles intuitive tasks. It reads facial expressions and tone of voice. It processes information very quickly but as a result, sometimes does so imperfectly. This makes it useful in situations which require immediate action and improvisation.
Your other brain handles more complex situations. It does long division, and figures out how to solve unfamiliar problems. It works much more slowly, but is much more precise. It handles problems that require consideration and effortful thought.
Your brains make an excellent team, but the problem is that they don’t always stay in their lanes. Your intuitive brain tries to handle tasks that require time and deliberation, and your reasoning brain tries to interfere in tasks that should be handled intuitively and instinctively.
It’s fair to say that the healthier the balance between the two aspects of your brain, the more reasonable and logical a person you’re likely to be, which is why the video above is so interesting. It tests your ability to use your reasoning brain in a situation where your intuitive brain is tempted to take over. Give it a go and see how logical you are..
August 22, 2020 7:17 pm - Steve Peters
If you didn’t already know that there was a blind spot in your visual field it would be hard to convince you it was there. Even after being shown how to find it, it’s still difficult to believe that anything is missing. Our brains fill in the blanks so seamlessly that we don’t notice the…continue reading on Medium…
Humans understand the world through stories. It’s hardwired into us in the same way as our sense of fairness or our tendency to see familiar objects in random shapes like clouds.
But unlike seeing a cloud that looks like a bunny rabbit, our love of stories isn’t always harmless. The simplicity of stories works in direct opposition to the complexity of reasoning. Stories, as wonderful as they are, don’t often capture the nuance of real life. In stories there is a good guy and a bad guy, a beginning, middle and end, there are happy endings and deserved punishments. Life is rarely so cooperative.
But thankfully, the human brain is remarkably adaptable. Even if we can’t stop ourselves from processing information as stories, we can guard against some of the errors this tendency produces. This article in Aero magazine aims to do just that:
Our heuristics—the rules of thumb we use in our reasoning—can create and maintain wonderful stories, at the cost of nuance and accuracy: such as when we assign more blame to those we dislike, and more merit to those we like, creating a clear separation between the good guys and the bad guys; when we feel that rhymes imply truth; assess ambiguous information as confirming our preconceived narratives, and evaluate arguments as if they were stories—on the basis of their believability, instead of their logical structures and evidential bases.
Overcoming these tendencies is a crucial step on the path towards intellectual progress, since reality has shown, time and time again, that it is under no obligation to conform to the stories we tell about it.
More than any time in history we need to be able to think critically. The Information Age, rather than making decisions easier by offering us more and better information, has made them harder by blurring the line between trustworthy and unreliable sources. Not to mention that the sheer volume of information we’re bombarded every day is overwhelming.
Nonetheless we still need to make choices. And with so many important decisions to be made about health, politics at the moment, as well as the usual financial and life decisions we have to worry about, the video above about how to prove our critical thinking skills feels like it couldn’t have come at a better time.
Rather than choosing an answer because it feels right, a person who uses critical thinking subjects all available options to scrutiny and scepticism. Using the tools at their disposal, they’ll eliminate everything but the most useful and reliable information.
We could all use a little wisdom now and then, so 68 pieces feels like we’re being spoiled. In honour of his 68th birthday, Kevin Kelly brings us 68 morsels of wisdom that he’s picked up over the years.
It’s positively brimming with things I wish I’d know 20 years ago like:
Learn how to learn from those you disagree with, or even offend you. See if you can find the truth in what they believe.
Everyone is shy. Other people are waiting for you to introduce yourself to them, they are waiting for you to send them an email, they are waiting for you to ask them on a date. Go ahead.
To say that it’s tempting to use memory as the basis for our arguments feels like taking understatement to the level of high art. How else are we supposed to draw conclusions? Are we just supposed to overlook the information we have right there in our heads?!
Actually, there’s an argument for suggesting just that. Of course, I’m not suggesting that we stop trusting the contents of our minds, that way madness lies, but there’s no doubt that we depend too heavily on a faculty that we know is heavily vulnerable to manipulation, bias, and just plain error.
In this article for Knowable Magazine, Chris Woolston explores the research of psychologist Elizabeth Loftus. Loftus has spent over 40 years pioneering research into the fallibility of memory. Her research has even changed the way that judges, lawyers and juries interpret the recollections of witnesses:
One 1975 Loftus paper that Roediger discusses with his students shows that memories of fast-moving events can be easily manipulated. Subjects in the studies watched film clips of car accidents and then answered questions about what they saw, and Loftus showed that when misleading details were slipped into the questions, the subjects often later remembered things that weren’t there. For example, when asked, “Did you see the children getting on the school bus?” 26 percent of respondents said they had, in fact, seen a school bus in the film, even though there wasn’t one. Of those given no misleading prompt, only 6 percent remembered a bus.
In another classic experiment, students watching film clips of car accidents were asked to estimate the speed of the cars involved. Loftus showed that the wording of follow-up questions had a big effect on the answers. When asked how fast cars were going when they “smashed” each other, the average answer was more than 40 miles per hour. When asked about the cars’ speed when they “contacted” each other, the average answer was just over 30 mph
Memory is one of the tools we have at our disposal, but shouldn’t be treated as the only tool, or the seat of a personal truth which nobody has the right to dispute. History has shown us that reason, logic and consensus aren’t guaranteed to protect us from making mistakes either, but if we take them as seriously as we take our memories, we give ourselves a chance of getting things right more often.