The five main flaws of human reasoning

Human errors

People exhibit certain mental patterns that lead them to draw inaccurate conclusions regarding probabilities. On almost every complex issue, intelligent people who have been exposed to the same information still reach different and even contradictory conclusions. It turns out that the human mind cannot deal with high levels of uncertainty. When we try to analyze many, often contradictory, pieces of evidence with differing levels of reliability, human conclusions often prove incredibly inaccurate.
Want to see what we mean? Test your intuition in this video.
So why do we fail so miserably? Usually, it’s a combination of the following 5 flaws:

1. Filtering of evidence - Cherry-picking, biases and interests

Unfortunately, analyses carried out in the real world are rarely built on entirely accurate and unbiased foundations. Studies consistently find that people ignore or underestimate evidence that is inconsistent with their existing beliefs.
Furthermore, even before we add our own biases, the information available to us has already been heavily filtered and distorted. The media’s incentives are partly to blame for this. There are very few media organizations whose main incentive is the accurate reporting of facts. Most seek to maximize profit, while others are funded to promote specific agendas or ideologies. High quality investigative journalism is expensive and ultimately does not attract significantly more traffic or revenue than fluff entertainment or superficial PR pieces.
Going one step further, the information available to the media is also biased. Stories that are beneficial to commercial or political interests are presented in neatly organized PR pieces, while special efforts are undertaken to conceal damaging information. In security-related matters, governments typically control the release of information, favoring journalists that are deemed friendly and non-threatening over journalists who are more critical in their reporting. This creates a deterrent effect and harms journalistic integrity.

2. Reliance on gut feelings and conventional wisdom to assess likelihoods

Once we have collected the evidence, and done our best to correct its biases, the next step of an analysis is to evaluate how each piece of evidence affects our conclusions.
Too often, people rely on their own personal experiences, intuitions and conventional wisdom to evaluate the weight of evidence, and fail to realize how different reality may be. A tragic example is the enormous number of people falsely convicted based on their own confessions. For centuries, courts have been assigning excessive weight to confessions given by suspects under police interrogation, based on the assumption that people would not ruin their entire lives just to avoid pressure. Only recently have we discovered how wrong we were about people’s behavior under stress--when statistics of DNA-based exonerations revealed that about 20% of the people exonerated had confessed to a crime they did not commit.

3. Prosecutor's fallacy - Ignoring a hypothesis’s initial plausibility

Another common trap of human intuition is failing to take into account the plausibility of an event before considering the context-specific evidence of the case at hand. For instance, take the following example (see an interactive version of this riddle):
A person is selected at random to take an HIV test. The test comes out positive. We know that this type of test is 99% accurate - that is, the probability that the test would indicate that a person has HIV when he doesn’t (or vice versa) is only 1%.
Most people, when asked what the probability is that our person really has HIV would respond that it is very likely, since the test is 99% accurate. However, these people fail to take into account that only about 0.5% of the population actually has HIV. In other words, on average, if you take 200 random people only 1 of them will have HIV, but the test will come out positive for about 2 people who don’t have HIV. Thus, it is about twice as likely that the person we examined does not have HIV, even though the 99% accurate test came out positive!

4. Weak intuition for compound probabilities

Humans have an intuitive understanding of basic probability, but only up to a certain point. When dealing with complex issues involving more than a few pieces of evidence, people lose their intuitive understanding of likelihoods, regardless of their overall rationality or intelligence.
Imagine that you have two boxes in front of you. Box A is filled with balls: ⅔ of the balls are white and ⅓ are black. Box B is also filled with balls, but with the opposite distribution: ⅔ black and ⅓ white.
You pick one of the boxes at random, but you don’t know whether it’s Box A (with mostly white balls) or Box B (with mostly black balls). You draw a ball 50 times out of the box (returning the ball each time), each time recording the color of the ball you pulled out. Out of the 50 times you drew a ball, you got a white ball 30 times.
Does this mean that it’s more likely that you picked Box A with ⅔ white balls, and if so, then by how much? Before reading the answer, think about what your intuition tells you.
It turns out that it’s about a thousand times more likely that you picked Box A, the one with ⅔ white balls, rather than Box B. This is a much higher probability than intuition would lead us to expect! Most people guess that it’s around 70%, and even those who think it’s higher don’t come close to the real likelihood (99.9% likelihood that it’s Box A). This is because human intuition fails to account for the effect of multiple independent events occurring together.

5. Overlooking dependencies

We’ve seen that the brain is not equipped to deal with multiple pieces of evidence. However, the problem is much worse than that: we must also consider all possible interdependencies. A dependency between two events means that the occurrence of one affects the probability of the other to occur.
Sally Clark’s conviction is a poignant example of overlooking probabilistic dependency. Sally Clark was convicted of having killed her first baby (at 11 weeks of age) and her second baby (at 8 weeks of age). An expert testified that the probability that they had both died of SIDS (the alternative hypothesis to murder) was 1 in about 72 million (calculated by squaring the probability of 1 in 8,500, which is the probability of a single SIDS death). The expert, and eventually the court, overlooked the dependency between the two events: It is reasonable to assume that the likelihood of a death from SIDS occurring is significantly greater if a previous child in the same family has already died under similar circumstances (due to factors such as genetic predisposition and environmental factors).
Another flaw related to dependencies is the use of inconsistent explanations for evidence. When we try to support a certain hypothesis and encounter contradictory evidence, we need to find likely explanations for each piece of evidence. Of course, many people simply choose to ignore such information, but even those who are careful not to do so, will frequently neglect to account for the dependencies between the different explanations. So while each piece of evidence has a valid explanation, the overall story is inconsistent, and very unlikely to be true. In logic this is called the kettle logic fallacy.
An example of this flaw may again be found in many cases of false confessions. When the court reviews the confession it may be impressed by the details that match the crime, while simultaneously dismissing inconsistent statements as manipulations. In doing so the court ignores the fact that this may imply an unlikely story in which the suspect was constantly switching between being repentant/cooperative and cunning/manipulative.