Logical Fallacies when Assessing Risks

Logical Fallacies when Assessing Risks

Flavio Gerbino
by Flavio Gerbino
time to read: 17 minutes

Scientific research in the field of behavioural economics and psychology have mercilessly uncovered our everyday fallacies in thinking. It seems as if this science is currently completely revising the way we go about solving problems.

The subject is a prominent one. In bookshops, there is many a shelf dedicated to countless books in the style of popular science that deal with the art of solving problems. Many of them frequently find their way to the top of the bestseller lists. Therefore, it seems reasonable to ask the question if and how we can apply the knowledge gained from the research into logical fallacies to IT and IT Security in order to achieve an adequate handling of risks. Which useful information can we gain from studying human behaviour so that we can make better, more informed, more objective decisions in the world of IT in general and the assessing of risks in particular?

The category of cognitive biases is particularly interesting. There are a number of interesting but controversial works that document the fallacies of our thinking systematically and comprehensively (sources below). I have read some of those and extracted the logical fallacies that I thought would help assessing situations and risks in IT Security where I think that a distortion of our vision has grave consequences or that said distortion will lead to bad, if not wrong, decisions.

It seems to be decisive to have insight into our own thinking and which errors in judgment we make frequently. Because if we have to deal with insecurities in thinking, and that’s always the case in the field of IT, then the way in which we perceive a problem is vital.

An Example: We dislike losses more than we like wins. But the question where wins end and losses begin, that’s based on individual perception and that perception can be influenced by many things. This means: Decisive is not the problem per se but our perception of it.

However, this article would be blown way out of proportion if we incorporated all the insight and knowledge gained into it and handled it systematically in addition to enriching it with real-life examples. But with a bit of imagination, it becomes obvious what the cognitive biases can do when we give in to them. This is something that can and will happen time and time again.

Definition of Cognitive Biases

Cognitive biases can be defined as tendencies to think in certain ways. Cognitive biases can cause systematic deviations of an intended rational norm (common sense) or a good and balanced judgment. This makes it apparent that our mistakes have characteristic patterns.

Even though the reality of these biases is confirmed, there is controversy about how they are being classified or explained. Some are effects of our mental shortcuts, also known as heuristics, which are used by the brain to make decisions or pass judgment. All effects in combination with these heuristics are being referred to as cognitive biases.

There is also discussion about whether or not some of these biases are really considered to be irrational or if they lead to useful behaviour or attitude.

An Example: People tend to ask suggestive questions upon meeting someone. The goal of these questions is to get an answer that fits in with the preconceived notion that the asker has of the asked. This train of thought is known as a confirmation bias and designates the notion to process information selectively. Unconsciously, we’re blacking out information that contradict our own expectations.

On the other side, this cognitive bias can be used as an example of social competence: As an empathic tool to establish a certain connection or basis for a connection with another person.

But this is just a side-note to illustrate the point that certain biases can be something other than a disadvantage. And to show that intuitive thinking and acting is important. h1. Overview: Biases

Depending on the literature, there are between 100 and 200 documented cognitive biases. It’s all a matter of classification, grouping, considering the independence of biases or clustering by subjects. For example, you could group them into classes: biases of decision-finding, biases of behaviour, social biases, memory biases et cetera.

Therefore, the following list is not a systematic listing but a small, subjective selection with the sole intent to give some food for thought: Maybe we can reduce our own biases a bit by taking into account this list. Because if we are aware of our own biases, we might become a bit more self-critical and a bit more sceptical. This knowledge of ourselves is essential: We have to learn, to accept, that we could have been wrong. This would form a basis for efficient and critical thinking.

Making mistakes is important. Errors are a necessary pass-through-state to get to understanding. The lesson is – among other things – that it’s necessary to analyse your own mistakes and draw conclusions to restructure and reorganize the own ways of thinking and acting.

Description Possible effect on risk assessments
Ambiguity Bias
Tendency to avoid options that lead to an unpredictable outcome, perhaps because of insufficient data.

This causes people to prioritize an option that has a known chance of success over an option that has an unclear outcome
Using this logic, we also set the likelihood of the manifestation of a risk to a much higher level than the risks we consider to be less memorable.

Risks should always be explained using real-life examples, plastic illustration and not just abstract terminology.
Anchoring or Focalism
Tendency to focus or anchor our judgment to a certain indicator when we have to decide something. Usually, this is the first information we get related to the issue at hand, which doesn’t necessarily have to be related to the issue itself. The anchor is a certain information or an indicator. This information can be deducted from circumstances or it can be given as is or it is something that is just available by chance. This information is crucial when making a decision. It doesn’t matter if the information is relevant or useful in order to make a rational decision.

It applies very much to numerical information values. Whenever you have to present something to management, start off small. The low numbers will trigger the anchoring function which will make the bigger risks seem less dramatic in the eyes of management, because the smaller numbers are still anchored in their minds. On the other hand, if you want to raise awareness for a certain risk, the information relevant to the risk should be passed along right after the numerical values associated with it.
Attentional Bias
Tendency that our thinking is being influenced by returning trains of thought and interests.

They influence which information are being made the focal point of a human’s attention. For example: If you’re interested in hacker attacks, then you will subconsciously weigh information more that will defend your stance on hacker attacks.
When assessing risks, this means that there is a high likelihood that we neglect risks that we are not interested in and think about less.

However, risks we are very interested in, such as spectacular hacker cases, will be ranked way above what they should be.
Availability Heuristic
Tendency to overestimate events with higher availability in our memory, independent of how current, extraordinary or emotionally charged the events might be.

Availability heuristic happens subconsciously if the importance of the likelihood of an event needs to be assessed, while the time, the possibility or the will to rely on precise data is lacking. In these cases, the verdict will be influenced by how present the event or similar ones are in our minds. Events that are easily remembered seem more likely to recur than events that we can hardly recall.
That’s why the likelihood of being hacked or falling victim to a virus is perceived to be higher right after having read or seen news reports on the topic.
Availability Cascade
A self-reinforcing process that sees a collectively held opinion gain more and more plausibility as it gets repeated in public. Repeat something often enough and it will be accepted as the truth. Statements that have been established in an environment will be seen as holding more weight and truth than those that have been heard for the first time. New risks will have to be well established before they get the same weight as the previously established ones.

On the other hand, it’s worth paying attention to the importance given to a risk that keeps coming up in discussion, be it ultimately unimportant or critical.
Bias Blind Spot
The tendency to see oneself as more objective than others or to be less affected by cognitive biases than others.

We believe that the effects of biases in other people’s judgments, while we are not able to see the same biases in our own judgments.

As a bit of solace: Others also know much less than what they think they know.
Because we’re all subject to this bias, it seems reasonable when assessing risks to get input from more than one independent source.

During a second step, the assessments and their severities are being compared and synched.
Conservatism Regressive Bias
A pronounced leaning towards overestimating high probabilities and underestimating lower probabilities. When assessing risks, bigger risks are getting overestimated at the potential cost of neglecting the smaller risks.
Contrast Effect
Amplification or reduction of certain stimuli when they’re in context of a contrasting, recently observed object or information. A risk seems bigger if it’s being directly compared to a smaller risk. Or smaller when compared with a bigger risk.

The attractiveness of a possible course of action in a scenario can be elevated by pitting it against a similar but worse alternative or lowered by pitting it against a similar but better alternative.
Framing Effect
Tendency to draw different conclusions from the same base data, depending on how the information was presented and whom it was presented by.

This means that the presentation of information can influence the behaviour of the recipient of the information.

The French saying C’est le ton qui fait la musique (Translated: It’s the sounds that make the music) sums this up nicely.
If the effect is applied to risks, it becomes important to treat risks the same and to work with them according to the same schemata in order to be able to assess them objectively and compare them fairly.
Gambler’s Fallacy
The tendency to believe that future odds are influenced by past occurrences while, in reality, they’re completely independent of one another.

This is a logical and mathematical error that follows the perceived but erroneous logic that an event is more likely to occur when it hasn’t occurred for a longer period of time. Conversely, the same erroneous assumption claims that something is unlikelier to happen if it occurred recently.
This error in thinking is also wide-spread when it comes to the assessment of risks and their likelihood of happening.

Avoiding this tendency is possible by constantly being reminded of the sentence Chance does not have a memory.
Hindsight Bias, aka. The I knew it all along effect
The I knew it all along effect describes the tendency to claim that a recently passed event could have been predicted.

After the event has passed, we remember our earlier predictions wrongly. We distort our initial assessments to match the outcome.

In hindsight, we overestimate the possibility that the event could have been predicted.

The explanation for this is that the knowledge of the actual event influences our way of thinking about the connected information and the entire cognitive coordinate grid will be shifted to match the event.
A CSO observes the threat landscape on his radar with great care. All agreed upon controls and measures for mitigation are implemented and formally, nothing has been missed. However, it becomes apparent that during a security incident, personal data has been massively compromised. Immediately, the question arises of just how this could have happened, despite all the security measures that are already in place.

This is the point where not only the layman but also the professionals of an area are affected by the Hindsight Bias by reviewing information while having the information of the event and being influenced by it. This is how they will overestimate the predictability of the event.

The Hindsight Bias is especially critical when it comes to passing blame and responsibility. It is vital to know that knowledge of facts cannot negate the influence of the Hindsight Bias.

Legally, there is the ex ante viewpoint, the assessment based on an earlier view. Using this technique, later developments that have not been known at the point in time where the events unfolded are ignored.
Hostile Media Effect
The phenomenon that the followers of a certain position or a certain belief tend to call the media coverage concerning the topic biased, unscientific and unfair.

Recipients who see the media coverage as one-sided also are under the impression that the media is biased in a way that is harmful to their point. This way, they feel at a disadvantage because of the media coverage.
This effect is known to everyone. No matter how factually accurate and balanced an article is, the proponents of the contrarian opinion will call it proof of an inherent media bias towards their position.

The same phenomenon can be observed in companies when it comes to audit reports, test reports or risk reports to just name a few.

Interestingly enough, the influence of the Hostile Media Effect can be observed on both sides of the argument. The deciding factor is not the factual veracity of the article in question but the deep chasm between the positions held in an argument. The deeper this chasm is, the more people comprehend themselves as a representation of a point, the more vehement their denial of facts becomes.
Illusion of Control
The tendency to overestimate our own influence on external factors. The belief that we are able to control certain things that are proven to be beyond our influence. Strongly held Illusions of Control can increase drive to get something done, but rarely lead to error-free decisions.

Worst case, strongly held illusions of control lead to the ignoring of important feedback which leads to the complete annulation of any learning experience. It also leads to greater objective willingness to take risks, because the subjective ability to properly assess risks sinks.
Information Bias / Illusion of Validity
Conviction that additionally gained information always leads to additional use to previously gained data in order to obtain a prediction, even if the newly gained information is seemingly irrelevant. Even an avalanche of new information won’t lead to better decisions.

What you don’t have to know in order to make a good decision will remain worthless, even if it’s known.

On the contrary: Too much information can lead to confusion and wrong decision due to disinformation.
Neglect of Probability
Tendency to completely disregard probabilities when a decision concerning an unknown situation has to be made.

Neglect of probability usually appears in emotionally charged situations
It is the simplest way to regularly act against the normal rules of decision making by completely ignoring little risks or giving them way too much weight. The entire spectrum between the two extremes is not being taken into account.

The fear of consequence of a threat scenario doesn’t lead to an overestimation of the threat becoming reality but also to the overestimation of the use of preventive and protective measures.
Pseudocertainty Effect
The effect of pseudocertainty causes the tendency to make decisions that shy away from risks if the expected outcome of a situation is something positive. But risky decisions are made when negative outcomes are to be avoided.

When a decision under uncertainty has to be made the difference between two possibilities is valued a lot higher if absolute certainty can be achieved.
In practical use, this means that an increase from 99 to 100 percent is valued a lot higher than the increase from 50 to 51 percent. While both are quantitatively identical, the former carries the end result of creating an absolute outcome.

This means that lowering the risk from 5 to 0 percent (the risk is absolutely gone), people would invest much more than lowering the risk from 10 to 5 percent.
Risk Compensation Effect
Tendency of people to become more reckless after the implementation of security measures (be they technical or legal). This is a paradox phenomenon. Security measures to mitigate risks can be partially or completely useless or even harmful to an environment because they suggest a feeling of security. So the users in the environment feel more secure and act more recklessly than before and are exposed to the reckless behaviour of others. This due to the fact that we perceive the likelihood of a threat to come to pass as less plausible or less grave.
Zero-Risk Bias
The Zero-Risk Bias claims that we’d rather reduce a small risk to zero than to tackle the reduction of a bigger risk that would lead to the overall reducing of risk.

This goes hand in hand with the fact that the reduction of a risk calms us less the more emotionally charged or bigger the danger is.
The Zero-Risk Bias is not about the wishful thinking that risks can be eliminated completely. We all know only too well that that won’t work. But yet we’re still attracted to the zero-risk, aren’t we?

This is about not focusing too much on small things until they’re completely done away with (because they will keep bugging us until then) instead of keeping the bigger picture and its overall improvement in mind.

It makes little sense to put a lot of effort into mitigating a tiny bit of residual risk completely. In almost every situation, however, it’s worth to make that same amount of effort to reduce a much larger risk.


Often, we fall for the illusion that we understand complex current events or we succumb to a retrospective on past events that is skewed. The overrating of information combined with the overrating of expertise leads to the illusion of competence.

This illusion is not just an individual error in judgment; it is an error that is deeply rooted in IT. Facts that challenge basic assumptions – and therefore seemingly challenge the success and the self-respect of humans – are being ignored completely.

But even self-reflection can have great success. The challenging of your own ways of thinking, without any outside interference or guidance, can lead to an improvement of your own ability to think critically.

Because sometimes, we can be oblivious to the obvious. But what’s even worse than that is if we’re oblivious to being oblivious.

Trust into our intuitions and our personal preferences is usually justified. But not always. We’re often convinced of being right when we’re wrong. So don’t let yourself be fooled by flawed intuitive thinking.


About the Author

Flavio Gerbino

Flavio Gerbino has been in information security since the late 1990s. His main areas of expertise in cybersecurity are the organizational and conceptual security of a company.

You need support in such a project?

Our experts will get in contact with you!

Disk Cloning

Disk Cloning

Ralph Meier

Open Source Intelligence Investigation

Open Source Intelligence Investigation

Michèle Trebo

Software-Defined Networking

Software-Defined Networking

Tom Looser

Flipper Zero WiFi Devboard

Flipper Zero WiFi Devboard

Tomaso Vasella

You want more?

Further articles available here

You need support in such a project?

Our experts will get in contact with you!

You want more?

Further articles available here