CognitiveBiasDangerousMost people tremendously enjoy getting recognition for their thought patterns, routines, and beliefs. We also tend to select and interpret information in such a way that it corresponds to and reinforces our own expectations. Some insist on their opinion even when new information has long since proven them wrong. And with all of this we also believe that we are right and unaffected.

The cognitive bias that we can undergo in this context are numerous with around 30 documented. Cognitive bias is a collective term in cognitive psychology for faulty tendencies in perceiving, remembering, thinking, and judging. We are usually not aware of them and unfortunately often fall for them.

Time and again our brain also fools us into believe something, for instance, it fills memory gaps with apparently suitable material. That shows for example, when two people have experienced the same thing, but the stories they tell us are very different. If we repeat an incident mentally or verbally, we end up believing in our own lies.

Everyone who deals with people should know the explanatory approaches underlying the cognitive biases, because they can play a significant role in decision-making and implementation. Anne M. Schüller and Alex T. Steffen provides a summarized collection in their book “Die Orbit-Organisation”, including:


  • The self-herding effect: people like to repeat activities in which they were once victorious and such behavior is known as "self-herding". Like herd behavior, we unreflectively follow the “herd” of our decisions from the past, which causes us to fall in love with our own ideas. If these are often crowned with success, it can result in a dangerous belief of one's own greatness. “His success went to his head,” the saying goes. Freed from self-doubt, this can lead to fantasies of omnipotence, a loss of reality, and the illusion of invincibility.

  • The social proof effect: this is a psychological phenomenon in which people orient their behavior towards that of their fellow human beings. You adopt or imitate their actions on the assumption that they are appropriate in each situation. Because everyone does it that way, it must be right, we imagine. The opinion of the masses can serve as a reference as well as that of a single authority. For example, if the boss is for or against something, suddenly everyone is for or against it – sometimes consciously, but often unconsciously. “Executive isolation” is a dangerous consequence, meaning the upper management only gets to hear what they want to hear.

  • The possession effect: this effect says that people tend to regard something as more valuable as soon as they own it. So, after a decision, there can initially be a kind of buying regret (“Wouldn't the other one have been better?”). We dissolve this uncomfortable tension, known as cognitive dissonance, by enhancing the decision made and glossing over it. This also applies to rules, meaning we tend to regard rules that we obey as more useful than they are in everyday business life. Because the admission of following a nonsensical rule would trigger cognitive dissonance. And the same applies to processes, structures and methods, meaning that those who developed or introduced them often value them more highly in terms of their usefulness and hold on to them more closely – and this is exactly what gets then in the way of necessary innovations.

  • Loss aversion: This is the tendency to weigh possible losses higher than possible gains. This behavioral peculiarity was proven above all with the help of various competition experiments and on the stock exchange. However, it does not only refer to monetary situations, but is more universal. In organizations, for example, it is noticeable when it comes to the question of whether an old method or an established product should be replaced by something new. The potential downsides that abolition can bring tend to be overrated compared to the potential benefits of the new. This leads to the fact that we prefer to leave a lot of things the same, even if that throws us back.

  • The omission effect: It describes the human tendency to overestimate the risks of action and to underestimate the risks of inaction. So, it's better not to give up the apparently superfluous process after all, because who knows, maybe it's still good for something. This effect is probably because in communities, undesirable actions are usually sanctioned more strongly than omissions, even if the consequences of both options are the same.

  • The status quo distortion: the disposal of a concept, a process or a ritual always means change, but most people prefer it when everything stays as is. One reason is that every change requires the brain to adapt, and that is programmed to save energy. The excessive preference for the status quo is known as status quo distortion. The power of habit has a grip on us in many situations and is omnipresent in day-to-day business.

  • The overconfidence effect: here we tend to overestimate our own ability or performance – and underestimate those of others. Almost everyone believes being a good driver and that others are less likely to be trusted. Closely related to this is the Dunning-Kruger effect that is essentially about the self-image of incompetent people who overestimate their own knowledge and skills. This tendency is based on the inability to judge oneself objectively, not to recognize superior abilities in others, and to misjudge the extent of one's own incompetence.

Once such thought patterns have been solidified, our entire perception is controlled by them: what reflects our convictions, we use as an amplifier, what does not fit, we ignore.

By Daniela La Marca