Psychology researchers are proposing a new model to understand the multitude of cognitive biases that are mostly demonstrated experimentally in the laboratory. They could all — or almost all — be derived from fundamental beliefs and information processing congruent with these beliefs.
This will also interest you
(ON VIDEO) Interview: does neurofeedback help the brain function better? Neurofeedback offers a user the ability to observe the activity of their own brain. This…
The existence of cognitive biases is no longer a secret as the use of the term has become so commonplace. A trivialization which, as is often the case, does not spare the concepts which penetrate the social sphere. By talking too much about cognitive biases, we see them everywhere and we only think through lenslens of the latter. However, psychological research has revealed more than 150 cognitive biases. Obviously, this is only accurate on the condition that we think within the theoretical framework where cognitive biases exist (ecological rationality, for example, does not consider heuristics as biases), which will be the case for this article.
The concept of cognitive bias is therefore exhausted, exhausted, yet research is still trying to demonstrate new ones. It is with this in mind that a psychology researcher from the University of Hagen (Germany) and a psychology researcher from the University of Mainz (Germany) offer us a new, more parsimonious way of conceiving of biases. : through fundamental beliefs and a way of processing information congruent with these beliefs.
These fundamental beliefs that inhabit us
There are beliefs that have been present in us for a very long time and which constitute us. In this article, the authors consider a belief as “ a hypothesis about certain aspects of the world that arises in our mind with the notion of precision”. They suggest that these assumptions are found in all human beings. Indeed, research in psychology has shown that we are “machines” for generating beliefs: schemas, patterns, stereotypes, generalization from our environment, etc.
We even generate beliefs when it is not rationally appropriate because we have no other models available for understanding the world around us and how to behave within it. Religion is a good example of a belief system that helps provide an effective model for both of these questions. Based on these observations, our two researchers consider that beliefs are ubiquitous and essential to human cognition. According to them, our cognitive biases derive from fundamental beliefs such as “ my experience is a reasonable reference for reflection” or ” I make correct evaluations of events”. And this is because our way of processing information strives to be congruent with these fundamental beliefs.
A biased way of processing information
Here again, psychological research has clearly demonstrated that the way we process information is biased towards our beliefs. We tend to test hypotheses that confirm what we think, discredit beliefs opposed to ours, and protect our beliefs. This mode of evaluation emerges at all levels of the treatment process: perception, evaluation, reconstruction and search for new information. The authors therefore suggest that this way of processing information is specific to our human condition: “ Belief-consistent information processing seems to be a fundamental principle in human information processing that is not only ubiquitous but also a human condition.. Consequently, this mode of processing information does not require any motivation: whether we are motivated or not, we do not, in this conception, really have a choice. By extension, they therefore consider that most biases are derivatives of the famous confirmation bias.
Some examples of biases that would follow this pattern
The authors take several approaches to illustrate their model. For example, the spotlight effect, the illusion of transparency or the false consensus effect all derive from the fundamental belief “my personal experience is a reasonable reference” coupled with information processing congruent with this belief. In the case of the effect of projectorprojector and the illusion of transparencytransparency, we would treat the perception of others towards us based on our personal experience. For the false consensus effect, we would infer other people’s beliefs from our own. The fundamental belief I make correct evaluations” would give rise to biases such as blind task bias or the hostile media effect. In these two biases we analyze others (or the media) in opposition to our evaluations deemed correct. We therefore come to notice the errors of others more or to think that the media treatment is biased on a subject.
In this video, Thibaut Giraud alias Monsieur Phi demonstrates the two epistemic virtues cited below by striving to develop the arguments for a position with which he intuitively disagrees. © Monsieur Phi, Youtube
Intellectual honesty and empathy as solutions
In the last part of their article, the authors point out that most strategies to combat bias do not give very good results. On the other hand, they cite several studies showing the effectiveness of the attitude consisting of challengerchallenger own beliefs by honestly considering opposing positions and actively seeking arguments to support them. In short, this suggests that the best way to guard against biases would not necessarily be to acquire skills to avoid them but rather to develop epistemic virtues such as honesty and intellectual empathy, which could lead to instilling doubt in our infinitely biased fundamental beliefs.