Have you ever wondered why the debates are more and more polarized? Why, when two people are arguing, is it nearly impossible for them to come to an agreement? How is it possible that, despite strong evidence to the contrary, people are so aggressively defending their views?
No matter how rational we consider ourselves to be, it seems that we humans have a natural tendency to seek out, interpret, favor, and remember information that supports our previous beliefs and values, whether or not there are facts. who contradict them.
This natural tendency has a name: it’s the bias on my side. Below, we’ll take a look at this widespread and in turn potentially harmful psychological phenomenon and the research that has revealed some light on how it occurs.
What is the bias on my side?
It is not uncommon that when we talk to someone about any topic, we tell them what we are thinking and what the “facts” are. We told him all the evidence we found in all kinds of “reliable” sources. We know that this person has an opinion contrary to ours and we are convinced that after giving him this proof, he will change his mind, but that just does not happen. No, she’s not deaf, she hasn’t ignored us either, it turns out that what we told her contradicts what he thinks she despised our “facts”, thinking that we are not. informed.
The bias on my side is a psychological phenomenon that makes us have tendency to seek out, interpret, encourage and remember information that supports or confirms our previous beliefs and values, Ignore or downplay this evidence that contradicts what we believe in. Basically, this bias is an inherent flaw in our brains in the way we process information, which causes us to make biased decisions or adopt the wrong opinions and opinions.
Although all human beings are victims of this bias, this psychological phenomenon is considered potentially dangerous, in the sense that this makes us practically blind to any information which, however true it may be, if it is contrary to what we think we will regard as false. or not very rigorous. In fact, some theorists of this model of thought like Keith E. Stanovich see it as primarily responsible for the idea of post-truth: we only see what we want to see.
Implications of this cognitive bias
Over the past several decades, Stanovich and other cognitive researchers such as Richard F. West and Maggie E. Toplak have approached this bias experimentally. One of its main implications is that humans tend to seek out information that reinforces our opinions, omitting or rejecting all data that, however true and demonstrable, we consider less rigorous. people we seek information that reinforces our hypotheses, instead of seeking all the evidence, both confirming and refuting.
In fact, it’s a pretty straightforward thing to understand by seeing how people behave on virtually any subject they want to read. For example, if we find a person who is pro-life i.e. is against abortion, they will be more likely to look for information that will prove them right, and moreover, he is even possible that it becomes even more opposed. to abortion. He will rarely look for information that explains why abortion should be a universal right or if the few weeks old fetus does not feel, and if he does, he will read this content from a very skeptical and superficial perspective.
Interestingly, the fact of seeking information that lies on both sides of a debate, that is to say of seeking data favorable and unfavorable to the opinion that one has already formed since the beginning of the debate. start, it seems to be related to personality traits rather than intelligence. In fact, some research suggests that people who are more confident in themselves tend to seek data to prove and refute both sides of the debate, while the more insecure look for what gives strength to their beliefs.
Another clear implication of this bias is how the same information is interpreted differently based on our core beliefs. In fact, if two people receive the exact same information on a topic, they are more likely to end up having different views, totally or partially opposed, because even if the message is the same, their interpretation will not be the same. thing and his view of it will be skewed in a personal way.
The experience of the death penalty
We have a good example of this in an experiment conducted at Stanford University, in which researchers they looked for participants who already had very divided opinions on the same subject: being for or against the death penalty. Each participant received descriptions of two studies, one comparing US states with and without the death penalty and the other comparing the murder rate in a state before and after the introduction of the death penalty.
Following this description, they were given more detailed information about the two studies and asked to rate how they believed the research methods of the two surveys were reliable. In both groups, both those who were in favor of the death penalty and those against said they changed their attitude somewhat at the start of the study when given the brief description, but when given more details, most reverted to their previous beliefs, Despite evidence that gave strength to the two studies. They were more critical of the sources contrary to his opinion.
German cars and American cars
Another study showed that intelligence does not protect us from prejudices on my side. In this case, the intelligence of the participants was measured before receiving information about a fact on which they had to express their opinion. The fact in question concerned cars which could pose safety problems. Participants, all Americans, were asked if they would allow German cars with safety concerns to cross the streets of the U.S. The question was also asked vice versa: if they thought defective American cars should be able to drive. transit through Germany.
Participants who were briefed on German cars with safety concerns said they should be banned in the United States to pose a danger to the country’s road safety. Instead, those who were briefed on their American counterparts said they should be able to transit to Germany. That is, they were more critical of the safety of German cars because they were German and driving in their home country and more lax with American cars because they were American and driving abroad. . Intelligence did not reduce the likelihood of bias on my side.
Memory and bias on my side
Although people try to interpret a fact in the most neutral way possible, our memory, which will be biased by our own beliefs, will act in favor of the memory of what supports our point of view, that is – say we have selective memory. Psychologists have speculated that information that matches our existing expectations will be more easily stored and remembered than information that does not agree. In other words, that is to say we memorize and we remember better what is right and we forget more easily what goes against us.
What does this have to do with social media?
Given all of this, it is possible to understand the seriousness of the implications of bias on my side when it comes to receiving and interpreting any information. This bias makes us unable to effectively and logically assess the arguments and evidence before us. presented, as solid as they are. We can more firmly believe something which is questionable by the simple fact that it is “on our side” and be very critical of something which, although very well demonstrated, to be “against us” which we do not consider to be. rigorous and reliable.
But of all the implications this entails, we have one that relates directly to social media, Especially its algorithms. These digital resources, by means of “cookies” and by remembering our search history, cause us to be presented with resources related to something that we have seen before. For example, if we search for photos of kittens on Instagram, we will start to get more photos of these animals in the magnifying glass section.
What is the implication of these algorithms with the bias on my side? A lot, because we are not only looking for images of animals or food on social media, but opinions and “facts” that confirm our pre-established opinion. So if we are looking for a blog on vegetarianism, we will appear in the search section of other related recipes, both politically neutral and vegetarian such as blog entries, images and other resources that talk about animal brutality. and criminalize “carnacas”.
Since we are hardly going to look for information contrary to our point of view, it’s only a matter of time before our opinions get more radical. As the networks have shown us resources in favor of our point of view, we will gradually delve into the subject and, taking the example of vegetarianism, it is likely that we will even find ourselves in vegan, partisan sectors. more intense actions towards meat. sector.
On this basis, and particularly applied to political ideologies, many people believe that these algorithms end democracy. The reason is that as the algorithm does not present us with all the available views on the same topic, but rather presents us with what favors our opinion, which makes us less likely to compare options. Because we are not faced with different “truths” and are stuck in the comfort of our own point of view because of social media, we are really being manipulated.
It is for this reason that, in an attempt to escape the trap of our own mind and the way social media helps us become even more closed in what we think, it never hurts to seek opinions. contrary to ours. Yes that’s right, the bias on my side will make us tend to see them more critically and superficially, but at least the attempt can give us some ideological and opinion freedom. Or at least clear the search history and don’t give the social networking opportunity a chance to trick us into our own beliefs.
- Macpherson, R. and Stanovich, K. (2007). Cognitive ability, thinking dispositions, and set of instructions as predictors of critical thinking. Learning and individual differences 17 (2007) 115-127.
- Stanovitch, K., West, R. (2007). The natural bias in the margin is independent of cognitive ability. THOUGHT AND REASONING, 2007, 13 (3), 225-247
- Stanovitch, K., West, R. (2008). On the failure of the cognitive ability to predict partial and biased thinking on the one hand. THOUGHT AND REASONING, 14 (2), 129-167
- Sternberg, RJ (2001). Why Schools Should Learn to Know: The Educational Wisdom Balance Theory. Educational psychologist, 36, 227 – 245.
- Stanovich, KE; West, RF; Toplak, ME (2013), Myside Bias, Rational Thinking and Intelligence, Current Directions in Psychological Science, 22 (4): 259-64, doi: 10.1177 / 0963721413480174, S2CID 14505370
- Toplak, ME and Stanovich, KE (2003). Associations between partial bias in an informal reasoning task and postsecondary education level. Applied Cognitive Psychology, 17, 851 – 860.
- Lord, Charles G .; Ross, Lee; Lepper, Mark R. (1979), Bias Assimilation and Polarization of Attitudes: The Effects of Previous Theories on Tests Considered Later, Journal of Personality and Social Psychology, 37 (11): 2098-09, CiteSeerX 10.1.1.372.1743 , doi: 10.1037 / 0022-3518.104.22.1688, ISSN 0022-3514