Getting it wrong is one of the most common fears people haveDespite the stoicism with which Greek or Roman philosophy took it (errare humanum est, which said Seneca the Younger). Or rather, we fear the expected consequences of errors, which for the great majority are pre-imagined catastrophes that cause a lot of psychological distress, and not some blockages in decision-making.
What is really a mistake?
We do not understand, in principle, an assessment that is inadequate or invalid in its scope, whether when making decisions or taking actions following that decision. We know it doesn’t match, because the prediction of the results we made is not fulfilled. Of course, we categorize it as an error if this mismatch has a negative balance, because if it is the opposite and we get an unexpected advantage, it will immediately become a hit despite the dissonance.
There has been a lot of research on how we deal with errors; different fields of study and more or less all point in the direction indicated by Buss and Haselton (2000) in their theory of error handling. Simply put, when we have to make a decision on an issue that involves some degree of uncertainty, we can make two types of mistakes.
In Type I errors, or false positives, we predict that a fact will happen which ultimately does not happen, while in Type II errors, or false negatives, we are betting that an event that will occur then will not happen. The theory maintains that in deciding it is not possible to minimize both probabilities; either we reduce one or we reduce the other.
What is better? It depends on the perceived cost, and therefore on the context. If I need to design a fire protection system or if I am an engineer, I will tend to minimize those of type II, which would be a real disaster; an alarm must have a tendency to produce false positives for obvious reasons. But in general, we tend to go for more conservative options if we expect to make a profit, while in a loss scenario we are more willing to take risks (Johnson, 2013).
How do errors occur?
Most decisions are made what Kahneman and other authors call System 1 or autopilot of our mental processes.
Anyone who’s tried to put dirty dishes in the fridge or searched around the house for glasses while wearing them on their head knows that our automations fail. But, nevertheless, the margin of vagueness is a tribute worth paying for the speed, efficiency and adaptability to the environment that this automatic method offers. The most important decisions will ideally be taken with the intervention of System 2, the action is voluntary, thoughtful and involves much more effort.
In general, when we think we have made a mistake, this is due to a lack of information when taking measurements, Either to be inaccessible (it is very difficult to know what the working climate will be like in this brand new work that we have done and which seems to be a great opportunity), or for a misinterpretation of the available, and here we would go into cognitive bias when deciding. It is not uncommon to ignore data that does not correspond to our predefined ideas, or to underestimate them. The overestimation of rather weak indices.
In fact, apart from the negative consequences that error can have, we are very concerned about the emotional cost of the terrible moment we verified that we got down to earth. Using the frustration of having your wants, needs or aspirations unmet is a process that is educated from an early age and not everyone knows how to handle it properly.
Anger at someone outside or at ourselves, the sadness of losing what we expected and the helplessness we sometimes find ourselves in, is a hard pill to swallow.
Fear of making a mistake: what should be done to manage it?
In general, to achieve better exposure to error without too serious psychological consequences, Some keys must be taken into account.
1. Accept that error is pervasive and daily
We make thousands of decisions a day, most of them decided by System 1, which saves us a lot of tedious work. We will therefore be wrong dozens if not hundreds of times. The more accustomed I am to the possibility of error, the less I will suffer when it happens.
2. Learn how to estimate real costs
The cost of error is not always high, nor is it a tragedy. In fact, among the dozens of mistakes that are made every day, most of us are unaware that there are no consequences. even there are mistakes that keep us from making bigger mistakesLike the “positive illusions” that overestimate our ability or ability to cope with certain situations and can lead us to resolve them over and over again (McKay & Dennet, 2009).
3. Appreciate our prejudices
Many of the biased decisions we make are adaptive, paradoxically; for example, looking both sides of the road, even if no cars are passing, is behavioral bias and the cost is minimal. The famous negativity bias is progressive because it promotes survival, Although not always right. Bias minimizes the cost of errors.
The point is, if we perceive that a bad result is repeating itself, there may be a prejudice that does not serve us – “beware of everyone”, “men just want sex”, etc.). A thoughtful assessment of how we decide is important.
4. Adequate emotional management
We will be angry, angry, and could be hyperventilated if we miss the delivery deadline, choose a career we don’t like, or enter a relationship with a toxic person. But beware of “making that unpleasant feeling last” longer than recommended. Negative emotions are used to indicate where there is a problem, no more no less. Then, our job is to identify it properly and to put in place solutions.
5. Incorporate new information.
It’s about seeking adaptability in our mental patterns, incorporating new behaviors, and adjusting our patterns once we’ve located what is interfering with our predictions. Humans frequently change the way we do things, even though in many cases we don’t do it consciously.
We don’t always look for the maximum benefit, but the best fit. Therefore, we need to carefully consider the error. To avoid the influence of our own prejudices, we can always seek help, whether professional or “amateur”; the vision of another trusted person can be very helpful.
- D. Johnson, D. Blumstein, J. Fowler, M. Haselton (2013) The evolution of error: error management, cognitive constraints and adaptive decision bias. Trends in Ecology and Evolution August 2013, vol. 28, no. 8.
- M. Haselton and D. Buss (2000) Error Management Theory: A New Perspective on Bias in Transverse Mental Reading. Journal of Personality and Social Psychology 2000. Vol. 78, no. 1.81-91.
- Psychological constructs by M. Psyrdellis and N. Justel (2017) related to the frustration response in humans. Research Yearbook, vol. XXIV, 2017, p. 301-310 University of Buenos Aires.
- N. Keith and M. Frese (2005). Self-regulation in error management training: emotional control and metacognition as mediators of performance effects. Journal of Applied Psychology 2005, vol. 90, no. 4, 677 – 691.