Stop the B.S. (Biases in Science)

“Pseudoscience is attractive because it promises certainty, whereas science gives us probability and doubt.”

- Dr. Carol Tavris

So as the new wave of cases washes upon us in this viral pandemic, irrational behavior continues to set up the perfect storm for the progressive decimation of our poorly developed healthcare system. Like a frog slowly burning alive in a pot of water, we are waiting again for the pandemic to peak before we realize the perils of playing musical chairs with death.

When I decided to write this post, I tried to search for reasons that got us up to this point. And even though there are plenty of crystal clear causes, such as a new variants, institutional inefficacy, and just the simple fact that the virus has always been highly transmissible, the reality is that there are many social, economic, cultural, biological, political factors all contributing to this scenario.

However, I’ve seen one major pattern in Peru: the recurrent dismissal of science as a legitimate means of understanding the world.

In the following paragraphs, I’m going to go over some common biases that may explain why I think, as we limp our way into 2021, we are complacent and science-denying in the face of the most significant public health challenge of our lifetime. I’m going to try and apply each of them to the Peruvian society as I have experienced it. I’m not a cognitive neuroscientist, so feel free to take my analyses with a grain of salt and if I’m wrong, make sure to correct me in the comments.

So which biases are (probably) preventing us from embracing science in Peru at a time when not doing so is like playing Russian Roulette with a fully loaded gun?

Confirmation bias

This is perhaps the most dangerous bias I’ve encountered so far. I see it everywhere, from politicians to patients, even in myself. The feeling that you know something and don’t need to seek out other information. This is incredibly powerful and incredibly dangerous in science and life in general.

The internet and modern science have made this an almost insurmountable challenge, as everyone can simply google whatever information they wish to confirm. The Peruvian government’s irrational aversion to scientific evidence is another excellent example of this. As the scientific consensus confirmed that Ivermectin and Hydroxychloroquine were not effective at treating COVID and could, in fact, be harmful, the Peruvian Ministry of health decided to purchase and create domestic supply chains to produce them.

Up to this day, we have congressmen pushing for the use of Ivermectin (an anti-parasitic medication) against a virus. The problem with this bias is that the need to feel you have the right answer overcomes any rational strategy to find the correct answer. Congress probably knows this (or even themselves are entrenched in confirmation bias).

Coupled with the monetary incentive of producing the drug, they engaged in a zero-sum game against their own people. They passed unscientific public health policies, created a supply chain for the placebo worm killing drugs, and exploited the need to feel we had the right answer.

Possible solution: Acknowledging this bias exists and questioning yourself when an answer feels good just because it aligns with your current belief, without any explanation whatsoever, would probably be a solution on an individual level. However, speaking for myself, I feel that the best solution would be to create a culture that values intellectual honesty and respect for knowledge. This is incredibly difficult to do in a society that has been so focused on ideological narratives that are now basically useless in the face of a real existential problem. We need people to trust science, but how can they trust science if they have been conditioned to distrust it?

Status quo bias

Status quo bias is when we think that the current situation is better than any other alternative. Status quo bias is a barrier for change because we tend to avoid change unless we have a powerful reason to do so. The main driver of status quo bias is loss aversion, which means that we are much more sensitive to losses than gains. From an evolutionary perspective, this makes sense, as our primitive ancestors needed to be careful not to lose their gains, such as food or shelter. They were much less likely to take risks to gain something that they didn’t have.

The only time we are likely to change is when we are unhappy with the current situation. It is much harder to get people to change from a good position, even if that situation could be improved. If people are not unhappy with the current situation, then status quo bias drives them to avoid change at all costs. When it comes to science, many people feel they are “better off” without it. There is no perceived loss in ignoring it, so why should they embrace it?

Possible solution: Maybe as a general rule, we should be as concerned about possibly missing an opportunity as we are about missing a threat. If we want to achieve the best possible outcome for all, we should be open to change, even if we currently enjoy a very good situation. This seems like a low hanging fruit anyone could grab. However, in my experience, this is a hard sell in a conservative society that fears change and has ingrained beliefs that a great deal of scientific thinking is linked to liberalism, atheism, or communism. How do you convince people that ignoring scientific advances is a form of loss aversion, even if there is no current threat?

Association bias

In Peru, we don’t tend to associate science with our cultural identity. This is despite the fact that Peru has some of the most impressive scientific institutions in Latin America in specific fields and has multiple award-winning brilliant scientists.

It’s almost as if people are thinking, “Why should I accept science when my country is so poor? or “We don’t have good scientists, so why bother?” If they were to visit any of our major Peruvian scientific institutions, I’m sure their opinions would change. Furthermore, people tend to forget that the reason countries are considered “developed” is because they invested in science and technology to get there in the first place.

Possible solution: Maybe trying to show that science is actually very “Peruvian” and “Andean”. It’s part of our history and was even the primary driver of our “pre-Inca” culture. The Incas were masters of irrigation. They used their aqueducts to grow all sorts of food that allowed them to feed a significant population density. Empiricism was definitely an essential aspect of this development. The same can be said about the Nazca lines, arguably one of the most impressive early examples of art and engineering in South America for that time. The point is that science is actually very Peruvian; we just don’t like to associate it with our identity.

Optimism bias

The concept of optimism correlated to the emotional sense of hope is not what I mean here. When people think that science and progress will just take care of themselves without the need for action, a sort of cynical optimism, that’s optimism bias. We like to think that if we are not actively trying to solve a problem, it’s not our fault that it hasn’t been solved yet. It could be that a large portion of the population in Peru is more concerned with their next meal or what to do with their kids than they are with science (or problems in the world). It’s not that they don’t appreciate it, it’s just that they don’t feel empowered to do anything about it. There is a great deal of resignation that no matter what they do, things will not change.

The fact that there is no good science education in schools doesn’t help either.

Possible solution: I’m not sure that this is a problem that scientists can solve alone, but it’s worth pointing out. I think that most people think of optimism as a form of positive thinking, but this cynical optimism is rooted in the lack of information and the feeling of powerlessness. Why would you worry about the economy or science if there is nothing you can do about it? If it’s not your fault, why should you worry? The answer is that the things we do have as much impact on the world as the things we don’t do.

Representativeness bias

In Peru, when someone says “scientist,” people immediately think of “geeks” or “nerds” that sit all day in front of computer screens thinking about their next evil experiment for personal benefits. We tend to believe that things are representative of what they “appear” to be. This takes a whole new meaning when people start assuming that you can make valid conclusions because of any stereotype.

Many people tend to think that science is some kind of mystical cult or a religion, where scientists think they are superior to everyone else and can’t be wrong when in fact, science is about disproving things instead of proving them and therefore is highly self-critical. Science is a self-correcting process, and it may be that people don’t know this and think that if you call yourself a scientist, you must be infallible, which fuels the distrust towards science once you realize that they are not.

Possible solution: We have to dispel the idea and make people aware that science is a method and not a person. It’s an approach to understand how the world works, a way to accurately map the terrain.

Planning fallacy

I’m just going to quote Daniel Kahneman, who proposed the term first (2002 Nobel Memorial Prize in Economic Sciences winner): “The planning fallacy is that you make a plan, which is usually a best-case scenario. Then you assume that the outcome will follow your plan, even when you should know better.” In other words, the best-case scenario is that everything goes as planned. But the actual outcome is often very different from what we originally planned.

We tend to think that the results will be similar to our original predictions if we follow a plan. So when the plan fails, we feel it was entirely unexpected for some reason. When it succeeds, we give ourselves credit for a job well done. But this isn’t how the world works. Most plans don’t work out because we don’t have all the future variables available to make an accurate probabilistic prediction.

As Peruvians wait for the vaccine to arrive, given the botched negotiations, we have to prepare for the worst-case scenario. We have to assume vaccines won’t arrive and plan accordingly. If they do, that’s the best-case scenario. When we don’t actively embrace science, we forget that a best-case scenario is often a false positive and plan on it, while we dismiss the worst-case scenario as an unlikely outcome, until it happens. There is a general sense of surprise and victimization whenever confronted with the harsh reality that science predicted all along, as if we had forgotten that probability doesn’t care about our hopes and wishes.

Possible solution: Although I advocate rational thinking, I understand that we are not programmed to be rationally pessimistic. Actually planning your worst outcome is hard, but I believe that if people started doing this, they could avoid the devastating effects of the planning fallacy. If people knew that they had to plan for the worst possible outcome and plan for it as if it were 100% sure to happen, perhaps they would be more careful in their decisions. One example of how this could work is thinking about the worst possible outcome that you can think of that would affect your life. Imagine it’s happening right now; what do you do? How do you get out of it? Imagine the entire population had to do this. It’s possible we would be much better prepared for the pandemic and avoid our healthcare system’s ongoing decimation.

Sunk cost fallacy

Sunk cost fallacy is when we keep doing something because we have already invested in it. We might have invested time, money, or effort in something that isn’t going to work out. But we still keep going because we are afraid of losing the investment that we’ve already made. It is tough to let go of an investment, mostly if you have sunk a lot of time and effort into it. Even if the project is going nowhere and costs you more than you are getting out of it, we still find it hard to let go and invest the remaining resources somewhere else.

One crucial component of this fallacy, in my opinion, is the human ego. Avoiding further losses to our ego (especially when tied to the investment) can’t be easily let go. The need to prove that we are right plays a significant role in this fallacy.

Science has been the single most important tool for progress in the last century. However, the sunk cost fallacy is so strong that we keep “investing” in a world that science has long ago disproved. Examples in Peru are countless, from pseudoscience beliefs to conspiracy theories, to charismatic spiritual leaders’ near-fanatical following. Many of these beliefs, even if not backed by actual scientific evidence, are accepted by people due to this fallacy. People simply don’t want to let go of the investment they have made in their beliefs. They want to be right.

Possible solution: Sometimes, it is necessary to learn how to jump from a sinking ship. The sunk cost fallacy often stops us from doing so. It is difficult to move on and invest time and effort into something else when we have already invested so much in something that didn’t work out. In a way, the sunk cost fallacy turns us into emotional gamblers. It doesn’t matter what the game is, be it investing, education, science, or relationships; the more we invest, the harder it is to accept failure and walk away, even if it is the more rational decision.

Many more cognitive biases have been described in different fields, including economics, social sciences, evolutionary biology, law, philosophy, and psychology.

Why would I go out of my way to try and explain our collective irrationality?

It is very easy to blame and criticize people for not doing enough to prevent the current crisis, but that would be somewhat hypocritical on my part. I can’t really call other people ignorant when I know that I have fallen for my fair share of cognitive biases in the past. One might argue that it is not the place for a young medical researcher to worry about such things, but I feel that it is crucial to understand how irrationality plays a major role in societal aversion to change.

Till’ next time!

Physician and researcher currently focusing on infectious diseases.