Bayes theorem: why absolute certainty is evil

Popularizer of science and one of the most successful female poker players, Liv Boeri talks about how Bayes’s probability formula helped her get rid of hypochondria and improve her logical thinking skills.

I have been a hypochondriac for most of my life.

When I was 13, I read an article about a girl of my age who recently went bald. For the next six months, I obsessively counted every hair remaining on the comb.

A few years later, when I was a freshman, I had a headache for three days in a row, and because of this I wept in bed, being sure that I had a brain tumor. (She was gone.)

In 2008, my neuroticism reached a dizzying peak. I did wakeboarding on a warm lake during a trip to Las Vegas and a few days after that I woke up feeling unwell. After three hours of reading Google, I was in a complete panic.

You see, there is an extremely rare, but nonetheless terrifying amoeba called Naegleria fowleri, which sometimes appears in warm freshwater lakes in the southern states, and if you inhale the water from the lake, the amoeba can penetrate the olfactory nerve, multiply and literally eat your brain. And although I understood the meaning of the words “extremely rare,” the plot was too perfect – a neurotic hypochondriac who constantly feared rare terrible diseases, fell victim to a rare terrible disease.

Of course, I was wrong again. The only thing that ate my brain was my irrational concern, and after several sleepless nights I felt good enough to join the Zagul again in Vegas.

Skipping to date, I am pleased to say that my hypochondria – and my logical thinking skills in general – have improved significantly. For the most part, I owe this to my profession: I started playing professional poker shortly after the amoeba incident, and over 10 years the game taught me how to better deal with uncertainty.

But I received the strongest antidote from my irrationality from an amazing source: from an 18th-century English priest, Rev. Thomas Bayes. His pioneering work in statistics has revealed an extremely powerful tool that, when used correctly, can radically improve our perception of the world.

Bayes Theorem
Our modern world, as you know, is unpredictable and complex. Should I buy bitcoins? Do you believe this headline? Does my confusion really exist or is it simply imposed on me?

Whether it’s finance, career or love life, we have to make difficult decisions every day. In addition, smartphones bombard us around the clock with an endless stream of news and information. Some of this information is reliable, some are just noise, and some are completely invented to mislead us. So how do we decide what to believe?

Rev. Bayes took tremendous steps in solving this age-old problem. He was a statistician, and his work on the nature of probability laid the foundation for what is now known as Bayes theorem. Although its formal definition seems a pretty awesome mathematical equation, it essentially boils down to the following:

Previous beliefs (prior probability) x new data = new beliefs (posterior probability)

READ ALSO:  Pakistani Navy discovers Indian submarine near its waters

In other words, whenever we get new evidence, how much does it affect what we currently consider to be true? Does this information support our beliefs, does it undermine them or does not affect them at all?

This approach is known as “Bayesian” thinking, and most likely you have been using this method of building beliefs all your life without realizing that it has a formal name.

For example, imagine that a colleague comes to you with shocking news: he suspects that your boss is “pumping” money out of the company. You always respected your boss, and if you had been asked to assess the likelihood of his theft before you heard any gossip (“a priori probability”), you would have considered it extremely unlikely. Meanwhile, it is known that your colleague is exaggerating and dramatizing the situation, especially with regard to leadership. Thus, his one word carries a small evidentiary weight – and you are not too serious about these allegations. Statistically speaking, your “posterior probability” remains almost unchanged.

Now let’s take the same scenario, but instead of verbal information, your colleague shows paper evidence that the company’s money goes to your boss’s bank account. In this case, the weight of evidence is much stronger, so the likelihood that the “boss is a thief” will increase significantly. The stronger the evidence, the stronger your new beliefs. And if the evidence is convincing enough, it will prompt you to completely change your mind about the boss.

If this seems obvious and intuitive, it should be. The human brain is to some extent the natural machine of Bayesian thinking, thanks to a process known as prognostic processing. The problem is that almost all of our intuitive feelings developed in simpler situations, such as the struggle for survival in the savannah. The complexity of more modern solutions can sometimes lead to Bayesian thinking not working, especially when it comes to what really excites us.

Traps of Motivated Reasoning
What if, instead of respecting the boss, you are annoyed because you think that he was unfairly promoted to his current position in your place? Objectively speaking, your “a priori” belief that he is stealing funds should be almost as unlikely as in the previous example. However, since you do not like him for another reason, you now have additional motivation to believe in gossip from your colleague. As a result, your “a posteriori” belief may change dramatically, despite the lack of convincing evidence … and it may come to the point that you will do or say something unreasonable.

The phenomenon of the transition from correctly building conclusions to reliance on personal desires or emotions is known as “motivated reasoning,” and it affects each of us, no matter how rational we may be. It’s hard to count how many objectively stupid games I played at the poker table due to excessive emotional attachment to a specific result – from chasing lost chips and reckless bluffs after an unsuccessful hand of cards to desperate heroism against opponents that got on my nerves.

READ ALSO:  In the Tesla electric car will add "dog mode"

When we identify too strongly with a deep-rooted belief, idea, or outcome, many cognitive biases can arise. For example, take bias confirmation. This is our tendency to readily accept any information that confirms our opinion and underestimate everything that contradicts it. This is very easy to notice in other people (especially those with whom you disagree politically), but it is very difficult to detect in yourself, because bias arises unconsciously. But she always is.

And such a Bayesian mistake can have very real and tragic consequences: these are criminal cases in which the jury unconsciously ignores exculpatory evidence and sends the innocent to prison because of his previous negative encounter with someone from the demographic group that includes the defendant. This is a growing inability to hear alternative arguments from representatives of another part of the political spectrum. Conspiracy theorists absorb any unconventional beliefs that come to their hand: they believe that the Earth is flat, that movie stars are lizards, and random pizzerias are the basis of sexual slavery, and all because of comments read on the Internet.

So how do we overcome this deeply rooted part of human nature? How to apply Bayesian thinking?

Extraordinary statements require extraordinary evidence.
For motivated reasoning, the solution is obvious: self-awareness.

Although the bias of confirmation is usually invisible to us, its physiological triggers are more obvious. Is there a person hearing about which you grit your teeth and your blood boils? Social or religious beliefs that are so dear to you that you find it ridiculous to even discuss them?

We all have some deep conviction that makes us immediately take a defensive position. This does not mean that the belief is actually false. But this means that we are vulnerable to poor argumentation about this belief. And if you learn to determine the appropriate emotional signals in yourself, you will have more chances to objectively evaluate the evidence or arguments of the other side.

However, the best remedy for some Bayesian errors is accurate information. That is what helped me in the battle against hypochondria. Studying the numerical probabilities of the diseases I was afraid of meant that I could deal with the risks just like in poker.

A friend tired of my neuroticism appreciated the approximate chances that someone of my age, gender and medical history would pick up this deadly amoeba after swimming in this particular lake. “Liv, the likelihood of this is significantly less than making a royal flush twice in a row,” he said. “You played thousands of games, and this never happened to you or to anyone else you know.” Stop worrying about this fucking amoeba. ”

If I wanted to take another step, I could, by applying the Bayes formula to this a priori probability, multiply it by the probative value of my brain symptoms. To do this mathematically, I would take the opposite situation: how likely are my symptoms without an amoeba? (Answer: very likely!) Since people have headaches all the time, this is very weak evidence of amoebic infection, and therefore the posterior probability remains almost unchanged.

READ ALSO:  116-year-old Japanese woman recognized as the oldest person on Earth

And this is an important lesson. When it comes to statistics, it’s easy to focus on fried headlines such as “thousands of people died from terrorism last year,” and forget about the other, equally important part of the equation: the number of people who didn’t die from it last year.

Sometimes conspiracy enthusiasts fall into a similar statistical trap. At first glance, to challenge certain established beliefs is good scientific practice, this can reveal injustice and prevent the recurrence of systemic errors in society. But for some, proof that the dominant point of view is wrong becomes an all-consuming mission. And this is especially dangerous in the era of the Internet, when a search on Google always throws something that matches your beliefs. Bayes’ rule teaches that extraordinary statements require extraordinary evidence.

However, for some people, the less likely the explanation is, the more likely they are to believe it. Take those who claim that the Earth is flat. They proceed from the notion that all pilots, astronomers, geologists, physicists and GPS engineers in the world are participating in a conspiracy to mislead the public about the shape of the planet. The a priori probability of this scenario, taking into account all other conceivable possibilities, is extremely small. But, absolutely wildly, any demonstration of the opposite point of view, no matter how strong it may seem, further strengthens their worldview.

Unconditional uncertainty
If there is at least one thing that, thanks to Bayes, we can be sure of, it is that we cannot be absolutely sure of anything. Like a spaceship trying to reach the speed of light, the posterior probability can only approach 100% (or 0%), but it can never reach this indicator.

When we say or think: “I am 100% sure!” – even with respect to something very probable, like the spherical shape of the Earth, this is not just stupidity, it is a factual mistake. Saying this, we affirm that there is no evidence in the world, no matter how strong they are, which can change our minds. And this is just as funny as saying: “I know everything about everything that could ever happen in the Universe,” because there is always something unknown that we cannot imagine, no matter how knowledgeable and wise we are.

That is why science never officially proves anything – it simply seeks confirmation or refutation of existing theories until the degree of confidence approaches 0% or 100%. This should serve as a reminder that we should always allow the possibility of a change of opinion if there is strong enough evidence. And most importantly, we must look at our beliefs realistically: this is just another a priori probability drifting in a sea of ​​uncertainty.

Especially for you Alihanrin Alihan!
loading…


For those who want to support my project:

Bitcoin: bc1qw94hw46llhmx7gym3vl0evft6sts3g8ktf23ep

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 14 other subscribers

Read also:

Be the first to comment

Leave a Reply

Your email address will not be published.


*


There is no god but Allah(God). Muhammad is the Messenger of Allah(God).