Bayes’s Theorem

The following extract is taken from Nate Silver’s book “The Signal and the Noise”.

Thomas Bayes’s paper, “‘An Essay toward Solving a Problem in the Doctrine of Chances’, was not published until after his death, when it was brought to the Royal Society’s attention in 1763 by a friend of his named Richard Price. It concerned how we formulate probabilistic beliefs about the world when we encounter new data.

Price, in framing Bayes’s essay, gives the example of a person who emerges into the world (perhaps he is Adam, or perhaps he came from Plato’s cave) and sees the sun rise for the first time. At first, he does not know whether this is typical or some sort of freak occurrence. However, each day that he survives and the sun rises again, his confidence increases that it is a permanent feature of nature. Gradually, through this purely statistical form of inference, the probability he assigns to his prediction that the sun will rise again tomorrow approaches (although never exactly reaches) 100 percent.”

To summarise, Baysian reasoning dictates that one should continually update their view as they are presented with new information.

With this philosophy in mind, Bayes’ Theorem provides an algebraic expression enabling us to update the probability we assign to an event occurring as new information becomes available.

To start, you’ll first need a hypothesis. In Richard Price’s example above the hypothesis may be that the sun will rise each morning, however, Nate Silver chooses to illustrate the theory with an all-together more difficult scenario.

“Suppose you are living with a partner and come home from a business trip to discover a strange pair of underwear in your dresser drawer. […] The hypothesis you are interested in evaluating is the probability that you are being cheated on.”

Nate explains how Bayes’s Theorem requires you to estimate a probability that the event occurred because your hypothesis is correct. Let’s call this probability Y.

“Let’s say that the probability of the panties’ appearing, conditional on his cheating on you, is 50 percent.”

Secondly, you need to estimate the probability of the event occurring conditional on the hypothesis being false. Let’s call this probability Z.

“If he isn’t cheating, are there some innocent explanations for how they got there? […] Collectively you put their probability at 5 percent.”

Thirdly, and perhaps most importantly, you need to define the probability that you would have assigned to the hypothesis before the event occurred. This is called a Prior Probability and we’ll label it X.

“Studies have found, for instance, that about 4 percent of married partners cheat on their spouses in any given year, so we’ll set that as our prior.”

Nate continues,

“If we’ve estimated these values, Bayes’s theorem can then be applied to establish a posterior possibility. This is the number that we’re interested in: how likely is it that we’re being cheated on, given that we’ve found the underwear?”

Plugging the three values into Bayes’s algebraic equation (XY over XY + Z (1-X)) delivers a final probability that he is cheating on you of 29 percent.

“This may still seem counterintuitive—aren’t those panties pretty incriminating? But it stems mostly from the fact that you had assigned a low prior probability to him cheating. Although an innocent man has fewer plausible explanations for the appearance of the panties than a guilty one, you had started out thinking he was an innocent man, so that weighs heavily into the equation.”

Before the new information the probability of him cheating was just 5 percent. The new information was presented, weighed and subsequently increased this probability to 29 percent. Objectively that’s a significant, six-fold increase, however, subjectively or emotional reaction deems it to be too low.
It seems that when we are presented with new and uncomfortable information our emotions naturally take over and we immediately jump to an extreme.

Bayes’s Theorem teaches us to take a statistical approach to reasoning. Perhaps more importantly, it teaches us to constantly revaluate our beliefs as we come across new information. Too often we form an opinion and then stubbornly stick to it, believing that wavering would be a sign of weakness. We should, however, be open to new information. We should be open to contradictory evidence. In fact we should actively seek it out and adjust our course accordingly. We should seek to improve our position not defend it.

Hey. I’m Alex Murrell. I'm a Planner at Epoch Design in Bristol where I help deliver highly creative, innovative and effective pack, instore and online communications for some of the world’s biggest FMCG brands. Want to know more? You can find me on Twitter or LinkedIn.

Leave your comment