Bertillonage

Matt Wolfe writing for New Republic:

“Ever since the old practice of marking convicts by mutilating their ears or branding their skin had been abolished, police had lacked a reliable system to determine if a person had a criminal record. Some larger departments assembled photographic rogues’ galleries, but photos were imprecise: People could look alike, or alter their appearance. And matching a suspect to his rap sheet might require leafing, one by one, through hundreds or thousands of pictures. Police chiefs began offering cash bonuses to any officer who successfully recognized a felon.

A solution was engineered in 1881 by Alphonse Bertillon, a sickly, 26-year-old record keeper for the Parisian police. The black sheep in a family of social science luminaries—his father had co-founded the School of Anthropology in Paris—Bertillon worked in the basement at police headquarters, tasked with transcribing physical descriptions of criminals, most of them incomplete or ambiguous. Frustrated, he sought to apply to his work some of his family’s scientific precision. Using a pair of calipers, he studied the physical attributes of prison inmates and found eleven measures unlikely to alter with age or a change in weight, such as sitting height, arm span, and the length and breadth of the head. He concluded that, while two people might share one measurement, the odds of sharing eleven were infinitesimally small. These figures—human flesh rendered into numbers—were placed on a single index card. This system, known as Bertillonage, was quickly put into wide use. An officer could now locate a suspect’s file in minutes, verifying his identity and hitching him, eternally, to his criminal record.”

Statistics and true crime. What’s not to like?

Law of large numbers

From Priceless by American author, columnist and skeptic William Poundstone:

“This says that flipping a fair coin a large number of times will give you a percentage of heads close to 50. That is all you can ask of a fair coin. You can’t predict the outcome of a small number of tosses. However, Tversky and Kahneman noted, people want to believe just that they suppose that’s flipping a coin 10 times will yield five heads and five tales, or something close to it. In reality, lopsided outcomes (like eight heads and two tails) are more common than people believe. Tversky and Kahneman Survey aid some mathematical psychologists at a meeting and found that even the experts were subject to this error. The article’s most memorable line displays a playful with it rarely encountered in scientific papers: ‘peoples intuitions about random sampling appear to satisfy the law of small numbers, which asserts that the law of large numbers applies to small numbers as well.”

Put simply, the more times an experiment is run, the closer the average result will be to the expected value.

Chaos theory

The following extract is taken from Nate Silver‘s book on the art and science of prediction, ‘The Signal and the Noise’.

“You may have heard the expression: the flap of a butterfly’s wings in Brazil can set off a tornado in Texas . It comes from the title of a paper delivered in 1972 by MIT’s Edward Lorenz, who began his career as a meteorologist.”

Later,

“Lorenz and his team were working to develop a weather forecasting program on an early computer known as a Royal McBee LGP-30. They thought they were getting somewhere until the computer started spitting out erratic results. They began with what they thought was exactly the same data and ran what they thought was exactly the same code—but the program would forecast clear skies over Kansas in one run, and a thunderstorm in the next.

After spending weeks double-checking their hardware and trying to debug their program, Lorenz and his team eventually discovered that their data wasn’t exactly the same: one of their technicians had truncated it in the third decimal place. Instead of having the barometric pressure in one corner of their grid read 29.5168, for example, it might instead read 29.517. Surely this couldn’t make that much difference?

Lorenz realized that it could. The most basic tenet of chaos theory is that a small change in initial conditions—a butterfly flapping its wings in Brazil—can produce a large and unexpected divergence in outcomes—a tornado in Texas. This does not mean that the behaviour of the system is random, as the term “chaos” might seem to imply. Nor is chaos theory some modern recitation of Murphy’s Law (“whatever can go wrong will go wrong”). It just means that certain types of systems are very hard to predict.

The problem begins when there are inaccuracies in our data. (…). Imagine that we’re supposed to be taking the sum of 5 and 5, but we keyed in the second number wrong. Instead of adding 5 and 5, we add 5 and 6. That will give us an answer of 11 when what we really want is 10. We’ll be wrong, but not by much: addition, as a linear operation, is pretty forgiving. Exponential operations, however, extract a lot more punishment when there are inaccuracies in our data. If instead of taking 55—which should be 3,215—we instead take 56 (five to the sixth power), we wind up with an answer of 15,625. That’s way off: we’ve missed our target by 500 percent.

This inaccuracy quickly gets worse if the process is dynamic, meaning that our outputs at one stage of the process become our inputs in the next. For instance, say that we’re supposed to take five to the fifth power, and then take whatever result we get and apply it to the fifth power again. If we’d made the error described above, and substituted a 6 for the second 5, our results will now be off by a factor of more than 3,000.22 Our small, seemingly trivial mistake keeps getting larger and larger.”

Silver summarises,

“Chaos theory applies to systems in which each of two properties hold:

  • The systems are dynamic, meaning that the behavior of the system at one point in time influences its behavior in the future;
  • And they are nonlinear, meaning they abide by exponential rather than additive relationships.”

When systems are dynamic and non-liner, small changes compound themselves, and keep compounding themselves exponentially.

In tightly coupled systems, an event has knock-on effects; like the toppling of the first domino. In chaotic systems for every domino that falls, 2 get added to the line.

Bayes’s Theorem

The following extract is taken from Nate Silver’s book “The Signal and the Noise”.

Thomas Bayes’s paper, “‘An Essay toward Solving a Problem in the Doctrine of Chances’, was not published until after his death, when it was brought to the Royal Society’s attention in 1763 by a friend of his named Richard Price. It concerned how we formulate probabilistic beliefs about the world when we encounter new data.

Continue Reading →

Network effect

According to Wikipedia:

A network effect […] is the effect that one user of a good or service has on the value of that product to other people. When a network effect is present, the value of a product or service is dependent on the number of others using it.

In essence, if more people using a system makes the system more valuable, a network effect is at play.

Continue Reading →