Chaos theory

The following extract is taken from Nate Silver‘s book on the art and science of prediction, ‘The Signal and the Noise’.

“You may have heard the expression: the flap of a butterfly’s wings in Brazil can set off a tornado in Texas . It comes from the title of a paper delivered in 1972 by MIT’s Edward Lorenz, who began his career as a meteorologist.”

Later,

“Lorenz and his team were working to develop a weather forecasting program on an early computer known as a Royal McBee LGP-30. They thought they were getting somewhere until the computer started spitting out erratic results. They began with what they thought was exactly the same data and ran what they thought was exactly the same code—but the program would forecast clear skies over Kansas in one run, and a thunderstorm in the next.

After spending weeks double-checking their hardware and trying to debug their program, Lorenz and his team eventually discovered that their data wasn’t exactly the same: one of their technicians had truncated it in the third decimal place. Instead of having the barometric pressure in one corner of their grid read 29.5168, for example, it might instead read 29.517. Surely this couldn’t make that much difference?

Lorenz realized that it could. The most basic tenet of chaos theory is that a small change in initial conditions—a butterfly flapping its wings in Brazil—can produce a large and unexpected divergence in outcomes—a tornado in Texas. This does not mean that the behaviour of the system is random, as the term “chaos” might seem to imply. Nor is chaos theory some modern recitation of Murphy’s Law (“whatever can go wrong will go wrong”). It just means that certain types of systems are very hard to predict.

The problem begins when there are inaccuracies in our data. (…). Imagine that we’re supposed to be taking the sum of 5 and 5, but we keyed in the second number wrong. Instead of adding 5 and 5, we add 5 and 6. That will give us an answer of 11 when what we really want is 10. We’ll be wrong, but not by much: addition, as a linear operation, is pretty forgiving. Exponential operations, however, extract a lot more punishment when there are inaccuracies in our data. If instead of taking 55—which should be 3,215—we instead take 56 (five to the sixth power), we wind up with an answer of 15,625. That’s way off: we’ve missed our target by 500 percent.

This inaccuracy quickly gets worse if the process is dynamic, meaning that our outputs at one stage of the process become our inputs in the next. For instance, say that we’re supposed to take five to the fifth power, and then take whatever result we get and apply it to the fifth power again. If we’d made the error described above, and substituted a 6 for the second 5, our results will now be off by a factor of more than 3,000.22 Our small, seemingly trivial mistake keeps getting larger and larger.”

Silver summarises,

“Chaos theory applies to systems in which each of two properties hold:

  • The systems are dynamic, meaning that the behavior of the system at one point in time influences its behavior in the future;
  • And they are nonlinear, meaning they abide by exponential rather than additive relationships.”

When systems are dynamic and non-liner, small changes compound themselves, and keep compounding themselves exponentially.

In tightly coupled systems, an event has knock-on effects; like the toppling of the first domino. In chaotic systems for every domino that falls, 2 get added to the line.

Hey. I’m Alex Murrell. I'm a Planner at Epoch Design in Bristol where I help deliver highly creative, innovative and effective pack, instore and online communications for some of the world’s biggest FMCG brands. Want to know more? You can find me on Twitter or LinkedIn.

Leave your comment