## The Limits of Knowledge, Part III: Big Surprises in Little Packages

(Part of a continuing series: part I and part II)

I love tales of serendipitous scientific discovery. A spot of mold in a Petri dish leads to penicillin, a spill on the stovetop becomes vulcanized rubber. The true hero is penetrating curiosity: instead of dumping a ruined experiment in the trash, the keen-eyed scientist frowns and wonders: *what does this mean*?

Most stories of serendipity occur among test tubes and Bunsen burners, but today computers allow numerical experiments and computational “accidents.” And one of the most paradigm-shattering accidents of the twentieth century involved neither dawdling clocks moving at the speed of light, nor slippery electrons dancing around an atom, but humble calculations of the weather.

Newton’s mechanics engendered a view of a steadily ticking universe, as intricate but as ultimately predictable as clockwork. In the century following Newton, physicists and mathematicians devised powerful conceptual tools with which to track the cosmic machinery.

This is not easy. Humans have a hard time thinking of more than one thing at a time. Most tools on a physicist’s utility belt are tricks to make a large, complex problem look smaller and simpler.

For example, total energy and total momentum must balance out at each step. (Please note that the conservation of energy, etc., is *not* something idly assumed just because Grandpa said so, but is based upon enormous amounts of empirical evidence and years of experiments designed to test said conservation.) Following the energy helps a physicist to intuit the broad sweep of mechanical evolution.

But there have always been ambitions to understand *everything*. In 1814 the French mathematician Pierre-Simon, marquis de Laplace, postulated an intellect who, at one moment knowing the position and velocities of all particles in the universe, could predict all future events . Such a thought sent a chill through philosophers, for it shattered the concept of free will, shackling us to Newton’s implacable cogs of force and momentum.

#

Of course, quantum mechanics places fundamental limits on simultaneous knowledge of the position and velocity of any particle, as devoted readers of this blog know. But even without the uncertainty principle, Laplace’s intellect faces an uphill battle–and a little butterfly leads the opposition.

During the 1940s and 1950s, general purpose electronic computers, the kind of tedious intellect Laplace envisioned, began to be developed, with ENIAC in 1946 and UNIVAC in 1951. By the early 1960s computers were being applied to a wide variety of problems. including meteorology.

Edward Lorenz was one such person (using a Royal McBee LGP-30; See also here.) Hoping to unravel the master principles behind weather, Lorenz developed a simplified set of differential equations with only twelve variables and numerically solved them.

At one point Lorenz wanted to reproduce some earlier results, and so re-ran his program. Comparing old and new results, he found they quickly diverged, giving wildly disparate predictions.

The source of the discrepancy was something seemingly innocuous: Lorenz had kept only three out of six decimal places in the data for restarting the calculation, for example approximating 0.506127 as 0.506.

But this small difference, one part in ten thousand, grew exponentially, until the two runs looked utterly unlike.

Many scientists would have simply concluded that their set of equations were “pathological” and started over from scratch. But Lorenz saw deeper.

He had discovered a “sensitive dependence on initial conditions.” Originally Lorenz used the image of a seagull flapping its wings and changing the weather around the world, but later settled on the more poetic idea of a butterfly as the revolutionary.

Lorenz’s genius was to realize that this sensitive dependence on initial conditions– rechristened *chaos* by James Yorke in 1975–was not pathological or rare, but in fact common. It occur in even simpler systems: Lorenz eventually boiled his equations down to just three variables, with the same result.

As it turns out, Lorenz was not the first to discover the principle of chaos. In the 1880s Henri Poincare (who, always the bridesmaid, also helped lay the foundations for relativity) realized there would exist divergent solutions for systems of three or more bodies. But most physicists shied away from Poincare’s finding, not believing the universe to be so obstinately unpredictable. It required large scale number-crunching to rub our noses in the truth.

#

Chaos is not randomness as found in quantum mechanics. Poincare and Lorenz had discovered *deterministic chaos*. It turns out chaos easily arises in Newton’s clockwork. Laplace’s intellect is doomed to failure the moment it uses only a fixed number of decimal places.

And, importantly, chaos is more than just the slogan “a sensitive dependence on initial conditions.” It is a rigorous topic, bristling with many powerful mathematical tools. (This can be contrasted with so-called “complexity,” a field devoid of rigor but brimming with meaningless aphorisms, making it beloved of consultants and other charlatans.)

The *Lyapunov exponent*, for example,* *is a precisely defined measure of just *how* sensitive a system is to initial conditions.

And some systems are not chaotic, but “integrable”–for *each* variable there is a conserved quantity like energy or momentum, reducing the problem to a set of independent, one-dimensional systems. But *most* systems are chaotic, or have interwoven chaotic and nonchaotic regimes (in technical terms–and I know, Dear Reader, you love technical terms–interwoven regions with positive and negative Lyapunov exponents).

And then there is Lorenz’s three-variable system, whose two-lobed chaotic trajectory, a staple of screensavers everywhere and sometimes called the Lorenz butterfly, is a a chaotic system that traces similar yet not quite the same paths over and over and over.

#

The butterfly effect has been used many times in both genre and non-genre fiction.

Ray Bradbury’s 1952 short story, “A Sound of Thunder,” anticipated the butterfly effect: a time-traveler crushes a Cretaceous butterfly and changes history.

It’s unknown if Lorenz was aware of Bradbury’s story.

Kim Stanley Robinson, whose sf often explores the vagaries of history, even wrote an essay titled “A sensitive dependence on initial conditions,” and his new and brilliant novel, *Galileo’s Dream*, envisions alternate paths history might have taken, hinging on Galileo’s trial before the Inquisition.

The butterfly effect also gives us pause as to whether Asimov’s psychohistory (invented well before Lorenz, to be sure) would ever have a chance; Asimov, with a Ph.D in biochemistry, was clearly thinking in analogy with thermodynamics, statistical averages of the energy and momentum of trillions of random gas particles.

But a better paradigm for history might be the Lorenz butterfly. Each lobe, or *basin of attraction*, could be swings of the historical pendulum…except that when and where the swing will occur, no one could predict.

And that’s part of the fun for sf writers. Let’s be honest: sf’s track record in predicting the future is poor. We got communication satellites and cell phones right, but where are the jetpacks? The orbiting Hiltons? The AIs plotting genocide of the human race? Sf cannot, and should not, try to play the role of Laplace’s intellect.

Instead, as Kim Stanley Robinson emphasizes, we write morality plays about how skittish and fickle history is. Down one path lies nuclear Armageddon, down another galactic utopia. And sitting at the crossroads, resting her wings for a moment, is our heroine: a little yellow butterfly.

Nicely done. I didn’t realize that the Cretaceous butterfly story predated the butterfly effect idea. Interesting.

I work with living systems (plant communities) in which initial conditions can make a huge difference. Not just what species are planted, but weather, soils, disturbance. You’d think that you would get the same things from the same starting point, but as any gardener can tell you that isn’t always true. Probability-based tools — I’m partial to fuzzy logic — help. If you start with X, you’ll get Y most of the time, but Z sometimes, and once in a great while you’ll end up with Q. And even when you can predict aggregate behavior, the fate of a particular individual is still uncertain.

Thanks. It’s possible there is a connection between Bradbury and Lorenz, but my researches turned up nothing and several sources (on the internet, mind you, so take with a grain of NaCl) said there is no known connection.

Although I didn’t say so in the post, for many years I held out hope for Lorenz to win a Nobel prize. He certainly instituted a paradigm shift in physics and in my mind deserved one.

Wonderfully done, simultaneously rigorous and poetic as science should ideally be — and witty, too (pauvre Poincaré!). I agree that Lorenz deserved the Nobel. He personifies the dictum that chance favors the prepared mind.

I think that it’s fascinating to see how science often uncovers what is already embedded in the universe: coastlines, trees and snowflakes are fractal; daisies and nautilus shells describe a perfect helix; splicing happens without pause in our cells…

I wouldn’t be so hard on complexity, though. There’s a reason why consultants and other charlatans latch on to the concept: a bona fide complex phenomenon takes as long to explain as it does to describe. This means that anyone who can come up with a catchy soundbite can be elevated to the status of prophet.

There are some interesting proto-paradigms in complexity “theory,” most importantly self-organized criticality as evidenced in sandpiles, and a few others like Conway’s game of life. But there are precious few rigorous results, and I myself am tired of hearing bland aphorisms used as a substitute for actual results. It’s the physics equivalent of stating, “Organisms always evolve to a state of higher complexity” (which comes from the New Age so-called thinker Ken Wilbur, and which was quoted in a “Science and Literature” course I sat in on, and deconstructed in real time, many years ago; my own response was to blurt out, “I’ve read better biology on the back of cereal boxes!”).

Well, most people think they “know” such biological “facts” as ever-increased complexity. To which my soundbite answer (less funny than yours, but more to the scientific point) is “bacteria”. Eubacteria, which are all the garden-variety beasties we normally encounter, streamlined their genomes, jettisoned introns and the splicing apparatus… and became wildly successful. This is also “complex” as a process. But “simplified” as an outcome.

The soundbite truisms become particularly annoying and potentially harmful when people use them to extrapolate matters of exobiology: constraints for life, for intelligence, for civilization.

Of course; I used bacteria as a counter-example to Wilbur’s quote, once I had made my snarky comment…