Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for July 12th, 2017

The long con

with 4 comments

In 1969, Jay W. Forrester, a computer scientist and professor of management at M.I.T., published the paper “Planning Under the Dynamic Influences of Complex Social Systems.” Despite its daunting title, it’s really about the limits of intuition, which is a subject that I’ve often discussed here before. Forrester, who died just last year, specialized in the analysis of complex systems, which he defined as “high-order, multiple-loop, nonlinear, feedback structures,” such as cities, governments, or economies. And he pointed out that when it comes to dealing with such structures, our usual cognitive tools aren’t just useless, but actively misleading:

These complex systems have characteristics which are commonly unknown. They are far different from the simple systems on which our intuitive responses have been sharpened. They are different from the behavior studied in science and mathematics where only simple systems have received orderly attention and analysis…As a first characteristic of complex systems, their behavior appears counterintuitive to the average person. Intuition and judgment, generated by a lifetime of experience with the simple systems that surround one’s every action, create a network of expectations and perceptions which could hardly be better designed to mislead the unwary when he moves into the realm of complex systems.

Forrester, who was deeply influenced by the cybernetics of Norbert Wiener, noted that we’re used to dealing with simple systems in which the connection between cause and effect is relatively clear. From such everyday activities as picking up an object from a table or driving a car, Forrester wrote, we learn “the obvious and ever-present fact that cause and effect are closely related in time and in space. A difficulty or failure of the simple system is observed immediately. The cause is obvious and immediately precedes the consequences.” This isn’t true of complex systems. And it isn’t just a matter of finding the correct answer, but of resisting those that are plausible, but wrong:

When one goes to complex systems all of these facts become fallacies. Cause and effect are no longer closely related either in time or in space. Causes of a symptom may actually lie in some far distant sector of a social system. Furthermore, symptoms may appear long after the primary causes. But the complex system is far more devious and diabolical than merely being different from the simple systems with which we have had experience. Not only is it truly different, but it appears to be the same. Having been conditioned to look close by for the cause of the trouble, the complex system provides a plausible relationship and pattern for us to discover. When we look nearby in time and in location, we find what appears to be a cause, but actually it is only a coincident symptom. Variables in complex systems are highly correlated, but time correlation means little in distinguishing cause and effect.

He concluded: “With a high degree of confidence we can say that the intuitive solutions to the problems of complex social systems will be wrong most of the time.”

This seems true enough, if we go by the track record of politicians, economists, and social scientists. But the most striking thing about Forrester’s discussion, at least to my eyes, was how he personified the system itself. He described it as “devious and diabolical,” writing a little later: “Again the complex system is cunning in its ability to mislead.” It sounds less like an impersonal problem than an active adversary, foiling any attempt to control it by distracting us with red herrings and tantalizing false hypotheses. To put it another way, it’s a con artist who presents us with one obvious—and tempting—interpretation of events while concealing the truth, which is often removed in time and space. You could even turn its argument on its head, and say that the most effective liars and deceivers are the ones who take their cues from the way in which the world misleads us, doing consciously what the universe does on account of its own complexity. The successful con artist is the one who can trick us into using our intuitions about patterns of behavior to draw erroneous conclusions. And even if this is nothing but a metaphor, it reminds us to actively distrust our own inclinations. Con artists prey on hope and greed, assuming that our usual standards of conduct and rationality go out the window as soon as we see some benefit for ourselves. As a result, we should be particularly suspicious of solutions that we want to believe. If an answer seems too good to be true, it probably is. (One good example is the Laffer curve, a drawing on a napkin that has influenced tax policy for over forty years with minimal evidence, simply because it tells people what they want to hear.)

So how do we beat the long con of complexity? Forrester’s answer, which was innovative at the time, was to use computer models, which removed analysis from the fallible realm of human intuition. Even if we accept this on principle, it isn’t an option that most of us have—but we also engage in a kind of mental modeling at all times. Elsewhere, Forrester wrote that in generating models, you should start with one question: “Where is the boundary that encompasses the smallest number of components within which the dynamic behavior under study is generated?” Break it down, and you’re left with what we all do, consciously or otherwise, when we try to figure out the world around us. We draw a boundary around the problem that ideally includes all relevant information while excluding irrelevancies and noise. If we’ve done it properly, it contains just the right number of pieces: not so many that we become overwhelmed, and not so few that we miss something important. We do this whenever we make an outline or a prototype, and, less systematically, whenever we try to predict the future in any way. And the nice thing about a model, which basically consists of a set of rules, isn’t just that it can handle more variables than we can comfortably grasp, but that it keeps us at arm’s length from our own assumptions. Many of the most useful creative activities involve the manipulation of symbols according to fixed procedures, which is sometimes taken as a means of encouraging intuition, but is really a way to move past our initial hunches to ones that are less obvious. (Randomness, the most powerful creative tool of all, wrenches us out of familiar patterns of cause and effect.) It doesn’t always work, but sometimes it does. As George Box said a decade after Forrester: “All models are wrong, but some are useful.”

Written by nevalalee

July 12, 2017 at 8:45 am

Quote of the Day

leave a comment »

There is a not uncommon phenomenon—sometimes called mystical experience—from which a person emerges with the conviction that some unsolvable problem (like the purpose of existence) has been completely explained; one can’t remember quite how, only that it was answered so well as to leave no doubt at all. This, I venture, reflects some mental mechanism (perhaps one of last resort) that, in a state of particularly severe turmoil or distress, can short-circuit the entire intellectual process—by creating the illusion that the problem has been settled.

Marvin Minsky, “Jokes and their Relation to the Cognitive Unconscious”

Written by nevalalee

July 12, 2017 at 7:30 am

Posted in Quote of the Day

Tagged with

%d bloggers like this: