In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:
This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.
I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.
And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.
Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:
What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.
This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.
In scientific thought we adopt the simplest theory which will explain all the facts under consideration and enable us to predict new facts of the same kind. The catch in this criterion lies in the word “simplest.” It is really an aesthetic canon, such as we find implicit in our criticisms of poetry or painting. The layman finds such a law as dx/dt = K(d²x/dy²) much less simple than “it oozes,” of which it is the mathematical statement. The physicist reverses this judgment, and his statement is certainly the more fruitful of the two, so far as prediction is concerned.
To know only one thing well is to have a barbaric mind: civilization implies the graceful relation of all varieties of experience to a central humane system of thought. The present age is peculiarly barbaric: introduce, say, a Hebrew scholar to an ichthyologist or an authority on Danish place names and the pair of them would have no single topic in common but the weather or the war…But that so many scholars are barbarians does not matter so long as a few of them are ready to help with their specialized knowledge the few independent thinkers, that is to say the poets, who try to keep civilization alive. The scholar is a quarryman, not a builder, and all that is required of him is that he should quarry cleanly. He is the poet’s insurance against factual error. It is easy enough for the poet in this hopelessly muddled and inaccurate modern world to be misled into false etymology, anachronism, and mathematical absurdity by trying to be what he is not. His function is truth, whereas the scholar’s is fact. Fact is not to be gainsaid; one may put it in this way, that fact is a Tribune of the People with no legislative right, but only the right of veto. Fact is not truth, but a poet who willfully defies fact cannot achieve truth.
It can be said with complete confidence that any scientist of any age who wants to make important discoveries must study important problems. Dull or piffling problems yield dull or piffling answers. It is not enough that a problem be “interesting”—almost any problem is interesting if it is studied in sufficient depth.
As an example of research work not worth doing, Lord Zuckerman invented the cruelly apt but not ridiculously farfetched example of a young zoology graduate who has decided to try to find out why thirty-six percent of sea urchin eggs have a tiny little black spot on them. This is not an important problem; such a graduate student will be lucky if his work commands the attention or interest of anyone except perhaps the poor fellow next door who is trying to find out why sixty-four percent of sea urchin eggs do not have a little black spot on them. Such a student has committed a kind of scientific suicide, and his supervisors are very much to blame. The example is purely imaginary, of course, for Lord Zuckerman knows very well that no sea urchin eggs are spotted.
It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.
Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)
Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?
The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”
Years ago, one of Broadway’s great play doctors and original writers commented that the classical three-act structure of a well-made play could be summed up this way: In the first act, you get a guy up in a tree. In the second act, you throw rocks at him. In the third act, you get him down again. When I told this to Steven [Spielberg], he observed that making Jaws was a four-act structure. “In Act One, I get into a tree, and for the next three acts, people throw rocks at me.”
At last night’s presidential debate, when moderator Chris Wallace asked if he would accept the outcome of the election, Donald Trump replied: “I’ll keep you in suspense, okay?” It was an extraordinary moment that immediately dominated the headlines, and not just because it was an unprecedented repudiation of a crucial cornerstone of the democratic process. Trump’s statement—it seems inaccurate to call it a “gaffe,” since it clearly reflects his actual views—was perhaps the most damaging remark anyone could have made in that setting, and it reveals a curious degree of indifference, or incompetence, in a candidate who has long taken pride in his understanding of the media. It was a short, unforgettable sound bite that could instantly be brought to members of both parties for comment. And it wasn’t an arcane matter of policy or an irrelevant personal issue, but an instantly graspable attack on assumptions shared by every democratically elected official in America, and presumably by the vast majority of voters. Even if Trump had won the rest of the debate, which he didn’t, those six words would have erased whatever gains he might have made. Not only was it politically and philosophically indefensible, but it was a ludicrous tactical mistake, an unforced error in response to a question that he and his advisors knew was going to be asked. As Julia Azari put it during the live chat on FiveThirtyEight: “The American presidency is not the latest Tana French novel—leaders can’t keep the people in suspense.”
But the phrase that he used tells us a lot about Trump. I’m speaking as someone who has devoted my fair share of thought to suspense itself: I’ve written a trilogy of thrillers and blogged here about the topic at length. When I think about the subject, I often start with what John Updike wrote in a review of Nabokov’s Glory, which is that it “never really awakens to its condition as a novel, its obligation to generate suspense.” What Updike meant is that stories are supposed to make us wonder about what’s going to happen next, and it’s that state of pleasurable anticipation that keeps us reading. It can be an end in itself, but it can also be a literary tool for sustaining the reader’s interest while the writer tackles other goals. As Kurt Vonnegut once said of plot, it isn’t necessarily an accurate representation of life, but a way to keep readers turning pages. Over time, the techniques of suspense have developed to the point where you can simulate it using purely mechanical tricks. If you watch enough reality television, you start to notice how the grammar of the editing repeats itself, whether you’re talking about Top Chef or Project Runway or Jim Henson’s Creature Shop. The delay before the judges deliver their decision, the closeups of the faces of the contestants, the way in which an editor pads out the moment by inserting cutaways between every word that Padma Lakshmi says—these are all practical tools that can give a routine stretch of footage the weight of the verdict in the O.J. Simpson trial. You can rely on them when you can’t rely on the events of the show itself.
And the best trick of all is to have a host who keeps things moving whenever the contestants or guests start to drag. That’s where someone like Trump comes in. He’s an embarrassment, but he’s far from untalented, at least within the narrow range of competence in which he used to operate. When I spent a season watching The Celebrity Apprentice—my friend’s older sister was on it—I was struck by how little Trump had to do: he was only onscreen for a few minutes in each episode. But he was good at his job, and he was also the obedient instrument of his producers. He has approached the campaign with the same mindset, but with few of the resources that are at an actual reality show’s disposal. Trump’s strategy has been built around the idea that he doesn’t need to spend money on advertising or a ground game, as long as the media provides him with free coverage. It’s an interesting experiment, but there’s a limit to how effective it can be. In practice, Trump is less like the producer or the host than a contestant, which reduces him to acting like a reality star who wants to maximize his screen time: say alarming things, pick fights, act unpredictably, and generate the footage that the show needs, while never realizing that the incentives of the contestants and producers are fundamentally misaligned. (He should have just watched the first season of UnREAL.) When he says that he’ll keep us in suspense about accepting the results of the election, he’s just following the reality show playbook, which is to milk such climactic moments for all they’re worth.
Yet this approach has backfired, and television provides us with some important clues as to why. I once believed that the best analogy to Trump’s campaign was the rake gag made famous by The Simpsons. As producer Al Jean described it: “Sam Simon had a theory that if you repeat a joke too many times, it stops being funny, but if you keep on repeating it, it might get really funny.” Trump performed a rake gag in public for months. First we were offended when he made fun of John McCain’s military service; then he said so many offensive things that we became numb to it; and then it passed a tipping point, and we got really offended. I still think that’s true. But there’s an even better analogy from television, which is the practice of keeping the audience awake by killing off major characters without warning. As I’ve said here before, it’s a narrative trick that used to seem daring, but now it’s a form of laziness: it’s easier to deliver shocking death scenes than to tell interesting stories about the characters who are still alive. In Trump’s case, the victims are ideas, or key constituents of the electorate: minorities, immigrants, women. When Trump turned on Paul Ryan, it was the equivalent of one of those moments, like the Red Wedding on Game of Thrones, when you’re supposed to gasp and realize that nobody is safe. His attack on a basic principle of democracy might seem like more of the same, but there’s a difference. The strategy might work for a few seasons, but there comes a point at which the show cuts itself too deeply, and there aren’t any characters left that we care about. This is where Trump is now. And by telling us that he’s going to keep us in suspense, he may have just made the ending a lot less suspenseful.