Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Christopher Nolan

The strange loop of Westworld

leave a comment »

The maze in Westworld

In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:

This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.

I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.


And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.

Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:

What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.

This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.

The test of tone

with one comment

Brendan Gleeson and Colin Farrell in In Bruges

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on April 22, 2014.

Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.

The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.

Hugh Dancy on Hannibal

As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.

At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the great pleasures of watching Hannibal lay in how it learned to acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. Fargo remains the most fascinating drama on television in large part because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.

Written by nevalalee

April 26, 2016 at 9:00 am

The Coco Chanel rule

with 4 comments

Coco Chanel

“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)

But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:

In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.

As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”

This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.

Walter Murch

This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”

This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it. 

“Open the bomb bay doors, please, Ken…”

leave a comment »

Slim Pickens in Dr. Strangelove

After the legendary production designer Ken Adam died last week, I found myself browsing through the book Ken Adam: The Art of Production Design, a wonderfully detailed series of interviews that he conducted with the cultural historian Christopher Frayling. It’s full of great stories, but the one I found myself pondering the most is from the making of Dr. Strangelove. Stanley Kubrick had just cast Slim Pickens in the role of Major Kong, the pilot of the B-52 bomber that inadvertently ends up triggering the end of the world, and it led the director to a sudden brainstorm. Here’s how Adam tells it:

[The bomber set] didn’t have practical bomb doors—we didn’t need them in the script at that time—and the set was almost ready to shoot. And Stanley said, “We need practical bomb doors.” He wanted this Texan cowboy to ride the bomb like a bronco into the Russian missile site. I did some setups, sketches for the whole thing, and Stanley asked me when it would be ready. I said, “If I work three crews twenty-four hours a day, you still won’t have it for at least a week, and that’s too late.” So now I arrive at Shepperton and I’m having kittens because I knew it was a fantastic idea but physically, mechanically, we couldn’t get it done. So again it was Wally Veevers, our special effects man, who saved the day, saying he’d sleep on it and come up with an idea. He always did that, even though he was having heart problems and wasn’t well. Wally came back and said, “We’re going to take a ten-by-eight still of the bomb bay interior, cut out the bomb-door opening, and shoot the bomb coming down against blue backing.” And that’s the way they did it.

I love this story for a lot of reasons. The first is the rare opportunity it affords to follow Kubrick’s train of thought. He had cast Peter Sellers, who was already playing three other lead roles, as Major Kong, but the performance wasn’t working, and when Sellers injured his ankle, Kubrick used this as an excuse to bring in another actor. Slim Pickens brought his own aura of associations, leading Kubrick to the movie’s single most memorable image, which now seems all but inevitable. And he seemed confident that any practical difficulties could be overcome. As Adam says elsewhere:

[Kubrick] had this famous theory in those days that the director had the right to change his mind up until the moment the cameras started turning. But he changed his mind after the cameras were rolling! For me, it was enormously demanding, because until then I was basically a pretty organized person. But I wasn’t yet flexible enough to meet these sometimes impossible demands that he came up with. So I was going through an anxiety crisis. But at the same time I knew that every time he changed his mind, he came up with a brilliant idea. So I knew I had to meet his demands in some way, even if it seemed impossible from a practical point of view.

Which just serves as a reminder that for Kubrick, who is so often characterized as the most meticulous and obsessive of directors, an intense level of preparation existed primarily to enable those moments in which the plan could be thrown away—a point that even his admirers often overlook.

Design by Ken Adam for Dr. Strangelove

It’s also obvious that Kubrick couldn’t have done any of this if he hadn’t surrounded himself with brilliant collaborators, and his reliance on Adam testifies to his belief that he had found someone who could translate his ideas into reality. (He tried and failed to get Adam to work with him on 2001, and the two reunited for Barry Lyndon, for which Adam deservedly won an Oscar.) We don’t tend to think of Dr. Strangelove as a movie that solved enormous technical problems in the way that some of Kubrick’s other projects did, but like any film, it presented obstacles that most viewers will never notice. Creating the huge maps in the war room, for instance, required a thousand hundred-watt bulbs installed behind perspex, along with an improvised air-conditioning system to prevent the heat from blistering the transparencies. Like the bomb bay doors, it’s the sort of issue that would probably be solved today with digital effects, but the need to address it on the set contributes to the air of authenticity that the story demands. Dr. Strangelove wouldn’t be nearly as funny if its insanities weren’t set against a backdrop of painstaking realism. Major Kong is a loving caricature, but the bomber he flies isn’t: it was reconstructed down to the tiniest detail from photos in aeronautical magazines. And there’s a sense in which Kubrick, like Christopher Nolan, embraced big logistical challenges as a way to combat a tendency to live in his own head—which is the one thing that these two directors, who are so often mentioned together, really do have in common.

There’s also no question that this was hard on Ken Adam, who was driven to something close to a nervous breakdown during the filming of Barry Lyndon. He says:

I became so neurotic that I bore all of Stanley’s crazy decisions on my own shoulders. I was always apologizing to actors for something that had gone wrong. I felt responsible for every detail of Stanley’s film, for all his mistakes and neuroses. I was apologizing to actors for Stanley’s unreasonable demands.

In Frayling’s words, Adam was “the man in the middle, with a vengeance.” And if he ended up acting as the ambassador, self-appointed or otherwise, between Kubrick and the cast and crew, it isn’t hard to see why: the production designer, then as now, provides the primary interface between the vision on the page—or in the director’s head—and its realization as something that can be captured on film. It’s a role that deserves all the more respect at a time when physical sets are increasingly being replaced by digital environments that live somewhere on a hard drive at Weta Digital. A director is not a designer, and even Adam says that Kubrick “didn’t know how to design,” although he also states that the latter could have taken over any number of the other technical departments. (This wasn’t just flattery, either. Years later, Adam would call Kubrick, in secret, to help him light the enormous supertanker set for The Spy Who Loved Me.) A director has to be good at many things, but it all emerges from a willingness to confront the problems that arise where the perfect collides with the possible. And it’s to the lasting credit of both Kubrick and Adam that they never flinched from that single combat, toe to toe with reality.

“He had played his part admirably…”

leave a comment »

"Laszlo, the bosun of the megayacht..."

Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.

A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:

We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…

Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.

"He had played his part admirably..."

What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.

It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…

“And what does that name have to do with this?”

with 2 comments

"The word on the side of your yacht..."

Note: This post is the thirtieth installment in my author’s commentary for Eternal Empire, covering Chapter 29. You can read the previous installments here.

Earlier this week, in response to a devastating article in the New York Times on the allegedly crushing work environment in Amazon’s corporate offices, Jeff Bezos sent an email to employees that included the following statement:

[The article] claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter is heard. Again, I don’t recognize this Amazon and I very much hope you don’t, either…I strongly believe that anyone working in a company that really is like the one described in the [Times] would be crazy to stay. I know I would leave such a company.

Predictably, the email resulted in numerous headlines along the lines of “Jeff Bezos to Employees: You Don’t Work in a Dystopian Hellscape, Do You?” Bezos, a very smart guy, should have seen it coming. As Richard Nixon learned a long time ago, whenever you tell people that you aren’t a crook, you’re really raising the possibility that you might be. If you’re concerned about the names that your critics might call you, the last thing you want to do is put words in their mouths—it’s why public relations experts advise their clients to avoid negative language, even in the form of a denial—and saying that Amazon isn’t a soulless, dystopian workplace is a little like asking us not to think of an elephant.

Writers have recognized the negative power of certain loaded terms for a long time, and many works of art go out of their way to avoid such words, even if they’re central to the story. One of my favorite examples is the film version of The Girl With the Dragon Tattoo. Coming off Seven and Zodiac, David Fincher didn’t want to be pigeonholed as a director of serial killer movies, so the dialogue exclusively uses the term “serial murderer,” although it’s doubtful how effective this was. Along the same lines, Christopher Nolan’s superhero movies are notably averse to calling their characters by their most famous names: The Dark Knight Rises never uses the name “Catwoman,” while Man of Steel, which Nolan produced, avoids “Superman,” perhaps following the example of Frank Miller’s The Dark Knight Returns, which indulges in similar circumlocutions. Robert Towne’s script for Greystoke never calls its central character “Tarzan,” and The Walking Dead uses just about every imaginable term for its creatures aside from “zombie,” for reasons that creator Robert Kirkman explains:

One of the things about this world is that…they’re not familiar with zombies, per se. This isn’t a world [in which] the Romero movies exist, for instance, because we don’t want to portray it that way…They’ve never seen this in pop culture. This is a completely new thing for them.

"And what does that name have to do with this?"

Kirkman’s reluctance to call anything a zombie, which has inspired an entire page on TV Tropes dedicated to similar examples, is particularly revealing. A zombie movie can’t use that word because an invasion of the undead needs to feel like something unprecedented, and falling back on a term we know conjures up all kinds of pop cultural connotations that an original take might prefer to avoid. In many cases, avoiding particular words subtly encourages us treat the story on its own terms. In The Godfather, the term “Mafia” is never uttered—an aversion, incidentally, not shared by the original novel, the working title of which was actually Mafia. This quietly allows us to judge the Corleones according to the rules of their own closed world, and it circumvents any real reflection about what the family business actually involves. (According to one famous story, the mobster Joseph Colombo paid a visit to producer Al Ruddy, demanding that the word be struck from the script as a condition for allowing the movie to continue. Ruddy, who knew that the screenplay only used the word once, promptly agreed.) The Godfather Part II is largely devoted to blowing up the first movie’s assumptions, and when the word “Mafia” is uttered at a senate hearing, it feels like the real world intruding on a comfortable fantasy. And the moment wouldn’t be as effective if the first installment hadn’t been as diligent about avoiding the term, allowing it to build a new myth in its place.

While writing Eternal Empire, I found myself confronting a similar problem. In this case, the offending word was “Shambhala.” As I’ve noted before, I decided early on that the third novel in the series would center on the Shambhala myth, a choice I made as soon as I stumbled across an excerpt from Rachel Polonsky’s Molotov’s Magic Lantern, in which she states that Vladimir Putin had taken a particular interest in the legend. A little research, notably in Andrei Znamenski’s Red Shambhala, confirmed that the periodic attempts by Russia to confirm the existence of that mythical kingdom, carried out in an atmosphere of espionage and spycraft in Central Asia, was a rich vein of material. The trouble was that the word “Shambhala” itself was so loaded with New Age connotations that I’d have trouble digging my way out from under it: a quick search online reveals that it’s the name of a string of meditation centers, a music festival, and a spa with its own line of massage oils, none of which is exactly in keeping with the tone that I was trying to evoke. My solution, predictably, was to structure the whole plot around the myth of Shambhala while mentioning it as little as possible: the name appears perhaps thirty times across four hundred pages. (The mythological history of Shambhala is treated barely at all, and most of the references occur in discussions of the real attempts by Russian intelligence to discover it.) The bulk of those references appear here, in Chapter 29, and I cut them all down as much as possible, focusing on the bare minimum I needed for Maddy to pique Tarkovsky’s interest. I probably could have cut them even further. But as it stands, it’s more or less enough to get the story to where it needs to be. And it doesn’t need to be any longer than it is…

Gatsby’s fortune and the art of ambiguity

with 3 comments

F. Scott Fitzgerald

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has recently come back into the spotlight, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse always seem like an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, even if I’m retreating from this a bit, I’ll often find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we see in so much “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at the moment, and our actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

%d bloggers like this: