Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Christopher Nolan

Thinking on your feet

leave a comment »

The director Elia Kazan, whose credits included A Streetcar Named Desire and On the Waterfront, was proud of his legs. In his memoirs, which the editor Robert Gottlieb calls “the most gripping and revealing book I know about the theater and Hollywood,” Kazan writes of his childhood:

Everything I wanted most I would have to obtain secretly. I learned to conceal my feelings and to work to fulfill them surreptitiously…What I wanted most I’d have to take—quietly and quickly—from others. Not a logical step, but I made it at a leap. I learned to mask my desires, hide my truest feeling; I trained myself to live in deprivation, in silence, never complaining, never begging, in isolation, without expecting kindness or favors or even good luck…I worked waxing floors—forty cents an hour. I worked at a small truck farm across the road—fifty cents an hour. I caddied every afternoon I could at the Wykagyl Country Club, carrying the bags of middle-aged women in long woolen skirts—a dollar a round. I spent nothing. I didn’t take trolleys; I walked. Everywhere. I have strong leg muscles from that time.

The italics are mine, but Kazan emphasized his legs often enough on his own. In an address that he delivered at a retrospective at Wesleyan University in 1973, long after his career had peaked, he told the audience: “Ask me how with all that knowledge and all that wisdom, and all that training and all those capabilities, including the strong legs of a major league outfielder, how did I manage to mess up some of the films I’ve directed so badly?”

As he grew older, Kazan’s feelings about his legs became inseparable from his thoughts on his own physical decline. In an essay titled “The Pleasures of Directing,” which, like the address quoted above, can be found in the excellent book Kazan on Directing, Kazan observes sadly: “They’ve all said it. ‘Directing is a young man’s game.’ And time passing proves them right.” He continues:

What goes first? With an athlete, the legs go first. A director stands all day, even when he’s provided with chairs, jeeps, and limos. He walks over to an actor, stands alongside and talks to him; with a star he may kneel at the side of the chair where his treasure sits. The legs do get weary. Mine have. I didn’t think it would happen because I’ve taken care of my body, always exercised. But I suddenly found I don’t want to play singles. Doubles, okay. I stand at the net when my partner serves, and I don’t have to cover as much ground. But even at that…

I notice also that I want a shorter game—that is to say also, shorter workdays, which is the point. In conventional directing, the time of day when the director has to be most able, most prepared to push the actors hard and get what he needs, usually the close-ups of the so-called “master scene,” is in the afternoon. A director can’t afford to be tired in the late afternoon. That is also the time—after the thoughtful quiet of lunch—when he must correct what has not gone well in the morning. He better be prepared, he better be good.

As far as artistic advice goes, this is as close to the real thing as it gets. But it can only occur to an artist who can no longer take for granted the energy on which he has unthinkingly relied for most of his life.

Kazan isn’t the only player in the film industry to draw a connection between physical strength—or at least stamina—and the medium’s artistic demands. Guy Hamilton, who directed Goldfinger, once said: “To be a director, all you need is a hide like a rhinoceros—and strong legs, and the ability to think on your feet…Talent is something else.” None other than Christopher Nolan believes so much in the importance of standing that he’s institutionalized it on his film sets, as Mark Rylance recently told The Independent: “He does things like he doesn’t like having chairs on set for actors or bottles of water, he’s very particular…[It] keeps you on your toes, literally.” Walter Murch, meanwhile, noted that a film editor needed “a strong back and arms” to lug around reels of celluloid, which is less of a concern in the days of digital editing, but still worth bearing in mind. Murch famously likes to stand while editing, like a surgeon in the operating room:

Editing is sort of a strange combination of being a brain surgeon and a short-order cook. You’ll never see those guys sitting down on the job. The more you engage your entire body in the process of editing, the better and more balletic the flow of images will be. I might be sitting when I’m reviewing material, but when I’m choosing the point to cut out of a shot, I will always jump out of the chair. A gunfighter will always stand, because it’s the fastest, most accurate way to get to his gun. Imagine High Noon with Gary Cooper sitting in a chair. I feel the fastest, most accurate way to choose the critically important frame I will cut out of a shot is to be standing. I have kind of a gunfighter’s stance.

And as Murch suggests, this applies as much to solitary craftsmen as it does to the social and physical world of the director. Philip Roth, who worked at a lectern, claimed that he paced half a mile for every page that he wrote, while the mathematician Robert P. Langlands reflected: “[My] many hours of physical effort as a youth also meant that my body, never frail but also initially not particularly strong, has lasted much longer than a sedentary occupation might have otherwise permitted.” Standing and walking can be a proxy for mental and moral acuity, as Bertrand Russell implied so memorably:

Our mental makeup is suited to a life of very severe physical labor. I used, when I was younger, to take my holidays walking. I would cover twenty-five miles a day, and when the evening came I had no need of anything to keep me from boredom, since the delight of sitting amply sufficed. But modern life cannot be conducted on these physically strenuous principles. A great deal of work is sedentary, and most manual work exercises only a few specialized muscles. When crowds assemble in Trafalgar Square to cheer to the echo an announcement that the government has decided to have them killed, they would not do so if they had all walked twenty-five miles that day.

Such energy, as Kazan reminds us, isn’t limitless. I still think of myself as relatively young, but I don’t have the raw mental or physical resources that I did fifteen years ago, and I’ve had to come up with various tricks—what a pickup basketball player might call “old-man shit”—to maintain my old levels of productivity. I’ve written elsewhere that certain kinds of thinking are best done sitting down, but there’s also a case to be made for thinking on your feet. Standing is the original power pose, and perhaps the only one likely to have any real effects. And it’s in the late afternoons, both of a working day and of an entire life, that you need to stand and deliver.

The Battle of Dunkirk

leave a comment »

During my junior year in college, I saw Christopher Nolan’s Memento at the Brattle Theatre in Cambridge, Massachusetts, for no other reason except that I’d heard it was great. Since then, I’ve seen all of Nolan’s movies on their initial release, which is something I can’t say of any other director. At first, it was because I liked his work and his choices intrigued me, and it only occurred to me around the time of The Dark Knight that I witnessing a career like no other. It’s tempting to compare Nolan to his predecessors, but when you look at his body of work from Memento to Dunkirk, it’s clear that he’s in a category of his own. He’s directed nine theatrical features in seventeen years, all mainstream critical and commercial successes, including some of the biggest movies in recent history. No other director alive comes close to that degree of consistency, at least not at the same level of productivity and scale. Quality and reliability alone aren’t everything, of course, and Nolan pales a bit compared to say, Steven Spielberg, who over a comparable stretch of time went from The Sugarland Express to Hook, with Jaws, Close Encounters, E.T., and the Indiana Jones trilogy along the way, as well as 1941 and Always. By comparison, Nolan can seem studied, deliberate, and remote, and the pockets of unassimilated sentimentality in his work—which I used to assume were concessions to the audience, but now I’m not so sure—only point to how unified and effortless Spielberg is at his best. But the conditions for making movies have also changed over the last four decades, and Nolan has threaded the needle in ways that still amaze me, as I continue to watch his career unfold in real time.

Nolan sometimes reminds me of the immortal Byron the Bulb in Gravity’s Rainbow, of which Thomas Pynchon writes: “Statistically…every n-thousandth light bulb is gonna be perfect, all the delta-q’s piling up just right, so we shouldn’t be surprised that this one’s still around, burning brightly.” He wrote and directed one of the great independent debuts, leveraged it into a career making blockbusters, and slowly became a director from whom audiences expected extraordinary achievements while he was barely out of the first phase of his career. And he keeps doing it. For viewers of college age or younger, he must feel like an institution, while I can’t stop thinking of him as an outlier that has yet to regress to the mean. Nolan’s most significant impact, for better or worse, may lie in the sheer, seductive implausibility of the case study that he presents. Over the last decade or so, we’ve seen a succession of young directors, nearly all of them white males, who, after directing a microbudgeted indie movie, are handed the keys to a huge franchise. This has been taken as an instance of category selection, in which directors who look a certain way are given opportunities that wouldn’t be offered to filmmakers of other backgrounds, but deep down, I think it’s just an attempt to find the next Nolan. If I were an executive at Warner Bros. whose career had overlapped with his, I’d feel toward him what Goethe felt of Napoleon: “[It] produces in me an impression like that produced by the Revelation of St. John the Divine. We all feel there must be something more in it, but we do not know what.” Nolan is the most exciting success story to date of a business model that he defined and that, if it worked, would solve most of Hollywood’s problems, in which independent cinema serves as a farm team for directors who can consistently handle big legacy projects that yield great reviews and box office. And it’s happened exactly once.

You can’t blame Hollywood for hoping that lightning will strike twice, but it’s obvious now that Nolan is like nobody else, and Dunkirk may turn out to be the pivotal film in trying to understand what he represents. I don’t think it’s his best or most audacious movie, but it was certainly the greatest risk, and he seems to have singlehandedly willed it into existence. Artistically, it’s a step forward for a director who sometimes seemed devoted to complexity for its own sake, telling a story of crystalline narrative and geographical clarity with a minimum of dialogue and exposition, with clever tricks with time that lead, for once, to a real emotional payoff. The technical achievement of staging a continuous action climax that runs for most of the movie’s runtime is impressive in itself, and Nolan, who has been gradually preparing for this moment for years, makes it look so straightforward that it’s easy to undervalue it. (Nolan’s great insight here seems to have been that by relying on the audience’s familiarity with the conventions of the war movie, he could lop off the first hour of the story and just tell the second half. Its nonlinear structure, in turn, seems to have been a pragmatic solution to the problem of how to intercut freely between three settings with different temporal and spatial demands, and Nolan strikes me as the one director both to whom it would have occurred and who would have actually been allowed to do it.) On a commercial level, it’s his most brazen attempt, even more than Inception, to see what he could do with the free pass that a director typically gets after a string of hits. And the fact that he succeeded, with a summer box office smash that seems likely to win multiple Oscars, only makes me all the more eager to see what he’ll do next.

It all amounts to the closest film in recent memory to what Omar Sharif once said of Lawrence of Arabia: “If you are the man with the money and somebody comes to you and says he wants to make a film that’s four hours long, with no stars, and no women, and no love story, and not much action either, and he wants to spend a huge amount of money to go film it in the desert—what would you say?” Dunkirk is half as long as Lawrence and consists almost entirely of action, and it isn’t on the same level, but the challenge that it presented to “the man with the money” must have been nearly as great. (Its lack of women, unfortunately, is equally glaring.) In fact, I can think of only one other director who has done anything comparable. I happened to see Dunkirk a few weeks after catching 2001: A Space Odyssey on the big screen, and as I watched the former movie last night, it occurred to me that Nolan has pulled off the most convincing Kubrick impression that any of us have ever seen. You don’t become the next Kubrick by imitating him, as Nolan did to some extent in Interstellar, but by figuring out new ways to tell stories using all the resources of the cinema, and somehow convincing a studio to fund the result. In both cases, the studio was Warner Bros., and I wonder if executives with long memories see Nolan as a transitional figure between Kubrick and the needs of the DC Extended Universe. It’s a difficult position for any director to occupy, and it may well prevent Nolan from developing along more interesting lines that his career might otherwise have taken. His artistic gambles, while considerable, are modest compared to even Barry Lyndon, and his position at the center of the industry can only discourage him from running the risk of being difficult or alienating. But I’m not complaining. Dunkirk is the story of a retreat, but it’s also the latest chapter in the life of a director who just can’t stop advancing.

Written by nevalalee

July 26, 2017 at 9:21 am

Gatsby’s fortune and the art of ambiguity

leave a comment »

F. Scott Fitzgerald

Note: I’m taking a short break this week, so I’ll be republishing a few posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on July 17, 2015. 

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has received greater recognition in recent years, or even John W. Campbell, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse can all too easily turn into an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, I’ll frequently find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we often see in “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at any given moment, and the resulting actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

The strange loop of Westworld

leave a comment »

The maze in Westworld

In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:

This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.

I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.


And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.

Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:

What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.

This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.

The test of tone

with one comment

Brendan Gleeson and Colin Farrell in In Bruges

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on April 22, 2014.

Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.

The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.

Hugh Dancy on Hannibal

As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.

At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the great pleasures of watching Hannibal lay in how it learned to acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. Fargo remains the most fascinating drama on television in large part because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.

Written by nevalalee

April 26, 2016 at 9:00 am

The Coco Chanel rule

with 4 comments

Coco Chanel

“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)

But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:

In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.

As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”

This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.

Walter Murch

This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”

This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it. 

“Open the bomb bay doors, please, Ken…”

leave a comment »

Slim Pickens in Dr. Strangelove

After the legendary production designer Ken Adam died last week, I found myself browsing through the book Ken Adam: The Art of Production Design, a wonderfully detailed series of interviews that he conducted with the cultural historian Christopher Frayling. It’s full of great stories, but the one I found myself pondering the most is from the making of Dr. Strangelove. Stanley Kubrick had just cast Slim Pickens in the role of Major Kong, the pilot of the B-52 bomber that inadvertently ends up triggering the end of the world, and it led the director to a sudden brainstorm. Here’s how Adam tells it:

[The bomber set] didn’t have practical bomb doors—we didn’t need them in the script at that time—and the set was almost ready to shoot. And Stanley said, “We need practical bomb doors.” He wanted this Texan cowboy to ride the bomb like a bronco into the Russian missile site. I did some setups, sketches for the whole thing, and Stanley asked me when it would be ready. I said, “If I work three crews twenty-four hours a day, you still won’t have it for at least a week, and that’s too late.” So now I arrive at Shepperton and I’m having kittens because I knew it was a fantastic idea but physically, mechanically, we couldn’t get it done. So again it was Wally Veevers, our special effects man, who saved the day, saying he’d sleep on it and come up with an idea. He always did that, even though he was having heart problems and wasn’t well. Wally came back and said, “We’re going to take a ten-by-eight still of the bomb bay interior, cut out the bomb-door opening, and shoot the bomb coming down against blue backing.” And that’s the way they did it.

I love this story for a lot of reasons. The first is the rare opportunity it affords to follow Kubrick’s train of thought. He had cast Peter Sellers, who was already playing three other lead roles, as Major Kong, but the performance wasn’t working, and when Sellers injured his ankle, Kubrick used this as an excuse to bring in another actor. Slim Pickens brought his own aura of associations, leading Kubrick to the movie’s single most memorable image, which now seems all but inevitable. And he seemed confident that any practical difficulties could be overcome. As Adam says elsewhere:

[Kubrick] had this famous theory in those days that the director had the right to change his mind up until the moment the cameras started turning. But he changed his mind after the cameras were rolling! For me, it was enormously demanding, because until then I was basically a pretty organized person. But I wasn’t yet flexible enough to meet these sometimes impossible demands that he came up with. So I was going through an anxiety crisis. But at the same time I knew that every time he changed his mind, he came up with a brilliant idea. So I knew I had to meet his demands in some way, even if it seemed impossible from a practical point of view.

Which just serves as a reminder that for Kubrick, who is so often characterized as the most meticulous and obsessive of directors, an intense level of preparation existed primarily to enable those moments in which the plan could be thrown away—a point that even his admirers often overlook.

Design by Ken Adam for Dr. Strangelove

It’s also obvious that Kubrick couldn’t have done any of this if he hadn’t surrounded himself with brilliant collaborators, and his reliance on Adam testifies to his belief that he had found someone who could translate his ideas into reality. (He tried and failed to get Adam to work with him on 2001, and the two reunited for Barry Lyndon, for which Adam deservedly won an Oscar.) We don’t tend to think of Dr. Strangelove as a movie that solved enormous technical problems in the way that some of Kubrick’s other projects did, but like any film, it presented obstacles that most viewers will never notice. Creating the huge maps in the war room, for instance, required a thousand hundred-watt bulbs installed behind perspex, along with an improvised air-conditioning system to prevent the heat from blistering the transparencies. Like the bomb bay doors, it’s the sort of issue that would probably be solved today with digital effects, but the need to address it on the set contributes to the air of authenticity that the story demands. Dr. Strangelove wouldn’t be nearly as funny if its insanities weren’t set against a backdrop of painstaking realism. Major Kong is a loving caricature, but the bomber he flies isn’t: it was reconstructed down to the tiniest detail from photos in aeronautical magazines. And there’s a sense in which Kubrick, like Christopher Nolan, embraced big logistical challenges as a way to combat a tendency to live in his own head—which is the one thing that these two directors, who are so often mentioned together, really do have in common.

There’s also no question that this was hard on Ken Adam, who was driven to something close to a nervous breakdown during the filming of Barry Lyndon. He says:

I became so neurotic that I bore all of Stanley’s crazy decisions on my own shoulders. I was always apologizing to actors for something that had gone wrong. I felt responsible for every detail of Stanley’s film, for all his mistakes and neuroses. I was apologizing to actors for Stanley’s unreasonable demands.

In Frayling’s words, Adam was “the man in the middle, with a vengeance.” And if he ended up acting as the ambassador, self-appointed or otherwise, between Kubrick and the cast and crew, it isn’t hard to see why: the production designer, then as now, provides the primary interface between the vision on the page—or in the director’s head—and its realization as something that can be captured on film. It’s a role that deserves all the more respect at a time when physical sets are increasingly being replaced by digital environments that live somewhere on a hard drive at Weta Digital. A director is not a designer, and even Adam says that Kubrick “didn’t know how to design,” although he also states that the latter could have taken over any number of the other technical departments. (This wasn’t just flattery, either. Years later, Adam would call Kubrick, in secret, to help him light the enormous supertanker set for The Spy Who Loved Me.) A director has to be good at many things, but it all emerges from a willingness to confront the problems that arise where the perfect collides with the possible. And it’s to the lasting credit of both Kubrick and Adam that they never flinched from that single combat, toe to toe with reality.

%d bloggers like this: