Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Christopher Nolan

Back to the Future

with 2 comments

It’s hard to believe, but the paperback edition of Inventor of the Future: The Visionary Life of Buckminster Fuller is finally in stores today. As far as I’m concerned, this is the definitive version of this biography—it incorporates a number of small fixes and revisions—and it marks the culmination of a journey that started more than five years ago. It also feels like a milestone in an eventful writing year that has included a couple of pieces in the New York Times Book Review, an interview with the historian Richard Rhodes on Christopher Nolan’s Oppenheimer for The Atlantic online, and the usual bits and bobs elsewhere. Most of all, I’ve been busy with my upcoming biography of the Nobel Prize-winning physicist Luis W. Alvarez, which is scheduled to be published by W.W. Norton sometime in 2025. (Oppenheimer fans with sharp eyes and good memories will recall Alvarez as the youthful scientist in Lawrence’s lab, played by Alex Wolff, who shows Oppenheimer the news article announcing that the atom has been split. And believe me, he went on to do a hell of a lot more. I can’t wait to tell you about it.)

The Ballad of Jack and Rose

leave a comment »

Note: To commemorate the twentieth anniversary of the release of Titanic, I’m republishing a post that originally appeared, in a slightly different form, on April 16, 2012.

Is it possible to watch Titanic again with fresh eyes? Was it ever possible? When I caught Titanic 3D five years ago in Schaumburg, Illinois, it had been a decade and a half since I last saw it. (I’ve since watched it several more times, mostly while writing an homage in my novel Eternal Empire.) On its initial release, I liked it a lot, although I wouldn’t have called it the best movie of a year that gave us L.A. Confidential, and since then, I’d revisited bits and pieces of it on television, but had never gone back and watched the whole thing. All the same, my memories of it remained positive, if somewhat muted, so I was curious to see what my reaction would be, and what I found is that this is a really good, sometimes even great movie that looks even better with time. Once we set aside our preconceived notions, we’re left with a spectacularly well-made film that takes a lot of risks and seems motivated by a genuine, if vaguely adolescent, fascination with the past, an unlikely labor of love from a prodigiously talented director who willed himself into a genre that no one would have expected him to understand—the romantic epic—and emerged with both his own best work and a model of large-scale popular storytelling.

So why is this so hard for some of us to admit? The trouble, I think, is that the factors that worked so strongly in the film’s favor—its cinematography, special effects, and art direction; its beautifully choreographed action; its incredible scale—are radically diminished on television, which was the only way that it could be seen for a long time. On the small screen, we lose all sense of scope, leaving us mostly with the charisma of its two leads and conventional dramatic elements that James Cameron has never quite been able to master. Seeing Titanic in theaters again reminds us of why we responded to it in the first place. It’s also easier to appreciate that it was made at precisely the right moment in movie history, an accident of timing that allowed it to take full advantage of digital technology while still deriving much of its power from stunts, gigantic sets, and practical effects. If it were made again today, even by Cameron himself, it’s likely that much of this spectacle would be rendered on computers, which would be a major aesthetic loss. A huge amount of this film’s appeal lies in its physicality, in those real crowds and flooded stages, all of which can only be appreciated in the largest venue possible. Titanic is still big; it’s the screens that got small.

It’s also time to retire the notion that James Cameron is a bad screenwriter. It’s true that he doesn’t have any ear for human conversation, and that he tends to freeze up when it comes to showing two people simply talking—I’m morbidly curious to see what he’d do with a conventional drama, but I’m not sure that I want to see the result. Yet when it comes to structuring exciting stories on the largest possible scale, and setting up and delivering climactic set pieces and payoffs, he’s without equal. I’m a big fan of Christopher Nolan, for instance—I think he’s the most interesting mainstream filmmaker alive—but his films can seem fussy and needlessly intricate compared to the clean, powerful narrative lines that Cameron sets up here. (The decision, for instance, to show us a simulation of the Titanic’s sinking before the disaster itself is a masterstroke: it keeps us oriented throughout an hour of complex action that otherwise would be hard to understand.) Once the movie gets going, it never lets up. It moves toward its foregone conclusion with an efficiency, confidence, and clarity that Peter Jackson, or even Spielberg, would have reason to envy. And its production was one of the last great adventures—apart from The Lord of the Rings—that Hollywood ever allowed itself.

Despite James Cameron’s reputation as a terror on the set, I met him once, and he was very nice to me. In 1998, as an overachieving high school senior, I was a delegate at the American Academy of Achievement’s annual Banquet of the Golden Plate in Jackson Hole, Wyoming, an extraordinarily surreal event that I hope to discuss in more detail one of these days. The high point of the weekend was the banquet itself, a black-tie affair in a lavish indoor auditorium with the night’s honorees—a range of luminaries from science, politics, and the arts—seated in alphabetical order at the periphery of the room. One of them was James Cameron, who had swept the Oscars just a few months earlier. Halfway through the evening, leaving my own seat, I went up to his table to say hello, only to find him surrounded by a flock of teenage girls anxious to know what it was like to work with Leonardo DiCaprio. Seeing that there was no way of approaching him yet, I chatted for a bit with a man seated nearby, who hadn’t attracted much, if any, attention. We made small talk for a minute or two, but when I saw an opening with Cameron, I quickly said goodbye, leaving the other guest on his own. It was Dick Cheney.

Written by nevalalee

December 20, 2017 at 9:00 am

The act of killing

with one comment

Over the weekend, my wife and I watched the first two episodes of Mindhunter, the new Netflix series created by Joe Penhall and produced by David Fincher. We took in the installments over successive nights, but if you can, I’d recommend viewing them back to back—they really add up to a single pilot episode, arbitrarily divided in half, and they amount to a new movie from one of the five most interesting American directors under sixty. After the first episode, I was a little mixed, but I felt better after the next one, and although I still have some reservations, I expect that I’ll keep going. The writing tends to spell things out a little too clearly; it doesn’t always avoid clichés; and there are times when it feels like a first draft of a stronger show to come. Fincher, characteristically, sometimes seems less interested in the big picture than in small, finicky details, like the huge titles used to identify the locations onscreen, or the fussily perfect sound that the springs of the chair make whenever the bulky serial killer Ed Kemper sits down. (He also gives us two virtuoso sequences of the kind that he does better than just about anyone else—a scene in a noisy club with subtitled dialogue, which I’ve been waiting to see for years, and a long, very funny montage of two FBI agents on the road.) For long stretches, the show is about little else than the capabilities of the Red Xenomorph digital camera. Yet it also feels like a skeleton key for approaching the work of a man who, in fits and starts, has come to seem like the crucial director of our time, in large part because of his own ambivalence toward his fantasies of control.

Mindhunter is based on a book of the same name by John Douglas and Mark Olshaker about the development of behavioral science at the FBI. I read it over twenty years ago, at the peak of my morbid interest in serial killers, which is a phase that a lot of us pass through and that Fincher, revealingly, has never outgrown. Apart from Alien 3, which was project that he barely understood and couldn’t control, his real debut was Seven, in which he benefited from a mechanical but undeniably compelling script by Andrew Kevin Walker and a central figure who has obsessed him ever since. John Doe, the killer, is still the greatest example of the villain who seems to be writing the screenplay for the movie in which he appears. (As David Thomson says of Donald Sutherland’s character in JFK: “[He’s] so omniscient he must be the scriptwriter.”) Doe’s notebooks, rendered in comically lavish detail, are like a nightmare version of the notes, plans, and storyboards that every film generates, and he alternately assumes the role of writer, art director, prop master, and producer. By the end, with the hero detectives reduced to acting out their assigned parts in his play, the distinction between Doe and the director—a technical perfectionist who would later become notorious for asking his performers for hundreds of takes—seems to disappear completely. It seems to have simultaneously exhilarated and troubled Fincher, much as it did Christopher Nolan as he teased out his affinities with the Joker in The Dark Knight, and both men have spent much of their subsequent careers working through the implications of that discovery.

Fincher hasn’t always been comfortable with his association with serial killers, to the extent that he made a point of having the characters in The Girl With the Dragon Tattoo refer to “a serial murderer,” as if we’d be fooled by the change in terminology. Yet the main line of his filmography is an attempt by a surprisingly smart, thoughtful director to come to terms with his own history of violence. There were glimpses of it as early as The Game, and Zodiac, his masterpiece, is a deconstruction of the formula that turned out to be so lucrative in Seven—the killer, wearing a mask, appears onscreen for just five minutes, and some of the scariest scenes don’t have anything to do with him at all, even as his actions reverberate outward to affect the lives of everyone they touch. Dragon Tattoo, which is a movie that looks a lot better with time, identifies its murder investigation with the work of the director and his editors, who seemed to be asking us to appreciate their ingenuity in turning the elements of the book, with its five acts and endless procession of interchangeable suspects, into a coherent film. And while Gone Girl wasn’t technically a serial killer movie, it gave us his most fully realized version to date of the antagonist as the movie’s secret writer, even if she let us down with the ending that she wrote for herself. In each case, Fincher was processing his identity as a director who was drawn to big technical challenges, from The Curious Case of Benjamin Button to The Social Network, without losing track of the human thread. And he seems to have sensed how easily he could become a kind of John Doe, a master technician who toys sadistically with the lives of others.

And although Mindhunter takes a little while to reveal its strengths, it looks like it will be worth watching as Fincher’s most extended attempt to literally interrogate his assumptions. (Fincher only directed the first two episodes, but this doesn’t detract from what might have attracted him to this particular project, or the role that he played in shaping it as a producer.) The show follows two FBI agents as they interview serial killers in search of insights into their madness, with the tone set by a chilling monologue by Ed Kemper:

People who hunt other people for a vocation—all we want to talk about is what it’s like. The shit that went down. The entire fucked-upness of it. It’s not easy butchering people. It’s hard work. Physically and mentally, I don’t think people realize. You need to vent…Look at the consequences. The stakes are very high.

Take out the references to murder, and it might be the director talking. Kemper later casually refers to his “oeuvre,” leading one of the two agents to crack: “Is he Stanley Kubrick?” It’s a little out of character, but also enormously revealing. Fincher, like Nolan, has spent his career in dialogue with Kubrick, who, fairly or not, still sets the standard for obsessive, meticulous, controlling directors. Kubrick never made a movie about a serial killer, but he took the equation between the creative urge and violence—particularly in A Clockwork Orange and The Shining—as far as anyone ever has. And Mindhunter will only become the great show that it has the potential to be if it asks why these directors, and their fans, are so drawn to these stories in the first place.

The man with the plan

with one comment

This month marks the twenty-fifth anniversary of the release of Reservoir Dogs, a film that I loved as much as just about every other budding cinephile who came of age in the nineties. Tom Shone has a nice writeup on its legacy in The New Yorker, and while I don’t agree with every point that he makes—he dismisses Kill Bill, which is a movie that means so much to me that I named my own daughter after Beatrix Kiddo—he has insights that can’t be ignored: “Quentin [Tarantino] became his worst reviews, rather in the manner of a boy who, falsely accused of something, decides that he might as well do the thing for which he has already been punished.” And there’s one paragraph that strikes me as wonderfully perceptive:

So many great filmmakers have made their debuts with heist films—from Woody Allen’s Take the Money and Run to Michael Mann’s Thief to Wes Anderson’s Bottle Rocket to Bryan Singer’s The Usual Suspects—that it’s tempting to see the genre almost as an allegory for the filmmaking process. The model it offers first-time filmmakers is thus as much economic as aesthetic—a reaffirmation of the tenant that Jean-Luc Godard attributed to D. W. Griffith: “All you need to make a movie is a girl and a gun.” A man assembles a gang for the implementation of a plan that is months in the rehearsal and whose execution rests on a cunning facsimile of midmorning reality going undetected. But the plan meets bumpy reality, requiring feats of improvisation and quick thinking if the gang is to make off with its loot—and the filmmaker is to avoid going to movie jail.

And while you could nitpick the details of this argument—Singer’s debut was actually Public Access, a movie that nobody, including me, has seen—it gets at something fundamental about the art of film, which lies at the intersection of an industrial process and a crime. I’ve spoken elsewhere about how Inception, my favorite movie of the last decade, maps the members of its mind heist neatly onto the crew of a motion picture: Cobb is the director, Saito the producer, Ariadne the set designer, Eames the actor, and Arthur is, I don’t know, the line producer, while Fischer, the mark, is a surrogate for the audience itself. (For what it’s worth, Christopher Nolan has stated that any such allegory was unconscious, although he seems to have embraced it after the fact.) Most of the directors whom Shone names are what we’d call auteur figures, and aside from Singer, all of them wear a writer’s hat, which can obscure the extent to which they depend on collaboration. Yet in their best work, it’s hard to imagine Singer without Christopher McQuarrie, Tarantino without editor Sally Menke, or Wes Anderson without Owen Wilson, not to mention the art directors, cinematographers, and other skilled craftsmen required to finish even the most idiosyncratic and personal movie. Just as every novel is secretly about the process of its own creation, every movie is inevitably about making movies, which is the life that its creators know most intimately. One of the most exhilarating things that a movie can do is give us a sense of the huddle between artists, which is central to the appeal of The Red Shoes, but also Mission: Impossible—Rogue Nation, in which Tom Cruise told McQuarrie that he wanted to make a film about what it was like for the two of them to make a film.

But there’s also an element of criminality, which might be even more crucial. I’m not the first person to point out that there’s something illicit in the act of watching images of other people’s lives projected onto a screen in a darkened theater—David Thomson, our greatest film critic, has built his career on variations on that one central insight. And it shouldn’t surprise us if the filmmaking process itself takes on aspects of something done in the shadows, in defiance of permits, labor regulations, and the orderly progression of traffic. (Werner Herzog famously advised aspiring directors to carry bolt cutters everywhere: “If you want to do a film, steal a camera, steal raw stock, sneak into a lab and do it!”) If your goal is to tell a story about putting together a team for a complicated project, it could be about the Ballet Lermontov or the defense of a Japanese village, and the result might be even greater. But it would lack the air of illegality on which the medium thrives, both in its dreamlife and in its practical reality. From the beginning, Tarantino seems to have sensed this. He’s become so famous for reviving the careers of neglected figures for the sake of the auras that they provide—John Travolta, Pam Grier, Robert Forster, Keith Carradine—that it’s practically become his trademark, and we often forget that he did it for the first time in Reservoir Dogs. Lawrence Tierney, the star of Dillinger and Born to Kill, had been such a menacing presence both onscreen and off that that he was effectively banned from Hollywood after the forties, and he remained a terrifying presence even in old age. He terrorized the cast of Seinfield during his guest appearance as Elaine’s father, and one of my favorite commentary tracks from The Simpsons consists of the staff reminiscing nervously about how much he scared them during the recording of “Marge Be Not Proud.”

Yet Tarantino still cast him as Joe Cabot, the man who sets up the heist, and Tierney rewarded him with a brilliant performance. Behind the scenes, it went more or less as you might expect, as Tarantino recalled much later:

Tierney was a complete lunatic by that time—he just needed to be sedated. We had decided to shoot his scenes first, so my first week of directing was talking with this fucking lunatic. He was personally challenging to every aspect of filmmaking. By the end of the week everybody on set hated Tierney—it wasn’t just me. And in the last twenty minutes of the first week we had a blowout and got into a fist fight. I fired him, and the whole crew burst into applause.

But the most revealing thing about the whole incident is that an untested director like Tarantino felt capable of taking on Tierney at all. You could argue that he already had an inkling of what he might become, but I’d prefer to think that he both needed and wanted someone like this to symbolize the last piece of the picture. Joe Cabot is the man with the plan, and he’s also the man with the money. (In the original script, Joe says into the phone: “Sid, stop, you’re embarrassing me. I don’t need to be told what I already know. When you have bad months, you do what every businessman in the world does, I don’t care if he’s Donald Trump or Irving the tailor. Ya ride it out.”) It’s tempting to associate him with the producer, but he’s more like a studio head, a position that has often drawn men whose bullying and manipulation is tolerated as long as they can make movies. When he wrote the screenplay, Tarantino had probably never met such a creature in person, but he must have had some sense of what was in store, and Reservoir Dogs was picked up for distribution by a man who fit the profile perfectly—and who never left Tarantino’s side ever again. His name was Harvey Weinstein.

Thinking on your feet

leave a comment »

The director Elia Kazan, whose credits included A Streetcar Named Desire and On the Waterfront, was proud of his legs. In his memoirs, which the editor Robert Gottlieb calls “the most gripping and revealing book I know about the theater and Hollywood,” Kazan writes of his childhood:

Everything I wanted most I would have to obtain secretly. I learned to conceal my feelings and to work to fulfill them surreptitiously…What I wanted most I’d have to take—quietly and quickly—from others. Not a logical step, but I made it at a leap. I learned to mask my desires, hide my truest feeling; I trained myself to live in deprivation, in silence, never complaining, never begging, in isolation, without expecting kindness or favors or even good luck…I worked waxing floors—forty cents an hour. I worked at a small truck farm across the road—fifty cents an hour. I caddied every afternoon I could at the Wykagyl Country Club, carrying the bags of middle-aged women in long woolen skirts—a dollar a round. I spent nothing. I didn’t take trolleys; I walked. Everywhere. I have strong leg muscles from that time.

The italics are mine, but Kazan emphasized his legs often enough on his own. In an address that he delivered at a retrospective at Wesleyan University in 1973, long after his career had peaked, he told the audience: “Ask me how with all that knowledge and all that wisdom, and all that training and all those capabilities, including the strong legs of a major league outfielder, how did I manage to mess up some of the films I’ve directed so badly?”

As he grew older, Kazan’s feelings about his legs became inseparable from his thoughts on his own physical decline. In an essay titled “The Pleasures of Directing,” which, like the address quoted above, can be found in the excellent book Kazan on Directing, Kazan observes sadly: “They’ve all said it. ‘Directing is a young man’s game.’ And time passing proves them right.” He continues:

What goes first? With an athlete, the legs go first. A director stands all day, even when he’s provided with chairs, jeeps, and limos. He walks over to an actor, stands alongside and talks to him; with a star he may kneel at the side of the chair where his treasure sits. The legs do get weary. Mine have. I didn’t think it would happen because I’ve taken care of my body, always exercised. But I suddenly found I don’t want to play singles. Doubles, okay. I stand at the net when my partner serves, and I don’t have to cover as much ground. But even at that…

I notice also that I want a shorter game—that is to say also, shorter workdays, which is the point. In conventional directing, the time of day when the director has to be most able, most prepared to push the actors hard and get what he needs, usually the close-ups of the so-called “master scene,” is in the afternoon. A director can’t afford to be tired in the late afternoon. That is also the time—after the thoughtful quiet of lunch—when he must correct what has not gone well in the morning. He better be prepared, he better be good.

As far as artistic advice goes, this is as close to the real thing as it gets. But it can only occur to an artist who can no longer take for granted the energy on which he has unthinkingly relied for most of his life.

Kazan isn’t the only player in the film industry to draw a connection between physical strength—or at least stamina—and the medium’s artistic demands. Guy Hamilton, who directed Goldfinger, once said: “To be a director, all you need is a hide like a rhinoceros—and strong legs, and the ability to think on your feet…Talent is something else.” None other than Christopher Nolan believes so much in the importance of standing that he’s institutionalized it on his film sets, as Mark Rylance recently told The Independent: “He does things like he doesn’t like having chairs on set for actors or bottles of water, he’s very particular…[It] keeps you on your toes, literally.” Walter Murch, meanwhile, noted that a film editor needed “a strong back and arms” to lug around reels of celluloid, which is less of a concern in the days of digital editing, but still worth bearing in mind. Murch famously likes to stand while editing, like a surgeon in the operating room:

Editing is sort of a strange combination of being a brain surgeon and a short-order cook. You’ll never see those guys sitting down on the job. The more you engage your entire body in the process of editing, the better and more balletic the flow of images will be. I might be sitting when I’m reviewing material, but when I’m choosing the point to cut out of a shot, I will always jump out of the chair. A gunfighter will always stand, because it’s the fastest, most accurate way to get to his gun. Imagine High Noon with Gary Cooper sitting in a chair. I feel the fastest, most accurate way to choose the critically important frame I will cut out of a shot is to be standing. I have kind of a gunfighter’s stance.

And as Murch suggests, this applies as much to solitary craftsmen as it does to the social and physical world of the director. Philip Roth, who worked at a lectern, claimed that he paced half a mile for every page that he wrote, while the mathematician Robert P. Langlands reflected: “[My] many hours of physical effort as a youth also meant that my body, never frail but also initially not particularly strong, has lasted much longer than a sedentary occupation might have otherwise permitted.” Standing and walking can be a proxy for mental and moral acuity, as Bertrand Russell implied so memorably:

Our mental makeup is suited to a life of very severe physical labor. I used, when I was younger, to take my holidays walking. I would cover twenty-five miles a day, and when the evening came I had no need of anything to keep me from boredom, since the delight of sitting amply sufficed. But modern life cannot be conducted on these physically strenuous principles. A great deal of work is sedentary, and most manual work exercises only a few specialized muscles. When crowds assemble in Trafalgar Square to cheer to the echo an announcement that the government has decided to have them killed, they would not do so if they had all walked twenty-five miles that day.

Such energy, as Kazan reminds us, isn’t limitless. I still think of myself as relatively young, but I don’t have the raw mental or physical resources that I did fifteen years ago, and I’ve had to come up with various tricks—what a pickup basketball player might call “old-man shit”—to maintain my old levels of productivity. I’ve written elsewhere that certain kinds of thinking are best done sitting down, but there’s also a case to be made for thinking on your feet. Standing is the original power pose, and perhaps the only one likely to have any real effects. And it’s in the late afternoons, both of a working day and of an entire life, that you need to stand and deliver.

The Battle of Dunkirk

leave a comment »

During my junior year in college, I saw Christopher Nolan’s Memento at the Brattle Theatre in Cambridge, Massachusetts, for no other reason except that I’d heard it was great. Since then, I’ve seen all of Nolan’s movies on their initial release, which is something I can’t say of any other director. At first, it was because I liked his work and his choices intrigued me, and it only occurred to me around the time of The Dark Knight that I witnessing a career like no other. It’s tempting to compare Nolan to his predecessors, but when you look at his body of work from Memento to Dunkirk, it’s clear that he’s in a category of his own. He’s directed nine theatrical features in seventeen years, all mainstream critical and commercial successes, including some of the biggest movies in recent history. No other director alive comes close to that degree of consistency, at least not at the same level of productivity and scale. Quality and reliability alone aren’t everything, of course, and Nolan pales a bit compared to say, Steven Spielberg, who over a comparable stretch of time went from The Sugarland Express to Hook, with Jaws, Close Encounters, E.T., and the Indiana Jones trilogy along the way, as well as 1941 and Always. By comparison, Nolan can seem studied, deliberate, and remote, and the pockets of unassimilated sentimentality in his work—which I used to assume were concessions to the audience, but now I’m not so sure—only point to how unified and effortless Spielberg is at his best. But the conditions for making movies have also changed over the last four decades, and Nolan has threaded the needle in ways that still amaze me, as I continue to watch his career unfold in real time.

Nolan sometimes reminds me of the immortal Byron the Bulb in Gravity’s Rainbow, of which Thomas Pynchon writes: “Statistically…every n-thousandth light bulb is gonna be perfect, all the delta-q’s piling up just right, so we shouldn’t be surprised that this one’s still around, burning brightly.” He wrote and directed one of the great independent debuts, leveraged it into a career making blockbusters, and slowly became a director from whom audiences expected extraordinary achievements while he was barely out of the first phase of his career. And he keeps doing it. For viewers of college age or younger, he must feel like an institution, while I can’t stop thinking of him as an outlier that has yet to regress to the mean. Nolan’s most significant impact, for better or worse, may lie in the sheer, seductive implausibility of the case study that he presents. Over the last decade or so, we’ve seen a succession of young directors, nearly all of them white males, who, after directing a microbudgeted indie movie, are handed the keys to a huge franchise. This has been taken as an instance of category selection, in which directors who look a certain way are given opportunities that wouldn’t be offered to filmmakers of other backgrounds, but deep down, I think it’s just an attempt to find the next Nolan. If I were an executive at Warner Bros. whose career had overlapped with his, I’d feel toward him what Goethe felt of Napoleon: “[It] produces in me an impression like that produced by the Revelation of St. John the Divine. We all feel there must be something more in it, but we do not know what.” Nolan is the most exciting success story to date of a business model that he defined and that, if it worked, would solve most of Hollywood’s problems, in which independent cinema serves as a farm team for directors who can consistently handle big legacy projects that yield great reviews and box office. And it’s happened exactly once.

You can’t blame Hollywood for hoping that lightning will strike twice, but it’s obvious now that Nolan is like nobody else, and Dunkirk may turn out to be the pivotal film in trying to understand what he represents. I don’t think it’s his best or most audacious movie, but it was certainly the greatest risk, and he seems to have singlehandedly willed it into existence. Artistically, it’s a step forward for a director who sometimes seemed devoted to complexity for its own sake, telling a story of crystalline narrative and geographical clarity with a minimum of dialogue and exposition, with clever tricks with time that lead, for once, to a real emotional payoff. The technical achievement of staging a continuous action climax that runs for most of the movie’s runtime is impressive in itself, and Nolan, who has been gradually preparing for this moment for years, makes it look so straightforward that it’s easy to undervalue it. (Nolan’s great insight here seems to have been that by relying on the audience’s familiarity with the conventions of the war movie, he could lop off the first hour of the story and just tell the second half. Its nonlinear structure, in turn, seems to have been a pragmatic solution to the problem of how to intercut freely between three settings with different temporal and spatial demands, and Nolan strikes me as the one director both to whom it would have occurred and who would have actually been allowed to do it.) On a commercial level, it’s his most brazen attempt, even more than Inception, to see what he could do with the free pass that a director typically gets after a string of hits. And the fact that he succeeded, with a summer box office smash that seems likely to win multiple Oscars, only makes me all the more eager to see what he’ll do next.

It all amounts to the closest film in recent memory to what Omar Sharif once said of Lawrence of Arabia: “If you are the man with the money and somebody comes to you and says he wants to make a film that’s four hours long, with no stars, and no women, and no love story, and not much action either, and he wants to spend a huge amount of money to go film it in the desert—what would you say?” Dunkirk is half as long as Lawrence and consists almost entirely of action, and it isn’t on the same level, but the challenge that it presented to “the man with the money” must have been nearly as great. (Its lack of women, unfortunately, is equally glaring.) In fact, I can think of only one other director who has done anything comparable. I happened to see Dunkirk a few weeks after catching 2001: A Space Odyssey on the big screen, and as I watched the former movie last night, it occurred to me that Nolan has pulled off the most convincing Kubrick impression that any of us have ever seen. You don’t become the next Kubrick by imitating him, as Nolan did to some extent in Interstellar, but by figuring out new ways to tell stories using all the resources of the cinema, and somehow convincing a studio to fund the result. In both cases, the studio was Warner Bros., and I wonder if executives with long memories see Nolan as a transitional figure between Kubrick and the needs of the DC Extended Universe. It’s a difficult position for any director to occupy, and it may well prevent Nolan from developing along more interesting lines that his career might otherwise have taken. His artistic gambles, while considerable, are modest compared to even Barry Lyndon, and his position at the center of the industry can only discourage him from running the risk of being difficult or alienating. But I’m not complaining. Dunkirk is the story of a retreat, but it’s also the latest chapter in the life of a director who just can’t stop advancing.

Written by nevalalee

July 26, 2017 at 9:21 am

Gatsby’s fortune and the art of ambiguity

leave a comment »

F. Scott Fitzgerald

Note: I’m taking a short break this week, so I’ll be republishing a few posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on July 17, 2015. 

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has received greater recognition in recent years, or even John W. Campbell, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse can all too easily turn into an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, I’ll frequently find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we often see in “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at any given moment, and the resulting actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

The strange loop of Westworld

leave a comment »

The maze in Westworld

In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:

This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.

I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.

Inception

And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.

Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:

What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.

This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.

The test of tone

with one comment

Brendan Gleeson and Colin Farrell in In Bruges

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on April 22, 2014.

Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.

The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.

Hugh Dancy on Hannibal

As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.

At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the great pleasures of watching Hannibal lay in how it learned to acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. Fargo remains the most fascinating drama on television in large part because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.

Written by nevalalee

April 26, 2016 at 9:00 am

The Coco Chanel rule

with 4 comments

Coco Chanel

“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)

But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:

In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.

As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”

This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.

Walter Murch

This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”

This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it. 

“Open the bomb bay doors, please, Ken…”

leave a comment »

Slim Pickens in Dr. Strangelove

After the legendary production designer Ken Adam died last week, I found myself browsing through the book Ken Adam: The Art of Production Design, a wonderfully detailed series of interviews that he conducted with the cultural historian Christopher Frayling. It’s full of great stories, but the one I found myself pondering the most is from the making of Dr. Strangelove. Stanley Kubrick had just cast Slim Pickens in the role of Major Kong, the pilot of the B-52 bomber that inadvertently ends up triggering the end of the world, and it led the director to a sudden brainstorm. Here’s how Adam tells it:

[The bomber set] didn’t have practical bomb doors—we didn’t need them in the script at that time—and the set was almost ready to shoot. And Stanley said, “We need practical bomb doors.” He wanted this Texan cowboy to ride the bomb like a bronco into the Russian missile site. I did some setups, sketches for the whole thing, and Stanley asked me when it would be ready. I said, “If I work three crews twenty-four hours a day, you still won’t have it for at least a week, and that’s too late.” So now I arrive at Shepperton and I’m having kittens because I knew it was a fantastic idea but physically, mechanically, we couldn’t get it done. So again it was Wally Veevers, our special effects man, who saved the day, saying he’d sleep on it and come up with an idea. He always did that, even though he was having heart problems and wasn’t well. Wally came back and said, “We’re going to take a ten-by-eight still of the bomb bay interior, cut out the bomb-door opening, and shoot the bomb coming down against blue backing.” And that’s the way they did it.

I love this story for a lot of reasons. The first is the rare opportunity it affords to follow Kubrick’s train of thought. He had cast Peter Sellers, who was already playing three other lead roles, as Major Kong, but the performance wasn’t working, and when Sellers injured his ankle, Kubrick used this as an excuse to bring in another actor. Slim Pickens brought his own aura of associations, leading Kubrick to the movie’s single most memorable image, which now seems all but inevitable. And he seemed confident that any practical difficulties could be overcome. As Adam says elsewhere:

[Kubrick] had this famous theory in those days that the director had the right to change his mind up until the moment the cameras started turning. But he changed his mind after the cameras were rolling! For me, it was enormously demanding, because until then I was basically a pretty organized person. But I wasn’t yet flexible enough to meet these sometimes impossible demands that he came up with. So I was going through an anxiety crisis. But at the same time I knew that every time he changed his mind, he came up with a brilliant idea. So I knew I had to meet his demands in some way, even if it seemed impossible from a practical point of view.

Which just serves as a reminder that for Kubrick, who is so often characterized as the most meticulous and obsessive of directors, an intense level of preparation existed primarily to enable those moments in which the plan could be thrown away—a point that even his admirers often overlook.

Design by Ken Adam for Dr. Strangelove

It’s also obvious that Kubrick couldn’t have done any of this if he hadn’t surrounded himself with brilliant collaborators, and his reliance on Adam testifies to his belief that he had found someone who could translate his ideas into reality. (He tried and failed to get Adam to work with him on 2001, and the two reunited for Barry Lyndon, for which Adam deservedly won an Oscar.) We don’t tend to think of Dr. Strangelove as a movie that solved enormous technical problems in the way that some of Kubrick’s other projects did, but like any film, it presented obstacles that most viewers will never notice. Creating the huge maps in the war room, for instance, required a thousand hundred-watt bulbs installed behind perspex, along with an improvised air-conditioning system to prevent the heat from blistering the transparencies. Like the bomb bay doors, it’s the sort of issue that would probably be solved today with digital effects, but the need to address it on the set contributes to the air of authenticity that the story demands. Dr. Strangelove wouldn’t be nearly as funny if its insanities weren’t set against a backdrop of painstaking realism. Major Kong is a loving caricature, but the bomber he flies isn’t: it was reconstructed down to the tiniest detail from photos in aeronautical magazines. And there’s a sense in which Kubrick, like Christopher Nolan, embraced big logistical challenges as a way to combat a tendency to live in his own head—which is the one thing that these two directors, who are so often mentioned together, really do have in common.

There’s also no question that this was hard on Ken Adam, who was driven to something close to a nervous breakdown during the filming of Barry Lyndon. He says:

I became so neurotic that I bore all of Stanley’s crazy decisions on my own shoulders. I was always apologizing to actors for something that had gone wrong. I felt responsible for every detail of Stanley’s film, for all his mistakes and neuroses. I was apologizing to actors for Stanley’s unreasonable demands.

In Frayling’s words, Adam was “the man in the middle, with a vengeance.” And if he ended up acting as the ambassador, self-appointed or otherwise, between Kubrick and the cast and crew, it isn’t hard to see why: the production designer, then as now, provides the primary interface between the vision on the page—or in the director’s head—and its realization as something that can be captured on film. It’s a role that deserves all the more respect at a time when physical sets are increasingly being replaced by digital environments that live somewhere on a hard drive at Weta Digital. A director is not a designer, and even Adam says that Kubrick “didn’t know how to design,” although he also states that the latter could have taken over any number of the other technical departments. (This wasn’t just flattery, either. Years later, Adam would call Kubrick, in secret, to help him light the enormous supertanker set for The Spy Who Loved Me.) A director has to be good at many things, but it all emerges from a willingness to confront the problems that arise where the perfect collides with the possible. And it’s to the lasting credit of both Kubrick and Adam that they never flinched from that single combat, toe to toe with reality.

“He had played his part admirably…”

leave a comment »

"Laszlo, the bosun of the megayacht..."

Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.

A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:

We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…

Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.

"He had played his part admirably..."

What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.

It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…

“And what does that name have to do with this?”

with 2 comments

"The word on the side of your yacht..."

Note: This post is the thirtieth installment in my author’s commentary for Eternal Empire, covering Chapter 29. You can read the previous installments here.

Earlier this week, in response to a devastating article in the New York Times on the allegedly crushing work environment in Amazon’s corporate offices, Jeff Bezos sent an email to employees that included the following statement:

[The article] claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter is heard. Again, I don’t recognize this Amazon and I very much hope you don’t, either…I strongly believe that anyone working in a company that really is like the one described in the [Times] would be crazy to stay. I know I would leave such a company.

Predictably, the email resulted in numerous headlines along the lines of “Jeff Bezos to Employees: You Don’t Work in a Dystopian Hellscape, Do You?” Bezos, a very smart guy, should have seen it coming. As Richard Nixon learned a long time ago, whenever you tell people that you aren’t a crook, you’re really raising the possibility that you might be. If you’re concerned about the names that your critics might call you, the last thing you want to do is put words in their mouths—it’s why public relations experts advise their clients to avoid negative language, even in the form of a denial—and saying that Amazon isn’t a soulless, dystopian workplace is a little like asking us not to think of an elephant.

Writers have recognized the negative power of certain loaded terms for a long time, and many works of art go out of their way to avoid such words, even if they’re central to the story. One of my favorite examples is the film version of The Girl With the Dragon Tattoo. Coming off Seven and Zodiac, David Fincher didn’t want to be pigeonholed as a director of serial killer movies, so the dialogue exclusively uses the term “serial murderer,” although it’s doubtful how effective this was. Along the same lines, Christopher Nolan’s superhero movies are notably averse to calling their characters by their most famous names: The Dark Knight Rises never uses the name “Catwoman,” while Man of Steel, which Nolan produced, avoids “Superman,” perhaps following the example of Frank Miller’s The Dark Knight Returns, which indulges in similar circumlocutions. Robert Towne’s script for Greystoke never calls its central character “Tarzan,” and The Walking Dead uses just about every imaginable term for its creatures aside from “zombie,” for reasons that creator Robert Kirkman explains:

One of the things about this world is that…they’re not familiar with zombies, per se. This isn’t a world [in which] the Romero movies exist, for instance, because we don’t want to portray it that way…They’ve never seen this in pop culture. This is a completely new thing for them.

"And what does that name have to do with this?"

Kirkman’s reluctance to call anything a zombie, which has inspired an entire page on TV Tropes dedicated to similar examples, is particularly revealing. A zombie movie can’t use that word because an invasion of the undead needs to feel like something unprecedented, and falling back on a term we know conjures up all kinds of pop cultural connotations that an original take might prefer to avoid. In many cases, avoiding particular words subtly encourages us treat the story on its own terms. In The Godfather, the term “Mafia” is never uttered—an aversion, incidentally, not shared by the original novel, the working title of which was actually Mafia. This quietly allows us to judge the Corleones according to the rules of their own closed world, and it circumvents any real reflection about what the family business actually involves. (According to one famous story, the mobster Joseph Colombo paid a visit to producer Al Ruddy, demanding that the word be struck from the script as a condition for allowing the movie to continue. Ruddy, who knew that the screenplay only used the word once, promptly agreed.) The Godfather Part II is largely devoted to blowing up the first movie’s assumptions, and when the word “Mafia” is uttered at a senate hearing, it feels like the real world intruding on a comfortable fantasy. And the moment wouldn’t be as effective if the first installment hadn’t been as diligent about avoiding the term, allowing it to build a new myth in its place.

While writing Eternal Empire, I found myself confronting a similar problem. In this case, the offending word was “Shambhala.” As I’ve noted before, I decided early on that the third novel in the series would center on the Shambhala myth, a choice I made as soon as I stumbled across an excerpt from Rachel Polonsky’s Molotov’s Magic Lantern, in which she states that Vladimir Putin had taken a particular interest in the legend. A little research, notably in Andrei Znamenski’s Red Shambhala, confirmed that the periodic attempts by Russia to confirm the existence of that mythical kingdom, carried out in an atmosphere of espionage and spycraft in Central Asia, was a rich vein of material. The trouble was that the word “Shambhala” itself was so loaded with New Age connotations that I’d have trouble digging my way out from under it: a quick search online reveals that it’s the name of a string of meditation centers, a music festival, and a spa with its own line of massage oils, none of which is exactly in keeping with the tone that I was trying to evoke. My solution, predictably, was to structure the whole plot around the myth of Shambhala while mentioning it as little as possible: the name appears perhaps thirty times across four hundred pages. (The mythological history of Shambhala is treated barely at all, and most of the references occur in discussions of the real attempts by Russian intelligence to discover it.) The bulk of those references appear here, in Chapter 29, and I cut them all down as much as possible, focusing on the bare minimum I needed for Maddy to pique Tarkovsky’s interest. I probably could have cut them even further. But as it stands, it’s more or less enough to get the story to where it needs to be. And it doesn’t need to be any longer than it is…

Gatsby’s fortune and the art of ambiguity

with 3 comments

F. Scott Fitzgerald

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has recently come back into the spotlight, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse always seem like an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, even if I’m retreating from this a bit, I’ll often find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we see in so much “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at the moment, and our actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

My ten great movies #10: Inception

with 3 comments

Inception

Note: Four years ago, I published a series of posts here about my ten favorite movies. Since then, the list has evolved, as all such rankings do, with a few new titles and a reshuffling of the survivors, so it seems like as good a time as any to revisit it now.

Five years after its release, when we think of Inception, what we’re likely to remember first—aside from its considerable merits as entertainment—is its apparent complexity. With five or more levels of reality and a set of rules being explained to us, as well as to the characters, in parallel with breathless action, it’s no wonder that its one big laugh comes at Ariadne’s bewildered question: “Whose subconscious are we going into?” It’s a line that gives us permission to be lost. Yet it’s all far less confusing than it might have been, thanks largely to the work of editor Lee Smith, whose lack of an Oscar nomination, in retrospect, seems like an even greater scandal than Nolan’s snub as Best Director. This is one of the most comprehensively organized movies ever made. Yet a lot of credit is also due to Nolan’s script, and in particular to the shrewd choices it makes about where to walk back its own complications. As I’ve noted before, once the premise has been established, the action unfolds more or less as we’ve been told it will: there isn’t the third-act twist or betrayal that similar heist movies, or even Memento, have taught us to expect. Another nudge would cause it all to collapse.

It’s also in part for the sake of reducing clutter that the dream worlds themselves tend to be starkly realistic, while remaining beautiful and striking. A director like Terry Gilliam might have turned each level into a riot of production design, and although the movie’s relative lack of surrealism has been taken as a flaw, it’s really more of a strategy for keeping the clean lines of the story distinct. The same applies to the characters, who, with the exception of Cobb, are defined mostly by their roles in the action. Yet they’re curiously compelling, perhaps because we respond so instinctively to stories of heists and elaborate teamwork. I admire Interstellar, but I can’t say I need to spend another three hours in the company of its characters, while Inception leaves me wanting more. This is also because its premise is so rich: it hints at countless possible stories, but turns itself into a closed circle that denies any prospect of a sequel. (It’s worth noting, too, how ingenious the device of the totem really is, with the massive superstructure of one of the largest movies ever made coming to rest on the axis of a single trembling top.) And it’s that unresolved tension, between a universe of possibilities and a remorseless cut to black, that gives us the material for so many dreams.

Tomorrow: The greatest story in movies. 

Written by nevalalee

May 11, 2015 at 8:27 am

The poster problem

leave a comment »

Avengers: Age of Ultron

Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:

This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.

If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.

Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:

But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.

Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.

Inception

It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.

And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.

Written by nevalalee

April 29, 2015 at 8:44 am

Left brain, right brain, samurai brain

leave a comment »

Seven Samurai

The idea that the brain can be neatly divided into its left and right hemispheres, one rational, the other intuitive, has been largely debunked, but that doesn’t make it any less useful as a metaphor. You could play an instructive game, for instance, by placing movie directors on a spectrum defined by, say, Kubrick and Altman as the quintessence of left-brained filmmaking and its right-brained opposite, and although such distinctions may be artificial, they can generate their own kind of insight. Christopher Nolan, for one, strikes me as a fundamentally left-brained director who makes a point of consciously willing himself into emotion. (Citing some of the cornier elements of Interstellar, the writer Ta-Nehisi Coates theorizes that they were imposed by the studio, but I think it’s more likely that they reflect Nolan’s own efforts, not always successful, to nudge the story into recognizably human places. He pulled it off beautifully in Inception, but it took him ten years to figure out how.) And just as Isaiah Berlin saw Tolstoy as a fox who wanted to be a hedgehog, many of the recent films of Wong Kar-Wai feel like the work of a right-brained director trying to convince himself that the left hemisphere is where he belongs.

Of all my favorite directors, the one who most consistently hits the perfect balance between the two is Akira Kurosawa. I got to thinking about this while reading the editor and teacher Richard D. Pepperman’s appealing new book Everything I Know About Filmmaking I Learned Watching Seven Samurai, which often reads like the ultimate tribute to Kurosawa’s left brain. It’s essentially a shot for shot commentary, cued up to the definitive Criterion Collection release, that takes us in real time through the countless meaningful decisions made by Kurosawa in the editing room: cuts, dissolves, wipes, the interaction between foreground and background, the use of music and sound, and the management of real and filmic space, all in service of story. It’s hard to imagine a better movie for a study like this, and with its generous selection of stills, the book is a delight to browse through—it reminds me a little of Richard J. Anobile’s old photonovels, which in the days before home video provided the most convenient way of revisiting Casablanca or The Wrath of Khan. I’ve spoken before of the film editor as a kind of Apollonian figure, balancing out the Dionysian personality of the director on the set, and this rarely feels so clear as it does here, even, or especially, when the two halves are united in a single man.

Seven Samurai

As for Kurosawa’s right brain, the most eloquent description I’ve found appears in Donald Richie’s The Films of Akira Kurosawa, which is still the best book of its kind ever written. In his own discussion of Seven Samurai, Richie speaks of “the irrational rightness of an apparently gratuitous image in its proper place,” and continues:

Part of the beauty of such scenes…is just that they are “thrown away” as it were, that they have no place, that they do not ostensibly contribute, that they even constitute what has been called bad filmmaking. It is not the beauty of these unexpected images, however, that captivates…but their mystery. They must remain unexplained. It has been said that after a film is over all that remains are a few scattered images, and if they remain then the film was memorable…Further, if one remembers carefully one finds that it is only the uneconomical, mysterious images which remain…

Kurosawa’s films are so rigorous and, at the same time, so closely reasoned, that little scenes such as this appeal with the direct simplicity of water in the desert…[and] in no other single film are there as many as in Seven Samurai.

What one remembers best from this superbly economical film then are those scenes which seem most uneconomical—that is, those which apparently add nothing to it.

Richie goes on to list several examples: the old crone tottering forward to avenge the death of her son, the burning water wheel, and, most beautifully, the long fade to black before the final sequence of the villagers in the rice fields. My own favorite moment, though, occurs in the early scene when Kambei, the master samurai, rescues a little boy from a thief. In one of the greatest character introductions in movie history, Kambei shaves his head to disguise himself as a priest, asking only for two rice balls, which he’ll use to lure the thief out of the barn where the boy has been taken hostage. This information is conveyed in a short conversation between the farmers and the townspeople, who exit the frame—and after the briefest of pauses, a woman emerges from the house in the background, running directly toward the camera with the rice balls in hand, looking back for a frantic second at the barn. It’s the boy’s mother. There’s no particular reason to stage the scene like this; another director might have done it in two separate shots, if it had occurred to him to include it at all. Yet the way in which Kurosawa films it, with the crowd giving way to the mother’s isolated figure, is both formally elegant and strangely moving. It offers up a miniature world of story and emotion without a single cut, and like Kurosawa himself, it resists any attempt, including this one, to break it down into parts.

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

The light of distant stars

with 2 comments

Matthew McConaughey in Interstellar

By now, Interstellar has inspired plenty of conversation on subjects ranging from the accuracy of its science to the consistency of its intricate timelines, but I wanted to highlight one aspect of the film that hasn’t received as much attention: its use of physical miniatures. If you’re a visual effects nerd like me, Interstellar represents a welcome return to a style of filmmaking that other directors seem to have all but abandoned, with huge, detailed models—the one for the spacecraft Endurance was a full twenty-five feet across—shot against star fields in the studio, a tradition that stretches back through Star Wars to 2001. And the result speaks for itself. The effects are so good that they practically fade into the background; for long stretches of the film, we’re barely aware of them as effects at all, but as elements in a story that persuasively takes place on the largest imaginable scale. (There’s even a sense in which the film’s scientific rigor and its reliance on modelwork go hand in hand. Dealing with big, unwieldy miniatures and hydraulics can only make a filmmaker more aware of the physics involved.)

Last week, I suggested that Christopher Nolan, the most meticulous creator of blockbusters we have, is drawn to IMAX and the logistical problems it presents as a way of getting out of his own head, or of grounding his elaborate conceits in recognizably vivid environments, and much the same is true of his approach to effects. If Inception had unfolded in a flurry of digital imagery, as it might easily have done in the hands of a lesser filmmaker, the story itself would have been far less interesting. Dreams, as Cobb reminds Ariadne, feel real while you’re inside them, and it’s revealing that the most controlling of directors understands the value of techniques that force him to give up control, while paradoxically allowing for greater realism. As Nolan says:

These are things you could try to calculate into CG if you had to, but the wonderful thing about miniature shooting is that it shows you things you never knew were there or couldn’t plan for. I refer to it as serendipity—this random quality that gives the image a feeling of life.

And the randomness is key. Critics often speak of the uncanny valley when describing how virtual actors are never as convincing as the real thing, and a similar principle seems to be at work with other visual effects. Computers have made enormous advances in depicting anything a filmmaker likes, but there are still crucial details—artifacts of lighting, the behavior of surfaces seen against real backdrops—that digital artistry struggles to replicate, precisely because they’re so unpredictable.

George Clooney on the set of Gravity

Light, it seems, is a problem as intractable, in its own way, as the subtleties of human expression, and while we may feel less of a visceral reaction when the technology falls short, it still prevents us from immersing ourselves completely in the experience. Even in films like The Return of the King or Avatar, which look undeniably spectacular, we’re often conscious of how expertly the imagery has been constructed, with the uniform, unreal light of a world that exists only on a hard drive at Weta. It holds us at arm’s distance even as it draws us in. That said, technology marches on, and it’s telling that Interstellar arrives in theaters almost exactly one year after Gravity, a movie that takes a diametrically opposite approach to many of the same problems: few practical sets or models were built, and for much of the film, everything in sight, from the spacesuits to the interiors to the panorama of the earth in the background, is a digital creation. The result, to put it mildly, looks fantastic, even in IMAX, and it’s the first movie I’ve seen in a long time in which computer effects are truly indistinguishable from reality.

At first glance, then, it might seem like Interstellar arrives at the scene a few months too late, at a point where digital effects have met and exceeded what might be possible using painstaking practical techniques. Really, though, the two films have a great deal in common. If the effects in Gravity work so well, it’s in large part due to the obsessiveness that went into lighting and wirework during principal photography: Emmanuel Lubezki’s famous light box amounts to a complicated way of addressing the basic—and excruciatingly specific—challenge of keeping the actors’ faces properly lit, a detail destined to pass unnoticed until it goes wrong. Interstellar takes much the same approach, with enormous projections used on the sound stage, rather than green screens, in order to immerse the actors in the effects in real time. In other words, both films end up converging on similar solutions from opposite directions, ultimately meeting in the same place: on the set itself. They understand that visible magic only works when grounded in invisible craft, and if the tools they use are very different, they’re united in a common goal. And the cinematic universe, thankfully, is big enough for them both.

Written by nevalalee

November 11, 2014 at 10:05 am

Stellar mass

leave a comment »

Interstellar

Note: This post does its best to avoid spoilers for Interstellar. I hope to have a more detailed consideration up next week.

Halfway through the first showing of Interstellar at the huge IMAX theater at Chicago’s Navy Pier, the screen abruptly went black. At a pivotal moment, the picture cut out first, followed immediately by the sound, and it took the audience a second to realize that the film had broken. Over the five minutes or so that followed, as we waited for the movie to resume, I had time to reflect on the sheer physicality of the technology involved. As this nifty featurette points out, a full print of Interstellar weighs six hundred pounds, mounted on a six-foot platter, and just getting it to move smoothly through the projector gate presents considerable logistical challenges, as we found out yesterday. (The film itself is so large that there isn’t room on the platter for any previews or extraneous features: it’s the first movie I’ve ever seen that simply started at the scheduled time, without any tedious preliminaries, and its closing credits are startlingly short.) According to Glenn Newland, the senior director of operations at IMAX, the company started making calls eighteen months ago to theater owners who were converting from film to digital, saying, in effect: Please hold on to that projector. You’re going to need it.

And they were right. I’ve noted before that if Christopher Nolan has indelibly associated himself with the IMAX format, that’s no accident. Nolan’s intuition about his large-scale medium seems to inform the narrative choices he makes: he senses, for instance, that plunging across a field of corn can be as visually thrilling as a journey through a wormhole or the skyline of Gotham City. Watching it, I got the impression that Nolan is drawn to IMAX as a kind of corrective to his own naturally hermetic style of storytelling: the big technical problems that the format imposes force him to live out in the world, not simply in his own head. And if the resulting image is nine times larger than that of conventional celluloid, that squares well with his approach to screenwriting, which packs each story with enough ideas for nine ordinary movies. Interstellar sometimes groans under the weight of its own ambitions; it lacks the clean lines provided by the heist plot of Inception or the superhero formula of his Batman films. It wants to be a popcorn movie, a visionary epic, a family story, and a scientifically rigorous adventure that takes a serious approach to relativity and time dilation, and it succeeds about two-thirds of the time.

Christopher Nolan on the set of Interstellar

Given the loftiness of its aims, that’s not too bad. Yet it might have worked even better if it had taken a cue from the director whose influence it struggles so hard to escape. Interstellar is haunted by 2001 in nearly every frame, from small, elegant touches, like the way a single cut is used to cover a vast stretch of time—in this case, the two-year journey from Earth to Saturn—to the largest of plot points. Like Kubrick’s film, it pauses in its evocation of vast cosmic vistas for a self-contained interlude of intimate, messy drama, which in both cases seems designed to remind us that humanity, or what it creates, can’t escape its most primitive impulses for self-preservation. Yet it also suffers a little in the comparison. Kubrick was shrewd enough to understand that a movie showing mankind in its true place in the universe had no room for ordinary human plots, and if his characters seem so drained of personality, it’s only a strategy for eliminating irrelevant distractions. Nolan wants to have it all, so he ends up with a film in which the emotional pieces sit uneasily alongside the spectacle, jostling for space when they should have had all the cosmos at their disposal.

Like most of Nolan’s recent blockbuster films, Interstellar engages in a complicated triangulation between purity of vision and commercial appeal, and the strain sometimes shows. It suffers, though much less glaringly, from the same tendency as Prometheus, in which characters stand around a spacecraft discussing information, like what the hell a wormhole is, that should have probably been covered long before takeoff. And while it may ultimately stand as Nolan’s most personal film—it was delivered to theaters under the fake title Flora’s Letter, which is named after his daughter—its monologues on the transcendent power of love make a less convincing statement than the visual wonders on display. (All praise and credit, by the way, are due to Matthew McConaughey, who carries an imperfectly conceived character with all the grace and authority he brought to True Detective, which also found him musing over the existence of dimensions beyond our own.) For all its flaws, though, it still stands as a rebuke to more cautious entertainments, a major work from a director who hardly seems capable of anything else. In an age of massless movies, it exerts a gravitational pull all its own, and if it were any larger, the theater wouldn’t be able to hold it.

Written by nevalalee

November 6, 2014 at 8:30 am