Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Inception

The man with the plan

with one comment

This month marks the twenty-fifth anniversary of the release of Reservoir Dogs, a film that I loved as much as just about every other budding cinephile who came of age in the nineties. Tom Shone has a nice writeup on its legacy in The New Yorker, and while I don’t agree with every point that he makes—he dismisses Kill Bill, which is a movie that means so much to me that I named my own daughter after Beatrix Kiddo—he has insights that can’t be ignored: “Quentin [Tarantino] became his worst reviews, rather in the manner of a boy who, falsely accused of something, decides that he might as well do the thing for which he has already been punished.” And there’s one paragraph that strikes me as wonderfully perceptive:

So many great filmmakers have made their debuts with heist films—from Woody Allen’s Take the Money and Run to Michael Mann’s Thief to Wes Anderson’s Bottle Rocket to Bryan Singer’s The Usual Suspects—that it’s tempting to see the genre almost as an allegory for the filmmaking process. The model it offers first-time filmmakers is thus as much economic as aesthetic—a reaffirmation of the tenant that Jean-Luc Godard attributed to D. W. Griffith: “All you need to make a movie is a girl and a gun.” A man assembles a gang for the implementation of a plan that is months in the rehearsal and whose execution rests on a cunning facsimile of midmorning reality going undetected. But the plan meets bumpy reality, requiring feats of improvisation and quick thinking if the gang is to make off with its loot—and the filmmaker is to avoid going to movie jail.

And while you could nitpick the details of this argument—Singer’s debut was actually Public Access, a movie that nobody, including me, has seen—it gets at something fundamental about the art of film, which lies at the intersection of an industrial process and a crime. I’ve spoken elsewhere about how Inception, my favorite movie of the last decade, maps the members of its mind heist neatly onto the crew of a motion picture: Cobb is the director, Saito the producer, Ariadne the set designer, Eames the actor, and Arthur is, I don’t know, the line producer, while Fischer, the mark, is a surrogate for the audience itself. (For what it’s worth, Christopher Nolan has stated that any such allegory was unconscious, although he seems to have embraced it after the fact.) Most of the directors whom Shone names are what we’d call auteur figures, and aside from Singer, all of them wear a writer’s hat, which can obscure the extent to which they depend on collaboration. Yet in their best work, it’s hard to imagine Singer without Christopher McQuarrie, Tarantino without editor Sally Menke, or Wes Anderson without Owen Wilson, not to mention the art directors, cinematographers, and other skilled craftsmen required to finish even the most idiosyncratic and personal movie. Just as every novel is secretly about the process of its own creation, every movie is inevitably about making movies, which is the life that its creators know most intimately. One of the most exhilarating things that a movie can do is give us a sense of the huddle between artists, which is central to the appeal of The Red Shoes, but also Mission: Impossible—Rogue Nation, in which Tom Cruise told McQuarrie that he wanted to make a film about what it was like for the two of them to make a film.

But there’s also an element of criminality, which might be even more crucial. I’m not the first person to point out that there’s something illicit in the act of watching images of other people’s lives projected onto a screen in a darkened theater—David Thomson, our greatest film critic, has built his career on variations on that one central insight. And it shouldn’t surprise us if the filmmaking process itself takes on aspects of something done in the shadows, in defiance of permits, labor regulations, and the orderly progression of traffic. (Werner Herzog famously advised aspiring directors to carry bolt cutters everywhere: “If you want to do a film, steal a camera, steal raw stock, sneak into a lab and do it!”) If your goal is to tell a story about putting together a team for a complicated project, it could be about the Ballet Lermontov or the defense of a Japanese village, and the result might be even greater. But it would lack the air of illegality on which the medium thrives, both in its dreamlife and in its practical reality. From the beginning, Tarantino seems to have sensed this. He’s become so famous for reviving the careers of neglected figures for the sake of the auras that they provide—John Travolta, Pam Grier, Robert Forster, Keith Carradine—that it’s practically become his trademark, and we often forget that he did it for the first time in Reservoir Dogs. Lawrence Tierney, the star of Dillinger and Born to Kill, had been such a menacing presence both onscreen and off that that he was effectively banned from Hollywood after the forties, and he remained a terrifying presence even in old age. He terrorized the cast of Seinfield during his guest appearance as Elaine’s father, and one of my favorite commentary tracks from The Simpsons consists of the staff reminiscing nervously about how much he scared them during the recording of “Marge Be Not Proud.”

Yet Tarantino still cast him as Joe Cabot, the man who sets up the heist, and Tierney rewarded him with a brilliant performance. Behind the scenes, it went more or less as you might expect, as Tarantino recalled much later:

Tierney was a complete lunatic by that time—he just needed to be sedated. We had decided to shoot his scenes first, so my first week of directing was talking with this fucking lunatic. He was personally challenging to every aspect of filmmaking. By the end of the week everybody on set hated Tierney—it wasn’t just me. And in the last twenty minutes of the first week we had a blowout and got into a fist fight. I fired him, and the whole crew burst into applause.

But the most revealing thing about the whole incident is that an untested director like Tarantino felt capable of taking on Tierney at all. You could argue that he already had an inkling of what he might become, but I’d prefer to think that he both needed and wanted someone like this to symbolize the last piece of the picture. Joe Cabot is the man with the plan, and he’s also the man with the money. (In the original script, Joe says into the phone: “Sid, stop, you’re embarrassing me. I don’t need to be told what I already know. When you have bad months, you do what every businessman in the world does, I don’t care if he’s Donald Trump or Irving the tailor. Ya ride it out.”) It’s tempting to associate him with the producer, but he’s more like a studio head, a position that has often drawn men whose bullying and manipulation is tolerated as long as they can make movies. When he wrote the screenplay, Tarantino had probably never met such a creature in person, but he must have had some sense of what was in store, and Reservoir Dogs was picked up for distribution by a man who fit the profile perfectly—and who never left Tarantino’s side ever again. His name was Harvey Weinstein.

Gatsby’s fortune and the art of ambiguity

leave a comment »

F. Scott Fitzgerald

Note: I’m taking a short break this week, so I’ll be republishing a few posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on July 17, 2015. 

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has received greater recognition in recent years, or even John W. Campbell, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse can all too easily turn into an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, I’ll frequently find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we often see in “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at any given moment, and the resulting actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

Strange currencies

with 2 comments

Doctor Strange

Last Monday, I took a break. I don’t normally take much time off, but I wanted to tune out of election coverage on what I feared, implausibly but correctly, might be the last entirely happy day I’d have for the next four years. My book project was in good shape, I’d finished a decent draft of a short story, and I had nothing else pressing to hold my attention. So I lit out. I treated myself to a Lyft ride into Chicago, where I dropped into two of my favorite used bookstores—Bookman’s Corner and Booklegger’s—and spent about twenty bucks. Then I took a train to the River East Theater on Illinois Street, where I met up with my wife to catch Doctor Strange, which was the first movie we’d seen together on the big screen since The Force Awakens. Afterward, we headed home just in time to put our daughter, who had spent the day with her grandparents, to bed. And if I lay out the context in such detail, it’s because I have a feeling that this is how most people in this country go to the movies. After a young adulthood in which I turned up at my local cineplex or art house theater at least once a week to see whatever blockbuster or critical darling was currently in the reviews, along with countless revivals, I’ve settled down into a routine in which I’m more likely to see two or three movies each year with my daughter and a couple of others for myself. This places me squarely in the mainstream of most moviegoers: according to a recent survey, the average American sees five movies a year, and I seem likely to hit that number exactly.

Which is both remarkable and kind of unsurprising. Hollywood releases about six hundred movies every year, a significant percentage of which are trying to appeal to as many demographic quadrants as possible. Yet even The Force Awakens, which sold over a hundred million tickets domestically, was seen by something less than a third of all Americans, even before you take multiple viewings into account. To convince the average adult to go to the movies five times in a single calendar year, you need a wide range of product, only a fraction of which is likely to entice any given individual to buy a ticket. Inevitably, however, the people who write professionally about the movies from both the artistic and business angles are inclined to try to make sense of the slate as a whole. Film critics may review two or three movies every week and go to even more—and they have to see everything, not just what appeals to their own tastes. As I learned during my own stint as a working critic, it’s a situation that has a way of altering your expectations: you realize how many movies are simply mediocre and forgettable, and you start to relish anything out of the ordinary, however misguided it might be. Needless to say, this isn’t how your typical moviegoer sees it. Someone who watches a hundred and fifty movies every year for free might as well belong to a different species as someone who pays to see fewer than five, but they have no choice but to try to understand each other, at least if we’re going to take criticism seriously from either side.

Tilda Swinton and Benedict Cumberbatch in Doctor Strange

So what does this have to do with Doctor Strange? Quite a lot, I think. I had originally hoped to write about it here last week, before the election made it hard to think about anything else, and there was a time when I wasn’t even sure whether I’d devote a post to it at all. Yet I’ve become intrigued precisely by the way it has faded in my imagination. In the moment, I liked it a lot. It stars five actors whom I’m happy to see in anything, and it actually gives two or three of them something interesting to do. When I broke it down in my head, its scenes fell into three categories. About of a third were watchable in the usual Marvel way, which takes pride in being pretty good, but not great; another third achieved something like high camp; and the last third were genuinely visionary, with some of the most striking visual effects I’ve ever seen. There are scenes in Doctor Strange that get as close as a movie possibly can to the look and feel of a dream, with elaborate geometric patterns and cityscapes that break down and reform themselves before our eyes. It left me wondering how they did it. But it didn’t stick in my head in the way that Inception, its obvious inspiration, still does. In part, it’s because it uses digital rather than practical effects: an homage to Joseph Gordon-Levitt’s famous hallway fight scene only reminds us of how much more effective—and respectful of gravity—it was to stage it right in the camera. And even the most amazing sequences are chases or showdowns that amount to interchangeable components. The story halts for them, and they could be inserted at any point into any version of the script.

As a result, it left me with a highlight reel of memories that is basically identical to the trailer. But a movie that was wholly as weird and as distinctive as the best scenes in Doctor Strange would never have made it into theaters. It would be fundamentally out of keeping with the basic premise of the Marvel Universe, which is that no one movie can stick out from the rest, and nothing can occur that is so meaningful that it interferes with the smooth production of films being shot simultaneously by other directors. The story, ideally, should be about as little as possible, while still creating the illusion that the stakes are infinite—which leads inexorably to diminishing returns. (When you read the early space opera stories of writers like John W. Campbell, you realize that once the heroes can casually span entire galaxies, it means that nothing matters whatsoever. And the same thing happens in the Marvel films.) Doctor Strange works because it keeps its weirdness hermetically sealed off from the rest: as long as we’re watching those scenes, we’re transported into a freakier, more exhilarating film, only to be returned to the safe beats of the formula as quickly and antiseptically as possible. There’s nothing wrong with the screenplay, except to the extent that there’s something wrong with every script written according to the usual specifications. The result has flashes of something extraordinary, but it’s scaled back for the audience members who see only five movies a year. It’s big and distinctive enough to assure you that you’ve gotten your money’s worth, but not so unusual that it makes you question what you bought with it. It’s Benedict Cumberbatch with an American accent. And it’s exactly as good as that sounds.

Written by nevalalee

November 15, 2016 at 8:27 am

The strange loop of Westworld

leave a comment »

The maze in Westworld

In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:

This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.

I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.

Inception

And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.

Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:

What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.

This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.

I can dream, can’t I?

with 6 comments

Inception

For years, I’ve been daydreaming about a piece of fan fiction that I’d love to write, although I doubt I’ll ever get a chance to do it. Let’s call it The Carousel. It’s a midquel to Inception, which means that it takes place during the events of the original movie—in this case, after Cobb has assembled his team for the mind heist, but before they’ve actually gone into Fischer’s head. (There’s nothing in the film itself to rule this out: it’s unclear how much time passes after Saito approaches them with the assignment.) Cobb is concerned about Ariadne’s lack of experience, so he proposes that they practice first with a quick, straightforward job. It’s a commission from a striking, mysterious woman in her fifties who wants them to enter her aging father’s dreams to discover the secrets of his past. She is, of course, Sally Draper from Mad Men. The rest of the story follows the team as they invade Don’s mind, burrowing into his memories of his life at Sterling Cooper and the women he loved and lost, and probing ever deeper toward the dark heart of the man who was once known as Dick Whitman. We’d see Arthur and Ariadne trying to blend in at the office holiday party, or maybe Eames going undercover in Korea. And when they emerge from Don’s brain at last, with or without the answers that Sally wants, they’ve all been subtly changed, and they’re ready to go after Fischer. If nothing else, it explains why they’re still wearing those suits.

Alas, I don’t think I’ll ever write this story, mostly because I know I can’t give it the energy and attention it deserves. After I got the idea for the crossover, I decided to put it off until Mad Men finished its run, which would allow me to draw on Don’s full backstory, but the longer I waited, the more obvious it became that I couldn’t justify the investment of time it required. For one thing, I’d want to write it up as a full novel, and to do it justice, I’d have to go back and watch all seven seasons of the series, looking for places in which I could insert Cobb’s team into the background, à la Back to the Future Part II. I’d also want to revisit Inception itself to see if there were any plot holes or contradictions I could explain in the process. In short, it would be a lot of work for a story that I’m not sure anybody else would read, or particularly want to see. But I seem to have incepted myself with it, because I can’t get it out of my head. As with most fanfic, there’s an element of wish fulfillment involved: it allows me to spend a little more time with characters I probably won’t see ever again. Mad Men ended so beautifully that any continuation—like the Sally Draper spinoff series that was pitched in all seriousness at AMC—would only undermine its legacy. And Inception is one of the few recent blockbusters that deliberately makes a sequel impossible, despite the occasional rumblings that we hear along those lines. It won’t happen. But this is why fanfic exists.

Jon Hamm on Mad Men

In the meantime, I’ll sometimes try to scratch that itch by reading a novel or short story and mentally casting all the characters with faces from Mad Men. It’s a habit that I picked up years ago, when I first read Arthur Hailey’s Hotel, and I’ve done it since with Airport and a few of John D. MacDonald’s novels. (I still think that Jon Hamm would make a perfect Travis McGee.) And the show maps onto George O. Smith’s stories about the space station Venus Equilateral almost too well. I’ll often do it when reading a story that is best approached as a period piece, thanks either to the author’s intentions or to the passage of time. Picturing Don, Joan, and the rest at least allows me to keep the clothes and hairstyles straight, which is a more significant factor than it might first appear: a book like John Updike’s Couples reads altogether differently when you realize that all of the women would have been dressed like Betty Draper. In other cases, it amounts to a hybrid form of fanfic, enabling the kind of dream casting that still makes me wish, say, for a miniseries version of The Corrections starring the cast of Arrested Development—which just makes me want to read that novel again with those actors in mind, just as I recently went back to Red Dragon while picturing Hugh Dancy as Will. It’s a harmless game, and it can bring out elements of a story that I might have overlooked, just as the casting of a particular movie star in a film can clarify a character in ways that a screenwriter can’t.

And this is just a variation on what happens inside all our heads when we read a novel. Only half of the work is done by the writer on the page; the other half occurs in the reader’s brain, which populates the novel with faces, settings, and images that the author might never have envisioned. What I see when I read a story is drastically different from what appears in your mind’s eye, and we have no way of comparing them directly. (That said, an adaptation can lock certain elements into place for many readers, so that their imaginations run more or less in parallel. Ten years ago, no two fans saw the characters from A Song of Ice and Fire in quite the same way, but thanks to Game of Thrones, I suspect that a lot of readers now just picture Peter Dinklage and Emilia Clarke, as if a wave function had collapsed into exactly one eigenstate.) The fact that fanfic bridges that gap instantaneously, so that we can immediately see all of our favorite characters, is a large part of its appeal—and the main reason why it’s a flawed school for writers who are still learning their craft. Creating believable characters from scratch is the single hardest aspect of writing, and fanfic allows you to skip that crucial step. Aspiring writers should be wary of it for the same reason that the playwright Willy Russell avoids listening to music or drinking wine while he works: “I think both those things seduce you into thinking that the feelings engendered by the wine or music are present in your work.” That’s true of fanfic, too, and it’s why I’ll probably never end up writing The Carousel. But I can dream, can’t I?

“He had played his part admirably…”

leave a comment »

"Laszlo, the bosun of the megayacht..."

Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.

A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:

We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…

Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.

"He had played his part admirably..."

What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.

It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…

Gatsby’s fortune and the art of ambiguity

with 3 comments

F. Scott Fitzgerald

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has recently come back into the spotlight, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse always seem like an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, even if I’m retreating from this a bit, I’ll often find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we see in so much “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at the moment, and our actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

The list of a lifetime

leave a comment »

Star Trek II: The Wrath of Khan

I miss Roger Ebert for a lot of reasons, but I always loved how fully he occupied the role of the celebrity critic while expanding it into something more. “Two thumbs up” has become a way of dismissing an entire category of film criticism, and Ebert was as responsible for its rise as anyone else, although he can hardly be blamed for his imitators. Yet he wouldn’t have been nearly as good at it—and he was damned good, especially when paired with Gene Siskel—if it hadn’t been built on a foundation of shrewdness, taste, and common sense that came through in every print review he wrote. He knew that a rating system was necessary, if only to give shape to his discussions with Gene, but he was also aware of its limitations. (For proof, you need only turn to his classic review of the Adam Sandler remake of The Longest Yard, which transforms, unexpectedly, into an extended essay on the absurdity of reconciling a thoughtful approach to criticism with “that vertical thumb.”) Read any critic for any length of time, whether it’s Pauline Kael or David Thomson or James Wood, and you start to see the whole business of ranking works of art, whether with thumbs or with words, as both utterly important and inherently ridiculous. Ebert understood this profoundly.

The same was true of the other major tool of the mainstream critic: the list. Making lists of the best or worst movies, like handing out awards, turns an art form into a horse race, but it’s also a necessary evil. A critic wants to be a valued guide, but more often, he ends up serving as a signpost, pointing up the road toward an interesting vista while hoping that we’ll take in other sights along the way. Lists are the most useful pointers we have, especially for viewers who are encountering the full variety of movies for the first time, and they’ve played an enormous role in my own life. And when you read Ebert’s essay on preparing his final list for the Sight & Sound poll, you sense both the melancholy nature of the task and his awareness of the power it holds. Ebert knows that adding a movie to his list naturally draws attention to it, and he pointedly includes a single “propaganda” title—here it’s Malick’s Tree of Life—to encourage viewers to seek it out. Since every addition requires a removal, he clarifies his feelings on this as well:

Once any film has ever appeared on my [Sight & Sound] list, I consider it canonized. Notorious or Gates of Heaven, for example, are still two of the ten best films of all time, no matter what a subsequent list says.

In short, he approaches the list as a game, but a serious one, and he knows that pointing one viewer toward Aguirre or The General makes all of it worthwhile.

Russell Crowe and Guy Pearce in L.A. Confidential

I thought of his example repeatedly when I revised my list of my ten favorite movies. Four years had gone by since my last series of posts on the subject, and the passage of time had brought a bit of reshuffling and a pair of replacements: L.A. Confidential and Star Trek II: The Wrath of Khan had given way to Vertigo and Inception. And while it’s probably a mistake to view it as a zero-sum game, it’s hard not to see these films as commenting on one another. L.A. Confidential remains, as I said long ago, my favorite of all recent Hollywood movies, but it’s a film that invests its genre with greater fluency and complexity without challenging the rules on a deeper level, while Vertigo takes the basic outline of a sleek romantic thriller and blows it to smithereens. As much as I love them both, there’s no question in my mind as to which one achieves more. The contest between Inception and Wrath of Khan is harder to judge, and I’m not sure that the latter isn’t ultimately richer and more rewarding. But I wanted to write about Inception ever so slightly more, and after this weekend’s handwringing over the future of original ideas in movies, I have a hunch that its example is going to look even more precious with time. Inception hardly needs my help to draw attention to it, but to the extent that I had a propaganda choice this time around, it was this one.

Otherwise, my method in ranking these films was a simple one. I asked myself which movie I’d save first—solely for my own pleasure—if the last movie warehouse in the world were on fire. The answer was The Red Shoes. Next would be Blue Velvet, then Chungking Express, and so on down the line. Looking at the final roster, I don’t think I’d make any changes. Like Ebert, who kept La Dolce Vita on his list because of how it reflected the arc of his own life, I’m aware that much of the result is a veiled autobiography: Blue Velvet, in particular, galvanized me as a teenager as few other movies have, and part of the reason I rank it so highly is to acknowledge that specific debt. Other films are here largely because of the personal associations they evoke. Yet any movie that encapsulates an entire period in my life, out of all the films I was watching then, has to be extraordinary by definition: it isn’t just a matter of timing, at least not if it lasts. (You could even say that a great movie, like Vertigo, is one that convinces many different viewers that it’s secretly about them.) Ebert knew that there was no contradiction in embracing The Tree of Life as both the largest cosmic statement since 2001 and an agonizingly specific evocation of his own childhood. Any list, like any critic, lives in two worlds, and each half gains meaning from the other. And when I think of my own list and the choices it made, I can only quote Ebert one last time: “To add a title, I must remove one. Which film can I do without? Not a single one.”

My ten great movies #10: Inception

with 3 comments

Inception

Note: Four years ago, I published a series of posts here about my ten favorite movies. Since then, the list has evolved, as all such rankings do, with a few new titles and a reshuffling of the survivors, so it seems like as good a time as any to revisit it now.

Five years after its release, when we think of Inception, what we’re likely to remember first—aside from its considerable merits as entertainment—is its apparent complexity. With five or more levels of reality and a set of rules being explained to us, as well as to the characters, in parallel with breathless action, it’s no wonder that its one big laugh comes at Ariadne’s bewildered question: “Whose subconscious are we going into?” It’s a line that gives us permission to be lost. Yet it’s all far less confusing than it might have been, thanks largely to the work of editor Lee Smith, whose lack of an Oscar nomination, in retrospect, seems like an even greater scandal than Nolan’s snub as Best Director. This is one of the most comprehensively organized movies ever made. Yet a lot of credit is also due to Nolan’s script, and in particular to the shrewd choices it makes about where to walk back its own complications. As I’ve noted before, once the premise has been established, the action unfolds more or less as we’ve been told it will: there isn’t the third-act twist or betrayal that similar heist movies, or even Memento, have taught us to expect. Another nudge would cause it all to collapse.

It’s also in part for the sake of reducing clutter that the dream worlds themselves tend to be starkly realistic, while remaining beautiful and striking. A director like Terry Gilliam might have turned each level into a riot of production design, and although the movie’s relative lack of surrealism has been taken as a flaw, it’s really more of a strategy for keeping the clean lines of the story distinct. The same applies to the characters, who, with the exception of Cobb, are defined mostly by their roles in the action. Yet they’re curiously compelling, perhaps because we respond so instinctively to stories of heists and elaborate teamwork. I admire Interstellar, but I can’t say I need to spend another three hours in the company of its characters, while Inception leaves me wanting more. This is also because its premise is so rich: it hints at countless possible stories, but turns itself into a closed circle that denies any prospect of a sequel. (It’s worth noting, too, how ingenious the device of the totem really is, with the massive superstructure of one of the largest movies ever made coming to rest on the axis of a single trembling top.) And it’s that unresolved tension, between a universe of possibilities and a remorseless cut to black, that gives us the material for so many dreams.

Tomorrow: The greatest story in movies. 

Written by nevalalee

May 11, 2015 at 8:27 am

The poster problem

leave a comment »

Avengers: Age of Ultron

Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:

This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.

If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.

Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:

But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.

Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.

Inception

It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.

And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.

Written by nevalalee

April 29, 2015 at 8:44 am

An unfinished decade

leave a comment »

Joaquin Phoenix in The Master

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What movie from our best films of the decade so far list doesn’t deserve to be on there?”

Toward the end of the eighties, Premiere Magazine conducted a poll of critics, directors, writers, and industry insiders to select the best films of the previous decade. The winners, in order of the number of votes received, were Raging Bull, Wings of Desire, E.T., Blue Velvet, Hannah and Her Sisters, Platoon, Fanny and Alexander, Shoah, Who Framed Roger Rabbit, and Do the Right Thing, with The Road Warrior, Local Hero, and Terms of Endearment falling just outside the top ten. I had to look up the list to retype it here, but I also could have reconstructed much of it from memory: a battered copy of Premiere’s paperback home video guide—which seems to have vanished from existence, along with its parent magazine, based on my inability, after five minutes of futile searching, to even locate the title online—was one of my constant companions as I started exploring movies more seriously in high school. And if the list contains a few headscratchers, that shouldn’t be surprising: the poll was held a few months before the eighties were technically even over, which isn’t close to enough time for a canon to settle into a consensus.

So how would an updated ranking look? The closest thing we have to a more recent evaluation is the latest Sight & Sound critics’ poll of the best films ever made. Pulling out only the movies from the eighties, the top films are Shoah, Raging Bull, Blade Runner, Blue Velvet, Fanny and Alexander, A City of Sadness, Do the Right Thing, L’Argent, The Shining, and My Neighbor Totoro, followed closely by Come and See, Distant Voices Still Lives, and Once Upon a Time in America. There’s a degree of overlap here, and Raging Bull was already all but canonized when the earlier survey took place, but Wings of Desire, which once came in second, is nowhere in sight, its position taken by a movie—Blade Runner—that didn’t even factor into the earlier conversation. The Shining received the vote of just a single critic in the Premiere poll, and at the time it was held, My Neighbor Totoro wouldn’t be widely seen outside Japan for another three years. Still, if there’s a consistent pattern, it’s hard to see, aside from the obvious point that it takes a while for collective opinion to stabilize. Time is the most remorseless, and accurate, critic of them all.

Inception

And carving up movies by decade is an especially haphazard undertaking. A decade is an arbitrary division, much more so than a single year, in which the movies naturally engage in a kind of accidental dialogue. It’s hard to see the release date of Raging Bull as anything more than a quirk of the calendar: it’s undeniably the last great movie of the seventies. You could say much the same of The Shining. And there’s pressure to make any such list conform to our idea of what a given decade was about. The eighties, at least at the time, were seen as a moment in which the auteurism of the prior decade was supplanted by a blockbuster mentality, encouraged, as Tony Kushner would have it, by an atmosphere of reactionary politics, but of course the truth is more complicated. Blue Velvet harks back to the fifties, but the division at its heart feels like a product of Reaganism, and the belated ascent of Blade Runner is an acknowledgment of the possibilities of art in the era of Star Wars. (As an offhand observation, I’d say that we find it easier to characterize decades if their first years happen to coincide with a presidential election. As a culture, we know what the sixties, eighties, and aughts were “like” far more than the seventies or nineties.)

So we should be skeptical of the surprising number of recent attempts to rank works of art when the decade in question is barely halfway over. This week alone, The A.V. Club did it for movies, while The Oyster Review did it for books, and even if we discount the fact that we have five more years of art to anticipate, such lists are interesting mostly in the possibilities they suggest for later reconsideration. (The top choices at The A.V. Club were The Master, A Separation, The Tree of Life, Frances Ha, and The Act of Killing, and looking over the rest of the list, about half of which I’ve seen, I’d have to say that the only selection that really puzzled me was Haywire.) As a culture, we may be past the point where a consensus favorite is even possible: I’m not sure if any one movie occupies the same position for the aughts that Raging Bull did for the eighties. If I can venture one modest prediction, though, it’s that Inception will look increasingly impressive as time goes on, for much the same reason as Blade Runner does: it’s our best recent example of an intensely personal version thriving within the commercial constraints of the era in which it was made. Great movies are timeless, but also of their time, in ways that can be hard to sort out until much later. And that’s true of critics and viewers, too.

Stellar mass

leave a comment »

Interstellar

Note: This post does its best to avoid spoilers for Interstellar. I hope to have a more detailed consideration up next week.

Halfway through the first showing of Interstellar at the huge IMAX theater at Chicago’s Navy Pier, the screen abruptly went black. At a pivotal moment, the picture cut out first, followed immediately by the sound, and it took the audience a second to realize that the film had broken. Over the five minutes or so that followed, as we waited for the movie to resume, I had time to reflect on the sheer physicality of the technology involved. As this nifty featurette points out, a full print of Interstellar weighs six hundred pounds, mounted on a six-foot platter, and just getting it to move smoothly through the projector gate presents considerable logistical challenges, as we found out yesterday. (The film itself is so large that there isn’t room on the platter for any previews or extraneous features: it’s the first movie I’ve ever seen that simply started at the scheduled time, without any tedious preliminaries, and its closing credits are startlingly short.) According to Glenn Newland, the senior director of operations at IMAX, the company started making calls eighteen months ago to theater owners who were converting from film to digital, saying, in effect: Please hold on to that projector. You’re going to need it.

And they were right. I’ve noted before that if Christopher Nolan has indelibly associated himself with the IMAX format, that’s no accident. Nolan’s intuition about his large-scale medium seems to inform the narrative choices he makes: he senses, for instance, that plunging across a field of corn can be as visually thrilling as a journey through a wormhole or the skyline of Gotham City. Watching it, I got the impression that Nolan is drawn to IMAX as a kind of corrective to his own naturally hermetic style of storytelling: the big technical problems that the format imposes force him to live out in the world, not simply in his own head. And if the resulting image is nine times larger than that of conventional celluloid, that squares well with his approach to screenwriting, which packs each story with enough ideas for nine ordinary movies. Interstellar sometimes groans under the weight of its own ambitions; it lacks the clean lines provided by the heist plot of Inception or the superhero formula of his Batman films. It wants to be a popcorn movie, a visionary epic, a family story, and a scientifically rigorous adventure that takes a serious approach to relativity and time dilation, and it succeeds about two-thirds of the time.

Christopher Nolan on the set of Interstellar

Given the loftiness of its aims, that’s not too bad. Yet it might have worked even better if it had taken a cue from the director whose influence it struggles so hard to escape. Interstellar is haunted by 2001 in nearly every frame, from small, elegant touches, like the way a single cut is used to cover a vast stretch of time—in this case, the two-year journey from Earth to Saturn—to the largest of plot points. Like Kubrick’s film, it pauses in its evocation of vast cosmic vistas for a self-contained interlude of intimate, messy drama, which in both cases seems designed to remind us that humanity, or what it creates, can’t escape its most primitive impulses for self-preservation. Yet it also suffers a little in the comparison. Kubrick was shrewd enough to understand that a movie showing mankind in its true place in the universe had no room for ordinary human plots, and if his characters seem so drained of personality, it’s only a strategy for eliminating irrelevant distractions. Nolan wants to have it all, so he ends up with a film in which the emotional pieces sit uneasily alongside the spectacle, jostling for space when they should have had all the cosmos at their disposal.

Like most of Nolan’s recent blockbuster films, Interstellar engages in a complicated triangulation between purity of vision and commercial appeal, and the strain sometimes shows. It suffers, though much less glaringly, from the same tendency as Prometheus, in which characters stand around a spacecraft discussing information, like what the hell a wormhole is, that should have probably been covered long before takeoff. And while it may ultimately stand as Nolan’s most personal film—it was delivered to theaters under the fake title Flora’s Letter, which is named after his daughter—its monologues on the transcendent power of love make a less convincing statement than the visual wonders on display. (All praise and credit, by the way, are due to Matthew McConaughey, who carries an imperfectly conceived character with all the grace and authority he brought to True Detective, which also found him musing over the existence of dimensions beyond our own.) For all its flaws, though, it still stands as a rebuke to more cautious entertainments, a major work from a director who hardly seems capable of anything else. In an age of massless movies, it exerts a gravitational pull all its own, and if it were any larger, the theater wouldn’t be able to hold it.

Written by nevalalee

November 6, 2014 at 8:30 am

The lost art of the extended take

with 8 comments

Karen Allen in Raiders of the Lost Ark

For Christmas, I got my wife a copy of The Wes Anderson Collection by Matt Zoller Seitz, which is one of those ideal presents that the giver buys for the recipient because he secretly wants it for himself—I’ve spent at least as much time browsing through it as she has. It’s a beautiful book of interviews with a fascinating subject, and I suspect that it will provide a lot of material for this blog. Today, though, I’d like to focus on one short exchange, which occurs during a discussion of Anderson’s use of extended tracking shots. Seitz points to the drinking contest in Raiders of the Lost Ark as an example of a great director subtly shooting a long scene in a single take without cuts, and shrewdly notes that our knowledge that the action is unfolding in real time subliminally increases the suspense. Anderson agrees: “You’re not only waiting to see who’s going to get knocked out with the liquor; you’re waiting to see who’s going to screw up the take.” Elsewhere, Seitz has written of how the way the scene was shot adds “a second, subtle layer of tension to an already snappy scene…our subliminal awareness that we’re seeing a filmed live performance, and our sporting interest in seeing how long they can keep it going.”

This is a beautiful notion, because it exemplifies a quality that many of my favorite films share: the fictional story that the movie is telling shades imperceptibly into the factual story of how the movie itself was made, which unfolds in parallel to the main action, both invisibly and right in front of our eyes. It’s something like Truffaut’s statement that a movie should simultaneously express “an idea of life and an idea of cinema,” but it’s less about any specific philosophical idea than a sense that the narrative that the movie presents to us is a metaphor for its own creation. We see this in a movie like Citizen Kane, in which it’s hard not to read the youthful excitement of Kane’s early days at the Inquirer as a portrait of Orson Welles arriving on the RKO lot, and its later, disillusioned passages as a weird prefiguring of what would happen to Welles decades down the line; or even a movie like Inception, in which the roles of the participants in the mind heist correspond to those of the team behind the camera—the director, the producer, the production designer—and the star looks a little like Chris Nolan himself. (Someone, possibly me, should really make a slideshow on how directors tend to cast leading roles with their own doubles, as Anderson often does as well.)

Gravity

And the ultimate expression of the marriage between the filmed story and the story of its creation is the extended shot. It’s a moment in which the movie we’re watching fuses uncannily with its own behind-the-scenes documentary: for a minute or two, we’re on the set, watching the action at the director’s side, and the result is charged with the excitement of live performance. If every cut, as Godard says, is a lie, a continuous take brings us as close to the truth—or at least to a clever simulacrum of it—as the movies can manage. It doesn’t need to be overtly flashy, either: I’ve never seen a better use of an extended take than in the party scene in 4 Months, 3 Weeks, and 2 Days, in which the camera remains stationary for an entire reel. But there’s also a childlike pleasure in seeing filmmakers taking a big risk and getting away with it. You see this in the massively choreographed long takes, involving dozens or hundreds of players, in movies as different as Absolute Beginners, Boogie Nights, and Hard BoiledAnd if the hallway fight in Inception ranks among the most thrilling sequences of the decade, it’s because we’re witnessing something astonishing as it must have appeared that day on the set, with Joseph Gordon-Levitt getting battered by the walls of that rotating corridor.

So it’s worth taking a moment to remember that it’s not the long take itself that matters, but the fact that it puts us in the filmmaker’s shoes, which we lose when an extended take is the result of digital trickery. I’m as big a fan as any of the opening shot of Gravity, which recently made my updated list of the greatest movie openings of all time, but there’s no escaping the fact that we’re seeing something that has been invisibly stitched together over many different days of filming, and nearly everything in sight has been constructed through visual effects. This doesn’t make it any less miraculous: along with Life of Pi, it marks a turning point, at least for me, in which digital effects finally live up to their promise of giving us something that can’t be distinguished from reality. But it’s a triumph of vision, planning, and conceptual audacity, without the extra frisson that arises from the sustained tightrope act of an extended shot done in the camera. As time goes by, it will become easier to create this sort of effect from multiple takes, as Cuarón himself did so brilliantly in Children of Men. But it can’t compare to the conspiratorial tension we get from a true tracking shot, done with the full possibility of a disastrous mistake, in which the movies, so often crafted from tricks and illusions, really do seem to defy gravity.

Written by nevalalee

December 26, 2013 at 9:10 am

The best closing shots in film

leave a comment »

Lawrence of Arabia

Note: Since I’m taking a deserved break for the holidays, I’m reposting a couple of my favorite entries from early in this blog’s run. This post was originally published, in a slightly different form, on January 13, 2011. Visual spoilers follow. Cover your eyes!

As I’ve noted before, the last line of a novel is almost always of interest, but the last line of a movie generally isn’t. It isn’t hard to understand why: movies are primarily a visual medium, and there’s a sense in which even the most brilliant dialogue can often seem beside the point. And as much the writer in me wants to believe otherwise, audiences don’t go to the movies to listen to words: they go to look at pictures.

Perhaps inevitably, then, there are significantly more great closing shots in film than there are great curtain lines. Indeed, the last shot of nearly every great film is memorable, so the list of finalists can easily expand into the dozens. Here, though, in no particular order, are twelve of my favorites. Click for the titles:

The power of clichés

with 4 comments

Brian Eno

Over the last few weeks, I’ve become fascinated with Brian Eno’s Oblique Strategies. I’ve always been drawn to the creative possibilities of randomness, and this is a particularly interesting example: in its original form, it’s a deck of cards, designed to be drawn from at random, each of which contains a single short aphorism, paradox, or suggestion intended to help break creative blocks. The tone of the aphorisms ranges from practical to gnomic to cheeky—”Overtly resist change,” “Turn it upside down,” “Is the tuning appropriate?”—but their overall intention is to gently disrupt the approach you’ve been taking toward the problem at hand, which often involves inverting your assumptions. This morning, for instance, when I drew a random card from the excellent online version, the result was: “Use clichés.” At first glance, this seems like strange advice, since most of us try to follow William Safire’s advice to avoid clichés like the plague. In reality, though, it’s a useful reminder that clichés do have their place, at least for an artist who has the skill and experience to deploy them correctly.

A cliché, by definition, is a unit of language or narrative that is already familiar to the reader, often to the point of losing all meaning. At their worst, clichés shut down thought by substituting a stereotyped formula for actual engagement with the subject. Still, there are times when this kind of conceptual invisibility can be useful. Songwriters, in particular, know that they can be an invaluable way of managing complexity within a piece of music, which often incorporates lulls or repetition as a courtesy to the listener. Paul Simon says it best:

So when I begin, I usually improvise a melody and sing words—and often those words are just clichés. If it is an old songwriting cliché, most of the time I throw it away, but sometimes I keep it, because they’re nice to have. They’re familiar. They’re like a breather for the listener. You can stop wondering or thinking for a little while and just float along with the music.

This kind of pause is one of the subtlest of all artistic tools: it provides a moment of consolidation, allowing the listener—or reader—to process the information presented so far. When we hear or read a cliché, we don’t need to pay attention to it, and that license to relax can be crucial in a work of art that is otherwise dense and challenging.

Paul Simon

This is a simply particular case of a larger point I’ve made elsewhere, which is that not every page of a story can be pitched at the same level of complexity or intensity. With few exceptions, even the most compressed narratives need to periodically rise and fall, both to give the reader a break and to provide a contrast or baseline for more dramatic moments. As the blogger Mike Meginnis has pointed out, this is one reason that we find flat, cartoonish characters in the fiction of Thomas Pynchon: any attempt to create conventionally plausible personalities when the bounds of complexity are being pushed in every other direction would quickly become unmanageable. And I’ve pointed out before that the plot of a movie like Inception needs to be simpler than it seems at first glance: the characters are mostly defined by type, without any real surprises after they’ve been introduced, and once the premise has been established, the plot unfolds in a fairly straightforward way. Christopher Nolan is particularly shrewd at using the familiar tropes of the story he’s telling—the thriller, the comic book movie, the heist film—for grounding us on one level while challenging us on others, which is one reason why I embedded a conventional procedural story at the heart of The Icon Thief.

If there’s one place where clichés don’t work, however, it’s in the creation of character. Given the arguments above, it might seem fine to use stereotypes or stock characters in the supporting cast, which allows the reader to tune them out in favor of the more important players, but in practice, this approach can easily backfire. Simple characters have their place, but it’s best to convey this through clean, uncomplicated motivations: characters who fall too easily into familiar categories often reflect a failure of craft or diligence on the author’s part, and they tend to cloud the story—by substituting a list of stock behaviors for clear objectives—rather than to clarify it. And this applies just as much to attempts to avoid clichés by turning them on their heads. In an excellent list of rules for writing science fiction and fantasy, the author Terry Bisson notes: “Racial and sexual stereotypes are (still) default SF. Avoiding them takes more than reversals.” It isn’t enough, in other words, to make your lead female character really good at archery. Which only hints at the most important point of all: as Niels Bohr said, the opposite of a great truth is another great truth, and the opposite of a cliché is, well, another cliché.

Written by nevalalee

April 23, 2013 at 8:37 am

The problem of narrative complexity

with 5 comments

David Foster Wallace

Earlier this month, faced with a break between projects, I began reading Infinite Jest for the first time. If you’re anything like me, this is a book you’ve been regarding with apprehension for a while now—I bought my copy five or six years ago, and it’s followed me through at least three moves without being opened beyond the first page. At the moment, I’m a couple of hundred pages in, and although I’m enjoying it, I’m also glad I waited: Wallace is tremendously original, but he also pushes against his predecessors, particularly Pynchon, in fascinating ways, and I’m better equipped to engage him now than I would have been earlier on. The fact that I’ve published two novels in the meantime also helps. As a writer, I’m endlessly fascinated by the problem of managing complexity—of giving a reader enough intermediate rewards to justify the demands the author makes—and Wallace handles this beautifully. Dave Eggers, in the introduction to the edition I’m reading now, does a nice job of summing it up:

A Wallace reader gets the impression of being in a room with a very talkative and brilliant uncle or cousin who, just when he’s about to push it too far, to try our patience with too much detail, has the good sense to throw in a good lowbrow joke.

And the ability to balance payoff with frustration is a quality shared by many of our greatest novels. It’s relatively easy to write a impenetrable book that tries the reader’s patience, just as it’s easy to create a difficult video game that drives players up the wall, but parceling out small satisfactions to balance out the hard parts takes craft and experience. Mike Meginnis of Uncanny Valley makes a similar point in an excellent blog post about the narrative lessons of video games. While discussing the problem of rules and game mechanics, he writes:

In short, while it might seem that richness suggests excess and maximal inclusion, we actually need to be selective about the elements we include, or the novel will not be rich so much as an incomprehensible blur, a smear of language. Think about the very real limitations of Pynchon as a novelist: many complain about his flat characters and slapstick humor, but without those elements to manage the text and simplify it, his already dangerously complex fiction would become unreadable.

Pynchon, of course, casts a huge shadow over Wallace—sometimes literally, as when two characters in Infinite Jest contemplate their vast silhouettes while standing on a mountain range, as another pair does in Gravity’s Rainbow. And I’m curious to see how Wallace, who seems much more interested than Pynchon in creating plausible human beings, deals with this particular problem.

Inception

The problem of managing complexity is one that has come up on this blog several times, notably in my discussion of the work of Christopher Nolan: Inception‘s characters, however appealing, are basically flat, and the action is surprisingly straightforward once we’ve accepted the premise. Otherwise, the movie would fall apart from trying to push complexity in more than one direction at once. Even works that we don’t normally consider accessible to a casual reader often incorporate elements of selection or order into their design. The Homeric parallels in Joyce’s Ulysses are sometimes dismissed as an irrelevant trick—Borges, in particular, didn’t find them interesting—but they’re very helpful for a reader trying to cut a path through the novel for the first time. When Joyce dispensed with that device, the result was Finnegans Wake, a novel greatly admired and rarely read. That’s why encyclopedic fictions, from The Divine Comedy to Moby-Dick, tend to be structured around a journey or other familiar structure, which gives the reader a compass and map to navigate the authorial wilderness.

On a more modest level, I’ve frequently found myself doing this in my own work. I’ve mentioned before that I wanted one of the three narrative strands in The Icon Thief to be a police procedural, which, with its familiar beats and elements, would serve as a kind of thread to pull the reader past some of the book’s complexities. More generally, this is the real purpose of plot. Kurt Vonnegut, who was right about almost everything, says as much in one of those writing aphorisms that I never tire of quoting:

I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading.

The emphasis is mine. Plot is really a way of easing the reader into that greatest of imaginative leaps, which all stories, whatever their ambitions, have in common: the illusion that these events are really taking place, and that characters who never existed are worthy of our attention and sympathy. Plot, structure, and other incidental pleasures are what keep the reader nourished while the real work of the story is taking place. If we take it for granted, it’s because it’s a trick that most storytellers learned a long time ago. But the closer we look at its apparent simplicity, the sooner we realize that, well, it’s complicated.

Making the impossible plausible

with 2 comments

Kim Novak and James Stewart in Vertigo

Ideally, all stories should consist of a series of events that arise organically from the characters and their decisions, based on a rigorous understanding of how the world really works. In practice, and especially in genre fiction, it doesn’t always happen that way. Sometimes a writer just has a plot development that he really wants to write, and isn’t entirely sure how to get there from here. Frequently he’ll construct a story out of a number of unrelated ideas, and needs to cobble them together in a way that will seem inevitable after the fact. And sometimes he’ll simply paint himself into a corner, or realize that he’s overlooked a monstrous plot hole, and wants to extricate himself in a way that leaves his dignity—and most of the earlier material—intact. A purist might say that the author should throw the story out and start again, proceeding more honestly from first principles, but a working writer doesn’t always have that luxury. Better, I think, to find a way of keeping the parts that work and smoothing over the connective bits that seem implausible or unconvincing, while keeping the reader immersed in the fictional dream. And in the spirt of faking it until you make it, I offer the following suggestions:

1. Make it a fait accompli. As I’ve mentioned before, a reader is much more likely to accept a farfetched narrative development if the characters take it for granted. Usually, this means putting the weakest link in your story offstage. My favorite example is from Some Like It Hot, an incredibly contrived movie that shrewdly refuses to show the most crucial moment in the entire plot: instead of giving us a scene in which the main characters decide to go on the run in drag, it just cuts to the two of them already in skirts, rushing across the platform to catch a train to Florida. The lesson, clearly, is that if something in your story is obviously impossible, it’s better to pretend that it’s already happened. And the best strategy of all is to push the most implausible element of your story outside the boundaries of the plot itself, so it’s already in place before the story begins, which is what I’ve previously called the anthropic principle of fiction. If the viewer doesn’t see something happen, it requires an additional mental effort to rewind the story to object to it, and by then, the plot and characters have already moved on. It’s best to make like Jack Lemmon in heels, and just run with it.

Tony Curtis and Jack Lemmon in Some Like It Hot

2. Tell, don’t show. Normally, we’re trained to depict a turning point in the action as vividly as possible, and are taught that it’s bad form to describe an important moment indirectly or leave it offstage. When it comes to a weak point in the plot, however, that sort of scrutiny can only raise questions. It’s smarter, instead, to break a cardinal rule of fiction and get it out of the way as unobtrusively as possible. An implausible conversation, for instance, might best be rendered as indirect dialogue, leaving readers to fill in a more convincing version themselves. And if you can’t dramatize something in a credible fashion, it might be best to summarize it, in the way dramatists use the arrival of a messenger to convey developments that would be impossible to stage, although it’s best to keep this sort of thing as short as you can. There’s a particularly gorgeous example in the novel The Silence of the Lambs. Thomas Harris shows us Hannibal Lecter’s escape from federal security in loving detail, but when it comes to the most astonishing element of his getaway—the fact that he peels off another man’s face and wears it like a mask—he lets Jack Crawford describe it after the fact in a few terse sentences. It’s still hard to buy, but it’s more acceptable than if he’d allowed us to question the action as it unfolded.

3. Use misdirection. The secret of sleight of hand is that the audience’s eye is naturally drawn to action, humor, and color, allowing the performer to conduct outrageous manipulations in plain sight. Similarly, when a story is engaging enough from moment to moment, the reader is less likely to object to inconsistencies and plot holes. A film like Inception, for instance—which is probably my favorite movie of the last fifteen years—is riddled with logical problems, and even the basic workings of its premise aren’t entirely consistent, but we’re having too much fun to care. In some ways, this is the most important point of all: when a work of art is entertaining, involving, and emotionally true, we’re more likely to forgive the moments when the plot creaks. Some of our greatest books and movies, like Vertigo, are dazzling precisely because they expend a huge amount of effort to convince us of premises that, if the artist proceeded with uncompromising logic, would never make it past a rough draft. Just remember, as Aristotle pointed out, that a convincing impossibility is preferable to an unconvincing possibility, and take it from there.

Written by nevalalee

February 1, 2013 at 9:50 am

Looper and the secret of good science fiction

with 4 comments

There are a lot of things to recommend about Looper, the excellent new science-fiction thriller from writer and director Rian Johnson, but one of my favorite elements is the movie’s time machine. It looks something like an industrial washer-dryer, and we only see it for a few seconds, housed in a dingy warehouse somewhere in China. To use it, you just shove someone inside, and he comes out the other end at a specific location thirty years in the past. None of the characters seem especially interested in knowing how it works, any more than we’d be curious about, say, the mechanics of our local subway—and this is exactly how it should be. Like Inception, which never really explains its dream invasion technology, Looper takes its biggest imaginative leap for granted, which accounts for a lot of its brainy but grounded appeal. (Actually, to be perfectly accurate, time travel is only the second-biggest imaginative leap in the movie…but I can’t say anything more without giving the plot away.)

This is how science fiction ought to be: less science, more fiction. I don’t know what the writing process behind Looper was like, but I imagine that Johnson received a fair amount of pressure from outside readers to spell out this information in greater detail—studio executives love exposition—and managed to resist it. (Evidently, Johnson shot, or at least conceived, a special-effects sequence depicting the process of time travel, with the help of Primer director Shane Carruth, but none of this seems to have survived in the final cut.) Instead, he takes time travel as a given and uses it to tell a complicated but always lucid story that cleverly teases out the potential of its premise. I’m a sucker for time travel movies with even a modicum of ambition—I even liked Déjà Vu—and Looper deserves a lot of credit for presenting its paradoxes without holding the audience’s hand. It’s hard to overstate how difficult this is, and one of the movie’s great virtues is that it makes it look so easy.

This is, in short, a very smart screenplay, and it’s one that I expect to cite approvingly at various points on this blog. Among other things, it provides one of the best recent examples of the anthropic principle of fiction, by casually introducing telekinesis as a minor plot point—certain characters can move small objects with their minds, but only at the level of a parlor trick—in order for it to pay off down the line in a major way. It doesn’t indulge in stylistic flourishes for their own sake, but it’s more than capable of big formal conceptions when necessary, as in one dazzling montage that follows one possible timeline over the course of three decades. It quietly develops two persuasive futures without making a point of it, and gives us an unusually interesting supporting cast. (I especially liked Jeff Daniels in the role of a man from the future, whose knowledge of coming events is rivaled only by that of Will McAvoy.) And it’s also ready to make its leads unsympathetic, as when the character played by Bruce Willis makes an agonizing choice that few other movies would be willing to follow to its logical conclusion.

If there’s one small disappointment that prevents Looper from becoming a stone classic out of the gate, it’s that its action isn’t quite as inventive as the story surrounding it. There’s nothing that says an innovative science-fiction thriller is required to deliver sensational action, but when you look at the short list of recent movies that have pushed the envelope in the genre—The Matrix, Minority Report, Children of Men, and Inception—you often find writers and directors who are just as eager to show us something new on a visceral level as to tell us a mind-bending story. Looper doesn’t seem as committed to redefining its boundaries in all directions, and its chases and gunfights are all fairly routine. (Its most memorable action beat is a direct lift from The Fury, but not remotely as effective.) Still, that shouldn’t minimize what Johnson has accomplished: he’s set a lot of challenges for himself, met nearly all of them, and come up with one of the two or three best movies I’ve seen all year.

Written by nevalalee

October 1, 2012 at 9:59 am

“A few moments earlier, on the other side of the estate…”

leave a comment »

(Note: This post is the nineteenth installment in my author’s commentary for The Icon Thief, covering Chapter 18. You can read the earlier installments here.)

Heist stories are fun for many reasons, but a lot of their appeal comes from the sense that they’re veiled allegories for the act of storytelling itself. We see this clearly in a movie like Inception, in which the various players can be interpreted as corresponding to analogous roles behind the camera—Cobb is the director, Saito the producer, Ariadne the set designer, Eames the primary actor, and Arthur is, I don’t know, the line producer, while Fischer, the mark, is a surrogate for the audience itself. (For what it’s worth, Christopher Nolan has stated that any such allegory was an unconscious one, although he seems to have embraced it after the fact.) Even in a novel, which is produced by a crew of one, there’s something in the structure of a heist that evokes a writer’s tools of the trade. It involves disguise, misdirection, perfect timing, and a ticking clock. If all goes well, it’s a well-oiled machine, and the target doesn’t even know that he’s been taken, at least not until later, when he goes back and puts together the pieces. And it’s no surprise that the heists contrived by writers, who spend most of their time constructing implausible machines, tend to be much more elaborate than their counterparts in the real world.

When I realized that I wanted to put a heist at the center of The Icon Thief, I was tickled by the opportunity, as well as somewhat daunted by the challenge. On the bright side, I had a lot of models to follow, so cobbling together a reasonable heist, in itself, was a fairly straightforward proposition. The trouble, of course, is that nearly everything in the heist genre has been done before. Every year seems to bring another movie centered on an impregnable safe or mansion, with a resourceful team of thieves—or screenwriters—determined to get inside. Audiences have seen it all. And I knew from early on that I wanted to make this heist a realistic one, without any laser grids or pressure-sensitive floors. I wanted the specifics to be clever, but not outside the means of a smart thief operating with limited resources. (A movie like Ocean’s 11, as entertaining as it may be, raises the question of why a group of criminals with access to such funding and technology would bother to steal for a living.) As I result, when I began to plot out the heist that begins to unfold in Chapter 18, I had a clear set of goals, but I wasn’t quite sure what form it would take.

The obvious place to begin was with the target itself. Consequently, I spent a memorable afternoon with a friend in the Hamptons, walking along Gin Lane, peeking over hedges, and generally acting as suspiciously as possible. The house that I describe here is a real mansion with more or less the physical setting that appears in the novel, with a mammoth hedge blocking it from the road, but a relatively accessible way in from the ocean side, where the property goes all the way down to the beach. I quickly decided that I wanted my thief to escape out the back way, onto the sand, where his getaway car would be waiting. On the way in, however, I wanted him to drive right through the gate. The crews in pickup trucks that I saw doing maintenance at many of these houses suggested one potential solution. And while I can’t quite remember how I came up with the final idea—a mid-engine pickup with an empty space under the hood large enough to allow two men to hide inside, undiscovered by security—I knew at once, when it occurred to me, that I’d found my way in.

The rest amounted to simple narrative mechanics. Following the anthropic principle of fiction that I mentioned earlier this week, I knew that I had to introduce the pickup early on, at least in the background, to make its ultimate use seem like less of a stretch—hence Sharkovsky’s enthusiasm for trophy trucks, which pops up at several points earlier in the novel. This chapter also includes one of the rare scenes told from the point of view from someone other than one of the central characters, since I wanted to put the reader in a shoes of a security guard who checks the truck thoroughly before letting it through the front gate, but neglects to look under the hood. The result is one of the novel’s more gimmicky moments, but I think it works. (Whether the arrangement that I describe in the book would actually function in real life is another matter, but at least it’s not entirely implausible, which by the standards of the genre is more than enough.) Sometimes I wonder if it’s too gimmicky, but that’s one of the pleasures of suspense: I can honor the heist genre with a quick nod in its direction, then move on as realistically as I can. And this heist is far from over…

Written by nevalalee

September 28, 2012 at 9:50 am

Christopher Nolan and the maze of storytelling

with one comment

The release of the final trailer for The Dark Knight Rises gives me as good an excuse as any to talk once more about the work of Christopher Nolan, who, as I’ve said before, is the contemporary director who fills me with the most awe. Nolan has spent the past ten years pushing narrative complexity on the screenplay level as far as it will go while also mastering every aspect of large-scale blockbuster filmmaking, and along the way, he’s made some of the most commercially successful films of the decade while retaining a sensibility that remains uniquely his own. In particular, he returns repeatedly to issues of storytelling, and especially to the theme of how artists, for all their intelligence and preparation, can find themselves lost in their own labyrinths. Many works of art are ultimately about the process of their own creation, of course, but to a greater extent than usual, Nolan has subtly given us a portrait of the director himself—meticulous, resourceful, but also strangely ambivalent toward the use of his own considerable talents.

Yesterday, I referred to my notes toward a novel as urgent communications between my past and future selves, “a la Memento,” but it was only after typing that sentence that I realized how accurate it really is. Leonard Shelby, the amnesiac played by Guy Pearce, is really a surrogate for the screenwriter: he’s thrust into the middle of a story, without any context, and has to piece together not just what comes next, but what happened before. His notes, his visual aids, and especially the magnificent chart he hangs on his motel room wall are variations of the tools that a writer uses to keep himself oriented in during a complex project—including, notably, Memento itself. It isn’t hard to imagine Nolan and his brother Jonathan, who wrote the original story on which the screenplay is based, using similar charts to keep track of their insanely intricate narrative, with a protagonist who finally turns his own body into a sort of corkboard, only to end up stranded in his own delusions.

This theme is explored repeatedly in Nolan’s subsequent films—notably The Prestige, in which the script’s endless talk about magic and sleight of hand is really a way of preparing us for the trick the movie is trying to play on the audience—but it reaches its fullest form in Inception. If Memento is a portrait of the independent screenwriter, lonely, paranoid, and surrounded by fragments of his own stories, Inception is an allegory for blockbuster moviemaking, with a central figure clearly based on the director himself. Many viewers have noted the rather startling visual similarity between Nolan and his hero, and it’s easy to assign roles to each of the major characters: Cobb is the director, Saito the producer, Ariadne the art director, all working toward the same goal as that of the movie itself—to transport the viewer into a reality where the strangest things seem inevitable. While Nolan has claimed that such an allegory wasn’t intentional, Inception couldn’t have been conceived, at least not in its current form, by a man who hadn’t made several huge movies. And at the end, we’re given the sense that the artist himself has been caught in a web of his own design.

In this light, Nolan’s Batman movies start to seem like his least personal work, which is probably true, but his sensibility comes through here as well. Batman Begins has an art director’s fascination with how things are really made—like Batman’s cowl, assembled from parts from China and Singapore—and The Dark Knight takes the figure of the director as antihero to its limit. The more we watch it, the more Nolan seems to uneasily identify, not with Batman, but with the Joker, the organized, methodical, nearly omniscient toymaker who can only express himself through violence. If the wintry, elegiac tone of our early glimpses of The Dark Knight Rises is any indication, Nolan seems ready to move beyond this, much as Francis Coppola—also fond of directorial metaphors in his work—came to both to identify with Michael Corleone and to dislike the vision of the world he had expressed in The Godfather. And if Nolan evolves in similar ways, it implies that the most interesting phase of his career is yet to come.

Written by nevalalee

May 2, 2012 at 9:45 am

%d bloggers like this: