Alec Nevala-Lee

Thoughts on art, culture, and the writing life.

Posts Tagged ‘Christopher Nolan

The test of tone

with 2 comments

Brendan Gleeson and Colin Farrell in In Bruges

Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.

The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.

Hugh Dancy on Hannibal

As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.

At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the pleasures of following Hannibal this season has been watching the show acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. I haven’t had a chance to catch Fargo on FX, but I’m curious about it, because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.

Written by nevalalee

April 22, 2014 at 9:22 am

The Tone Ranger

leave a comment »

Armie Hammer and Johnny Depp in The Lone Ranger

Last night, I watched The Lone Ranger. Given the fact that I haven’t yet seen 12 Years a Slave, Captain Phillips, or Before Midnight, this might seem like an odd choice. In my defense, I can only plead that on those rare evenings when my wife is out of the house, I usually seize the opportunity to watch something that I don’t think she’ll enjoy—the last time around it was Battle Royale. I’ve also been intrigued by The Lone Ranger ever since it flamed out in spectacular fashion last summer. Regular readers will know that I have a weakness for flops, and everything I’d read made me think that this was the kind of fascinating studio mess that I find impossible to resist. Quentin Tarantino’s guarded endorsement counted for a lot as well, and we’re already seeing the first rumblings of a revisionist take that sees the film as a neglected treasure. I wouldn’t go quite so far; it has significant problems, and I’m not surprised that the initial reaction was so underwhelming. But I liked it a lot all the same. It’s an engaging, sometimes funny, occasionally exciting movie with more invention and ambition than your average franchise installment, and I’d sooner watch its climactic train chase again than, say, most of The Avengers.

And what interests me the most is its most problematic element, which is the range of tones it encompasses. The Lone Ranger isn’t content just to be a Western; on some level, it wants to be all Westerns, quoting freely from Dead Man and Once Upon a Time in the West while also indulging in slapstick, adventure, gruesome violence, hints of the supernatural, and even moments of tragedy. It’s a revenge narrative by way of Blazing Saddles, and it’s no surprise that the result is all over the map. Part of this may be due to the sheer scale of the production—when someone gives you $200 million to make a Western, you may as well throw everything you can into the pot—but it’s also a reflection of the sensibilities involved. Director Gore Verbinski and screenwriters Ted Elliot and Terry Rossio had collaborated earlier, of course, on the Pirates of the Caribbean franchise, which gained a lot of mileage from a similar stylistic mishmash, though with drastically diminishing returns. And Verbinski at his best has the talent to pull it off: he combines the eye of Michael Bay with a real knack for comedy, and I predicted years ago that he’d win an Oscar one day. (He eventually did, for Rango.)

Gore Verbinski on the set of The Lone Ranger

But playing with tone is a dangerous thing, as we see in the later Pirates films, and The Lone Ranger only gets maybe eighty percent of the way to pulling it off. Watching it, I was reminded of what the screenwriter Tony Gilroy says in his contribution to William Goldman’s Which Lie Did I Tell? Gilroy starts by listing examples of movies that experiment with tone, both good (Dr. Strangelove, The Princess Bride) and bad (Batman and Robin, Year of the Comet) and concludes:

But tone? Tone scares me…Why? Because when it goes wrong it just sucks out loud. I think the audience—the reader—I think they make some critical decisions in the opening movements of a film. How deeply do I invest myself here? How much fun can I have? Should I be consciously referencing the rest of my life during the next two hours, or is this an experience I need to surrender to? Are you asking for my heart or my head or both? Am I rooting for the hero or the movie? Just how many pounds of disbelief are you gonna ask me to suspend before this is through?

The Lone Ranger tramples on all these questions, asking us to contemplate the slaughter of Comanches a few minutes before burying our heroes up to their necks in a nest of scorpions, and the fact that it holds together even as well as it does is a testament both to the skill of the filmmakers and the power of a strong visual style. If nothing else, it looks fantastic, which helps us over some of the rough spots, although not all of them.

And it’s perhaps no accident that William Goldman’s first great discovery of a new tone came in Butch Cassidy and the Sundance Kid. It’s possible that there’s something about the Western that encourages this kind of experimentation: all it needs is a few men and horses, and the genre has been so commercially weakened in recent years that filmmakers have the freedom to try whatever they think might work. It’s true that The Lone Ranger works best in its last forty minutes, when The William Tell Overture blasts over the soundtrack and it seems content to return to its roots as a cliffhanging serial, but when you compare even its most misguided digressions to the relentless sameness of tone in a Liam Neeson thriller or a Bourne knockoff, it feels weirdly like a step forward. (Even Christopher Nolan, a director I admire immensely, has trouble operating outside of a narrow, fundamentally serious tonal range—it’s his one great shortcoming as a storyteller.) Going to the movies every summer would be more fun in general if more megabudgeted blockbusters looked and felt like The Lone Ranger, and its failure means that we’re more likely to see the opposite.

Writing what you never knew you needed

with one comment

Moira Shearer in The Red Shoes

I’ve said more than once that whenever I start a new writing project, I’m trying to end up with a story that I want to read. That’s the kind of thing writers tend to say when asked why they’re drawn to certain types of material, and in a limited sense, it’s true. When I look back at my body of work, it’s clear that it reflects my own particular tastes in fiction: I like tight, detailed narratives with an emphasis on plot and unusual ideas, and for the most part, that’s the kind of story I’ve written. To the extent that the outcome ever surprises me, it’s usually because of the subject matter—I’ll often decide to write a story about a world I know little about, trusting in research and brainstorming to take me into unexpected places. That element of the unknown goes a long way toward keeping the process interesting, and one of the trickiest parts of being a writer is balancing the desire to express one’s own personality with the need to discover something new. The result, if I’ve done it right, is a story that contains a touch of the unanticipated while also looking more or less like the unwritten work I had in mind. Or, as the artist Carl Andre puts it: “A creative person is a person who simply has a desire…to add something to the world that’s not there yet, and goes about arranging for that to happen.”

But there’s an inherent shortcoming to this approach, which lies in the fact that the works of art that matter the most to us often show us something we never knew we needed. When I think about the movies I love, for instance, they tend to be films that blindsided me completely, either with their stories themselves or with the way in which they were told. Knowing what I know about myself, it doesn’t come as a surprise that I’d enjoy the hell out of a movie like Gravity or Inception, but I never would have expected that my favorite movie of all time would turn out to be The Red Shoes, or that I’d passionately love recent films as different as Once, In Bruges, and Certified Copy. These are movies that snuck into my heart, rather than selling me in advance on their intentions, and I feel all the more grateful because they modestly expanded my sense of the possible. As much as I admire a director like Christopher Nolan, there’s no question that he’s primarily adept at delivering exactly the kind of movie that I think I want: a big, expensive, formally ambitious entertainment with just enough complexity to set it apart from the work of other skilled popular filmmakers. And while Nolan’s career has been extraordinary, it’s of a different order entirely from that of, say, Wong Kar-Wai, who at his best made small, messy, gorgeous movies that I never could have imagined on my own.

William Shimell and Juliette Binoche in Certified Copy

The same is true of fiction. Looking back over the list of my own favorite novels, surprisingly few resemble the stories I’ve tried to write myself. I love these books because they come from places that I haven’t explored firsthand, whether it’s the sustained performance of a massive novel of ideas like The Magic Mountain or a bejeweled toy like Dictionary of the Khazars. When it comes to novels that stick more closely to the categories that I understand from the inside, like The Day of the Jackal or The Silence of the Lambs, my appreciation is a little different: it’s a respect for craft, for the flawless execution of a genre I know well, and although nothing can diminish my admiration for these books, it’s altogether different from the feeling I get from a novel that comes to us as a fantastic mythical beast, or as a dispatch from some heretofore unexplored country. And it doesn’t need to be deliberately difficult or obscure. Books from Catch-22 to The Time Traveler’s Wife have left me with the sense that I’ve finished reading something that nobody else, least of all me, could have pulled off. (It’s also no accident that it took me a long time to get around to many of the books I’ve mentioned above. More than even the most difficult movies, few of which demand more than two or three hours of our attention, a novel that doesn’t resemble the ones we’ve read before demands a considerable leap of faith.)

That said, I don’t know if it’s possible for writers to feel that away about their own work, especially not for something the size of a novel, in which any flashes of outside inspiration need to share space with months or years of continuous effort. (A short story or poem, which can be conceived and written in a more compressed window of time, is more likely to retain some of that initial strangeness.) But it does imply that writing only the kinds of stories we already like goes only part of the way toward fulfilling our deepest artistic needs. A reader who spends his or her life reading only one kind of book—romance, fantasy, science fiction—ends up with a limited imaginative palate, and a big part of our literary education comes from striking out into books that might seem unfamiliar or uninviting. For writers, this means following a story wherever it takes us, giving up some measure of control, and even deliberately pushing forward into areas of writing that we don’t fully understand, trusting that we’ll find something new and worthwhile along the way. Like all ventures into the unknown, it carries a degree of risk, and we may find that we’ve invested time and energy that we can’t recover into a story that was never meant to be. But it’s far more dangerous to never take that risk in the first place.

Written by nevalalee

January 29, 2014 at 9:56 am

The lost art of the extended take

with 8 comments

Karen Allen in Raiders of the Lost Ark

For Christmas, I got my wife a copy of The Wes Anderson Collection by Matt Zoller Seitz, which is one of those ideal presents that the giver buys for the recipient because he secretly wants it for himself—I’ve spent at least as much time browsing through it as she has. It’s a beautiful book of interviews with a fascinating subject, and I suspect that it will provide a lot of material for this blog. Today, though, I’d like to focus on one short exchange, which occurs during a discussion of Anderson’s use of extended tracking shots. Seitz points to the drinking contest in Raiders of the Lost Ark as an example of a great director subtly shooting a long scene in a single take without cuts, and shrewdly notes that our knowledge that the action is unfolding in real time subliminally increases the suspense. Anderson agrees: “You’re not only waiting to see who’s going to get knocked out with the liquor; you’re waiting to see who’s going to screw up the take.” Elsewhere, Seitz has written of how the way the scene was shot adds “a second, subtle layer of tension to an already snappy scene…our subliminal awareness that we’re seeing a filmed live performance, and our sporting interest in seeing how long they can keep it going.”

This is a beautiful notion, because it exemplifies a quality that many of my favorite films share: the fictional story that the movie is telling shades imperceptibly into the factual story of how the movie itself was made, which unfolds in parallel to the main action, both invisibly and right in front of our eyes. It’s something like Truffaut’s statement that a movie should simultaneously express “an idea of life and an idea of cinema,” but it’s less about any specific philosophical idea than a sense that the narrative that the movie presents to us is a metaphor for its own creation. We see this in a movie like Citizen Kane, in which it’s hard not to read the youthful excitement of Kane’s early days at the Inquirer as a portrait of Orson Welles arriving on the RKO lot, and its later, disillusioned passages as a weird prefiguring of what would happen to Welles decades down the line; or even a movie like Inception, in which the roles of the participants in the mind heist correspond to those of the team behind the camera—the director, the producer, the production designer—and the star looks a little like Chris Nolan himself. (Someone, possibly me, should really make a slideshow on how directors tend to cast leading roles with their own doubles, as Anderson often does as well.)

Gravity

And the ultimate expression of the marriage between the filmed story and the story of its creation is the extended shot. It’s a moment in which the movie we’re watching fuses uncannily with its own behind-the-scenes documentary: for a minute or two, we’re on the set, watching the action at the director’s side, and the result is charged with the excitement of live performance. If every cut, as Godard says, is a lie, a continuous take brings us as close to the truth—or at least to a clever simulacrum of it—as the movies can manage. It doesn’t need to be overtly flashy, either: I’ve never seen a better use of an extended take than in the party scene in 4 Months, 3 Weeks, and 2 Days, in which the camera remains stationary for an entire reel. But there’s also a childlike pleasure in seeing filmmakers taking a big risk and getting away with it. You see this in the massively choreographed long takes, involving dozens or hundreds of players, in movies as different as Absolute Beginners, Boogie Nights, and Hard BoiledAnd if the hallway fight in Inception ranks among the most thrilling sequences of the decade, it’s because we’re witnessing something astonishing as it must have appeared that day on the set, with Joseph Gordon-Levitt getting battered by the walls of that rotating corridor.

So it’s worth taking a moment to remember that it’s not the long take itself that matters, but the fact that it puts us in the filmmaker’s shoes, which we lose when an extended take is the result of digital trickery. I’m as big a fan as any of the opening shot of Gravity, which recently made my updated list of the greatest movie openings of all time, but there’s no escaping the fact that we’re seeing something that has been invisibly stitched together over many different days of filming, and nearly everything in sight has been constructed through visual effects. This doesn’t make it any less miraculous: along with Life of Pi, it marks a turning point, at least for me, in which digital effects finally live up to their promise of giving us something that can’t be distinguished from reality. But it’s a triumph of vision, planning, and conceptual audacity, without the extra frisson that arises from the sustained tightrope act of an extended shot done in the camera. As time goes by, it will become easier to create this sort of effect from multiple takes, as Cuarón himself did so brilliantly in Children of Men. But it can’t compare to the conspiratorial tension we get from a true tracking shot, done with the full possibility of a disastrous mistake, in which the movies, so often crafted from tricks and illusions, really do seem to defy gravity.

Written by nevalalee

December 26, 2013 at 9:10 am

Man and supermen

leave a comment »

Man of Steel

I’m starting to come to terms with an uncomfortable realization: I don’t much like The Avengers. Watching it again recently on Netflix, I was impressed by how fluidly it constructs an engaging movie out of so many prefabricated parts, but I couldn’t help noticing how arbitrary much of it seems. Much of the second act, in particular, feels like it’s killing time, and nothing seems all that essential: it clocks along nicely, but the action scenes follow on one another without building, and the stakes never feel especially high, even as the fate of the world hangs in the balance. And I don’t think this is Joss Whedon’s fault. He comes up with an entertaining package, but he’s stuck between the need to play with all the toys he’s been given while delivering them intact to their next three movies. Each hero has his or her own franchise where the real story development takes place, so The Avengers begins to play like a sideshow, rather than the main event it could have been. This is a story about these characters, not the story, and for all its color and energy, it’s a movie devoted to preserving the status quo. (Even its most memorable moment seems to have been retconned out of existence by the upcoming Agents of S.H.I.E.L.D.)

And while it may seem pointless to worry about this now, I think it’s worth asking what kind of comic book movies we really want, now that it seems that they’re going to dominate every summer for the foreseeable future. I’ve been pondering this even more since finally seeing Man of Steel, which I liked a lot. It has huge problems, above all the fact that its vision of Superman never quite comes into focus: by isolating him from his supporting cast for much of the movie, it blurs his identity to the point where major turning points, like his decision to embrace his role as a hero, flit by almost unnoticed. Yet once it ditches its awkward flashback structure, the movie starts to work, and its last hour has a real sense of awe, scale, and danger. And I’m looking forward to the inevitable sequel, even if it remains unclear if Henry Cavill—much less Zach Snyder or Christopher Nolan—can give the scenes set at the Daily Planet the necessary zest. At their best, the Superman films evoke a line of classic newspaper comedies that extends back to His Girl Friday and even Citizen Kane, and it’s in his ability to both wear the suit and occupy the skin of Clark Kent that Christopher Reeve is most sorely missed.

Joss Whedon on the set of The Avengers

If nothing else, Man of Steel at least has a point of view about its material, however clouded it might be, which is exactly what most of the Marvel Universe movies are lacking. At this point, when dazzling special effects can be taken for granted, what we need more than anything is a perspective toward these heroes that doesn’t feel as if it were dictated solely by a marketing department. Marvel itself doesn’t have much of an incentive to change its way of doing business: it’s earned a ton of money with this approach, and these movies have made a lot of people happy. But I’d still rather watch Chris Nolan’s Batman films, or even an insanity like Watchmen or Ang Lee’s Hulk, than yet another impersonal raid on the Marvel toy chest. Whedon himself is more than capable of imposing an idiosyncratic take on his projects, and even though it only intermittently comes through in The Avengers itself, I’m hopeful that its success will allow him to express himself more clearly in the future—which is one reason why I’m looking forward to Agents of S.H.I.E.L.D., which seems more geared toward his strengths.

And although I love Nolan’s take on the material, it doesn’t need to be dark, or even particularly ambitious. For an illustration, we need look no further than Captain America, which increasingly seems to me like the best of the Marvel movies. Joe Johnston’s Spielberg imitation is the most credible we’ve seen in a long time—even better, in many ways, than Spielberg himself has managed recently with similar material—and you can sense his joy at being given a chance to make his own Raiders knockoff. Watching it again last night, even on the small screen, I was utterly charmed by almost every frame. It’s a goof, but charged with huge affection toward its sources, and I suspect that it will hold up better over time than anyone could have anticipated. Unfortunately, it already feels like an anomaly. Much of its appeal is due to the period setting, which we’ve already lost for the sequel, and it looks like we’ve seen the last of Hugo Weaving’s Red Skull, who may well turn out to be the most memorable villain the Marvel movies will ever see. Marvel’s future is unlikely to be anything other than hugely profitable for all concerned, but it’s grown increasingly less interesting.

Written by nevalalee

July 9, 2013 at 8:54 am

The power of clichés

with 4 comments

Brian Eno

Over the last few weeks, I’ve become fascinated with Brian Eno’s Oblique Strategies. I’ve always been drawn to the creative possibilities of randomness, and this is a particularly interesting example: in its original form, it’s a deck of cards, designed to be drawn from at random, each of which contains a single short aphorism, paradox, or suggestion intended to help break creative blocks. The tone of the aphorisms ranges from practical to gnomic to cheeky—”Overtly resist change,” “Turn it upside down,” “Is the tuning appropriate?”—but their overall intention is to gently disrupt the approach you’ve been taking toward the problem at hand, which often involves inverting your assumptions. This morning, for instance, when I drew a random card from the excellent online version, the result was: “Use clichés.” At first glance, this seems like strange advice, since most of us try to follow William Safire’s advice to avoid clichés like the plague. In reality, though, it’s a useful reminder that clichés do have their place, at least for an artist who has the skill and experience to deploy them correctly.

A cliché, by definition, is a unit of language or narrative that is already familiar to the reader, often to the point of losing all meaning. At their worst, clichés shut down thought by substituting a stereotyped formula for actual engagement with the subject. Still, there are times when this kind of conceptual invisibility can be useful. Songwriters, in particular, know that they can be an invaluable way of managing complexity within a piece of music, which often incorporates lulls or repetition as a courtesy to the listener. Paul Simon says it best:

So when I begin, I usually improvise a melody and sing words—and often those words are just clichés. If it is an old songwriting cliché, most of the time I throw it away, but sometimes I keep it, because they’re nice to have. They’re familiar. They’re like a breather for the listener. You can stop wondering or thinking for a little while and just float along with the music.

This kind of pause is one of the subtlest of all artistic tools: it provides a moment of consolidation, allowing the listener—or reader—to process the information presented so far. When we hear or read a cliché, we don’t need to pay attention to it, and that license to relax can be crucial in a work of art that is otherwise dense and challenging.

Paul Simon

This is a simply particular case of a larger point I’ve made elsewhere, which is that not every page of a story can be pitched at the same level of complexity or intensity. With few exceptions, even the most compressed narratives need to periodically rise and fall, both to give the reader a break and to provide a contrast or baseline for more dramatic moments. As the blogger Mike Meginnis has pointed out, this is one reason that we find flat, cartoonish characters in the fiction of Thomas Pynchon: any attempt to create conventionally plausible personalities when the bounds of complexity are being pushed in every other direction would quickly become unmanageable. And I’ve pointed out before that the plot of a movie like Inception needs to be simpler than it seems at first glance: the characters are mostly defined by type, without any real surprises after they’ve been introduced, and once the premise has been established, the plot unfolds in a fairly straightforward way. Christopher Nolan is particularly shrewd at using the familiar tropes of the story he’s telling—the thriller, the comic book movie, the heist film—for grounding us on one level while challenging us on others, which is one reason why I embedded a conventional procedural story at the heart of The Icon Thief.

If there’s one place where clichés don’t work, however, it’s in the creation of character. Given the arguments above, it might seem fine to use stereotypes or stock characters in the supporting cast, which allows the reader to tune them out in favor of the more important players, but in practice, this approach can easily backfire. Simple characters have their place, but it’s best to convey this through clean, uncomplicated motivations: characters who fall too easily into familiar categories often reflect a failure of craft or diligence on the author’s part, and they tend to cloud the story—by substituting a list of stock behaviors for clear objectives—rather than to clarify it. And this applies just as much to attempts to avoid clichés by turning them on their heads. In an excellent list of rules for writing science fiction and fantasy, the author Terry Bisson notes: “Racial and sexual stereotypes are (still) default SF. Avoiding them takes more than reversals.” It isn’t enough, in other words, to make your lead female character really good at archery. Which only hints at the most important point of all: as Niels Bohr said, the opposite of a great truth is another great truth, and the opposite of a cliché is, well, another cliché.

Written by nevalalee

April 23, 2013 at 8:37 am

The problem of narrative complexity

with 5 comments

David Foster Wallace

Earlier this month, faced with a break between projects, I began reading Infinite Jest for the first time. If you’re anything like me, this is a book you’ve been regarding with apprehension for a while now—I bought my copy five or six years ago, and it’s followed me through at least three moves without being opened beyond the first page. At the moment, I’m a couple of hundred pages in, and although I’m enjoying it, I’m also glad I waited: Wallace is tremendously original, but he also pushes against his predecessors, particularly Pynchon, in fascinating ways, and I’m better equipped to engage him now than I would have been earlier on. The fact that I’ve published two novels in the meantime also helps. As a writer, I’m endlessly fascinated by the problem of managing complexity—of giving a reader enough intermediate rewards to justify the demands the author makes—and Wallace handles this beautifully. Dave Eggers, in the introduction to the edition I’m reading now, does a nice job of summing it up:

A Wallace reader gets the impression of being in a room with a very talkative and brilliant uncle or cousin who, just when he’s about to push it too far, to try our patience with too much detail, has the good sense to throw in a good lowbrow joke.

And the ability to balance payoff with frustration is a quality shared by many of our greatest novels. It’s relatively easy to write a impenetrable book that tries the reader’s patience, just as it’s easy to create a difficult video game that drives players up the wall, but parceling out small satisfactions to balance out the hard parts takes craft and experience. Mike Meginnis of Uncanny Valley makes a similar point in an excellent blog post about the narrative lessons of video games. While discussing the problem of rules and game mechanics, he writes:

In short, while it might seem that richness suggests excess and maximal inclusion, we actually need to be selective about the elements we include, or the novel will not be rich so much as an incomprehensible blur, a smear of language. Think about the very real limitations of Pynchon as a novelist: many complain about his flat characters and slapstick humor, but without those elements to manage the text and simplify it, his already dangerously complex fiction would become unreadable.

Pynchon, of course, casts a huge shadow over Wallace—sometimes literally, as when two characters in Infinite Jest contemplate their vast silhouettes while standing on a mountain range, as another pair does in Gravity’s Rainbow. And I’m curious to see how Wallace, who seems much more interested than Pynchon in creating plausible human beings, deals with this particular problem.

Inception

The problem of managing complexity is one that has come up on this blog several times, notably in my discussion of the work of Christopher Nolan: Inception‘s characters, however appealing, are basically flat, and the action is surprisingly straightforward once we’ve accepted the premise. Otherwise, the movie would fall apart from trying to push complexity in more than one direction at once. Even works that we don’t normally consider accessible to a casual reader often incorporate elements of selection or order into their design. The Homeric parallels in Joyce’s Ulysses are sometimes dismissed as an irrelevant trick—Borges, in particular, didn’t find them interesting—but they’re very helpful for a reader trying to cut a path through the novel for the first time. When Joyce dispensed with that device, the result was Finnegans Wake, a novel greatly admired and rarely read. That’s why encyclopedic fictions, from The Divine Comedy to Moby-Dick, tend to be structured around a journey or other familiar structure, which gives the reader a compass and map to navigate the authorial wilderness.

On a more modest level, I’ve frequently found myself doing this in my own work. I’ve mentioned before that I wanted one of the three narrative strands in The Icon Thief to be a police procedural, which, with its familiar beats and elements, would serve as a kind of thread to pull the reader past some of the book’s complexities. More generally, this is the real purpose of plot. Kurt Vonnegut, who was right about almost everything, says as much in one of those writing aphorisms that I never tire of quoting:

I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading.

The emphasis is mine. Plot is really a way of easing the reader into that greatest of imaginative leaps, which all stories, whatever their ambitions, have in common: the illusion that these events are really taking place, and that characters who never existed are worthy of our attention and sympathy. Plot, structure, and other incidental pleasures are what keep the reader nourished while the real work of the story is taking place. If we take it for granted, it’s because it’s a trick that most storytellers learned a long time ago. But the closer we look at its apparent simplicity, the sooner we realize that, well, it’s complicated.

Christopher Nolan on the art of the reaction shot

with one comment

Director Christopher Nolan

Jordan Goldberg: In closing, what would you guys say you’ve learned through this experience?

Christopher Nolan: I’ve learned to get more reaction shots. [All laugh.] I’ve learned you can never have too many reaction shots to something extraordinary. Just on a technical level. In order to portray an extraordinary figure in an ordinary world, you have to really invest in the reality of the ordinary and in the reactions of people to him. That, to me, was what was fun about taking on this character because it hadn’t been done before. He is such an extraordinary figure, but if you can believe in the world he’s in, you can really enjoy that extraordinariness and that theatricality.

The Dark Knight Trilogy: The Complete Screenplays

Written by nevalalee

December 8, 2012 at 9:50 am

“A few moments earlier, on the other side of the estate…”

leave a comment »

(Note: This post is the nineteenth installment in my author’s commentary for The Icon Thief, covering Chapter 18. You can read the earlier installments here.)

Heist stories are fun for many reasons, but a lot of their appeal comes from the sense that they’re veiled allegories for the act of storytelling itself. We see this clearly in a movie like Inception, in which the various players can be interpreted as corresponding to analogous roles behind the camera—Cobb is the director, Saito the producer, Ariadne the set designer, Eames the primary actor, and Arthur is, I don’t know, the line producer, while Fischer, the mark, is a surrogate for the audience itself. (For what it’s worth, Christopher Nolan has stated that any such allegory was an unconscious one, although he seems to have embraced it after the fact.) Even in a novel, which is produced by a crew of one, there’s something in the structure of a heist that evokes a writer’s tools of the trade. It involves disguise, misdirection, perfect timing, and a ticking clock. If all goes well, it’s a well-oiled machine, and the target doesn’t even know that he’s been taken, at least not until later, when he goes back and puts together the pieces. And it’s no surprise that the heists contrived by writers, who spend most of their time constructing implausible machines, tend to be much more elaborate than their counterparts in the real world.

When I realized that I wanted to put a heist at the center of The Icon Thief, I was tickled by the opportunity, as well as somewhat daunted by the challenge. On the bright side, I had a lot of models to follow, so cobbling together a reasonable heist, in itself, was a fairly straightforward proposition. The trouble, of course, is that nearly everything in the heist genre has been done before. Every year seems to bring another movie centered on an impregnable safe or mansion, with a resourceful team of thieves—or screenwriters—determined to get inside. Audiences have seen it all. And I knew from early on that I wanted to make this heist a realistic one, without any laser grids or pressure-sensitive floors. I wanted the specifics to be clever, but not outside the means of a smart thief operating with limited resources. (A movie like Ocean’s 11, as entertaining as it may be, raises the question of why a group of criminals with access to such funding and technology would bother to steal for a living.) As I result, when I began to plot out the heist that begins to unfold in Chapter 18, I had a clear set of goals, but I wasn’t quite sure what form it would take.

The obvious place to begin was with the target itself. Consequently, I spent a memorable afternoon with a friend in the Hamptons, walking along Gin Lane, peeking over hedges, and generally acting as suspiciously as possible. The house that I describe here is a real mansion with more or less the physical setting that appears in the novel, with a mammoth hedge blocking it from the road, but a relatively accessible way in from the ocean side, where the property goes all the way down to the beach. I quickly decided that I wanted my thief to escape out the back way, onto the sand, where his getaway car would be waiting. On the way in, however, I wanted him to drive right through the gate. The crews in pickup trucks that I saw doing maintenance at many of these houses suggested one potential solution. And while I can’t quite remember how I came up with the final idea—a mid-engine pickup with an empty space under the hood large enough to allow two men to hide inside, undiscovered by security—I knew at once, when it occurred to me, that I’d found my way in.

The rest amounted to simple narrative mechanics. Following the anthropic principle of fiction that I mentioned earlier this week, I knew that I had to introduce the pickup early on, at least in the background, to make its ultimate use seem like less of a stretch—hence Sharkovsky’s enthusiasm for trophy trucks, which pops up at several points earlier in the novel. This chapter also includes one of the rare scenes told from the point of view from someone other than one of the central characters, since I wanted to put the reader in a shoes of a security guard who checks the truck thoroughly before letting it through the front gate, but neglects to look under the hood. The result is one of the novel’s more gimmicky moments, but I think it works. (Whether the arrangement that I describe in the book would actually function in real life is another matter, but at least it’s not entirely implausible, which by the standards of the genre is more than enough.) Sometimes I wonder if it’s too gimmicky, but that’s one of the pleasures of suspense: I can honor the heist genre with a quick nod in its direction, then move on as realistically as I can. And this heist is far from over…

Written by nevalalee

September 28, 2012 at 9:50 am

Crossing the digital divide

leave a comment »

On Saturday, my wife and I went to the Siskel Center in Chicago to see the engaging new documentary Side by Side, which focuses on the recent shift toward digital filmmaking and its implications for movies as a whole. Despite some soporific narration by producer and interviewer Keanu Reeves—who is not a man who should ever be allowed to do voiceover—this is a smart, interesting film that treats us to a dazzling range of perspectives, many of them from artists I’ve discussed repeatedly on this blog: David Lynch, Christopher Nolan, David Fincher, George Lucas, Stephen Soderbergh, Lars Von Trier, and the indispensable Walter Murch, not to mention Martin Scorsese, James Cameron, Michael Ballhaus, Robert Rodriguez, the Wachowskis, and many more. And while the interviewees come down on various sides of the digital issue—Rodriguez is probably the most unapologetic defender, Nolan the greatest skeptic—there’s one clear message: digital filmmaking is here to stay, and movies will never be the same.

If there’s one thread that runs through the entire movie, it’s the tradeoffs that come when you trade an expensive, cumbersome, highly challenging medium for something considerably cheaper and easier. At first glance, the benefits are enormous: you can run the camera for as long as you like for next to nothing, allowing you to capture more material, and the relatively small size of digital cameras lets you bring them places and achieve effects that might have been impossible before. Digital photography allows for greater control over technical details like color correction; makes editing far less difficult, at least on a practical level; and offers access to advanced tools to filmmakers with limited budgets. Yet there are tradeoffs as well. Film is still capable of visual glories that digital can’t match, and it’s curious that a movie that features Nolan and his genius cinematographer Wally Pfister lacks a single mention of IMAX. (Despite the multiplicity of voices here, I would have loved to have heard from Brad Bird, who because famous working in an exclusively digital medium but still chose IMAX to film much of Mission: Impossible—Ghost Protocol.)

Still, as the movie demonstrates, resolution and image quality for digital video is advancing at an exponential rate, and within the next ten years or so, it’s possible that we won’t notice the difference between digital photography and even the highest-resolution images available on film. Even then, however, something vital threatens to be lost. As Greta Gerwig, of all people, points out, when there’s real film running through the camera, everyone on set takes the moment very seriously, an intensity that tends to be diminished when video is cheap. The end of constraints comes at the cost of a certain kind of serendipity: as Anne V. Coates, the editor of Lawrence of Arabia, reveals, the greatest cut in the history of movies was originally meant as a dissolve, but was discovered by accident in the editing room. And as both David Lynch and producer Lorenzo DiBonaventure note, the increased availability of digital filmmaking doesn’t necessarily mean that we’ll see a greater number of good movies. In fact, the opposite is more likely to be true, as digital technology lowers the barriers to entry for artists who may not be ready to release movies in the first place—the cinematic equivalent of Kindle publishing.

The answer, clearly, is that we need to continue to impose constraints even as we’re liberated by new technology. That sense of intensity that Gerwig mentions is something that directors can still create, but only if they consciously choose to do so. As I’ve argued before, with a nod to Walter Murch, it’s important to find analog moments in a digital world, by intentionally slowing down the process, using pen and paper, and embracing randomness and restriction whenever possible. Most of all, we need to find time to render, to acknowledge that even when digital technology cuts the production schedule in half, there’s still a necessary period in which works of art must be given time to ripen. David Lynch says he’s done with film, and he’s earned the right to make movies in any way he likes. But when I look at Inland Empire, I see an extraordinary movie that could have been far greater—and central to my own life—if, like Blue Velvet, it had been cut from three hours down to two. Digital technology makes it possible to avoid these hard choices. But that doesn’t mean we should.

Thoughts on a Dark Knight

leave a comment »

Let’s talk about scale. For much of the past decade, the major movie studios have waged a losing battle to keep audiences in theaters, while competing with the vast array of more convenient entertainment options available at home. Hollywood’s traditional response to the threat of new media has always been to offer greater spectacle, these days in the form of IMAX or 3D, with an additional surcharge, of course. But as the new formats bring us closer to the action, computerized effects push us further away. No matter how beautifully rendered a digital landscape may be, it’s still strangely airless and sterile, with a sense that we’re being given a view of more megapixels, not a window on the world. Even so immersive a film as Avatar ultimately keeps us at arm’s length: Pandora is a universe unto itself, yes, but it still sits comfortably on a hard drive at Weta. And for all their size and expense, most recent attempts to create this kind of immersion, from John Carter to The Avengers, fail to understand the truth about spectacle: large-scale formats are most exciting when they give us a vision of a real, tangible, photographed world.

This is why The Dark Knight Rises is such a landmark. Christopher Nolan, who cited the films of David Lean as an influence in Batman Begins, understands that the real appeal of the great Hollywood epics in VistaVision and Cinerama was the startling clarity and scope of the world they presented. It’s the kind of thing that can only be achieved on location, with practical effects, real stunts, aerial photography, and a cast of thousands. The Dark Knight Rises is packed with digital effects, but we’re never aware of them. Instead, we’re in the presence of a director luxuriating in the huge panoramic effects that IMAX affords—with image, with music, with sound—when trained on the right material on real city streets. As a result, it feels big in a way that no other movie has in a long time. Brad Bird achieved some of the same effect in Mission: Impossible—Ghost Protocol, but while Bird invited us to marvel at his surfaces, Nolan wants us to plunge us into a world he’s created, and he uses the medium as it was meant to be used: to tell a rich, dense story about an entire city.

Even more than The Dark Knight, this final installment makes it clear that Nolan’s twin obsessions with epic filmmaking and narrative complexity aren’t two different impulses, but opposite sides of the same coin: the massive IMAX screen, which surrounds us with images of staggering detail, is the visual equivalent of what Nolan is trying to do with the stories he tells. One thinks of The Last Judgment, of Bruegel, of Bosch. And his narrative skills have only improved with time. The Dark Knight had a great script, but it occasionally seemed to strain under the weight of its ideas, until it came off as two hugely eventful movies packed into one. The new movie doesn’t quite reach the heights of its predecessor, but it’s also more confident and assured: we’re sucked in at once and held rapt for two hours and forty minutes. And Nolan seems to have gotten over his ambivalence about the character of Batman himself. He’s always been shy about the Batsuit, which served as a kinky reminder of the story’s comic book origins, but here, he keeps Bruce Wayne vulnerable and unmasked for as long as possible, until he becomes more of a hero than ever before.

This is, in short, something close to a masterpiece—not just a worthy conclusion to the best series of comic book movies ever made, but the year’s first really great studio film. And yet I do have one big complaint. I’ve spoken before about Hollywood’s weird obsession with secrets, in which it refuses to disclose simple information about a movie for no other reason than a fetish over secrecy for its own sake, when in fact the film itself has no interesting surprises. (See: Prometheus and Super 8.) The same impulse often applies to casting rumors. For The Dark Knight Rises, the studio adamantly refused to confirm who Anne Hathaway would be playing, despite it being fairly obvious, and did the same with the characters played by Tom Hardy and Joseph Gordon-Levitt. Yet even at the earliest point in the film’s production, it was made very clear that a certain character was going to be appearing in the film—thus ruining the movie’s one big surprise. In short, Hollywood has no idea what a secret is: it routinely hides information to no purpose, but then, when it really counts for once, it reveals it in a way that utterly destroys the filmmaker’s intentions. And there’s no other living director whose intentions deserve greater respect and admiration.

Whither Whedon?

leave a comment »

Over the weekend, along with everyone else in the Northern Hemisphere, my wife and I saw The Avengers. I’m not going to bother with a formal review, since there are plenty to go around, and in any case, if you haven’t already seen it, your mind is probably made up either way. I’ll just say that while I enjoyed it, this is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many, as well as a rather dull villain—far better if they had gone with the Red Skull of Captain America—so that a lot of the film, probably too much, is spent slotting all the components into place.

Still, once everything clicks, it moves along efficiently, if not always coherently, and it’s a bright, shiny toy for the eyes, certainly compared to the dreary Thor. It doesn’t force us to rethink what this genre is capable of doing, as The Dark Knight did, but it’s a movie that delivers exactly what audiences want, and perhaps a touch more, which is more than enough to deliver the highest opening weekend in history. And this, more than anything else, puts its director in a peculiar position. Joss Whedon has made a career out of seeming to work against considerable obstacles, and never quite succeeding, except in the eyes of his devoted fans. Buffy switched networks; Firefly was canceled before its time; Dollhouse struggled on for two seasons in the face of considerable interference. All of his projects carry a wistful sense of what might have been, and throughout it all, Whedon has been his own best character, unfailingly insightful in interviews, gracious, funny and brave, the underdog whose side he has always so eloquently taken.

So what happens when the underdog becomes responsible for a record-shattering blockbuster? The Avengers isn’t all that interesting as a movie—far less so than The Cabin in the Woods—but it’s fascinating as a portent of things to come. Whedon has delivered the kind of big popular success that can usually be cashed in for the equivalent of one free movie with unlimited studio resources, as if all the holes in his frequent shopper’s card had finally been punched. For most of his career, at least since Buffy, Whedon has had everything—charm, talent, an incredibly avid fanbase—except the one thing that a creative type needs to survive in Hollywood: power. Now, abruptly, he has oodles of it, obtained in the only way possible, by making an ungodly amount of money for a major studio. Which means that he’s suddenly in a position, real or imaginary, to make every fanboy’s dreams come true.

The question is what he intends to do with it. Unlike Christopher Nolan, he isn’t a director who seems to gain personal satisfaction from deepening and heightening someone else’s material, so The Avengers 2 doesn’t seem like the best use of his talents. Personally, I hope he pulls a Gary Ross, takes the money, and runs. He could probably make another Firefly movie, although that doesn’t seem likely at this point. He could make Goners. He could pick up an ailing franchise with fewer moving parts and do wonderful things with it—I hear that Green Lantern is available. Or, perhaps, he’ll surprise us. The Avengers isn’t a bad film, but it gives us only occasional glimpses of the full Whedon, peeking out from between those glossy toys, and those hints make you hunger for a big movie that he could control from beginning to end. For most of his career, fans have been wondering what he’d do with the full resources and freedom he’d long been denied—even as he seemed to thrive on the struggle. And if he’s as smart and brave as he’s always seemed, he won’t wait long to show us.

Written by nevalalee

May 7, 2012 at 9:50 am

Christopher Nolan and the maze of storytelling

with one comment

The release of the final trailer for The Dark Knight Rises gives me as good an excuse as any to talk once more about the work of Christopher Nolan, who, as I’ve said before, is the contemporary director who fills me with the most awe. Nolan has spent the past ten years pushing narrative complexity on the screenplay level as far as it will go while also mastering every aspect of large-scale blockbuster filmmaking, and along the way, he’s made some of the most commercially successful films of the decade while retaining a sensibility that remains uniquely his own. In particular, he returns repeatedly to issues of storytelling, and especially to the theme of how artists, for all their intelligence and preparation, can find themselves lost in their own labyrinths. Many works of art are ultimately about the process of their own creation, of course, but to a greater extent than usual, Nolan has subtly given us a portrait of the director himself—meticulous, resourceful, but also strangely ambivalent toward the use of his own considerable talents.

Yesterday, I referred to my notes toward a novel as urgent communications between my past and future selves, “a la Memento,” but it was only after typing that sentence that I realized how accurate it really is. Leonard Shelby, the amnesiac played by Guy Pearce, is really a surrogate for the screenwriter: he’s thrust into the middle of a story, without any context, and has to piece together not just what comes next, but what happened before. His notes, his visual aids, and especially the magnificent chart he hangs on his motel room wall are variations of the tools that a writer uses to keep himself oriented in during a complex project—including, notably, Memento itself. It isn’t hard to imagine Nolan and his brother Jonathan, who wrote the original story on which the screenplay is based, using similar charts to keep track of their insanely intricate narrative, with a protagonist who finally turns his own body into a sort of corkboard, only to end up stranded in his own delusions.

This theme is explored repeatedly in Nolan’s subsequent films—notably The Prestige, in which the script’s endless talk about magic and sleight of hand is really a way of preparing us for the trick the movie is trying to play on the audience—but it reaches its fullest form in Inception. If Memento is a portrait of the independent screenwriter, lonely, paranoid, and surrounded by fragments of his own stories, Inception is an allegory for blockbuster moviemaking, with a central figure clearly based on the director himself. Many viewers have noted the rather startling visual similarity between Nolan and his hero, and it’s easy to assign roles to each of the major characters: Cobb is the director, Saito the producer, Ariadne the art director, all working toward the same goal as that of the movie itself—to transport the viewer into a reality where the strangest things seem inevitable. While Nolan has claimed that such an allegory wasn’t intentional, Inception couldn’t have been conceived, at least not in its current form, by a man who hadn’t made several huge movies. And at the end, we’re given the sense that the artist himself has been caught in a web of his own design.

In this light, Nolan’s Batman movies start to seem like his least personal work, which is probably true, but his sensibility comes through here as well. Batman Begins has an art director’s fascination with how things are really made—like Batman’s cowl, assembled from parts from China and Singapore—and The Dark Knight takes the figure of the director as antihero to its limit. The more we watch it, the more Nolan seems to uneasily identify, not with Batman, but with the Joker, the organized, methodical, nearly omniscient toymaker who can only express himself through violence. If the wintry, elegiac tone of our early glimpses of The Dark Knight Rises is any indication, Nolan seems ready to move beyond this, much as Francis Coppola—also fond of directorial metaphors in his work—came to both to identify with Michael Corleone and to dislike the vision of the world he had expressed in The Godfather. And if Nolan evolves in similar ways, it implies that the most interesting phase of his career is yet to come.

Written by nevalalee

May 2, 2012 at 9:45 am

American exceptionalism

leave a comment »

I didn’t want to see Captain America. The trailer wasn’t great, Joe Johnston wasn’t exactly my idea of a dream director, and most of all, I was getting a little tired of superheroes. The fact that we’ve seen four major comic book adaptations this summer alone wasn’t the only reason. Ten years ago, a movie like Spider-Man felt like a cultural event, a movie that I’d been waiting decades to see. Today, they’ve become the norm, to the point where a movie that isn’t driven by digital effects and an existing comic book property seems strangely exotic. At worst, such movies come off as the cynical cash grabs that, frankly, most of them are, a trend epitomized by Green Lantern, a would-be marketing bonanza so calculated that an A.V. Club headline summed it up as “Superhero movies are popular right now. Here’s another one.”

Which is why it gives me no small pleasure to report that Captain America is a pretty good movie, and in ways that seem utterly reproducible. This isn’t a film like The Dark Knight, which seems like an increasingly isolated case of a genius director being given all the resources he needed to make a singular masterpiece. Captain America is more the work of talented journeymen, guys who like what they do and are reasonably skilled at it, and who care enough to give the audience a good time—presumably with the kind of movie that they’d enjoy seeing themselves. Joe Johnston is no Chris Nolan, but in his own way, he does an even more credible Spielberg imitation than the J.J. Abrams of Super 8, and to more of a purpose. If this is clearly a cash grab—as its closing minutes make excruciatingly clear—it’s also full-blooded and lovingly rendered.

As a result, it’s probably the comic book movie I enjoyed most this year. While it doesn’t have the icy elegance of X-Men: First Class, it has a better script (credited to Christopher Markus and Stephen McFeely), and it’s far superior to the muddled, halfhearted, and overpraised Thor. Part of this is due to the fact that it’s the only recent superhero movie to manage a credible supervillain: in retrospect, Hugo Weaving’s Red Skull doesn’t do much more than strut around, but he’s still mostly glorious. And it’s also one of the rare modern comic book movies that remembers that the audience might still like to see some occasional action. As Thor failed to understand, special effects alone aren’t enough: I’ve had my mind blown too many times before. Yet it’s still fun to see an expertly staged action scene that arises organically from the story, and Captain America has a good handful of those, at a time when I’ve almost forgotten what it was like to see one.

What Captain America does, then, isn’t rocket science: it’s what you’d expect from any big studio movie, done with a modicum of care, aiming to appeal to the largest possible audience. So why aren’t there more movies like this? Perhaps because it’s harder to do than it looks: for one thing, it requires a decent script, which, more than anything else, is the limiting factor in a movie’s quality, and can’t be fixed by throwing money at it. The more movies I see, the more I respect mainstream entertainment that tries to be more than disposable, an effort that can seem quixotic in an industry where Pirates of the Caribbean: On Stranger Tides earns a billion dollars worldwide. Like it or not, movies are going to look increasingly like this, which is why it’s a good idea to welcome quality wherever we find it. Because it isn’t enough for a superhero to be super anymore; he also needs to be special.

Transformers: Death of the Author

with 5 comments

Never has a city been more lovingly destroyed on camera than Chicago in Transformers: Dark of the Moon. By the time the movie reaches its crowning orgy of destruction, my wife and I had been staring at the screen for close to ninety minutes, along with an enthusiastic crowd in the IMAX theater at Navy Pier. My wife had seen much of the movie being shot on Michigan Avenue, just up the street from her office at the Tribune Tower, and I think it was with a sort of grim amusement, or satisfaction, that she watched her own building crumble to pieces as an alien robot spacecraft crashed into its beautiful gothic buttresses. It’s an image that merits barely five seconds in the movie’s final hour, which devastates most of downtown Chicago in gorgeous, even sensual detail, but it still struck me as a pivotal moment in our personal experience of the movies. (And hasn’t the Tribune already suffered enough?)

Like its immediate predecessor, Transformers 3 is generally pretty lousy. (I actually liked the first one, which had the advantage of comparative novelty, as well as a genuinely nimble comic performance by Shia LaBeouf that both director and star have struggled to recreate ever since.) As a story, it’s ridiculous; as a perfunctory attempt at a coherent narrative, it’s vaguely insulting. It’s also staggeringly beautiful. For the first ten minutes, in particular, the IMAX screen becomes a transparent window onto the universe, delivering the kind of transcendent experience, with its view of millions of miles, that even Avatar couldn’t fully provide. And even after its nonstop visual and auditory assault has taken its toll on your senses, it still gives new meaning to the phrase “all the money is there on the screen.” Here, it feels like the cash used to render just one jaw-dropping frame could have been used to pay down much of the national debt.

As I watched Dark of the Moon, or rather was pummeled into submission by it, I had the nagging feeling that Armond White’s notoriously glowing review of Revenge of the Fallen deserved some kind of reappraisal. At the time, White was dismissed, not without reason, as a troll, for issuing such pronouncements as “In the history of motion pictures, Bay has created the best canted angles—ever.” And yet I don’t think he was trolling, or even entirely wrong: it’s just that he was one movie too early. Michael Bay’s genius, and I use this word deliberately, is visible in every shot of Dark of the Moon, but it’s weirdly overdeveloped in just one direction. Bay is like one of those strange extinct animals that got caught in an evolutionary arms race until they became all horns, claws, or teeth. While a director like Christopher Nolan continues to develop along every parameter of storytelling, Bay is nothing but a massive eye: cold, brilliant, and indifferent to story or feeling. And it’s pointless to deny his talents, as ridiculously squandered as they might be.

So what exactly am I saying here? To steal a phrase from Roger Ebert’s review of The Life Aquatic, I can’t recommend Transformers: Dark of the Moon, but I would not for one second discourage you from seeing it—provided that you shell out fifteen dollars or more for the real experience in IMAX 3-D, which Bay has lovingly bullied the nation’s projectionists into properly presenting. (On video, I suspect that you might have the same reaction that my wife and I did when we rented Revenge of the Fallen: within forty minutes, both of us had our laptops out.) It’s an objectively terrible movie that, subjectively, I can’t get out of my head. As an author, I’m horrified by it: it’s a reminder of how useless, or disposable, writers can be. I won’t go as far as to say that it’s a vision of my own obsolescence, or that I believe that the robots are our future. But at this point in history, the burden is on writers to demonstrate that we’re necessary. And the momentum isn’t on our side.

Source Code and the state of modern science fiction

leave a comment »

On Saturday, my wife and I finally saw Source Code, the new science fiction thriller directed by Moon‘s Duncan Jones. I liked Moon a lot, but wasn’t sure what to expect from his latest film, and was pleasantly surprised when it turned out to be the best new movie I’ve seen this year. Admittedly, this is rather faint praise—by any measure, this has been a slow three months for moviegoers. And Source Code has its share of problems. It unfolds almost perfectly for more than an hour, then gets mired in an ending that tries, not entirely successfully, to be emotionally resonant and tie up all its loose ends, testing the audience’s patience at the worst possible time. Still, I really enjoyed it. The story draws you in viscerally and is logically consistent, at least up to a point, and amounts to a rare example of real science fiction in a mainstream Hollywood movie.

By “real” science fiction, of course, I don’t mean that the science is plausible. The science in Source Code is cheerfully absurd, explained with a bit of handwaving about quantum mechanics and parabolic calculus, but the movie is unusual in having the courage to follow a tantalizing premise—what if you could repeatedly inhabit the mind of a dead man eight minutes before he died?—through most of its possible variations. This is what the best science fiction does: it starts with an outlandish idea and follows it relentlessly through all its implications, while never violating the rules that the story has established. And one of the subtlest pleasures of Ben Ripley’s screenplay for Source Code lies in its gradual reveal of what the rules actually are. (If anything, I wish I’d known less about the story before entering the theater.)

This may sound like a modest accomplishment, but it’s actually extraordinarily rare. Most of what we call science fiction in film is thinly veiled fantasy with a technological sheen. A movie like Avatar could be set almost anywhere—the futuristic trappings are incidental to a story that could have been lifted from any western or war movie. (Walter Murch even suggests that George Lucas based the plot of Star Wars on the work he did developing Apocalypse Now.) Star Trek was often a show about ideas, but its big-screen incarnation is much more about action and spectacle: Wrath of Khan, which I think is the best science fiction film ever made, has been aptly described as Horatio Hornblower in space. And many of the greatest sci-fi movies—Children of Men, Blade Runner, Brazil—are more about creating the look and feel of a speculative future than any sense of how it might actually work.

And this is exactly how it should be. Movies, after all, aren’t especially good at conveying ideas; a short story, or even an episode of a television show, is a much better vehicle for working out a clever premise than a feature film. Because movies are primarily about action, character, and image, it isn’t surprising that Hollywood has appropriated certain elements of science fiction and left the rest behind. What’s heartening about Source Code, especially so soon after the breakthrough of Inception, is how it harnesses its fairly ingenious premise to a story that works as pure entertainment. There’s something deeply satisfying about seeing the high and low aspects of the genre joined so seamlessly, and it requires a peculiar set of skills on the part of the director, who needs to be both fluent with action and committed to ideas. Chris Nolan is one; Duncan Jones, I’m excited to say, looks very much like another.

The singular destiny of David Fincher

with 4 comments

The most extraordinary thing about last night’s Academy Awards, which were otherwise inexplicably awkward, was the idea that in today’s Hollywood, five men like David Fincher, David O. Russell, Darren Aronofsky, and Joel and Ethan Coen could be competing for Best Director, with only the unstoppable force of Tom Hooper and The King’s Speech excluding Christopher Nolan from the final slot on that list. It was perhaps inevitable that Hooper would end up playing the spoiler, but despite the outcome, the sight of so many unpredictable, talented, and relatively young directors in one room was enough to make me feel lucky for the chance to watch their careers unfold—and that includes Hooper, as long as last night’s coronation doesn’t lull him into premature complacency. (His next big project, an adaptation of Les Misérables, doesn’t bode especially well.)

That said, David Fincher deserved to win. And one day he will. Of all the directors on that list, he’s the one who seems most capable of making a major movie that can stand with the greatest American films, which is something that I never would have guessed even five years ago. For a long time, Fincher struck me as the most erratic of technical perfectionists, at least as far as my own tastes were concerned: before The Social Network, he had made one of my favorite movies (Zodiac); one of my least favorite (Fight Club); one that was good, but limited (Seven); and several that I can barely remember (The Game, Panic Room, and the rest). But as of last night, he seems capable of anything—aside from the ambitious dead end of Benjamin Button, which only proves that Fincher needs to stay away from conventional prestige projects.

Because the crucial thing about Fincher is that his technical proficiency is the least interesting or distinctive thing about him. The world is full of directors who can do marvelous things with digital video, who know how to choreograph physical and verbal violence, and who display a fanatic’s obsession with art direction, sound, and special effects. What sets Fincher apart is his willingness, which even Nolan lacks, to lavish these considerable resources on small, surprising stories. Many of my favorite movies, from Ikiru to The Insider, are the result of a great director training his gifts on subjects that might seem better suited for television. The Social Network, which grows deeper and sadder the more often I watch it, belongs proudly to that tradition. And I have a feeling that an Oscar would have made it much harder for Fincher to continue along that path.

A win last night might also have calcified Fincher’s perfectionist habits into mere self-indulgence, which is a risk that will never entirely go away. Fincher has repeatedly demonstrated his ability to elicit fine performances from his actors, but his approach to filmmaking, with its countless takes, has more often been an emotional dead end for directors. In On Directing Film, David Mamet sums up the traditional case against multiple takes:

I’ve seen directors do as many as sixty takes of a shot. Now, any director who’s watched dailies knows that after the third or fourth take he can’t remember the first; and on the set, when shooting the tenth take, you can’t remember the purpose of the scene. And after shooting the twelfth, you can’t remember why you were born. Why do directors, then, shoot this many takes? Because they don’t know what they want to take a picture of. And they’re frightened.

Fincher, of course, is more likely to ask for a hundred takes of a shot, let alone sixty. So far, the results speak for themselves: The Social Network and Zodiac are two of the most beautifully acted ensemble movies of the last decade. They’re so good, in fact, that they’ve singlehandedly forced me to rethink my own feelings about multiple takes in the digital era. In the old days, when  film stock was too expensive to be kept running for long, the need to stop and restart the camera after every take quickly sucked all the energy out of a set. Now that videotape is essentially free, multiple takes become more of a chance to play and explore, and can result in acting of impressive nuance and subtlety. (In a recent post, David Bordwell does a nice job of highlighting how good Jesse Eisenberg’s performance in The Social Network really is.) But they’re only useful if the director remains hungry enough to channel these takes into unforgettable stories. An Oscar, I suspect, would have taken much of that hunger away.

My gut feeling, after last night, is that if Fincher continues to grow, his potential is limitless. Over the past few years, he has already matured from a director who, early on, seemed interested in design above all else to an artist whose technique is constantly in the service of story, as well as an authentic interest in his characters and the worlds they inhabit. This mixture of humanism (but not sentimentality) and technical virtuosity is precious and rare, and it’s enough to put Fincher at the head of his generation of filmmakers, as long as he continues to follow his gift into surprising places. At first glance, The Girl With the Dragon Tattoo seems like a step back, but at least it affords the range of tones and locations that he needs. And if last night’s loss forces him to search all the more urgently for great material, then perhaps we’re all better off in the end.

Guillermo’s Labyrinth

with 2 comments

Daniel Zalewski’s recent New Yorker piece on Guillermo del Toro, director of Pan’s Labyrinth and the Hellboy movies, is the most engaging profile I’ve read of any filmmaker in a long time. Much of this is due to the fact that del Toro himself is such an engaging character: enthusiastic and overweight, he’s part auteur and part fanboy, living in a house packed with ghouls and monsters, including many of the maquettes from his own movies. And the article itself is equally packed with insights into the creative process. On creature design:

Del Toro thinks that monsters should appear transformed when viewed from a fresh angle, lest the audience lose a sense of awe. Defining silhouettes is the first step in good monster design, he said. “Then you start playing with movement. The next element of design in color. And then finally—finally—comes detail. A lot of people go the other way, and just pile up a lot of detail.”

On Ray Harryhausen:

“He used to say, ‘Whenever you think of a creature, think of a lion—how a lion can be absolutely malignant or benign, majestic, depending on what it’s doing. If your creature cannot be in repose, then it’s a bad design.’”

And in an aside that might double as del Toro’s personal philosophy:

“In emotional genres, you cannot advocate good taste as an argument.”

Reading this article makes me freshly mourn the fact that del Toro won’t be directing The Hobbit. I like Peter Jackson well enough, but part of me feels that if del Toro had been allowed to apply his practical, physical approach to such a famous property—much as Christopher Nolan did with the effects in Inception—the history of popular filmmaking might have been different. As it stands, I can only hope that Universal gives the green light to del Toro’s adaptation of At the Mountains of Madness, a prospect that fills me with equal parts joy and eldritch terror. Judging from what I’ve heard so far, it sounds like del Toro is planning to make the monster movie to end all monster movies. Let’s all hope that he gets the chance.

The best movies of the year

leave a comment »

First, the bad news. This was a terrible year for movies. Some combination of recessionary cutbacks, the delayed effects of the writer’s strike, and a determination to convert every imaginable movie to muddy 3D resulted in stretches of up to two or three months when multiplexes were basically a wasteland. And even if this cinematic dead zone turns out to be temporary, it’s hard not to see it as karmic comeuppance for the Academy’s recent decision to bump the number of Best Picture nominees to ten, an act of desperation that is looking more misguided with every passing day. Still, there were some very good movies released this year, including one that ranks among the best I’ve ever seen. It’s almost enough to make me think that this year was better than it actually was:

1. Inception. After a decade of extraordinary productivity, Christopher Nolan is beginning to look like nothing so much as two great directors working as one: the first is obsessed with pushing the bounds of filmic complexity on the narrative level, while the other has devoted himself to mastering every aspect of modern blockbuster filmmaking. Inception is the ultimate result of this paradoxical partnership: it’s one of those rare movies in which every aspect of the production—acting, story, visual effects, art direction, stunts, music, editing, even costume design—is both immediately exhilarating and endless to meditation. I only wish there were more of it.

2. Toy Story 3. I was hard on this movie yesterday, so let’s set the record straight: this is the best Pixar film since Finding Nemo, and one of the finest animated movies ever made. It’s touching, exciting, thematically rich, and very funny, with an enormous cast of characters—both existing and new—who are so engaging that I’m sad we won’t have a chance to see them in other stories. (Fanfic, as usual, is ready to come to the rescue.) It’s enough to make me wish that I were ten years younger, just so I could have grown up with these toys—and movies—on my playroom shelves.

3. The Social Network. Over the past few years, David Fincher has gone from being a stylish but chilly visual perfectionist to a director who can seemingly do anything. Zodiac was the best movie ever made about serial killers and journalism, as well as the best Bay Area picture since Vertigo; The Social Network, in turn, is the best Harvard movie of all time, as well as a layered, trashy story of money and friendship, with an Aaron Sorkin script that manages to evoke both John Hughes and Citizen Kane. It’s almost enough to make me excited about The Girl With the Dragon Tattoo.

4. Exit Through the Gift Shop. Even more than Inception, this was the best film of the year for inspiring endless heated debate. Months later, I’m still not sure what to think about the strange case of Banksy and Mr. Brainwash, which is some combination of cautionary tale, Horatio Alger story, fascinating reportage, and practical joke. I do know that it’s impossible to watch it without questioning your deepest assumptions about art, commerce, and the nature of documentary filmmaking. And even if it’s something of a put-on, which I think at least part of it is, it’s still the best movie of its kind since F for Fake.

5. The Ghost Writer. Roman Polanski’s modest but wickedly sophisticated thriller is a reminder that a movie doesn’t need to be big to be memorable. The ingredients couldn’t be simpler: a tight story, an impeccable cast (aside from Kim Cattrall’s distractingly plummy British accent), and an isolated house on the beach. The result is one of the great places in the movies, as real as Hannibal Lecter’s cell or the detective’s office in The Usual Suspects. By the end, we feel as if we could find our way around this house on our own, and the people inside it—especially the devastating Olivia Williams—have taken up residence in our dreams.

6. Fair Game. Aside from a pair of appealingly nuanced performances by Naomi Watts (as Valerie Plame) and Sean Penn (as Joseph Wilson), Fair Game doesn’t even try to be balanced: it’s a story of complex good against incredible, mustache-twirling evil, which would be objectionable from a narrative perspective if it weren’t so close to the truth. At its best, it’s reminiscent of The Insider, both in its sense of outrage and in the massive technical skill that it lavishes on intimate spaces. It’s impossible to watch it without being swept up again by renewed indignation.

7. The Town. True, it’s slightly confused about its main character, who comes off as more of a sociopath than the film wants to admit, and I have problems with the last ten minutes, in which Ben Affleck, as both director and star, slips from an admirable objectivity into a strange sort of self-regard. Still, for most of its length, this is a terrific movie, with one of the best supporting casts in years—notably Jeremy Renner, Rebecca Hall, Jon Hamm, and the late Pete Postlethwaite. The result is a genre piece that is both surprisingly layered and hugely entertaining, with a fine sense of Boston atmosphere.

8. The Secret in Their Eyes. Technically, this Argentine movie—which won the Academy Award for Best Foreign Language Film—came out last year, but I’d feel irresponsible if I didn’t include it here. Like The Lives of Others, which it superficially resembles, it’s one of those foreign films, aware of but unimpressed by the conventions of Hollywood, that seems so rich and full of life that it passes beyond genre: it’s funny, romantic, and unbearably tense, and contains one of the most virtuoso action sequences this side of Children of Men. I don’t know what to call it, but I love it.

9. Scott Pilgrim vs. The World. A week doesn’t go by in which I don’t think fondly of Knives Chau, Scott Pilgrim’s hapless but unexpectedly resourceful Chinese-Canadian love interest. The film in which Knives finds herself is equally adorable: it has enough wit and invention for three ordinary movies, and it’s one of the few comedies of recent years that knows what to do with Michael Cera. It’s something of a mess, and its eagerness to please can be exhausting, but it still contains more delights per reel than any number of tidier films.

10. The American. Despite opening at the top of the box office over Labor Day weekend, this odd, nearly perfect little movie was mostly hated or dismissed by audiences soon after its release. The crucial thing is to adjust your expectations: despite what the commercials say, this isn’t a thriller so much as a loving portrait of a craftsman—in this case, an assassin—at work, as well as a visual essay on such important subjects as the Italian countryside, a woman’s naked body, and George Clooney’s face. It’s perilously close to ridiculous, but until its ludicrous final shot, it casts its own kind of peculiar spell.

Honorable mention goes to Winter’s Bone, A Prophet, Tangled, and How to Train Your Dragon, as well as to parts of The Kids Are All Right, The King’s Speech, and even Black Swan, which really deserves a category of its own. (As for Tron: Legacy, well, the less said about that, the better.)

In praise of David Thomson

with 3 comments

The publication of the fifth edition of David Thomson’s New Biographical Dictionary of Film, the best book ever written on the movies, is cause for celebration, and an excuse for me to talk about one of the weirdest books in all of literature. Thomson is a controversial figure, and for good reason: his film writing isn’t conventional criticism so much as a single huge work of fiction, with Thomson himself as both protagonist and nemesis. It isn’t a coincidence that one of Thomson’s earliest books was a biography of Laurence Sterne, author of Tristram Shandy: his entire career can be read as one long Shandean exercise, in which Thomson, as a fictional character in his own work, is cheerfully willing to come off as something of a creep, as long as it illuminates our reasons for going to the movies.

First, a word about the book’s shortcomings. As in previous editions, instead of revising the entries for living subjects in their entirety, Thomson simply adds another paragraph or two to the existing filmographies, so that the book seems to grow by accretion, like a coral reef. This leads to inconsistencies in tone within individual articles, and also to factual mistakes when the entry hasn’t been updated recently enough—like the article on George Lucas, for instance, in which the latter two Star Wars prequels still evidently lie in the future. And the book is full of the kind of errors that occur when one tries to keep up, in print, with the vagaries of movie production—as when it credits David O. Russell with the nonexistent Nailed and omits The Fighter. (Now that this information is readily available online, Thomson should really just delete all of the detailed filmographies in the next edition, which would cut the book’s size by a quarter or more.)

And then, of course, there are Thomson’s own opinions, which are contrarian in a way that can often seem perverse. He’s lukewarm on Kurosawa, very hard on Kubrick (The Shining is the only movie he admires), and thinks that Christopher Nolan’s work “has already become progressively less interesting.” He thinks that The Wrestler is “a wretched, interminable film,” but he loves Nine. He displays next to no interest in animation or international cinema. There’s something to be outraged about on nearly every page, which is probably why the Dictionary averages barely more than three stars from reviewers on Amazon. And if you’re the sort of person who thinks that a critic whose opinions differ from your own must be corrupt, crazy, or incompetent—as many of Roger Ebert’s correspondents apparently do—then you should stay far, far away from Thomson, who goes out of his way to infuriate even his most passionate defenders.

Yet Thomson’s perversity is part of his charm. Edmund Wilson once playfully speculated that George Saintsbury, the great English critic, invented his own Toryism “in the same way that a dramatist or novelist arranges contrasting elements,” and there are times when I suspect that Thomson is doing the same thing. And it’s impossible not to be challenged and stirred by his opinions. There is a way, after all, in which Kurosawa is a more limited director than Ozu—although I know which one I ultimately prefer. Kubrick’s alienation from humanity would have crippled any director who was not Kubrick. Until The Dark Knight and Inception, Nolan’s movies were, indeed, something of a retreat from the promise of Memento. And for each moment of temporary insanity on Thomson’s part, you get something equally transcendent. Here he is on Orson Welles, for example, in a paragraph that has forever changed how I watch Citizen Kane:

Kane is less about William Randolph Hearst—a humorless, anxious man—than a portrait and prediction of Welles himself…As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent…Kane is Welles, just as every apparent point of view in the film is warmed by Kane’s own memories, as if the entire film were his dream in the instant before death.

On Spielberg and Schindler’s List:

Schindler’s List is the most moving film I have ever seen. This does not mean it is faultless. To take just one point: the reddening of one little girl’s coat in a black-and-white film strikes me as a mistake, and a sign of how calculating a director Spielberg is. For the calculations reveal themselves in these few errors that escape. I don’t really believe in Spielberg as an artist…But Schindler’s List is like an earthquake in a culture of gardens. And it helps persuade this viewer that cinema—or American film—is not a place for artists. It is a world for producers, for showmen, and Schindlers.

And, wonderfully, on what is perhaps my own favorite bad movie of all time:

Yet in truth, I think Kevin [Spacey] himself is the biggest experiment, and to substantiate that one has only to call to the stand Beyond the Sea, written, produced and directed by Kev and with himself as Bobby Darin. The result is intoxicating, one of the really great dreadful films ever made, worthy of an annual Beyond the Sea award (why not give it on Oscar night?), as well as clinching evidence that this man is mad. Anything could happen.

The result, as I note above, is a massive Proustian novel in which nearly every major figure in the history of film plays a role. (Thomson has already written a novel, Suspects, that does this more explicitly, and his book-length study of Nicole Kidman is manifestly a novel in disguise.) Reading the Dictionary, which is as addictive as Wikipedia or TV Tropes, is like diving headfirst into a vast ocean, and trying to see how deep you can go before coming up for air. Although if it really is a novel, it’s less like Proust than like Pale Fire, in which Thomson plays the role of Kinbote, and every article seems to hint darkly at some monstrous underlying truth. (In that light, even the book’s mistakes seem to carry a larger meaning. What does it mean, for instance, that Thomson’s brilliant article on Heath Ledger, in which he muses on “the brief purchasing power” of fame, was “inadvertently dropped” from the fifth edition?)

And what monstrous truth does the Dictionary conceal? It’s the same truth, which applies as much to Thomson himself as it does to you and me, as the one that he spells out, unforgettably, at the end of Rosebud, his study of Orson Welles:

So film perhaps had made a wasted life?
One has to do something.

Follow

Get every new post delivered to your Inbox.

Join 3,860 other followers

%d bloggers like this: