Archive for the ‘Movies’ Category
A lot of people thought my work was very tedious, and it can be if you look at it from that point of view, but I never looked upon it as tedious…They don’t know the joy of seeing the film come back and what you had in your mind is on film.
“There is a major but very difficult realization that needs to be reached about [Cary] Grant—difficult, that is, for many people who like to think they take the art of film seriously,” David Thomson writes in The New Biographical Dictionary of Film, before going on to make a persuasive argument that Grant “was the best and most important actor in the history of the cinema.” There’s a similarly difficult realization that needs to be reached about Tom Cruise, which is that for better or worse, over the last quarter of a century, he’s been the best movie star we have, and one of the best we’ve ever had. Not the best actor, certainly, or even the one, like Clooney, who most embodies our ideas of what a star should be, but simply the one who gave us the most good reasons to go to the movies for more than twenty years. I love film deeply, and I’ve thought about it more than any sane person probably should, and I have no trouble confessing that for most of my adult life, Cruise and his movies have given me more pleasure than the work of any other actor or director.
And yet it wasn’t until I realized that I loved his movies that I really started to take notice of him in his own right. We’re usually drawn to stars because of the qualities they embody, but in Cruise’s case, I became a fan—and remain a huge one—because I belatedly noticed that whenever I bought a ticket to a movie with his name above the title, I generally had a hell of a good time. That hasn’t always been true in recent years, and while some might say that his movies have taken a hit because Cruise’s own public image has been tarnished, I’d argue that the causal arrow runs the other way. Cruise has always functioned less as a traditional movie star than as a sort of seal of quality: a guarantee that we’ll be treated to a film that provides everything that the money, talent, and resources of a major studio can deliver. As a result, whenever the movies in which he appears become less interesting, Cruise himself grows less attractive. Left to his own devices, he can’t rescue Lions for Lambs or Knight and Day, but if he gives us a big, impersonal toy like Mission: Impossible—Ghost Protocol, all is forgiven.
It’s worth emphasizing how strange this is. We tend to think of movie stars as supernatural beings who can elevate mediocre material by their mere presence, but Cruise is more of a handsome, professional void, a running man around whom good to great movies have assembled themselves with remarkable consistency. In fact, he’s more of a great producer and packager of talent who happens to occupy the body of a star who can also get movies made. Hollywood consists of many ascending circles of power, in which each level has more of it than the one below, but when judged by its only real measure—the ability to give a film a green light—true power has traditionally resided with a handful of major stars. What sets Cruise apart from the rest is that he’s used his stardom to work with many of the great filmmakers of his time (Kubrick, Scorsese, Spielberg, Coppola, Mann, Stone, De Palma, Anderson) and a host of inspired journeymen, and he’s been largely responsible for the ascent of such talents as J.J. Abrams and Brad Bird. If this sort of thing were easy, we’d see it more often. And the fact that he did it for more than two decades speaks volumes about his intelligence, shrewdness, and ambition.
Recently, he’s faltered a bit, but his choices, good or bad, are still fascinating, especially as his aura continues to enrich his material with memories of his earlier roles, a process that goes at least as far back as Eyes Wide Shut. I haven’t seen Oblivion, but over the weekend, I caught Jack Reacher, a nifty but profoundly odd and implausible genre movie that runs off Cruise like a battery. (It’s actually much more of a star vehicle than Ghost Protocol, in which Cruise himself tended to get lost among all the wonders on display.) While most leading men strive to make it all seem easy, much of the appeal of watching Cruise lies in how hard this boy wonder of fifty seems to push himself in every frame, as if he still has everything to prove. Other stars may embody wit, cool, elegance, or masculinity, but Cruise is the emblem of the man who wills himself into existence, both on and off the screen, and sustains the world around him through sheer focus and energy. Real or not, it’s a seductive vision, or illusion, for those of us blessed with less certainty. As Taffy Brodesser-Akner says this week in The New York Times Magazine: “Who has ever worked so hard for our pleasure?”
It’s fair to say that I’ve spent more time discussing discussing Hannibal Lecter here than any other character in literature. This is a blog about writing, after all, and Lecter’s example is as good as case studies get, since it serves as both a model and a cautionary tale. The man we meet in The Silence of the Lambs, and to a lesser extent Red Dragon, is arguably the most compelling character to come out of the popular fiction of the last thirty years. Barely a decade elapsed before his most memorable cinematic appearance topped the list of AFI’s heroes and villains, which is astonishing for a role with less than twenty minutes of screen time. At his best, Thomas Harris is a suspense novelist of stunning intelligence and resourcefulness, and he’s written three novels that absolutely deserve to be ranked among the finest in the genre, as well as a flawed fourth book full of remarkable moments—although the fifth is best left unmentioned. But to a large extent, his reputation rests entirely on the creation of one character, and it’s defined his career to a degree that I don’t think he ever expected.
Of course, Harris himself was finally unable to keep Lecter under control, and if his prolonged silence is any indication, it seems that he’s gathering his energies for something else. This is all speculation, of course; Harris is notoriously private, and he’s never been anything but a slow, painstaking writer. But he’s also a man who wrote Hannibal Rising largely to avoid seeing his character fall into other hands, and I believe he’s intelligent enough to sense that the result is by far his weakest book. Hence the surprise of Hannibal, the NBC series that invents entirely new backstories for many of Harris’s most famous characters, all without the author’s involvement. I can’t say for sure what inspired Harris to relinquish control, and for all I know, there could be complicated rights issues involved. But I’d like to believe that Harris recognizes that he’s already sucked this particular vein dry, and is ready, at last, to move on. I’ve said before that an entirely new suspense novel from Harris would be the literary event of the year, possibly the decade, and I still hold out hope that we’ll see it.
As for Hannibal itself, I’m not sure how I feel. I watched the premiere last week, and plan to tune in again tonight, if only to catch a welcome glimpse of Gillian Anderson. It’s a well-crafted show, and there’s a lot of talent on both sides of the camera, but it also sets problems for itself that it may not be able to solve. Back when Red Dragon was first published, the figure of Will Graham, a profiler who willed himself into crime scenes to the point where he saw them play out through the killer’s eyes, may have been novel, but by now, we’ve seen variations on this character so many times that we’re already tuning out, no matter how hard the show works to make his the result visually exciting. Even more problematic is the casting of Mads Mikkelsen as Lecter. Mikkelsen is a fine actor, but his cold eyes and angular face make it hard for him to convey the character’s supposed charm, much less pass himself off as one of the leading lights of Baltimore society. He all but advertises that he’s the bad guy, which will only make his relationship with Graham increasingly implausible as the series continues.
But it’s really the premise itself that risks making the show unsustainable. Lecter needs to be in his cell, because he’s much less compelling for what he is than for what he was. His qualities as an epicure, a man of culture, and a social darling are all important facts to establish, but they only gain meaning from their absence: Lecter fascinates us once all these things have been taken away, leaving only a cold, flawless brain behind a pane of bulletproof glass, and what both Hannibal and the novel of the same name demonstrate is that it isn’t especially interesting to watch the old Lecter go about his business. (If Harris’s novel is any indication, he spends most of his time shopping.) If the show runs for long enough, it will eventually end up back where it needs to be, but it doesn’t do itself any favors by starting so far back in the timeline. As Lecter himself might say, a television series ought to start from first principles. And as it stands, it’s going to be a very long time before we see Hannibal back where he belongs.
I’d say a day rarely goes by when I don’t think about Stanley Kubrick, but recently, he’s been on my mind even more than usual, thanks to the appearance in my life of two wonderful pieces of Kubrickiana. The first is the gorgeous Taschen study of Napoleon, the film Kubrick modestly hoped would be “the greatest movie ever made,” including the director’s own notes, voluminous research materials, reference photographs, and his original script. The other is Room 237, the fascinating if somewhat shapeless new documentary that details five elaborate interpretations of The Shining, with its interviewees convinced that the movie is really a veiled allegory, among other things, for the Holocaust, the genocide of Native Americans, or the role Kubrick played in faking the Apollo moon landing. And although the two works might not seem to have much in common, aside from the figure at the center, they cast a surprising amount of light, both separately and together, on the artist who seemed more committed than any other director to advancing the entire medium.
Room 237 deliberately leaves its viewers with more questions than answers, and there are times when we’d like a little more information about some of the passionate, articulate, slightly unhinged voices on the soundtrack, but it’s still a huge pleasure to watch. If nothing else, it’s a welcome excuse to revisit The Shining, one of the richest of all American movies, as the documentary picks it apart with the obsessiveness of a conspiracy theorist rewinding the Zapruder film. It gives us freeze frames, enlargements, diagrams, and entire scenes played in slow motion or reverse, until the film starts to resemble one of those visual puzzles that Nabokov evokes at the end of Speak, Memory: Find What the Sailor Has Hidden. The guiding principle of these readings is that nothing is a mistake, and that even seemingly innocent continuity errors can convey a secret meaning. Even I’d never noticed that the color of Jack’s typewriter changes over the course of the movie, that the pattern on the carpet beneath Danny’s feet reverses itself between two shots, or that Stuart Ullman’s office includes an impossible window.
And it’s worth asking why Kubrick, and The Shining in particular, has encouraged such diverse—and occasionally insane—interpretations. For one possible answer, we can turn to Taschen’s Napoleon, which convincingly documents that Kubrick really was as obsessive as he seemed: even before the project had been officially approved by any studio, he had an army of assistants and researchers compiling visual references, combing through archives, and assembling a card catalog that documented what every character was doing on every day of Napoleon’s life. When one of the interviewees in Room 237 claims that Kubrick deployed a team to spend months researching every aspect of Colorado history, we aren’t given much supporting evidence, but it’s certainly plausible. The interviewees argue, not without reason, that Kubrick was a perfectionist who shot miles of film, did countless takes, and wanted to control every aspect of a movie’s production, so that even the fact that a chair in the background disappears from one shot to the next can be taken as a deliberate choice.
But I think there’s a deeper point to be made here about the nature of Kubrick’s process, and of artistic endeavor in general, beyond what particular significance we read into The Shining. (For what it’s worth, I think there’s a good case to be made that the film deliberately incorporates symbolism from American history, although less for the sake of a clear message than as a way of enhancing its richness and texture, and I’m not sure how seriously to take that can of Calumet baking powder.) Kubrick understood that the point of meticulous control, paradoxically, is to make a movie that can strike up its own reality in the inner life of the viewer, independent of the artist’s intentions. The Shining retains its fascination, after so many other films have been forgotten, because the intricacy of its construction, far from limiting its possible readings, creates a sort of playground—or labyrinth—that rewards endless exploration. The real impossible window is one that the work opens up because the rest of it has been so deliberately constructed. And while I’m not sure what Kubrick would have made of the elaborate games of Room 237, I think he’d be the first to grant that only a very dull movie is all work and no play.
Somewhere in my parents’ home, there’s a book with both its front and back covers missing. When it first fell into my hands, it was brand new, and I would have been about eight years old, which I remember because I can still see exactly where it stood on the bookcase in our old house. The strange thing is that it wasn’t on a shelf I could reach: either someone took it down for me or I made a point of retrieving it myself, and it’s been so long that I’m not sure which was the case. All I know is that for the next ten years, it was rarely out of my sight, and throughout the most formative decade of my life, it was probably the book I read the most. Even now, I know much of it by heart, and I’ll occasionally find its phrases and rhythms appearing in my work, like fragments of my own memories. It was Roger Ebert’s Movie Home Companion, and when I look back now, I realize that it wasn’t just the book that first introduced me to the movies—which would be legacy enough—but one that made me think for the first time about journalism, criticism, and countless other aspects of the world and culture around me.
I’ve written at greater length about Ebert’s role in my life here and here, and I won’t repeat myself. I never had a chance to tell him in person how much he meant to me, although I’d like to think that he saw what I wrote here, and he certainly heard much the same thing from countless other writers and movie lovers. Still, the fact that I never met Ebert, despite having lived the last few years of my life in Chicago, will always remain a profound regret, although I’m very grateful that I got to see him in person at the celebration of his favorite film music at the Chicago Symphony Orchestra. For a while, Ebert and Gene Siskel were my two favorite guys on television, and I can still hum the opening theme for At the Movies, which was always a high point of my week. I’ll never forget where I was when I learned that Gene Siskel had died, and I’m sure I’ll remember where I was when I heard that Ebert was gone. (To give you a sense of how big a part of my life Ebert was, my wife called me with the news from work, and a college friend emailed later that day to say she was thinking of me.)
If there’s a silver lining to Ebert’s death, it’s that it gives us a sense of how deeply he influenced a whole generation of writers and critics. Will Leitch’s bittersweet remembrance in Deadspin, which recounts how he benefited from Ebert’s example and generosity, then foolishly threw it all away, is essential reading. But the words that linger with me the most are those of Scott Tobias of The A.V. Club, which reflect my own feelings to an almost frightening extent:
Cinema is a river with many tributaries, and I’m sure I’m not alone among movie-crazy teenagers in the ‘80s in using Roger Ebert’s Movie Home Companion as the boat downstream. You go through all the four-star reviews. You see Taxi Driver, and then of course you have to see Raging Bull, and then every other Martin Scorsese picture that sits on the video shelf. (And then you get into the movies that influenced Scorsese, which is a lifetime in itself.) You argue with him, you glean insights in the things you watch, you learn an entire new way of thinking, talking, and writing about the movies. And you never stop watching. You never stop debating. You have a companion for life, even now that his is over.
“The old man was around for a long time,” Ebert says of John Wayne in The Shootist, and although Ebert was only in his thirties when he wrote those words, the same could be said about his own career. Ebert was the one who first taught me that, at his best, a critic is sort of an island of stability, staying at the same desk for forty years to regard a changing world through a very particular lens, until his body of work says as much about the decades through which he lived as about the movies themselves. Ebert once seemed more stable—and certainly more substantial—than most, and at his prime, it was hard to believe that he would ever be gone. Toward the end, of course, this changed. Yet it’s in the last act of his life that his influence will be the most profound: he proved that criticism, a trade that has often been denigrated and dismissed, can give us the tools to face the fact of our own mortality with honor. At the end of his life, Ebert seemed reduced to little more than his words and, remarkably, his thumb, as if his most famous trademark had really been a mysterious preparation for a time when it would be all that remained. And in the end, his words were enough.
If you’ve been reading this blog for any period of time, you’ve no doubt gathered that I like David Mamet. While I generally agree with Lawrence Weschler that Walter Murch is the smartest person in America, there was a time in my life when I would have ranked Mamet at least a close second. I’ve learned a tremendous amount about craft from his essays, interviews, and commentary tracks, and in particular, his little book On Directing Film is the most useful guide to storytelling I’ve ever seen. As I’ve mentioned before, I discovered it at a point when I thought I’d figured out the writing process to my own satisfaction, so reading it was a little like having an efficiency expert visit your business for a day and set you straight regarding best practices. I encountered it too late for it to have any real influence on The Icon Thief, but it was a major reason I was able to get City of Exiles from conception to finished draft in under a year, and it’s since become an indispensable part of my approach to writing. I try to read it again every six months or so, especially when I’m starting a new project, and I’m still amazed by its level of insight and practicality.
Yet there’s a shadow side to Mamet’s intelligence and mastery. It’s taken me a long time to figure out what it is, and I’ve been thinking about it a lot recently, ever since seeing Mamet’s latest movie, Phil Spector, which aired over the weekend on HBO. Like all of his films, it’s watchable, full of good dialogue, and admirably streamlined: it clocks in at just over ninety minutes, and there isn’t an ounce of fat on the screenplay. All the same, it feels weirdly like half a movie, or a brilliant sketch of something better, which is true of nearly all of Mamet’s work as a director. I haven’t seen a movie of his I didn’t like—I even enjoyed Redbelt—but there’s something clinical and detached about his style that leaves even his best films feeling a little thin. And the more I think about it, the more it seems like an inevitable consequence of his approach to craft. Mamet’s method is as rigorous as mathematics: you figure out the sequence of objectives for each character, then craft the scene and the individual shots to convey this information as simply as possible. Hence his beloved story about Stanislavsky:
Stanislavsky was once having dinner with a steamboat captain on the Volga River and Stanislavsky said, “How is it that among all the major and minor paths of the Volga River, which are so many and so dangerous, you manage to always steer the boat safely?” And the captain said, “I stick to the channel; it’s marked.”
If nothing else, Mamet’s movies stick to the channel, and his philosophy as a director has always been that you shouldn’t stray much to either side. Most famously, he believes that if a script has been properly written, the actors just need to say the lines clearly and without inflection, and the words themselves will do the work—although if Phil Spector is any indication, even Mamet can’t always get this from Pacino. This approach to storytelling is unimpeachably correct, and if you’re going to imitate any director, you can’t go too far wrong by following Mamet: at worst, you’ll end up with a first draft that is mechanical but basically efficient, which is far from the worst that can happen. (As T.S. Eliot says in one of his essays, a poet who imitates Dante will wind up with a boring poem, but someone who imitates Shakespeare is likely to make a fool of himself.) But Mamet has essentially transformed himself into a director who delivers brilliant, clean, unimpeachable first drafts. And it’s no accident that the best movies based on his work—which I’d argue are The Untouchables and Glengarry Glen Ross—were made by other directors.
And we’ve seen much the same progression in Mamet’s prose, which has devolved from the wit and lucidity of On Directing Film to something crabbed, aphoristic, and airless. Bambi Vs. Godzilla contains five or six pages that include some of the best storytelling advice imaginable—if you’re curious, it’s in the chapter “The Wisdom of the Ancients”—surrounded by material so tight and hermetic that reading it becomes physically enervating. The same is true, sadly, of Three Uses of the Knife, and I’m too discouraged to even try The Secret Knowledge. Which is just a reminder, as if we needed one, of the pitfalls of genius. Mamet remains the most intelligent living writer I know, and when it comes to the nuts and bolts of craft, he’s right about almost everything. But being consistently right for forty years can be dangerous in itself. Mamet is very good at what he does, and unlike a lot of artists, he knows the reasons why. But there’s a point where logic and craft take you only so far, at least not without being willing to embrace the possibility of failure or foolishness. Mamet, like most smart men, simply can’t take that risk. And although he’s still the best there is at sticking to the channel, there’s a chance that a lot of viewers will simply decide to change it.
Structure means knowing where you’re going; making sure you don’t meander about. Some great films have been made by meandering people, like Terrence Malick and Robert Altman, but it’s not as well done today and I don’t recommend it. I’m a structure nut. I actually make charts. Where are the jokes? The thrills? The romance? Who knows what, and when? You need these things to happen at the right times, and that’s what you build your structure around: the way you want your audience to feel. Charts, graphs, coloured pens, anything that means you don’t go in blind is useful.
Some of you are probably reading this post at Starbucks. Maybe you didn’t really need a coffee, but wanted to relax and catch up on your email in a pleasant, convenient environment where you could rent a comfortable chair for the price of a beverage. What you wanted, in short, was a third place—a location that wasn’t your home or office, but where you could unwind for half an hour among a few other regulars. In The Great Good Place, Ray Oldenburg describes the third place as a free or inexpensive location, ideally serving food and drink, where people can meet, have conversations, or sit quietly on their own. Starbucks is well aware of the power of the third place—founder Howard Schultz mentions it repeatedly in his book—and has taken pains to turn itself into a destination where people like to spend time even if they don’t particularly need to be caffeinated. Coffeehouses, as it happens, have often played such a role: the insurance company Lloyd’s of London began as a coffeehouse where sailors, ship owners, and merchants could talk shop. And third places are an essential part of building communities and social relationships.
That’s true of fiction, too. Recently, my wife and I have been watching a lot of The Vampire Diaries, and we’re endlessly amused by the fact that all the main characters spend most of their free time at the Mystic Grill. (At least when they aren’t attending yet another picnic, clam bake, or sock hop at the mayor’s mansion.) At times, the fact that everyone in Mystic Falls seems to end up at the Grill, regardless of age, social status, or supernatural orientation, verges on the surreal—among other things, it seems to be an obligatory stop for any visiting vampire or werewolf passing through town. Yet nearly every television show has its own equivalent of the Grill, a local hangout where the characters can interact and run into one another outside their homes and workplaces. Friends had Central Perk; Beverly Hills 90210 had The Peach Pit; Seinfeld had its famous diner; and Cheers had, well, Cheers. From a budgetary perspective, it makes sense: a single standing set can serve as a backdrop for scenes that don’t require any particular location, and a restaurant or bar offers plenty of convenient business for the actors and director.
But it also serves a more subtle narrative purpose. The screenwriter Terry Rossio says somewhere on his excellent Wordplay blog—I can’t find the specific post—that it can be a good idea for a movie’s characters to return periodically to the same familiar spot, a narrative home base that grounds the story and allows it to develop a sense of place, rather than jumping from one new location to another. From a storytelling perspective, this saves a lot of time: instead of having to introduce a setting the viewer hasn’t seen before, you can sit the characters at their usual table and get down to the business at hand. Done properly, a third place becomes invisible. This is why it works best when the story focuses on the same handful of characters, who might naturally have a favorite place to meet, rather than the Vampire Diaries approach, in which so many different characters drift through the Grill at one point or another that it seems like the only restaurant in town. And at its best, the third place becomes a place we’d like to visit, perhaps out of the hope that we’ll see our favorite characters seated in the corner.
And at its heart, the usefulness of the third place expresses a crucial point about fiction. It’s important to vary the story’s setting, and a series of chapters set in the same office or police station are bound to start feeling a little repetitive. But if the scenes we stage there have interest and truth, the magic of the location starts to build: each scene in Hannibal Lecter’s cell, or at the lunch counter in Chungking Express, trembles with the resonance of the scenes we’ve seen there before, and this only happens if the location recurs. A third place in fiction, like a coffeehouse in real life, gains meaning from the interactions that unfold there, and a place described in just a few lines can start to seem more real than the houses in which we’ve lived. (The great example here is 221B Baker Street, which doesn’t quite qualify as a third place, but which has prompted fans to build detailed reconstructions based on a handful of tantalizing paragraphs.) So if you’re writing a story and there isn’t a third place for the characters to interact and dream, you might want to think about adding one. After all, there’s a reason that Starbucks is everywhere.
A movie, or any work of art, isn’t complete until someone sees it. Even the most modest studio film these days represents about two hundred years of collective work from the cast and crew, and when the result of their labor is projected on a screen in a darkened room, where it can shape and channel the emotions of a theater full of strangers, surprising things can happen. In Behind the Seen, Walter Murch compares this phenomenon to that of an old-fashioned radio tube, which takes a powerful but simple electrical current and combines it with a weak but coherent signal to transform it, say, into Beethoven’s Ninth Symphony. A similar thing happens to an audience in a theater:
The power—the energy—isn’t coming from the film. It’s coming from the collective lives and emotional world of the audience. Say it’s a big theater—you have a thousand people there, and the average age of that audience is 25. You have 25,000 years, three times recorded history, sitting in the audience. That’s a tremendously powerful but unorganized force that is looking for coherence.
And the mark of a great movie is one that takes up an unexpected life, for better or worse, once it meets the undirected power of a large popular audience.
I’ve been thinking about this ever since finally seeing Zero Dark Thirty, which I think is unquestionably the movie of the year. (If I were to repost my list of the year’s best films, it would occupy the top slot, just ahead of The Dark Knight Rises and Life of Pi.) It’s an incredible work, focused, complex but always clear, and directed with remarkable assurance by Kathryn Bigelow, who tells an often convoluted story, but never allows the eye to wander. Yet it’s a film that seems likely to be defined by the controversy over its depiction of torture. This isn’t the place to respond to such concerns in detail, except to note that Bigelow and writer Mark Boal have already argued their own case better than anyone else. But it seems to me that many of the commentators who see the movie as an implicit endorsement of torture—”No waterboarding, no Bin Laden,” as Frank Bruni writes—are reading something into it that ignores the subtleties of the film’s own structure, which begins with enhanced interrogation and then moves beyond it.
But it’s a testament to the skill and intelligence of Bigelow, Boal, and their collaborators that they’ve given us a movie that serves as a blank slate, on which viewers can project their own fears and concerns. Zero Dark Thirty doesn’t tell us what to think, and although some, like Andrew Sullivan, have taken this as an abdication of artistic responsibility, it’s really an example of the art of film at its height. It’s a movie for adults. So, in very different ways, are Lincoln and Django Unchained, which is why I’m not surprised by the slew of opinion pieces about the lack of “agency” in the black characters in Lincoln, or whether Django is really a story about a slave being saved by a white man. Such responses tell us more about the viewers than the movies themselves, and that’s fine—but we also need to recognize that movies that can evoke and sustain such questions are ultimately more interesting than films like Argo or Les Misérables, which reassure us at every turn about what we’re supposed to be feeling.
Needless to say, the Oscars have rarely rewarded this kind of ambiguity, which may be why Zero Dark Thirty had to content itself with a shared award for Best Sound Editing. And both Argo and Les Misérables are very good movies. But it takes remarkable skill and commitment to tell stories like this—and in particular, to give us all the satisfactions we crave from more conventional entertainment while also pushing forward into something darker. (That’s why many of our greatest, most problematic works of fiction tend to come from artists who have proven equally adept at constructing beautiful toys: Bigelow could never have made Zero Dark Thirty if she hadn’t already made Point Break.) When we’re sitting in the dark, looking for coherence, we’re at our must vulnerable, and when we’re faced with a movie that pushes our buttons while leaving us unsettled by its larger implications, it’s tempting to reduce it to something we can easily grasp. But in a medium that depends so much on the resonance between a work and its viewers, such films demand courage not just in the artist, but from the audience as well.
Who is the author of a movie? As with most collaborative art forms, it’s a question that gets tricker the closer you look at it. A screenwriter like William Goldman might argue that the author of an original screenplay deserves most of the credit, which seems reasonable until we remember that most stories are radically altered, often by anonymous hands, from initial draft to shooting script, and equally significant changes may take place in the editing room. An editor like Walter Murch plays an enormous role in finding the final structure and rhythm of a movie, but here, again, there’s a huge range of potential influence, from editors who simply reflect the director’s wishes to artists like Ralph Rosenblum, who discover a shape for a movie while working on their own. Then there’s the producer, who often shepherds the entire process from the initial idea to the marketing campaign, oversees major creative and hiring decisions, and is there to pick up the Best Picture award on Oscar night. This says nothing of the cinematographer, art director, sound designer, composer, and others, who can enormously influence our final experience of a film, or the actors, including a star whose involvement may have been crucial to securing funding in the first place, and who is often the only face the public associates with the finished product.
Usually, of course, we tend to think of the director as the final author of a film. This is a surprisingly recent development—the auteur theory as we know it wasn’t developed until the early fifties—and it’s often been criticized as unfair to the many other talents whose work is essential to filmmaking. Yet the auteur theory, while inherently undemocratic, is like democracy in at least one way: it’s the worst theory of film, except for all the other theories that have been tried. And it’s the only way I can explain a movie like Silver Linings Playbook. It’s funny, beautifully acted, and ultimately very touching, yet the screenplay is manipulative, contrived, sometimes superficial, and doesn’t always escape the trap of smug, affected quirkiness. Watching it, I realized that you could shoot the same screenplay, word for word, with most of the same actors, but in the hands of a different director, the result would be unwatchable. And the only explanation for its ultimate effectiveness is that David O. Russell, who also adapted the screenplay from Matthew Quick’s novel, is a director who understands his own strengths and limitations, and particularly his ability to make specific, subtle choices that can take otherwise routine material and make it compelling.
Because a director’s ultimate role is that of someone who makes choices, thousands of them, starting in preproduction and continuing on an hourly basis until the final cut is delivered, covering everything from the color of the wallpaper to which of three possible endings to use. As David Mamet says of his experience directing House of Games:
[The crew] came over to ask me my opinion regularly, not because of any talent on my part, or because of any expertise I had demonstrated, but because the film is a hierarchy and it was my job to do one part of it: to provide an aesthetic overview, and to be able to express that overview in simple, practicable terms—more light on her face, less light on her face; the car in the background, no car in the background.
This may sound straightforward, but anyone who has confronted the endless series of choices that a work of fiction presents, even for a writer working in solitude, knows that it’s the hardest thing in the world, especially when conducted in public, with thousands of dollars at risk of being lost with even the smallest delay. Those choices, as much as the individual talents of the creative team involved, are what give a film its flavor and individuality. Nearly everyone involved in movies on the studio level, from the color timers to the supporting cast, are there because they’ve reached the peak of their profession, but it means nothing if their gifts are squandered or misdirected.
And some of the most crucial choices that a director can make are, by nature, invisible. It’s no exaggeration to say that the best performance in the world can be turned into the worst by a deliberate selection of bad takes in the editing room, and Russell’s approach to Silver Linings Playbook is a reminder of how subtle this process can be:
There’s an extreme version we shot that’s very dark. You know, we had to cover it several different ways on a 33-day schedule. And the De Niro character was written harsher or warmer…So you have to be careful with it and it took a lot of careful work in the editing room with Jay Cassidy to calibrate it.
Which is why the role of a director, even after all this time, remains so mysterious. It’s about calibration, or finding the right balance and tone for elements that can be combined in an infinite number of ways—which is why it seems to go wrong more often than it goes right, despite all the talent involved. Novelists do much the same thing: every word represents a choice, as does the direction of the plot, the actions of the characters, and even the decision of which story to write in the first place. Craft is about learning to plan as much of this process as possible in advance, while developing enough intuition and experience to make smart choices in the moment when confronted by the unexpected. And when a director, or any artist, can bring this sort of craft to bear under pressure, when it’s needed the most, it’s then that he deserves to be called an author.
I have a friend who hates Reservoir Dogs. He’s willing to grant that some of Quentin Tarantino’s other movies have merit, but refuses to rewatch this particular film, mostly on account of its violence—which, he says, he found increasingly hard to take after he had children. I can understand what he means. In the case of my own daughter, I’m still working out what kinds of media she’ll be watching at what age, and while I definitely plan to introduce Beatrix to the joys of Pulp Fiction and the two movies about her namesake at the right time, I might give Reservoir Dogs a pass. I liked it plenty when I first saw it, but I haven’t been tempted to revisit it in a long time, and these days, I think of it mostly as an inventive and resourceful debut that paved the way for the astonishing career to come. (The recent Vanity Fair oral history of the making of Pulp Fiction just serves as a reminder of how deeply influential Tarantino has been, even as his influences and innovations are absorbed into invisibility by the culture as a whole.)
And although I understand my friend’s point about the violence in Reservoir Dogs, what lingers with me, weirdly, is Tarantino’s restraint. Take the movie’s most notorious sequence. When I think of it today, what I remember is not so much the violence as two amazingly assured shots. The first is the moment when the camera turns aside as Mr. Blond prepares to hack off the cop’s ear, tracking away to focus on a nondescript corner of the room as we listen to the screams coming from just offscreen. It’s a startlingly subjective camera move, as striking in its way as the moment in Taxi Driver when Scorsese pans away from Bickle’s telephone rejection from Betsy, and reflects Tarantino’s understanding that such things are more effective when left to the imagination. Even better is the shot immediately afterward, when Mr. Blond leaves the warehouse, crosses a peaceful street in silence, retrieves a gas can from his car, and returns, all in a single unbroken take that ends back in the room where “Stuck in the Middle With You” is still playing. Mike D’Angelo of The A.V. Club has sung this shot’s praises, and it’s one that still knocks me out, more than fifteen years after I first saw it.
Given this kind of filmic grace, which Tarantino had in spades before he even turned thirty, it’s instructive to turn to Django Unchained, which I finally caught over the weekend. (I liked it a lot, by the way, although it strikes me as one of his less essential movies, somewhere above Death Proof and below Jackie Brown.) Django has also aroused controversy over its violence, and while I wouldn’t want to argue that it isn’t a violent movie, here, too, I’m more struck by its restraint than anything else. This is partly because it’s the first movie in which Tarantino hasn’t done deliberate violence to the medium of storytelling itself: the plot proceeds in a linear fashion, without any of the structural games we find in his previous work, and the boundary between good and evil is much more clearly delineated than usual. Even if we hadn’t been clued in by the fact that audiences, for the most part, seem to be embracing the movie, there isn’t a lot of doubt about how this particular revenge story will conclude. And although Tarantino doesn’t shy away from the blood squibs in his climactic shootouts, he’s even more careful here in his use of violence than usual.
Django Unchained takes place in a violent time, with plenty of human misery inherent to the story, but it doesn’t linger over scenes of cruelty and torture. Tarantino gives us these moments in flashes, just long enough to lock them in the mind’s eye, and doesn’t deal with sexual violence at all, except by implication. Which doesn’t mean he shies away from the implications of the material. The film’s most memorable scene is the long monologue by Samuel L. Jackson—who gives what I think is the supporting performance of the year—in which he coolly explains how a living death in the mines, to which slaves are routinely condemned, is far more cruel than any torture Django’s captors could invent. Tarantino knows the difference between the violence of history and that of escapism, and it’s fascinating to see a film in which they exist so casually side by side. Sometimes his canniness goes a little too far: when Django engages in one killing that might make him seem unsympathetic, he instructs the bystanders to tell the victim goodbye, and when he fires, the body is jerked offscreen by what can only be a stagehand with a length of piano wire, leaving it conveniently out of sight for the rest of the scene. It’s a cheap gag, but done with the artistry that separates Tarantino, not just from his imitators, but from his precursors. And like it or not, that’s the mark of a master.
I’ve always been fascinated by Kevin Spacey. This is an actor with less genetic charisma than any other leading man I can name—he’s neither handsome enough for conventional star parts or physically distinctive enough to be a striking supporting player—but his intelligence and craft have resulted in some of the most indelible performances of the latter half of the nineties, and beyond. I don’t think any other living actor can claim a run as good as Seven, The Usual Suspects, L.A. Confidential, and American Beauty, not to mention Beyond the Sea, which I’m convinced is one of the great bad movies of all time, deserving, as David Thomson has noted, of an award given annually in its name. There’s a preening, endearing vanity behind Spacey’s nondescript looks that emerges whenever he’s asked to sing, which he does very well, or do one of his uncanny impersonations. He’s a showoff trapped in an everyman’s body, and although I don’t think he’s ever given a truly uncalculated or uninhibited performance, he’s also provided me with more pleasure as a moviegoer over the years than most actors with more conventional endowments.
And he’s the perfect lead for House of Cards, the weirdly compelling political drama that premiered over the weekend on Netflix. Spacey always seems to be in a kind of conspiratorial huddle with the audience, even if he’s only conning us in the end, and as the scheming majority whip Frank Underwood, he isn’t above giving the lens itself a wink, and occasionally an extended monologue to comment on the action. If the show were more realistically plotted, this would be distracting, but a realistic look at power politics isn’t quite what this series has in mind. Underwood is a master manipulator, but everyone around him is so gullible, including the supposed Washington operators with whom he interacts, that it’s as if he’s read the script notes for the next thirteen episodes. The fetching Kate Mara does what she can in the role of an ambitious metro reporter, but her rapid rise, once Underwood starts feeding her information, is more Brenda Starr than Bob Woodward. It should play worse than it does, but if there’s anyone who can carry this sort of thing, it’s Spacey, who clearly relishes the chance to have the camera to himself, and knows how to sell arch lines like “I love her like sharks love blood.”
And I kind of love it, too. House of Cards is remarkably unsubtle in its writing, but benefits from considerable subtlety in its art direction, photography, and sound design. Every frame glows with the burnished yet chilly digital look that David Fincher, who directed the first two episodes, has long since perfected, and the compositions are both clinical and playful: instead of the long tracking shots of The West Wing, we’re treated to a vision of power as one glossy tableau after another. The sets and locations are lovingly detailed—even if my wife observed that no real newsroom kitchen has that much free bread—and we’re given plenty of time to drink them in, with a pace that some viewers have criticized as being too slow, but which suits the balance and polish of the images on the screen. The result is a television series that looks and feels more like a movie than any I’ve ever seen, and its elegance goes a long way toward addressing its narrative shortcomings. (It’s also presented in an unusual aspect ratio, slightly narrower than the standard 16:9 size, which I suspect represents a compromise between the anamorphic format that Fincher prefers and the demands of a show destined to be viewed primarily on widescreen televisions.)
And for all the hype over the fact that the series is being released in one big chunk, rather than parceled out in weekly installments, I have a hunch that its real influence will be in its look and tone, rather than its delivery system. I’ve only seen the first two episodes, and although I intend to watch the rest soon, it doesn’t strike me as the kind of densely plotted show that demands to be devoured in a few epic viewing sessions: all the conventions of serialized storytelling are here, but mostly for the sake of appearances. I’ve written before about the challenges of constructing shapely long-form narratives in television, in which a show can be canceled after two episodes or run for years, and although the Netflix model presents one possible solution to the problem, my initial impression is that it leads to a sort of complacency: subplots are introduced without any particular urgency, with the implication of a payoff somewhere down the line, where a series produced under greater ratings pressure might feel more of a need to justify itself moment to moment. House of Cards is secure, even occasionally a little smug, in the fact of its own survival. I’m enjoying it tremendously, but I can’t help but feel that it might have been a stronger show, if less lovingly crafted, if, to borrow the title from another Kevin Spacey movie, it had been forced to swim with the sharks.
Ideally, all stories should consist of a series of events that arise organically from the characters and their decisions, based on a rigorous understanding of how the world really works. In practice, and especially in genre fiction, it doesn’t always happen that way. Sometimes a writer just has a plot development that he really wants to write, and isn’t entirely sure how to get there from here. Frequently he’ll construct a story out of a number of unrelated ideas, and needs to cobble them together in a way that will seem inevitable after the fact. And sometimes he’ll simply paint himself into a corner, or realize that he’s overlooked a monstrous plot hole, and wants to extricate himself in a way that leaves his dignity—and most of the earlier material—intact. A purist might say that the author should throw the story out and start again, proceeding more honestly from first principles, but a working writer doesn’t always have that luxury. Better, I think, to find a way of keeping the parts that work and smoothing over the connective bits that seem implausible or unconvincing, while keeping the reader immersed in the fictional dream. And in the spirt of faking it until you make it, I offer the following suggestions:
1. Make it a fait accompli. As I’ve mentioned before, a reader is much more likely to accept a farfetched narrative development if the characters take it for granted. Usually, this means putting the weakest link in your story offstage. My favorite example is from Some Like It Hot, an incredibly contrived movie that shrewdly refuses to show the most crucial moment in the entire plot: instead of giving us a scene in which the main characters decide to go on the run in drag, it just cuts to the two of them already in skirts, rushing across the platform to catch a train to Florida. The lesson, clearly, is that if something in your story is obviously impossible, it’s better to pretend that it’s already happened. And the best strategy of all is to push the most implausible element of your story outside the boundaries of the plot itself, so it’s already in place before the story begins, which is what I’ve previously called the anthropic principle of fiction. If the viewer doesn’t see something happen, it requires an additional mental effort to rewind the story to object to it, and by then, the plot and characters have already moved on. It’s best to make like Jack Lemmon in heels, and just run with it.
2. Tell, don’t show. Normally, we’re trained to depict a turning point in the action as vividly as possible, and are taught that it’s bad form to describe an important moment indirectly or leave it offstage. When it comes to a weak point in the plot, however, that sort of scrutiny can only raise questions. It’s smarter, instead, to break a cardinal rule of fiction and get it out of the way as unobtrusively as possible. An implausible conversation, for instance, might best be rendered as indirect dialogue, leaving readers to fill in a more convincing version themselves. And if you can’t dramatize something in a credible fashion, it might be best to summarize it, in the way dramatists use the arrival of a messenger to convey developments that would be impossible to stage, although it’s best to keep this sort of thing as short as you can. There’s a particularly gorgeous example in the novel The Silence of the Lambs. Thomas Harris shows us Hannibal Lecter’s escape from federal security in loving detail, but when it comes to the most astonishing element of his getaway—the fact that he peels off another man’s face and wears it like a mask—he lets Jack Crawford describe it after the fact in a few terse sentences. It’s still hard to buy, but it’s more acceptable than if he’d allowed us to question the action as it unfolded.
3. Use misdirection. The secret of sleight of hand is that the audience’s eye is naturally drawn to action, humor, and color, allowing the performer to conduct outrageous manipulations in plain sight. Similarly, when a story is engaging enough from moment to moment, the reader is less likely to object to inconsistencies and plot holes. A film like Inception, for instance—which is probably my favorite movie of the last fifteen years—is riddled with logical problems, and even the basic workings of its premise aren’t entirely consistent, but we’re having too much fun to care. In some ways, this is the most important point of all: when a work of art is entertaining, involving, and emotionally true, we’re more likely to forgive the moments when the plot creaks. Some of our greatest books and movies, like Vertigo, are dazzling precisely because they expend a huge amount of effort to convince us of premises that, if the artist proceeded with uncompromising logic, would never make it past a rough draft. Just remember, as Aristotle pointed out, that a convincing impossibility is preferable to an unconvincing possibility, and take it from there.
5. Looper. As Rian Johnson’s online commentary for the movie makes clear, this is the ultimate rarity: a labor of love, developed over the course of a decade, that is immediately accessible and exciting, and which knows how to tell a complicated story in quick, economical strokes. The montage that follows one character’s life over three decades may be the year’s single most bravura sequence, and although Johnson isn’t quite as good at shooting action as he is at conceiving a twisty plot, that’s a minor flaw in an otherwise remarkably assured and singular movie. In a year in which Prometheus and John Carter confused effects with storytelling, this was a small masterpiece of grounded science fiction, and should stand as an example for many filmmakers to come.
4. Flight. In some ways, this was the movie that made me most hopeful for the future of Hollywood. It isn’t an independent film or the work of a visionary auteur: rather, it’s a solid mainstream picture, based on an ambitious original screenplay, with a major star and superb director willing to tackle thorny, uncomfortable issues of character and ethical choice. Above all, it’s a slick, entertaining movie for adults that puts technology at the service of a story that comes across as old-fashioned in its belief in narrative, performance, and big moral themes. The fact that it was brought in on a modest budget and enjoyed considerable popular and critical acclaim only underlines that this is the kind of movie that studios can, and should, be making all the time.
3. The Master. At this point, Paul Thomas Anderson has evolved into a director of such peculiar, hermetic intensity that it would almost be more surprising if he delivered a movie that wasn’t so elliptical, mysterious, and deeply strange. Yet for all its apparent shapelessness, it delivers more scenes, moments, and images that linger in my memory than any other movie I’ve seen all year, as rendered by Mihai Malăimare, Jr.’s gorgeous cinematography, which was scandalously denied an Oscar nomination. Anchoring it all is the ravaged presence of Joaquin Phoenix, who gives what I emphatically believe is the year’s best performance: secretive, violent, and tender, with an Easter Island face that speaks more eloquently than any dialogue ever could.
2. Life of Pi. The year’s most technically astounding movie is bound to be diminished on the small screen, but its achievement goes far beyond the most lifelike special effects I’ve ever seen. As a director, Ang Lee is both hungry for new challenges and capable of doing almost anything, and he indulges in a great deal of delicious trickery—changing the aspect ratio, playing with the visual possibilities that 3D affords—without losing sight of the story’s underlying humanity. The ending is heartbreaking and inevitable, and cuts deeper than it seems at first glance: this isn’t just a love letter to the possibilities of digital filmmaking, but a meditation on the meaning and morality of storytelling itself.
1. The Dark Knight Rises. Take this, if you will, as the testimony of an avowed Christopher Nolan fanboy, but even after the inevitable backlash and nitpicking—which I don’t think is altogether warranted—I still think that this is the movie of the year. Much of its power is inseparable from the beauties of the IMAX format, which Nolan and his collaborators employ as it has never been used before, to tell the epic story of an entire city on a massive scale. Even on Blu-ray, however, its pleasures remain considerable: Bane’s voice still rumbles menacingly, and it has a more shapely, satisfying story than any of its predecessors, to the point where I’d argue that it’s a stronger picture than The Dark Knight—which makes it the best comic book movie ever made. And at a moment when superheroes seem to outnumber ordinary mortals at the multiplex, it’s an achievement that I suspect will only look better with time.
Note: For an explanation of some of this list’s more glaring omissions, please see here.
10. The Raid: Redemption. In a year in which the issue of media violence returned to dominate the national conversation, this was the most violent movie of all, with more than an hour and a half of the most graphic combat and bloodshed imaginable. Yet it’s curiously thrilling, a member of a long line of martial-arts movies that space out scenes of bonecrunching combat with the regularity of dance numbers in a musical. At times, it’s more exhausting than exhilarating, with huge reserves of energy and invention devoted to the barest of B-movie storylines, but it still finds time for displays of old-fashioned charisma—in the form of future superstar Iko Uwais—and even a cops and gangsters plot with a few satisfying payoffs. There’s an American remake on the horizon, but I’ll only see it if they cast all the principal parts with the stars of The Departed.
9. Skyfall. It isn’t quite on the same level as Casino Royale, which remains the best of all the Bond movies, but director Sam Mendes still manages to assemble the most striking series of images around the idea of Bond that the series has ever seen. Its major weakness is its villain, who is introduced in memorable fashion but whose plan turns out to be depressingly uninteresting, and it fumbles a number of big moments, notably the revelation of Naomie Harris’s true identity. Still, this is a big, satisfying entertainment that finally completes the most protracted reboot in recent cinematic history, and even as it ties a bow on the franchise, it honors its past, thanks in large part to its dynamite opening credits and theme song, which I find myself humming on a daily basis.
8. Moonrise Kingdom. One of Wes Anderson’s greatest strengths has always been his insight into the inner life of children—or of adults who behave like overgrown kids—and in twelve-year-old Sam and Suzy, he’s finally found the perfect pair he’s been seeking for his entire career. None of the adults, aside from Bob Balaban’s narrator, are drawn with the same level of vividness or affection, but perhaps it doesn’t matter: I see myself in these kids, and it’s clear that Anderson does as well. As always, his work is lavish with gags and visual puns, but what sticks with you is its tone of melancholy sweetness, and I won’t soon forget the image of those three brothers, in their pajamas, gathered around a Fisher Price turntable to listen to The Young Person’s Guide to the Orchestra. (It also has my favorite line reading of the year: “Where’s my record player?”)
7. The Cabin in the Woods. Of the two films from Joss Whedon’s miracle year, I suspect that this one will last the longest, since it’s the kind of movie that seems destined to be rediscovered by successive generations of passionate fans. It’s a savage deconstruction of slasher clichés—and arguably pursues the “zombie redneck torture family” trope a bit too monotonously—but also a love letter to the possibility of film, and reminds us how timid most movies really are. Above all, as a film that needs to be seen with as little advance knowledge as possible, it’s a short object lesson on the nature of surprise, and on how mechanical shocks have largely taken the place of the real thing. It’s likely to become a movie, like Psycho or Citizen Kane, in which the twists have passed into cultural currency, so if it’s still unspoiled for you, you owe it to yourself to see it now.
6. Wreck-It Ralph. Far more than the wretched Brave, which is a movie I dislike all the more as time goes on, of all recent animated films, this is the one that makes me hopeful about the future of the medium. It’s an unabashedly mainstream movie, designed to appeal to all quadrants, with jokes that alternate between ingenious and obvious, but it’s also fun, colorful, tremendously appealing, and blessed with a script that keeps surprising us on the levels of both plot and character. Like Toy Story or Who Framed Roger Rabbit, it takes a premise that could easily have turned into a commercial for itself and transforms it into something touching, weird, and undefinable. And it’s even better when paired with the wonderful short Paperman, which blends traditional and computer animation with a sense of grace that points the way forward for an entire art form.
Tomorrow: My top five movies of the year.
For as long as I can remember, the movies have been a huge part of my life. Growing up, I raided my parents’ videocassette collection on a regular basis, and may have been the only eleven-year-old in my fifth grade class whose favorite movie was 2001: A Space Odyssey. In high school, I was lucky enough to live only a short train ride away from the wonderful UC Theater in Berkeley, and spent many a wonderful weekend there taking in a double feature. (The only time I ever cut class on purpose was to catch an afternoon screening of Last Tango in Paris.) I continued this tradition in college, with countless visits to the Brattle and the Harvard Film Archive, and in New York I had an embarrassment of riches at the Film Forum, Lincoln Center, Landmark Sunshine, the Ziegfeld, and even the Angelika, despite its awful seats, screens, and location above a rumbling subway line. Chicago, meanwhile, offered the Music Box, Landmark Century, and many others. As a result, for the past fifteen years, I’ve probably averaged a movie a week, and sometimes more.
And although I’ve tried to make my mark in a different sort of art form, I’ve learned a tremendous amount as a storyteller from the movies, to the point where I sometimes feel, to misquote Ishmael, that the local movie theater was my Yale College and my Harvard. Part of the reason is that the movies allow us to experience a wide range of styles and subjects more quickly than a lifetime of reading: I can watch most of the movies on the Sight & Sound poll in the time it takes me to read War and Peace. The movies have omnivorously stolen whatever useful tricks were available from the other arts, and have raided the literary corpus for stories, often transforming them in fascinating ways. Of course, there’s some danger in taking the lessons of cinema too literally: a novel isn’t a movie, and both forms are capable of effects that can’t be achieved in the other. As a novelist, I have far more control over the finished product than I would as a director or screenwriter. But there’s no doubt that the play of my imagination on the page has been deeply shaped by my love of such filmmakers as Kubrick and the Archers, to the point where I can only echo John Irving: “When I feel like being a director, I write a novel.”
Which is why one of the hardest adjustments I’ve had to make as the father of a newborn baby centers on the fact that I’ll no longer be able to go to the movies as often as I’d like. For someone who has long been used to seeing the latest releases on opening weekend, and plenty of art house and revival movies on a regular basis, this is a real shock. The last movie I saw on the big screen was The Hobbit, and I’m not sure when I’ll have the chance to catch another. This wouldn’t be as big of a deal if Beatrix had happened to arrive, say, in early February, when there isn’t much worth seeing in any case. But as luck would have it, she was born at the height of Oscar season, which means I haven’t been able to see a wide range of movies that I otherwise would have caught on opening day: I haven’t seen Django Unchained or Silver Linings Playbook or Zero Dark Thirty or Amour or Les Misérables or even This is 40. I owe Christopher McQuarrie a personal apology for failing to at least check out Jack Reacher. And in a year that was already shaping up to be one of the best for popular filmmaking in a long time, it’s a loss that I feel deeply.
Nevertheless, beginning tomorrow, I’ll be counting down my ten favorite movies of the year, as I’ve done every year since starting this blog, despite the fact that the list will contain a number of startling omissions. At first, I was tempted to skip this year’s ranking, or to hold off on the outside chance that I’d at least see a couple of the movies mentioned above before Oscar night. At the moment, this doesn’t seem likely, so I’m going ahead with what can only be seen as an incomplete pool of contenders. Yet even if you arbitrarily cut the movie year off in the middle of December, as I’ve effectively done, you’re still left with an extraordinary year for cinema, and especially for big popular movies—a better year, in some ways, than either of the two I’ve covered here in the past. As such, it was perhaps the best year imaginable for me to say goodbye to cinema, at least for now: I’ve missed a lot, but I feel blessed to have seen the movies I did. The ones I’ve been forced to omit will still be waiting for me when the time comes, even if I end up watching them months from now, at home, with a baby in my arms. And when I put it that way, it doesn’t sound so bad at all.
Quentin Tarantino was right to be mad. Last week, in an interview with the journalist Krishnan Guru-Murthy of Channel 4 News, Tarantino reacted testily when asked for his thoughts on the cultural impact of violence in the movies: “Don’t ask me a question like that. I’m not biting. I refuse your question…You can’t make me dance to your tune. I’m not a monkey.” And although Tarantino ultimately comes off, as he often does in his press appearances, as a bit of a dick, it’s hard to blame him. For most of his career, he’s found himself at the center of the debate over cinematic violence, despite the fact that most of his films, Kill Bill notwithstanding, aren’t nearly as violent as their reputations would imply. A movie like Pulp Fiction contains only a few seconds of actual violence, as opposed to the nonstop killing we see in many mainstream action films, so Tarantino’s irritation at being asked such questions again isn’t hard to understand. Yet while I don’t much feel like entering that particular discussion either, I think it’s worth asking why the same handful of works and artists are repeatedly invoked as illustrations of violence in the media, even as countless other, even more violent movies are quickly forgotten.
The statement by Wayne La Pierre of the NRA in the aftermath of the Sandy Hook massacre was stupefying on many levels, but especially with regard to the movies he mentioned, which were limited to “blood-soaked slasher films like American Psycho and Natural Born Killers.” Setting aside the fact that neither is a slasher film, or even particularly graphic in its onscreen violence, it’s a little odd that the most recent of the two movies he decries is more than a decade old, when dozens of objectively more violent films have been released in the meantime. Clearly, these movies, both of which I admire with reservations, aren’t disposable or forgettable: they’re ambitious, stylish, problematic films that implicate us as much as the characters, and many viewers still haven’t gotten over it. Most audiences, it seems, handle cinematic bloodshed in much the same way as I’ve noted they deal with surprises. They don’t mind being surprised, or shown graphic violence, in the context of a genre they understand, but when their assumptions about a work of art are called into question—or if it makes them uncomfortable—they feel what Pauline Kael, Tarantino’s favorite movie critic, observed all these years ago about Bonnie and Clyde:
Though we may dismiss the attacks with “What good movie doesn’t give some offense?,” the fact that it is generally only good movies that provoke attacks by many people suggests that the innocuousness of most of our movies is accepted with such complacence that when an American movie reaches people, when it makes them react, some of them think there must be something the matter with it—perhaps a law should be passed against it.
Of course, the kind of violence that really shakes and infuriates an audience is very rare, which is why LaPierre had to reach so far back in time for his examples. For most works of art, violence functions for the artist much as smoking does for actors. The reason why there’s so much smoking in movies isn’t because Hollywood is determined to glamorize tobacco use, or is somehow in the pocket of the cigarette companies, but because smoking is a tremendously useful tool for performers, who are always looking for something to do with their hands: it gives them a wide range of ways to emphasize lines or emotional beats, and no comparable bit of business has managed to take its place. Similarly, violence is a proven, replicable way of provoking a reaction from the audience, and it doesn’t require much skill to pull off. Suspense in itself is tremendously hard to achieve, but putting a pistol in a character’s hand is easy, and in an art form starved for reliable tricks, it isn’t surprising that filmmakers often turn to violence for dramatic effects. Movies don’t glorify violence; they glorify the narrative jolts that violence can provide. When a movie resorts to periodic bursts of violence to keep the audience awake, it’s simply following Raymond Chandler’s dictum: “When in doubt, have two guys come through the door with guns.”
And I’m no exception. I’ve noted before how I inserted a violent scene into the first part of The Icon Thief because I felt that the story was lacking a necessary action beat, and I’ve often found myself parceling out moments of violence throughout my novels—which tend to have a pretty high body count—as if laying in dance numbers in a musical. I feel justified in doing this because this is one of the conventions of suspense fiction, and I’d like to believe that the violence in my novels is at least inventive, powerful, and integral to the plot. But I have misgivings about it as well, if only because I see all too clearly how violence can become a crutch, a way of artificially propping up a story that lacks organic excitement. Here, as in everything else, it all comes to down to craft. When I think about the works of art that will be experienced by my daughter Beatrix—who, after all, was named after a character in a Tarantino movie—I find that I’m less worried about her seeing violent films than in settling for movies that use violence as a substitute for craftsmanship. The problem isn’t violent movies; the problem is bad movies of any kind. And the only way to discourage mindless violence is to honor those artists who use it mindfully and well.