Posts Tagged ‘Martin Scorsese’
Writing the vegetables
In the huge interview with Empire that I recommended earlier this week, Christopher McQuarrie shares a story from the editing of Mission: Impossible—Rogue Nation. McQuarrie and Tom Cruise had assembled a rough cut of the entire movie, and it wasn’t playing well. To be fair, it never does, especially when it includes a lot of unfinished visual effects, but what they were seeing left them particularly depressed, and after watching the first half, they walked outside to get some air and brace themselves for the rest. (McQuarrie refers to it as a “Cut me, Mick” moment, and anyone who has dreaded going back to a troubled project can probably relate.) McQuarrie describes what happened next:
We went back in and sat down and Eddie [Hamilton] had cut together a big chunk of the second half of the movie. And we got to the moment—no music in it, nothing, total rough cut—and [Ilsa] said: “Come away with me.” Tom and I looked at each other, and we’re like, “Do you feel that? That kind of worked! That was actually good!” And then there was the scene in the safe house when they’re all fighting with each other, and that was working. All of a sudden, we were looking at it and going, “You know, all the vegetables of the movie are actually tracking. They’re actually playing really well. It’s all the action that’s not worked out yet.”
McQuarrie quickly moves on, but the notion of a story’s “vegetables”—the scenes that exist to get from one high point to another—stuck with me, along with the idea that you can evaluate a work in progress by keeping an eye on those interstitial scenes.
On some level, this seems to run contrary to one of the central tenets of storytelling, which is that if you nail the big moments and don’t actively screw anything up, the rest will take care of itself. (As Howard Hawks put it: “A good movie is three great scenes and no bad scenes.”) And in practice, viewers or readers will forgive almost anything if a story delivers when it counts. But the vegetables are important, too—to facilitate the climaxes, as worthwhile scenes in themselves, and as a kind of index of the whole. I’ve noted elsewhere that the famous moments that we remember rely on the surrounding material to have an impact. Revealingly, such scenes rarely, if ever, come at the very beginning, which is when writers feel the most pressure to start off with a bang—which only indicates the extent to which they depend on context and preparation. That pattern holds throughout the story. A novel or movie that consists of just one high point after another is likely to be exhausting, while one that conceives itself as a delivery system for awesome moments may fall flat whenever something amazing isn’t happening. To some extent, this is a matter of personal taste. I gave up on Game of Thrones in part because of its tendency to sag between character deaths, while I never got tired of Mad Men, which was made up of countless tiny but riveting choices that gained power from their cumulative impact. The most reasonable approach, unless you’re Matthew Weiner, is a deliberate balance in which the quieter scenes enable the more conventionally exciting sequences. The vegetables may not be the main attraction, but they play the same role in a story that aromatics like onions and garlic have in cooking. They add flavor and bind the rest together.
The vegetables can also be tasty in themselves. A few weeks ago, I finally saw Hamilton onstage, and my big takeaway was how good the second act is—it’s just one great song after another. Yet on paper, it also consists mostly of vegetables, with characters talking about politics or setting up information that will pay off later on. You can see this clearly in “Take a Break,” a purely functional song that exists solely to establish the fact that Hamilton is away from his family, but is so lovingly written and performed that it becomes a showstopper. Even better is “The Election of 1800,” which just moves the political pieces around, but thrills me to no end. (I love it in part because it reminds me of Evita, which is nothing but vegetables, but so cleverly delivered that we don’t even notice. And neither musical could exist, at least not at this level of success, if they hadn’t found solutions to the problem of treating politics in song.) You may not notice such functional scenes on your first encounter, or even your tenth, but the more you listen to a soundtrack or watch a movie, the more they stand out. They’re often the ones that I end up revisiting the most, in part because they can’t take our attention for granted, so they have to exist at a high level of craft. I’ve read the novel The Silence of the Lambs maybe ten times, but the one chapter that I never tire of reading is the one in which Clarice Starling searches the storage unit that might hold the key to an unsolved murder. It really only exists to get the plot to the next stage, but Harris enriches it with countless lovely touches, like how the resourceful Clarice fixes a stuck lock with a few drops of oil from a dipstick, or how she uses the jack from her car to lever up the rusty door. And you really start to appreciate this sort of scene when you notice its total absence from Hannibal Rising.
For a writer, the best thing about vegetables, as well as a potential pitfall, is that you can always find ways of improving them, which isn’t always true of the big moments. Novelists may not be in the same position as filmmakers who have to wait for special effects to be rendered, but if you’ve ever written a novel, you know that you eventually stop seeing the scenes that made you want to write it in the first place. You’ve read them so many times that they become invisible, and it can be hard to look past your preconceptions to see what’s actually on the page. With purely functional scenes, it’s easy to retain your detachment, and you can keep tinkering with them even when you lack the energy to tackle larger issues. Ideally, the vegetables can even serve as a gauge of quality, as they did with McQuarrie and Cruise: if the small stuff is working, there’s reason to hope that the big stuff is, too. But proportionality also matters, and endless fiddling on minor details can blind you to a scene’s true importance. (Martin Scorsese threatened to take his name off Raging Bull because he couldn’t hear a background character ordering a Cutty Sark in a bar.) Fretting too much over the vegetables can turn into procrastination, or a form of avoidance. As Carl Richards of the New York Times points out, it’s when you’re looking for excuses to avoid moving to the next stage that you seize onto finicky little items: “What color should the logo be?” “I can’t find an agent.” “It could use another round of edits.” “I’m not sure what font to use.” That’s when the vegetables tend to call to you the most. The best approach is to utilize this impulse to polish the small parts until they shine, while keeping it under control so that you don’t lose sight of the overall picture. Vegetables in a story are good for you. But you don’t want to neglect the meat.
The steady hand
Forty years ago, the cinematographer Garrett Brown invented the Steadicam. It was a stabilizer attached to a harness that allowed a camera operator, walking on foot or riding in a vehicle, to shoot the kind of smooth footage that had previously only been possible using a dolly. Before long, it had revolutionized the way in which both movies and television were shot, and not always in the most obvious ways. When we think of the Steadicam, we’re likely to remember virtuoso extended takes like the Copacabana sequence in Goodfellas, but it can also be a valuable tool even when we aren’t supposed to notice it. As the legendary Robert Elswit said recently to the New York Times:
“To me, it’s not a specialty item,” he said. “It’s usually there all the time.” The results, he added, are sometimes “not even necessarily recognizable as a Steadicam shot. You just use it to get something done in a simple way.”
Like digital video, the Steadicam has had a leveling influence on the movies. Scenes that might have been too expensive, complicated, or time-consuming to set up in the conventional manner can be done on the fly, which has opened up possibilities both for innovative stylists and for filmmakers who are struggling to get their stories made at all.
Not surprisingly, there are skeptics. In On Directing Film, which I think is the best book on storytelling I’ve ever read, David Mamet argues that it’s a mistake to think of a movie as a documentary record of what the protagonist does, and he continues:
The Steadicam (a hand-held camera), like many another technological miracle, has done injury; it has injured American movies, because it makes it so easy to follow the protagonist around, one no longer has to think, “What is the shot?” or “Where should I put the camera?” One thinks, instead, “I can shoot the whole thing in the morning.”
This conflicts with Mamet’s approach to structuring a plot, which hinges on dividing each scene into individual beats that can be expressed in purely visual terms. It’s a method that emerges naturally from the discipline of selecting shots and cutting them together, and it’s the kind of hard work that we’re often tempted to avoid. As Mamet adds in a footnote: “The Steadicam is no more capable of aiding in the creation of a good movie than the computer is in the writing of a good novel—both are labor-saving devices, which simplify and so make more attractive the mindless aspects of creative endeavor.” The casual use of the Steadicam seduces directors into conceiving of the action in terms of “little plays,” rather than in fundamental narrative units, and it removes some of the necessity of disciplined thinking beforehand.
But it isn’t until toward the end of the book that Mamet delivers his most ringing condemnation of what the Steadicam represents:
“Wouldn’t it be nice,” one might say, “if we could get this hall here, really around the corner from that door there; or to get that door here to really be the door that opens on the staircase to that door there? So we could just movie the camera from one to the next?”
It took me a great deal of effort and still takes me a great deal and will continue to take me a great deal of effort to answer the question thusly: no, not only is it not important to have those objects literally contiguous; it is important to fight against this desire, because fighting it reinforces an understanding of the essential nature of film, which is that it is made of disparate shorts, cut together. It’s a door, it’s a hall, it’s a blah-blah. Put the camera “there” and photograph, as simply as possible, that object. If we don’t understand that we both can and must cut the shots together, we are sneakily falling victim to the mistaken theory of the Steadicam.
This might all sound grumpy and abstract, but it isn’t. Take Birdman. You might well love Birdman—plenty of viewers evidently did—but I think it provides a devastating confirmation of Mamet’s point. By playing as a single, seemingly continuous shot, it robs itself of the ability to tell the story with cuts, and it inadvertently serves as an advertisement of how most good movies come together in the editing room. It’s an audacious experiment that never needs to be tried again. And it wouldn’t exist at all if it weren’t for the Steadicam.
But the Steadicam can also be a thing of beauty. I don’t want to discourage its use by filmmakers for whom it means the difference between making a movie under budget and never making it at all, as long as they don’t forget to think hard about all of the constituent parts of the story. There’s also a place for the bravura long take, especially when it depends on our awareness of the unfaked passage of time, as in the opening of Touch of Evil—a long take, made without benefit of a Steadicam, that runs the risk of looking less astonishing today because technology has made this sort of thing so much easier. And there’s even room for the occasional long take that exists only to wow us. De Palma has a fantastic one in Raising Cain, which I watched again recently, that deserves to be ranked among the greats. At its best, it can make the filmmaker’s audacity inseparable from the emotional core of the scene, as David Thomson observes of Goodfellas: “The terrific, serpentine, Steadicam tracking shot by which Henry Hill and his girl enter the Copacabana by the back exit is not just his attempt to impress her but Scorsese’s urge to stagger us and himself with bravura cinema.” The best example of all is The Shining, with its tracking shots of Danny pedaling his Big Wheel down the deserted corridors of the Overlook. It’s showy, but it also expresses the movie’s basic horror, as Danny is inexorably drawn to the revelation of his father’s true nature. (And it’s worth noting that much of its effectiveness is due to the sound design, with the alternation of the wheels against the carpet and floor, which is one of those artistic insights that never grows dated.) The Steadicam is a tool like any other, which means that it can be misused. It can be wonderful, too. But it requires a steady hand behind the camera.
Hollywood confidential
Curtis Hanson, who died earlier this week, directed one movie that I expect to revisit endlessly for the rest of my life, and a bunch of others that I’m not sure I’ll ever watch again. Yet it’s those other films, rather than his one undisputed masterpiece, that fascinate me the most. L.A. Confidential—which I think is one of the three or four best movies made in my lifetime—would be enough to secure any director’s legacy, and you couldn’t have blamed Hanson for trying to follow up that great success with more of the same. Instead, he delivered a series of quirky, shaggy stories that followed no discernible pattern, aside from an apparent determination to strike out in a new direction every time: Wonder Boys, 8 Mile, In Her Shoes, Lucky You, Too Big to Fail, and Chasing Mavericks. I’ve seen them all, except for the last, which Hanson had to quit halfway through after his health problems made it impossible for him to continue. I’ve liked every single one of them, even Lucky You, which made about as minimal an impression on the world as any recent film from a major director. And what I admire the most about the back half of Hanson’s career is its insistence that a filmmaker’s choice of projects can form a kind of parallel narrative, unfolding invisibly in the silences and blank spaces between the movies themselves.
There comes a point in the life of every director, in fact, when each new film is freighted with a significance that wasn’t there in the early days. Watching Bridge of Spies recently, I felt heavy with the knowledge that Spielberg won’t be around forever. We don’t know how many more movies he’ll make, but it’s probably more than five and fewer than ten. As a result, there’s a visible opportunity cost attached to each one, and a year of Spielberg’s time feels more precious now than it did in the eighties. This sort of pressure becomes even more perceptible after a director has experienced a definitive triumph in the genre for which he or she is best known. After Goodfellas, Martin Scorsese seemed anxious to explore new kinds of narrative, and the result—the string of movies that included The Age of Innocence, Kundun, Bringing Out the Dead, and Hugo—was sometimes mixed in quality, but endlessly intriguing in its implications. Years ago, David Thomson wrote of Scorsese: “His search for new subjects is absorbing and important.” You could say much the same of Ridley Scott, Clint Eastwood, or any number of other aging, prolific directors with the commercial clout to pick their own material. In another thirty years or so, I expect that we’ll be saying much the same thing about David Fincher and Christopher Nolan. (If a director is less productive and more deliberate, his unfinished projects can end up carrying more mythic weight than most movies that actually get made, as we’re still seeing with Stanley Kubrick.)
Hanson’s example is a peculiar one because his choices were the subject of intense curiosity, at least from me, at a much earlier stage than usual. This is in part because L.A. Confidential is a movie of such clarity, confidence, and technical ability that it seemed to herald a director who could do just about anything. In a way, it did—but not in a manner that anyone could have anticipated. Hanson’s subsequent choices could come off as eccentric, and not after the fashion of Steven Soderbergh, who settled into a pattern of one for himself, one for the masses. The movies after Wonder Boys are the work of a man who was eager to reach a large popular audience, but not in the sense his fans were expecting, and with a writerly, almost novelistic approach that frustrated any attempt to pin him down to a particular brand. It’s likely that this was also a reflection of how hard it is to make a modestly budgeted movie for grownups, and Hanson’s filmography may have been shaped mostly by what projects he was able to finance. (This also accounts for the confusing career of his collaborator Brian Helgeland, who drifted after L.A. Confidential in ways that make Hanson seem obsessively focused.) His IMDb page was littered with the remains of ideas, like an abortive adaptation of The Crimson Petal and the White, that he was never able to get off the ground. His greatest accomplishment, I suspect, was to make the accidents of a life in Hollywood seem like the result of his own solitary sensibilities.
Yet we’re still left with the boundless gift of L.A. Confidential, which I’ve elsewhere noted is the movie that has had the greatest impact on my writing life. (My three published novels are basically triangulations between L.A. Confidential, Foucault’s Pendulum, and The Day of the Jackal, with touches of Thomas Harris and The X-Files, but it was Hanson, even more than James Ellroy, who first taught me the pleasures of a triple plot.) It has as many great scenes as The Godfather, and as deep a bench of memorable performances, and it’s the last really complicated story that a studio ever allowed itself. When you look at the shine of its images and the density of its screenplay, you realize that its real descendants can be found in the golden age of television, although it accomplishes more in two and a half hours than most prestige dramas can pull off in ten episodes. It’s a masterpiece of organization that still allows itself to breathe, and it keeps an attractive gloss of cynicism while remaining profoundly humane. I’m watching it again as I write this, and I’m relieved to find that it seems ageless: it’s startling to realize that it was released nearly two decades ago, and that a high school student discovering it now will feel much as I did when I saw Chinatown. When it first came out, I was almost tempted to undervalue it because it went down so easily, and it took me a few years to recognize that it was everything I’d ever wanted in a movie. And it still is—even if Hanson himself always seemed conscious of its limitations, and restless in his longing to do more.
My alternative canon #5: The Last Temptation of Christ
Note: I’ve often discussed my favorite movies on this blog, but I also love films that are relatively overlooked or unappreciated. Over the next week and a half, I’ll be looking at some of the neglected gems, problem pictures, and flawed masterpieces that have shaped my inner life, and which might have become part of the standard cinematic canon if the circumstances had been just a little bit different. You can read the previous installments here.
With the passage of time, most of the great scandals of film history start to feel positively quaint, but I don’t think there’s any doubt that if The Last Temptation of Christ were released again today, it would be the most controversial movie of its year. Even if you were to subtract its most obviously inflammatory scenes—the early sequence of Jesus as a crossmaker, the fantasy of his marriage to Mary Magdalene—you’d be left with a work of art that commits the ultimate sin of religious cinema: it engages the message of Jesus on its own terms, rather than as a series of sedate picture postcards. As studies like The Five Gospels and The Acts of Jesus make clear, one of the few things we can say for sure about Jesus of Nazareth is that many of those around him believed that he was insane, and when we watch Willem Dafoe in the title role, we can begin to remember why. This isn’t to say that I necessarily regard Scorsese’s, or Kazantzakis’s, vision as historically accurate: the idea of Jesus as a failed revolutionary who finally came to terms with his divinity makes for a nice three-act structure, but I’m not sure if it’s sustained by a close reading of the gospels. But the movie’s agonized effort to reimagine the most familiar story in the western tradition is unbelievably important. It’s the only Biblical movie I’ve ever seen that tries to stage these events as if they were happening for the first time, and the experience of watching it forces us, at every turn, to confront the strangeness of what it might mean to be both fully human and fully divine. The movie never doubts the divinity of Jesus: it’s Jesus himself who does.
And the fact that this film exists at all is something of a miracle. It was Scorsese’s second attempt to adapt Kazantzakis’s novel, and you can tell that it was shot on a shoestring. If it succeeds far more often than we’d have any right to expect, it’s thanks largely to the script by Paul Schrader, which is the best he ever wrote. (Among other things, it’s often genuinely funny, which is incredible in itself.) It’s full of fine performances, including a nice little cameo by Irvin Kershner, but my favorite is Harvey Keitel as Judas Iscariot, a role that is inevitably charged by our knowledge of the actor’s history with his director: in the scene in which the aging Judas accuses Jesus of having abandoned his mission, Keitel asked to deliver the speech to Scorsese, who is lying just out of the frame. It may not be my favorite Scorsese movie—these days, it’s a tossup between Taxi Driver, Casino, and The Departed—but it’s the one that continues to mean the most to me. I’ve watched it many times, and it rarely fails to move me to tears, although never in the same place twice. These days, the moment that haunts me the most comes after a beautiful young angel has taken Jesus down from the cross, inviting him to look at the world with fresh eyes: “Maybe you’ll find this hard to believe, but sometimes we angels look down on men and envy you. Really envy you.” The angel, of course, turns out to be Satan. And the movie’s central accomplishment is that it makes the last temptation, with its vision of an ordinary life, seem very tempting indeed, which only reminds us of the courage required for any man to reject it for something more.
Cutty Sark and the semicolon
In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:
By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”
I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that most authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God in the Sistine Chapel.
And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:
“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”
It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.
But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:
One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.
And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.
But there’s another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:
I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.
Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.
My ten great movies #1: The Red Shoes
Like all great films, but much more so, The Red Shoes—which I think is the greatest movie ever made—works on two levels, as both a story of life and a story of film. As the latter, it’s simply the most inventive movie ever made in Technicolor, second only to Citizen Kane in its abundance of tricks and flourishes. These range from small cinematic jokes (like its use of the scrolling title Forty-five minutes later, subsequently borrowed by Scorsese in The Aviator, to indicate the passage of time within a single shot) to effects of unforgettable emotional power (like the empty spotlight on the stage in the final scene). It’s the definitive work by a pair of filmmakers who had spent the previous decade on an unparalleled streak, making more great films in ten years than five ordinary directors could produce in an entire career. And The Red Shoes was the movie they had been building toward all along, because along with everything else, it’s the best film we have about the artistic process itself.
And even here, it works on multiple levels. As a depiction of life at a ballet company, it may not be as realistic as it seems—Moira Shearer, among others, has dismissed it as pure fantasy—but it feels real, and it remains the most romantic depiction of creative collaboration yet captured on film. (It inspired countless careers in dance, and certainly inspired me to care deeply about ballet, an art form toward which I’d been completely indifferent before seeing this movie.) And as an allegory, it’s unsurpassed: Lermontov’s cruelty toward Vicky is really a dramatization of the dialogue between art and practicality that takes place inside every artist’s head. This may be why The Red Shoes is so important to me now: from the moment I first saw it, it’s been one of my ten favorite films, but over the years, and especially after I decided to become a writer, my love for it has increased beyond what I feel toward almost any other work of art. Yet Vicky’s final words still haunt me, as does Lermontov’s offhand remark, which stands as a permanent warning, and enticement, to artists of all kinds: “The red shoes are never tired.”
Birds of a feather
A while back, for the book Inventory by The A.V. Club, the director Paul Thomas Anderson shared his list of “Two movies that without fail or question will make me stop dead in my tracks and watch them all the way to the very end, no matter what else is happening or needs to get done.” The films were The Birdcage and The Shining. His second choice probably won’t raise many eyebrows—The Shining‘s fingerprints are all over his work, particularly There Will Be Blood—but the first one might give us pause. Yet when I watched it over the weekend, I had no trouble seeing why Anderson finds it so appealing. There’s the astonishing opening shot, for instance, which zooms across the waters of South Beach and continues in an unbroken movement into the club where Robin Williams is greeting patrons and overseeing his floor show of drag queens. Among other things, it’s impossible not to see it as an influence on the opening tracking shot of Boogie Nights, which would come out the following year. (The cinematographer here, incidentally, was Emmanuel Lubezki, who would go on to do spectacular work for the likes of Terrence Malick and Alfonso Cuarón and win an Oscar for his indispensable contributions to Gravity.)
After almost twenty years, it’s fair to say that The Birdcage holds up as an unexpectedly rich, sophisticated slice of filmmaking. Like many of Anderson’s own films, it has a deep bench of supporting players anchored by a generous lead performance: I felt like watching it primarily as a reminder of how good Robin Williams could be with the right direction and material, and what stands out the most is his willingness to dial down his natural showiness to highlight the more flamboyant performances taking place on all sides. He’s essentially playing the straight man—well, sort of—to Nathan Lane and Hank Azaria, but his restrained energy and intelligence give all the actors around him an additional kick. Not surprisingly, for a movie directed by Mike Nichols from a script by Elaine May, it’s often subversively clever, like a Woody Allen film disguised as a studio crowdpleaser. Lane’s very first line is a reference to The Red Shoes, and the film is packed with nods to gay culture, like the way Lane’s show begins with the opening notes of “The Man Who Got Away,” a la Judy at Carnegie Hall, that probably went over the heads of much of its audience. But I don’t think even I would have watched it nearly as attentively or affectionately without the clue from Anderson.
And Anderson clearly knew what he was doing. Whenever you’re asked to provide a list of your favorite movies or other works of art, there are several competing impulses at play: you’re torn between providing a list of major milestones, the films that speak to you personally, or simply the ones that you enjoy the most. There’s also an awareness that a surprising choice can be notable in its own right. After composing his final list for the Sight and Sound poll of the greatest movies of all time, Roger Ebert wrote:
Apart from any other motive for putting a movie title on a list like this, there is always the motive of propaganda: Critics add a title hoping to draw attention to it, and encourage others to see it. For 2012, I suppose [The Tree of Life] is my propaganda title.
Whether or not Anderson was thinking explicitly in these terms, there’s no question in my mind that he listed The Birdcage so prominently as a way of highlighting it in the reader’s mind. This is a great movie, he seems to be saying, that you may not have sufficiently appreciated, and listing it here without comment does more to lock it in the memory than any number of words of critical analysis.
That’s the real pleasure—and value—of lists like this, which otherwise can start to seem like pointless parlor games. We don’t learn much from the debates over whether Vertigo really deserves to be ranked above Citizen Kane, but it can be enlightening to discover that Quentin Tarantino’s favorite films include titles like “The Bad News Bears,” “Dazed and Confused,” “Rolling Thunder,” and “Pretty Maids All in a Row.” (Going through the Sight and Sound lists of great directors is like a miniature education in itself: after seeing that both Martin Scorsese and Francis Ford Coppola named Andrej Wadja’s Ashes and Diamonds in their top ten, there’s no way that I can’t not see this movie.) Once we’ve worked our way through the established canon, as determined by a sober critical consensus, the next step ought to be seeking out the movies that people we admire have singled out for love, especially when they take us down unexplored byways. After watching one movie through Anderson’s eyes, I wish he’d tossed out a few more titles, but maybe it’s best that he left us with those two. And the next time The Birdcage comes up on television, it’ll stop me dead in my tracks.
The riffs of Wall Street
Over the weekend, I watched The Wolf of Wall Street for the second time, and I came away with two thoughts: 1) I like this movie one hell of a lot. 2) It still feels about twenty minutes too long. And unlike Casino—a propulsive three-hour epic that I wouldn’t know where to trim—it’s easy to identify the scenes where the movie grows slack. Most of them, unfortunately, revolve around Jonah Hill, an actor whose performances I enjoy and who works mightily in the service of an unwieldy enterprise. Hill is a massively energetic presence and an unparalleled comic riffer, and Scorsese appears to have fallen in love with his talents to the point of grandfatherly indulgence. The scene in which Hill’s character delivers a briefcase of cash to Jon Bernthal, for instance, seems to go on forever at a point in the story when momentum is at a premium, mostly so Hill can deliver two or three inventively obscene tirades. It’s amusing, but it would have been just as good, or better, at half the length. And while for all I know, the entire scene might exist word for word in Terence Winter’s script, it certainly feels like an exercise in creative improvisation, and it caused me to reflect about the shifting role of improv in film, both in Scorsese’s work and in the movies as a whole.
Improv has been a part of cinema, in one way or another, since the days of silent film, and directors have often leaned on actors who were capable of providing great material on demand. (Sigourney Weaver says as much in Esquire‘s recent oral history of Ghostbusters: “Bill [Murray] was kind of expected to come up with brilliant things that weren’t in the script, like day after day after day. Ivan [Reitman] would say, “All right, Bill, we need something here.”) Given the expense of physical celluloid and repeated setups, though, it wasn’t simply a matter of allowing performers to riff on camera, as many viewers assume. More frequently, an actor would arrive on set with unscripted material that he or she had worked out privately or in rehearsal, and the version that ended up in the finished scene was something that had already gone through several rounds of thought and revision. Things began to change with the widespread availability of excellent digital cameras and the willingness of directors like Judd Apatow to let actors play off one another in real time, since tape was cheap enough to run nonstop at marginal additional cost. When the results are culled and chiseled down in the editing room, they can be spectacular, like catching lightning in a bottle, and the approach has begun to influence movies like The Wolf of Wall Street, which was shot on conventional film.
Like all good tricks, though, the improv approach gets tired after a while, and Apatow’s movies since Funny People and This is 40 have presented increasingly diminishing returns. The trouble lies in a fundamental disconnect between improv, character, and situation. A line may be hilarious in the moment, but if the riffs don’t build into something that enhances our understanding of the people involved, they start to feel exhausting or enervating—or, worst of all, like the work of actors treading water in hopes of a laugh. We aren’t watching a story, but a collection of notions, and they’re at their weakest when they’re the most interchangeable. Hill’s riffs in The Wolf of Wall Street are funny, sure, but they’re only variations on the hyperaggressive, pointedly offensive rants that he’s delivered in countless other movies. By the time the film is over, we still aren’t entirely sure who his character is; there are intriguing hints of his weird personal life early on, but they’re mostly discarded, and the movie is too busy to provide us with anything like a payoff. Many of Hill’s big scenes consist of him hitting the same two or three beats in succession, and for a movie that is already overlong, I can’t help but wish that Scorsese and Thelma Schoonmaker had kept Hill’s one best take and saved the rest for the special features.
Of course, many of the most memorable lines and moments in Scorsese’s own filmography have arisen from improvisation—“You talkin’ to me?” in Taxi Driver, “I’m a clown? I amuse you?” in Goodfellas, the confrontation between DeNiro and Pesci while fixing the television in Raging Bull—so it’s hard to blame him for returning to the same well. Yet when we compare Hill’s work to those earlier scenes, it only exposes its emptiness. “You talking to me?” and “I’m a clown?” are unforgettable because they tell us something about the characters that wasn’t there in the script, while The Wolf of Wall Street only tells us how inventively profane Jonah Hill can be. And this isn’t Hill’s fault; he’s doing what he can with an underwritten part, and he’s working with a director who seems more willing to linger on scenes that would have been pared down in the past. We see hints of this in The Departed, in which Scorsese allows Nicholson to ham it up endlessly—imitating a rat, smashing a fly and eating it—in his scene at the restaurant with DiCaprio, but there, at least, it’s a set of lunatic grace notes for a character that the screenplay has already constructed with care. Improv has its place in movies, especially in the arms race of modern comedy, which is increasingly expected to deliver laughs without a pause. But as in so many other respects, The Wolf of Wall Street is a warning about the dangers of excess.
The kindest cut
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “Has an ‘uncensored’ version of a familiar entertainment ever scandalized you?”
Earlier this year, there was a brief online furor over a report that Martin Scorsese had cut a few minutes of footage from The Wolf of Wall Street to get its rating down from an NC-17 to an R. Looking back, the initial response seems overblown—if there’s one thing that Wolf doesn’t need, it’s more graphic sex—but it’s easy to understand the reaction. Scorsese is both our most acclaimed living filmmaker and something like a national treasure, and he should presumably be allowed to release his movie in whatever form he sees fit. In the past, Scorsese’s struggles with the ratings board have resulted in some genuine losses: the original bloodbath that concludes Taxi Driver was desaturated in postproduction to avoid an X rating, and although the version we have plays just fine, I still wish we could see the vivid colors that the cinematographer Michael Chapman wistfully describes. Yet there’s also a part of me that believes that there’s a place for a system that requires filmmakers to pull ever so slightly back from their original intentions. Like it or not, less is often more, and sometimes it takes an arbitrary, borderline annoying set of cultural watchdogs to enforce that discipline, even at the cost of a frame or two.
This isn’t meant as a defense of the MPAA rating system, which is badly damaged: it ignores violence but panics at the slightest hint of sex, and it permanently destroyed our chances of a viable cinema for adults in the United States by its bungled rollout of the NC-17 rating. (As Roger Ebert pointed out at the time, it was a mistake to simply substitute the NC-17 for the X, which only transferred the existing stigma to a new category: the real solution would have been to insert a new A rating between X and R, allowing for adult content that fell short of outright pornography. Unfortunately, the revised system was allowed to stand, and there isn’t much of an incentive in this country for anyone to make a change.) But I’d also argue that the ratings have their place, within limits. We often end up with a more interesting cinema when directors are forced to work around the restrictions, pushing them to the extent of permissibility, than if they’re simply given a free pass. It wasn’t what the ratings board had in mind, but just as the Hays Code indirectly shaped the conventions of noir, you could argue that American movies have benefited from their puritanical streak—not in the blandness of the mainstream, but at the edges, where smart, subversive filmmakers skewed the rules in ways the censors never intended.
And it’s often the most imaginative and formally inexhaustible directors who benefit the most from such shackles. I’d rather watch Psycho again than Frenzy, and you can make a strong case that David Lynch—who at his best is the most interesting director of my lifetime—works better under constraints. I’ve written elsewhere of how Lynch was contractually obligated to produce a cut of Blue Velvet that was under two hours, which he and editor Duwayne Dunham delivered down to the minute. The result is nothing less than my favorite American movie, and although it lost close to an hour of footage in the process, the sacrifice was a crucial one: the deleted scenes featured on the recent Blu-ray release are fascinating, often wonderful, but including them would have left us with a movie that most of us would have been glad to watch once, like Inland Empire, rather than one I’ve wanted to experience again and again. Since then, Lynch has moved on, and his most recent work, shot on digital video without any eye to commercial appeal, seems designed to avoid any constraints whatsoever. And he’s earned the right. But I don’t know if he’ll ever make another movie like Blue Velvet.
Lynch also clearly benefited from the thematic constraints enforced by television. Twin Peaks gained much of its power from the fact that it had to operate within broadcast standards, and it was endlessly evocative precisely because it left so much to implication. (The difference between the original series and Fire Walk With Me is that between the intensity of restraint and its opposite.) Much the same is true of Mulholland Dr., the first two acts of which were originally a television pilot. And Wild at Heart, at least to my eyes, was actively improved by its television cut. When I first saw it, back when it was a real event for me to catch a movie like this on a broadcast channel, I loved it—it was sweet, sinister, colorful, and charged with perverse romance. A few years later, when I caught a screening of the full version at the late and lamented UC Theater in Berkeley, I was surprised to discover how much less I enjoyed it: it was uglier, more indulgent, and ultimately less true to its own conception. This is all very subjective, of course, but I still believe that the television cut retained most of what I love about Lynch while paring away the worst of his excesses. In its existing form, it feels ever more like a footnote, while the television cut is a minor masterpiece that I’d love to see again now. I only wish that I’d taped it.
The likability fallacy
As I’ve mentioned elsewhere, I’m at a point in my life—it’s called “fatherhood”—in which I can see maybe three or four films in theaters every year. My wife and I saw The Hobbit the week before our daughter was born, and since then, our moviegoing has been restricted to a handful of big event movies: Star Trek Into Darkness, Man of Steel, Gravity. In general, my criteria for whether a movie is worth catching on the big screen are fairly simple. It needs to be something that would be considerably reduced on television, which applies particularly to a film like Gravity: I loved it, and I plan to watch it again and again, but its impact won’t be nearly the same at home. Reviews count, as well as my own intangible excitement over a franchise, and beyond that, I tend to go with directors whose work has impressed in the past, which is why I know that the one movie I’ll definitely be seeing next year is Chris Nolan’s Interstellar. In other words, after a lifetime of seeking out strange and challenging movies in theaters, I’ve turned into something like a studio’s idea of the mainstream moviegoer, who tends to prefer known quantities to interesting gambles, and is happy to catch the rest on video. You can complain all you like about Hollywood’s reliance on sequels, remakes, and established properties, but when I look at my own choices as a movie lover with a limited amount of time, I can’t say it’s entirely wrong.
But if there’s a bright side to all this, it’s that it allows me to treat myself as a kind of guinea pig: I can take a hard look at my newfound conservatism as a moviegoer with what remains of my old analytical eye. So much of how Hollywood operates is based on a few basic premises about what audiences want, and as I’ve become less adventurous as a viewer, I’ve gotten a better sense of how accurate those assumptions—presumably based on endless focus group testing and box office analysis—really are. And I’ve come to some surprising conclusions. I’ve found, for instance, that star power alone isn’t enough to get me out of the house: I’m an unabashed Tom Cruise fan, but I still waited for Oblivion to arrive at Redbox. I don’t need a happy ending to feel that I’ve gotten my money’s worth, as long as a darker conclusion is honestly earned. And the one that I can’t repeat often enough is this: I’m not worried about whether I’m going to “like” the characters. Studios are famously concerned about how likable their characters are, and they get nervous about any project in which the lead comes off as unsympathetic. Industry observers tend to think in the same way. As a writer for Time Out recently said of the trailer for The Wolf of Wall Street: “Why should we give a damn about these self-absorbed, money-grubbing Armani-clad cretins and spend our money and time learning about their lives?”
Well, to put it mildly, I can think of a few reasons why, and they’re strong enough that The Wolf of Wall Street is the next, and probably last, movie this year that I expect will get me into theaters. Spending three hours in the company of an Armani-clad cretin seen through the eyes of Martin Scorsese strikes me as a great use of my money and time, and while I can’t speak for the rest of the world, the movie we’ve glimpsed so far looks sensational. Part of this, of course, is because Scorsese has proven himself so capable of engaging us in the lives of unlikable characters. I don’t think there’s a sympathetic face to be seen throughout all of Casino, one of the most compulsively watchable movies of all time, and Scorsese has always seemed more comfortable in the heads of the flawed and unredeemable: it’s the difference between Goodfellas and Kundun, or Raging Bull and Hugo, and even a sleek machine like Cape Fear comes off as an experiment in how thoroughly he can grip us without a likable figure in sight. But there’s a larger principle at work here, too. Scorsese, by consensus, operates at a consistently higher level than any other filmmaker of his generation, and if he’s drawn to such flawed characters, this probably tells us less about him personally than about the fact that his craft is powerful enough to get away with it. Likability wouldn’t be a factor if all movies were this good.
In other words, any fears over the protagonist’s likability are really an admission that something else is going wrong, either in story or execution: the audience doesn’t care about the characters not because they aren’t sympathetic enough, but because it hasn’t been given a reason to be invested on a deeper level. Trying to imbue the hero in a meaningless story with more likable qualities is like changing the drapes while the house is on fire, but unfortunately, it’s often all the studio can understand. As Shane Black notes in the excellent interview collection Tales From the Script:
Movie stars are gonna give you your best ideas, because they’re the opposite of development people. Development people are always saying, “How can the character be more likable?” Meanwhile, the actor’s saying, “I don’t want to be likable.” You know, they give you crazy things like, “I wanna eat spaghetti with my hands.” Crazy’s great. Anything but this sort of likable guy that everyone at the studio insists they should play.
“Make him more likable,” like “raising the stakes,” is a development executive’s dream note: it doesn’t require any knowledge of the craft of storytelling, and you won’t get fired for suggesting it. But let’s not mistake it for anything more. I don’t want my characters to be likable; I want them to be interesting. And if the characters, or the story around them, are interesting enough, it might even get me out of the house.
Quote of the Day
Cinema is a matter of what’s in the frame and what’s out.
Hugo and the ghost of Michael Powell
Martin Scorsese’s Hugo opens with an image that has long been central to this director’s work: a boy looking through a window at the world outside. As most fans know, this image is autobiographical—Scorsese’s asthma kept him indoors for much of his childhood, forcing him to view the world from afar—and although this isn’t the young Henry Hill, staring longingly at the gangsters across the street, but Hugo Cabret and a CGI wonderland of Paris in the 1930s, it shouldn’t blind us to the fact that this is Scorsese’s most personal film since Goodfellas. It’s a curious movie: far from his best work, yet ultimately entrancing, for reasons that have less to do with its considerable technical merits than with its romantic notion of what the arts, especially cinema, can mean to one person over the course of his or her life. In particular, it’s about what movies mean to Scorsese, and to convey this, he employs no fewer than three fictional surrogates, often where you least expect them.
At first glance, of course, it’s the technological aspects that command our attention. Scorsese is clearly tickled to be working with a large budget and in three dimensions, and Hugo is one of the best arguments I’ve yet seen for 3D as something more than just a fad. Unlike Avatar, which largely unfolds in an airless, if gorgeous, universe of special effects, Hugo takes particular pleasure in small touches of reality: steam, ash, the particles of dust on a real set. Its 3D is less a gimmick than a way of immersing us in a new world, aided immeasurably by Robert Richardson’s cinematography and Dante Ferretti’s production design, and the result is captivating from the very first frame. And while the same isn’t quite true of the plot—Scorsese seems rather indifferent to some of the beats of the children’s book he’s adapting, and the first half hour is especially lumpy—the story eventually becomes absorbing as well, thanks largely to the invisible figure at its heart: the English filmmaker Michael Powell.
The action of Hugo, and this is a minor spoiler, revolves in great part around the director Georges Méliès, whom Hugo discovers, now neglected and depressed, operating a toy shop at Montparnasse Station. Later, Hugo introduces him to a film scholar, an enthusiastic student of Méliès’s work, who goes on to unearth and restore many of his lost films. And while the plot closely parallels that of Brian Selznick’s original novel, it isn’t hard to see what drew Scorsese to the story: it’s basically a fabulous recasting of his own relationship with Michael Powell, whose films he loved as a child, and whose life he finally entered after establishing himself as a director and student of film in his own right. Like Méliès, Powell, once hugely popular, was overlooked for decades, during what should have been the most productive years of his career—in Powell’s case, after the disastrous release of the controversial Peeping Tom. And Scorsese played a major role in his rediscovery, leading the way in recent years in the restoration of his major works, beginning with The Red Shoes. (It’s even possible to see a hint of Thelma Schoonmaker, Scorsese’s editor and Powell’s wife, in Méliès’s wife Jeanne d’Alcy, played here by Helen McCrory.)
As a result, Powell’s ghost hovers like a protective spirit above much of Hugo. (Among the many small references to the work of the Archers: in the film’s closing scene, Méliès, played by Ben Kingsley, wears the same white tie and tails as Lermontov at the end of The Red Shoes.) And Scorsese himself appears in three guises: as the young Hugo; as the movie scholar and Méliès fan René Tabard (nicely played by Michael Stuhlbarg); and, most interestingly, as Méliès himself. Scorsese is obviously far more interested in Méliès than in much of the surrounding story, and it’s hard not to read the final scene, as Méliès receives the Legion of Honor, in light of Scorsese’s string of late career awards. And while Scorsese has been far from neglected, he knows how it feels: he once feared that Raging Bull would be his last movie, and spent much of the 1980s in a relative wilderness. Like all artists, Scorsese has had moments, at one point or another, when he feared that his work had been in vain. If a film like Hugo is any indication, his legacy is secure.
Turn off, tune out, drop in
For most of the past decade, I’ve been wearing white headphones. I got my first iPod nine years ago, when I was a senior in college, and at the time, I thought it was the most beautiful thing I’d ever seen. (Today, it looks like a big brick of lucite, but that’s another story.) I’ve updated my music player twice since then, and there’s rarely been a day when I didn’t put on those white earbuds. I drive only very rarely and walk or take public transit almost everywhere around Chicago, as I did when I was living in Boston and New York, so the iPod and its successors have always been a big part of my life. But now, reluctantly, I’m starting to let it go. And I’m writing this post partly as a way of reminding myself why.
I’d been thinking about taking the headphones off for a long time, but it was only last week, when I saw the documentary Public Speaking, that I decided to do something about it. Public Speaking is Martin Scorsese’s loving portrait of occasional writer and professional raconteur Fran Lebowitz. (On her legendary writer’s block: “It’s more of a writer’s blockade.”) Lebowitz doesn’t own a cell phone, a Blackberry, or a computer, and seems vaguely puzzled by those who do. In the film, while miming someone texting furiously, she notes that when you’re down there, on your mobile device, you’re nowhere else, including wherever you happen to be. And much of Lebowitz’s own brilliance and charm comes from her intense engagement with her surroundings.
None of this is exactly groundbreaking, of course, but for whatever reason, it crystallized something in my own mind. For a while, I’ve been obsessed by the fact that every moment in a writer’s life is, potentially, a time that can be used for creation. A writer can’t be working all the time, of course—that way lies madness—but much of the art of surviving as an artist is knowing how to exploit what stray moments of creativity we’re given. Many of my best ideas have popped spontaneously into my head, as I’ve said in the past, while shaving, or while doing otherwise mindless chores like washing the dishes. I’ve quoted Woody Allen on this point before, but because it’s some of the most useful writing advice I know, I’ll quote him again, from Eric Lax’s great Conversations with Woody Allen:
I never like to let any time go unused. When I walk somewhere in the morning, I still plan what I’m going to think about, which problem I’m going to tackle. I may say, This morning I’m going to think of titles. When I get in the shower in the morning, I try to use that time. So much of my time is spent thinking because that’s the only way to attack these writing problems.
And walking alone, as Colin Fletcher and others have realized, is perhaps the best time for thinking. I’ve rarely had to deal with a plot problem that couldn’t be solved, all but unconsciously, by a short walk to the grocery store. And yet here’s the thing: when my iPod is playing, it doesn’t work. Music, I’m increasingly convinced, anesthetizes the right side of the brain. Sometimes it can help your mind drift and relax, which can lead to insight as well, but for the most part, it’s an excuse to avoid leaving yourself open to ideas—which is unacceptable when you’re counting on those ideas to survive. So from now on, whenever I go out, I’m leaving the headphones at home. Not all the time, perhaps: there are times when I just need to hear, I don’t know, “Blue Monday.” But for the most part, for the first time in years, I’m going to try and listen to my thoughts.
Von Trier’s obstructions
As you see [filmmaking] makes me into a clown. And that happens to everyone—just look at Orson Welles or look at even people like Truffaut. They have become clowns.
—Werner Herzog, in Werner Herzog Eats His Shoe
The news that Lars Von Trier has been expelled from Cannes for his decidedly ill-advised remarks is depressing in more ways than one, although I can’t fault the festival for its decision. I don’t think that von Trier is really a Nazi sympathizer; I think he’s a provocateur who picked the wrong time and place to make a string of increasingly terrible jokes. But the fact that he ended up in such a situation in the first place raises questions of its own about the limitations of the provocateur’s life. Von Trier, who used to be something of a hero of mine, has always been testing his audiences, but there’s a difference between a director who pushes the bounds of taste out of some inner compulsion, and one who is simply going through the motions. Von Trier, it seems, has gradually become the latter.
There was a time when I thought that von Trier was one of the major directors of the decade, along with Wong Kar-Wai, and I don’t think I was entirely wrong. Dancer in the Dark is still the last great movie musical, a remarkable instance of a star and director putting their soul and sanity on the line for the sake of a film, and a rebuke to directors who subject their audiences to an emotional ordeal without demanding the same of themselves. Just as impressive was The Five Obstructions, von Trier’s oddly lovable experiment with the director Jørgen Leth, which remains the best cinematic essay available on the power of constraints. (Von Trier had recently announced a remake with Martin Scorsese as the test subject, a prospect that made me almost giddy with joy. I’d be curious to see if this is still happening, in light of von Trier’s recent troubles.)
But the cracks soon began to show. I greatly admired Dogville, which was a major work of art by any definition, but it lacked the crucial sense that von Trier was staking his own soul on the outcome: he was outside the movie, indifferent, paring his nails, and everything was as neat as mathematics. At the time, I thought it might be the only movie of its year that I would still remember a decade later, but now I can barely recall anything about it, and don’t have much inclination to watch it again. I tried very hard to get through Manderlay and gave up halfway through—Bryce Dallas Howard’s performance, through no fault of her own, might be the most annoying I’ve ever seen. And I still haven’t watched Antichrist, less out of indifference than because my wife has no interest in seeing it. (One of these days, I’ll rent it while she’s out of town, which will be a fun weekend.)
And now we have the Cannes imbroglio, which only serves as a reminder that every director—indeed, every artist—ultimately becomes a caricature of himself, in ways that only reveal what was already there. That was true of Orson Welles, who in his old age fully became the gracious ham and confidence trickster he had always been, except more so, in ways that enhance our understanding of him as a young man. The same will be true, I’m afraid, of von Trier. The spectacle that he presented is even less flattering when we try to imagine the same words being said by Herzog, or even someone like Michael Haneke—men who are provocateurs, yes, but only as an expression of their deepest feelings about the world, something that is no longer true of von Trier, if it ever was. Von Trier, clearly, was just joking. But he revealed much more about himself than if he were trying to be serious.
Hayao Miyazaki and the future of animation
Yesterday was the seventieth birthday of Japanese filmmaker Hayao Miyazaki, the director of Spirited Away, which makes this as appropriate a time as any to ask whether Miyazaki might be, in fact, the greatest living director in any medium. He certainly presents a strong case. My own short list, based solely on ongoing quality of output rather than the strength of past successes, includes Martin Scorsese, Wong Kar-Wai, and Errol Morris, but after some disappointing recent work by these last three, Miyazaki remains the only one who no longer seems capable of delivering anything less than a masterpiece. And he’s also going to be the hardest to replace.
Why is that? Trying to pin down what makes Miyazaki so special is hard for the same reason that it’s challenging to analyze any great work of children’s fiction: it takes the fun out of it. I’m superstitiously opposed to trying to figure out how the Alice books work, for example, in a way that I’m not for Joyce or Nabokov. Similarly, the prospect of taking apart a Miyazaki movie makes me worry that I’ll come off as a spoilsport—or, worse, that the magic will somehow disappear. That’s one reason why I ration out my viewings of Ponyo, one of the most magical movies ever made, so carefully. And it’s why I’m going to tread cautiously here. But it’s still possible to hint at some of the qualities that set Miyazaki apart from even the greatest animators.
The difference, and I apologize in advance for my evasiveness, comes down to a quality of spirit. Miyazaki is as technically skilled as any animator in history, of course, but his craft would mean little without his compassion, and what I might also call his eccentricity. Miyazaki has a highly personal attachment to the Japanese countryside—its depiction of the satoyama is much of what makes My Neighbor Totoro so charming—as well as the inner lives of small children, especially girls. He knows how children think, look, and behave, which shapes both his characters and their surrounding movies. His films can seem as capricious and odd as the stories that very young children tell to themselves, so that Spirited Away feels both beguilingly strange and like a story that you’ve always known and only recently rediscovered.
Which is why Miyazaki is greater than Pixar. Don’t get me wrong: Pixar has had an amazing run, but it’s a singularly corporate excellence. The craft, humor, and love of storytelling that we see in the best Pixar movies feels learned, rather than intuitive; it’s the work of a Silicon Valley company teaching itself to be compassionate. Even the interest in children, which is very real, seems like it has been deliberately cultivated. Pixar, I suspect, is run by men who love animation for its own sake, and who care about children only incidentally, which was also true of Walt Disney himself. (If they could make animated movies solely for adults, I think they would, as the career trajectory of Brad Bird seems to indicate. If nothing else, it would make it easier for them to win an Oscar for Best Picture.)
By contrast, the best Miyazaki movies, like the Alice books, are made for children without a hint of condescension, or any sense that children are anything but the best audience in the world. And as traditional animation is replaced by monsters of CGI that can cost $200 million or more, I’m afraid that this quality will grow increasingly rare. We’ve already seen a loss of personality that can’t be recovered: it’s impossible to be entirely original, not to mention eccentric, with so much money on the line. The result, at best, is a technically marvelous movie that seems to have been crafted by committee, even if it’s a committee of geniuses. Toy Story 3 is a masterpiece, and not good enough.
Miyazaki is seventy now, and judging from Ponyo, he’s still at the top of his game. I hope he keeps making movies for a long time to come. Because it’s unclear if the world of animation, as it currently exists, will ever produce anyone quite like him again.
My fifty essential movies
Yesterday I posted a list of my fifty essential books—that is, the fifty books that I would keep if I were deprived of all others. When I tried to do the same for movies, I found that the task was slightly easier, if only because I had fewer titles to choose from. (In both cases, I’ve tried to limit myself to books and movies that I actually own.) The result, as before, is a portrait of myself as expressed in other people’s works of art—which, in the end, may be the most accurate kind of self-portrait there is.
As usual, there are a few caveats. I’ve tried to be as honest as possible. This means omitting some of the very best movies of all time—The Rules of the Game and Tokyo Story, for instance—that I admire enormously but encountered too late for them to burrow into my subconscious. There’s an obvious preference for entertainment over art, as is generally the case in a home video library. And many of the movies named below might be ranked differently, or left out altogether, on another day (or hour). As of today, January 5, 2011, here’s how the canon looks to me:
1. The Red Shoes (d. Michael Powell and Emeric Pressburger)
2. Chungking Express (d. Wong Kar-Wai)
3. Blue Velvet (d. David Lynch)
4. Casablanca (d. Michael Curtiz)
5. The Third Man (d. Carol Reed)
6. Eyes Wide Shut (d. Stanley Kubrick)
7. L.A. Confidential (d. Curtis Hanson)
8. Seven Samurai (d. Akira Kurosawa)
9. Star Trek II: The Wrath of Khan (d. Nicholas Meyer)
10. Citizen Kane (d. Orson Welles)
11. Vertigo (d. Alfred Hitchcock)
12. Indiana Jones and the Last Crusade (d. Steven Spielberg)
13. Lawrence of Arabia (d. David Lean)
14. The Shining (d. Stanley Kubrick)
15. A Canterbury Tale (d. Michael Powell and Emeric Pressburger)
16. The Empire Strikes Back (d. Irwin Kershner)
17. The Last Temptation of Christ (d. Martin Scorsese)
18. Inception (d. Christopher Nolan)
19. The Silence of the Lambs (d. Jonathan Demme)
20. Spellbound (d. Jeffrey Blitz)
21. Mary Poppins (d. Robert Stevenson)
22. 2001: A Space Odyssey (d. Stanley Kubrick)
23. The Godfather (d. Francis Ford Coppola)
24. Spirited Away (d. Hayao Miyazaki)
25. Casino Royale (d. Martin Campbell)
26. Fast, Cheap and Out of Control (d. Errol Morris)
27. JFK (d. Oliver Stone)
28. Barry Lyndon (d. Stanley Kubrick)
29. Miller’s Crossing (d. Joel and Ethan Coen)
30. Sleeping Beauty (d. Clyde Geronimi)
31. Psycho (d. Alfred Hitchcock)
32. Kill Bill Vol. 1 and 2 (d. Quentin Tarantino)
33. The Untouchables (d. Brian DePalma)
34. Raiders of the Lost Ark (d. Steven Spielberg)
35. The Dark Knight (d. Christopher Nolan)
36. Last Tango in Paris (d. Bernardo Bertolucci)
37. Children of Men (d. Alfonso Cuarón)
38. The Departed (d. Martin Scorsese)
39. The Godfather Part II (d. Francis Ford Coppola)
40. Crumb (d. Terry Zwigoff)
41. The Searchers (d. John Ford)
42. The Usual Suspects (d. Bryan Singer)
43. The Long Goodbye (d. Robert Altman)
44. Zodiac (d. David Fincher)
45. The Life and Death of Colonel Blimp (d. Michael Powell and Emeric Pressburger)
46. Boogie Nights (d. Paul Thomas Anderson)
47. Taxi Driver (d. Martin Scorsese)
48. The Limey (d. Steven Soderbergh)
49. Dancer in the Dark (d. Lars von Trier)
50. Pink Floyd The Wall (d. Alan Parker)
Random observations: I had to look up the names of two of the directors (for Spellbound and Sleeping Beauty). Up until a few minutes ago, the last place on this list was occupied by The Life Aquatic With Steve Zissou, which I had to drop after realizing that I’d left out Last Tango in Paris. I allowed myself more than one movie per director, with the largest number of slots occupied by Kubrick (four), Powell and Pressburger (three) and Scorsese (three). And I’m slightly surprised to find that my three favorite movies of the last decade are evidently Spellbound, Spirited Away, and Casino Royale.
Sharp observers might be able to guess which film occupies the top spot in the list of my favorite movies of the past year, which I’m hoping to post later this week. And in any case, if you have a Netflix account that you aren’t using, well, hopefully this will give you a few ideas.
“A terrible possibility began to gather in her mind…”
leave a comment »
Note: This post is the twentieth installment in my author’s commentary for Eternal Empire, covering Chapter 21. You can read the previous installments here.
Casino Royale is my favorite Bond film, and one of the most entertaining movies I’ve ever seen: it’s the one installment in the franchise that I never tire of watching, and it’s fun just to think about. But there’s a single moment toward the end that always struck me as a headscratcher. After Bond wins the big poker tournament, defeating the villainous Le Chiffre, he and Vesper celebrate with a late dinner and cocktails in the restaurant at the titular casino. Vesper gets a text message, checks it, and says that Mathis—their local contact, played by the indispensable Giancarlo Giannini—needs to see her. She leaves. Bond sits there for a minute alone, then mutters to himself, reflectively: “Mathis…” A second later, he’s on his feet, and he dashes outside just in time to see Vesper being herded into a car by a couple of thugs. He sets off in pursuit, and we’re quickly plunged into a crazy chase, a surprise reversal, a crash, and the most memorable torture scene in the entire series. It isn’t for another twenty breathless minutes, in fact, that Bond, recovering afterward in the hospital, explains how he realized that Mathis was a traitor: he was the only one who could have told Le Chiffre that Bond had discovered his poker tell. Mathis is dispatched with a stun gun to the solar plexus, and that’s that.
But it all raises a few questions, to the point where it actively distracted me on my first couple of viewings. We’ll leave aside the fact that Mathis actually isn’t the mole: in fact, as Bond realizes too late, Vesper was the one who tipped off Le Chiffre. Mathis is ultimately exonerated, although this point is revealed so casually, in a line of throwaway dialogue, that most viewers could be forgiven for missing it. More to the point, we’re never given any indication of Bond’s thought process before he jumps to the conclusion that Mathis betrayed them. Usually, this kind of “Oh, crap” moment is triggered by a clue, or a bunch of them, that the audience and the character in question put together at the same time, as we see most memorably in The Usual Suspects. Here, the reasoning is left deliberately opaque, and the gap between Bond’s sudden brainstorm and its explanation is so long—and so crowded with action—that any connective thread is lost. This isn’t a fatal flaw, and it doesn’t impair our enjoyment of what follows. But it’s striking that the blue-chip screenwriting team of Neal Purvis, Robert Wade, and Paul Haggis evidently decided that all we needed was the dawning realization in Bond’s eyes, without giving us any indication of what caused it. (It wasn’t a choice made in the editing room, either: the original script follows exactly the same sequence of beats.)
This interests me because it reflects the kind of shorthand that such stories often use when covering familiar territory. We’ve all seen movies that move from A to B to C, where A is a clue, B is the hero’s eureka moment, and C is the explanation. Casino Royale omits A altogether and relegates C to the status of a footnote, so the middle factor—the light that goes off in Bond’s head—is all we have left. It all but advertises the fact that A and C are basically irrelevant, or could be replaced by any number of arbitrary components: all that matters is the effect they have on Bond. Which only works if you assume that the audience is sophisticated enough to recognize the trope and fill in the blanks on its own. (It reminds me a little of an observation that Pauline Kael made about Raging Bull, in which Scorsese uses a single vivid scene to represent would have been an entire montage in another movie: “Probably for him it stands for the series.”) It’s revealing, too, that it appears here, in a movie that is otherwise more than happy to spin long chains of plot points. An “Oh, crap” moment depends on the film being ever so slightly ahead of the audience, and Casino Royale neatly circumvents the challenge by giving viewers no information whatsoever that might allow them to anticipate the next move.
And while I’m probably reading too much into it, or making conscious what really would have been an intuitive choice by the writers, it also feels like an acknowledgment of how artificial such moments of insight can be. It all depends on the hero seeing a pattern that had been there all along, and to keep the solution from being too obvious, we often see our protagonist making an enormous inductive leap based on the flimsiest possible evidence. There’s a moment much like this in Chapter 21 of Eternal Empire. Wolfe has just been told that Ilya, who has been held without talking for months at Belmarsh Prison, has suddenly agreed to cooperate with the authorities, and that he’s due for a hearing that day at the Central Criminal Court in London. Meanwhile, Vasylenko, his former mentor, is slated to attend a separate appeal that morning. The coincidence of the two court appearances being scheduled at the same time, along with the fact that Ilya and Vasylenko will be transported on the same prison van, allows Wolfe to conclude that they’re planning to escape. That single germ of suspicion is enough to send her racing out of the office, sending her chair rolling backward—the procedural equivalent of the cloud of dust that the Road Runner leaves in his wake. Is this moment plausible? No more or less than Bond’s. Which is another way of saying that it’s exactly as plausible as it needs to be…
Like this:
Written by nevalalee
June 4, 2015 at 9:15 am
Posted in Books, Writing
Tagged with Casino Royale, Eternal Empire commentary, Martin Scorsese, Neal Purvis, Paul Haggis, Pauline Kael, Raging Bull, Robert Wade