Posts Tagged ‘David Thomson’
“There is a major but very difficult realization that needs to be reached about [Cary] Grant—difficult, that is, for many people who like to think they take the art of film seriously,” David Thomson writes in The New Biographical Dictionary of Film, before going on to make a persuasive argument that Grant “was the best and most important actor in the history of the cinema.” There’s a similarly difficult realization that needs to be reached about Tom Cruise, which is that for better or worse, over the last quarter of a century, he’s been the best movie star we have, and one of the best we’ve ever had. Not the best actor, certainly, or even the one, like Clooney, who most embodies our ideas of what a star should be, but simply the one who gave us the most good reasons to go to the movies for more than twenty years. I love film deeply, and I’ve thought about it more than any sane person probably should, and I have no trouble confessing that for most of my adult life, Cruise and his movies have given me more pleasure than the work of any other actor or director.
And yet it wasn’t until I realized that I loved his movies that I really started to take notice of him in his own right. We’re usually drawn to stars because of the qualities they embody, but in Cruise’s case, I became a fan—and remain a huge one—because I belatedly noticed that whenever I bought a ticket to a movie with his name above the title, I generally had a hell of a good time. That hasn’t always been true in recent years, and while some might say that his movies have taken a hit because Cruise’s own public image has been tarnished, I’d argue that the causal arrow runs the other way. Cruise has always functioned less as a traditional movie star than as a sort of seal of quality: a guarantee that we’ll be treated to a film that provides everything that the money, talent, and resources of a major studio can deliver. As a result, whenever the movies in which he appears become less interesting, Cruise himself grows less attractive. Left to his own devices, he can’t rescue Lions for Lambs or Knight and Day, but if he gives us a big, impersonal toy like Mission: Impossible—Ghost Protocol, all is forgiven.
It’s worth emphasizing how strange this is. We tend to think of movie stars as supernatural beings who can elevate mediocre material by their mere presence, but Cruise is more of a handsome, professional void, a running man around whom good to great movies have assembled themselves with remarkable consistency. In fact, he’s more of a great producer and packager of talent who happens to occupy the body of a star who can also get movies made. Hollywood consists of many ascending circles of power, in which each level has more of it than the one below, but when judged by its only real measure—the ability to give a film a green light—true power has traditionally resided with a handful of major stars. What sets Cruise apart from the rest is that he’s used his stardom to work with many of the great filmmakers of his time (Kubrick, Scorsese, Spielberg, Coppola, Mann, Stone, De Palma, Anderson) and a host of inspired journeymen, and he’s been largely responsible for the ascent of such talents as J.J. Abrams and Brad Bird. If this sort of thing were easy, we’d see it more often. And the fact that he did it for more than two decades speaks volumes about his intelligence, shrewdness, and ambition.
Recently, he’s faltered a bit, but his choices, good or bad, are still fascinating, especially as his aura continues to enrich his material with memories of his earlier roles, a process that goes at least as far back as Eyes Wide Shut. I haven’t seen Oblivion, but over the weekend, I caught Jack Reacher, a nifty but profoundly odd and implausible genre movie that runs off Cruise like a battery. (It’s actually much more of a star vehicle than Ghost Protocol, in which Cruise himself tended to get lost among all the wonders on display.) While most leading men strive to make it all seem easy, much of the appeal of watching Cruise lies in how hard this boy wonder of fifty seems to push himself in every frame, as if he still has everything to prove. Other stars may embody wit, cool, elegance, or masculinity, but Cruise is the emblem of the man who wills himself into existence, both on and off the screen, and sustains the world around him through sheer focus and energy. Real or not, it’s a seductive vision, or illusion, for those of us blessed with less certainty. As Taffy Brodesser-Akner says this week in The New York Times Magazine: “Who has ever worked so hard for our pleasure?”
I’ve always been fascinated by Kevin Spacey. This is an actor with less genetic charisma than any other leading man I can name—he’s neither handsome enough for conventional star parts or physically distinctive enough to be a striking supporting player—but his intelligence and craft have resulted in some of the most indelible performances of the latter half of the nineties, and beyond. I don’t think any other living actor can claim a run as good as Seven, The Usual Suspects, L.A. Confidential, and American Beauty, not to mention Beyond the Sea, which I’m convinced is one of the great bad movies of all time, deserving, as David Thomson has noted, of an award given annually in its name. There’s a preening, endearing vanity behind Spacey’s nondescript looks that emerges whenever he’s asked to sing, which he does very well, or do one of his uncanny impersonations. He’s a showoff trapped in an everyman’s body, and although I don’t think he’s ever given a truly uncalculated or uninhibited performance, he’s also provided me with more pleasure as a moviegoer over the years than most actors with more conventional endowments.
And he’s the perfect lead for House of Cards, the weirdly compelling political drama that premiered over the weekend on Netflix. Spacey always seems to be in a kind of conspiratorial huddle with the audience, even if he’s only conning us in the end, and as the scheming majority whip Frank Underwood, he isn’t above giving the lens itself a wink, and occasionally an extended monologue to comment on the action. If the show were more realistically plotted, this would be distracting, but a realistic look at power politics isn’t quite what this series has in mind. Underwood is a master manipulator, but everyone around him is so gullible, including the supposed Washington operators with whom he interacts, that it’s as if he’s read the script notes for the next thirteen episodes. The fetching Kate Mara does what she can in the role of an ambitious metro reporter, but her rapid rise, once Underwood starts feeding her information, is more Brenda Starr than Bob Woodward. It should play worse than it does, but if there’s anyone who can carry this sort of thing, it’s Spacey, who clearly relishes the chance to have the camera to himself, and knows how to sell arch lines like “I love her like sharks love blood.”
And I kind of love it, too. House of Cards is remarkably unsubtle in its writing, but benefits from considerable subtlety in its art direction, photography, and sound design. Every frame glows with the burnished yet chilly digital look that David Fincher, who directed the first two episodes, has long since perfected, and the compositions are both clinical and playful: instead of the long tracking shots of The West Wing, we’re treated to a vision of power as one glossy tableau after another. The sets and locations are lovingly detailed—even if my wife observed that no real newsroom kitchen has that much free bread—and we’re given plenty of time to drink them in, with a pace that some viewers have criticized as being too slow, but which suits the balance and polish of the images on the screen. The result is a television series that looks and feels more like a movie than any I’ve ever seen, and its elegance goes a long way toward addressing its narrative shortcomings. (It’s also presented in an unusual aspect ratio, slightly narrower than the standard 16:9 size, which I suspect represents a compromise between the anamorphic format that Fincher prefers and the demands of a show destined to be viewed primarily on widescreen televisions.)
And for all the hype over the fact that the series is being released in one big chunk, rather than parceled out in weekly installments, I have a hunch that its real influence will be in its look and tone, rather than its delivery system. I’ve only seen the first two episodes, and although I intend to watch the rest soon, it doesn’t strike me as the kind of densely plotted show that demands to be devoured in a few epic viewing sessions: all the conventions of serialized storytelling are here, but mostly for the sake of appearances. I’ve written before about the challenges of constructing shapely long-form narratives in television, in which a show can be canceled after two episodes or run for years, and although the Netflix model presents one possible solution to the problem, my initial impression is that it leads to a sort of complacency: subplots are introduced without any particular urgency, with the implication of a payoff somewhere down the line, where a series produced under greater ratings pressure might feel more of a need to justify itself moment to moment. House of Cards is secure, even occasionally a little smug, in the fact of its own survival. I’m enjoying it tremendously, but I can’t help but feel that it might have been a stronger show, if less lovingly crafted, if, to borrow the title from another Kevin Spacey movie, it had been forced to swim with the sharks.
Any consideration of Ang Lee’s Life of Pi needs to begin with the point that, objectively speaking, this may be the most visually astonishing movie ever made. Yet it’s likely that many, if not most, viewers will come away with a limited sense of the film’s accomplishments. This is a movie that, for a solid hour or more, consists of a single sustained visual effect, in which every shot has been created for us out of almost nothing, but at first glance, it doesn’t feel that way. Indeed, it sometimes seems more like a small, intimate chamber piece, a two-hander that just happens to be about a boy and his tiger. Except for a limited number of shots, however, that tiger isn’t real, a point that seems to have eluded more than a few reviewers. It is, in fact, the most lifelike special effect I’ve ever seen in a movie, and the result is both totally miraculous and strangely invisible: this isn’t a tiger constantly showing off how tigerish it can be, but a living, breathing animal that we simply accept as part of the fabric of the story. (I suspect that Borges, who was obsessed by tigers of his imagination, would have found this movie both fascinating and problematic.)
As a result, I have a hunch that Life of Pi may lose the Oscar for Best Visual Effects to a showier but less accomplished movie, like Prometheus, much as 2001 didn’t win the award for makeup in the year of Planet of the Apes, allegedly because, as Arthur C. Clarke has claimed, the voters failed to realize that the monkeys weren’t real. And I can’t entirely blame them. As I watched Life of Pi, I had to constantly remind myself that I was witnessing a bravura display of visual effects, even as Lee and his collaborators seemed determined to conceal their wizardry as much as possible. I’ve noted more than once that in movies like Jurassic Park or Terminator 2, the special effects still hold up magnificently, because those few precious minutes of footage were the result of years of thought and care. These days, computer effects have become so routine that even the most spectacular examples of digital mayhem, as in The Avengers, start to look like cartoons, so it’s heartening to find a movie, and a director, willing to lavish that kind of old-fashioned attention on more than an hour’s worth of visual magic.
But when trick effects become so seamless that artifice can no longer be distinguished from reality, it’s also something of a loss. Artifice for its own sake, when pursued with the same kind of love as perfect realism, can be a joy, as in Joe Wright’s recent adaptation of Anna Karenina. The movie has its problems, but I think that if I’d seen it twenty years ago, it would have instantly become one of my favorites. I went through a phase in my early teens when I was fascinated by movies that gloried in their artifice, either to pay homage to the films of an earlier era, like Martin Scorsese’s New York, New York, or to push forward into a stylized world of their own, like Coppola’s One From the Heart. With its stage sets and images of model trains moving through tabletop snowscapes, Anna Karenina is a movie that embraces its artificiality, like Coppola’s Dracula or the late movies of Powell and Pressburger. In fact, it’s arguably the most ambitious recent attempt to make what Michael Powell has called “the composed film,” in which every element has been planned by the director in advance.
Of course, this can lead to its own set of pitfalls, and if Anna Karenina has one major flaw, it’s that its actors are rarely allowed to find lives for their characters apart from the production design. (Indeed, Keira Knightley’s performance depends entirely on costuming and makeup to dramatize Anna’s descent: as David Thomson has observed elsewhere, Knightley “is still more credible as a faintly animated photographer’s model than as an actress.”) Finding the right balance between artifice and realism, as Powell and Pressburger did in their best films—along with Welles, Kubrick, and Hitchcock—is the province of our greatest directors, and such moments are the ones, as a moviegoer, that I treasure above all others. Hence my love for the sequence in Life of Pi in which the screen briefly elongates to Cinemascope proportions, allowing a swarm of flying fish not just to come right out at the audience, but spill over the edges of the frame. It goes by in a blink of an eye, and Lee notes that most viewers don’t even notice it, but it’s a thrilling example of what a great director does best: giving us something that not only reproduces reality, but advances on it—at least if we’re willing to watch carefully.
Comedy, as we all know intuitively, is largely built on threes. It often shows the same thing three times with slight variations, followed by a kicker at the end, which is why so many jokes are built around three different nationalities, religions, or professions, like those about the mathematician, the physicist, and the engineer. There’s the famous comedy triple, in which two items set up a pattern, followed by a third that serves as a punchline. (There are countless examples, but I’ve always liked this one from The Simpsons: “Well, little girl, I’ve had a lot of jobs in my day: whale hunter, seal clubber, president of the Fox network…”) A similar rule applies to magic, which depends on the basic pattern of setup, development, and surprise climax. In Magic and Showmanship, Henning Nelms describes a trick in which a color-changing fan is used to magically dye handkerchiefs different colors, and then says:
Commercial color-changing fans can display four different hues. But this is bad showmanship. Dyeing one is trivial. Dyeing two arouses interest. Dyeing three provides your climax. There is no reason to add an anticlimax simply because you are prepared to do so.
So why is the number three so powerful? For the same reason that one point is just a point, two points is a line, and three points, suddenly, is structure. Our brains are wired to look for patterns, and it takes three items to confirm or deny that a pattern exists—and it can be very satisfying either to be given the payoff we’ve been expecting or to be shown how cleverly we’ve been misled. Writing about The Godfather, David Thomson speaks of “the sinister charm of action foreseen, spelled out, and finally delivered,” as when Michael kills Sollozzo and McCloskey. “It is a killing in which we are his accomplices,” Thomson says, and three is the minimum number of story points required for the reader to actively conspire in the narrative. This is why most of our stories, from jokes to fairy tales to novels, still consist of a beginning, a middle, and an end. (Or, as Philip Larkin puts it, “a beginning, a muddle, and an end.”)
This also applies to a story’s constituent parts. Narratives tend to have a sort of fractal structure: an individual chapter or scene will often have the same three-act structure as the story as a whole. This often applies to the movie scenes we tend to remember most vividly, which are structured as miniature plays—think of Holly’s first meeting with Harry in The Third Man. My own novels and stories are usually structured in three acts, to the point where I use numbered sections even in short novelettes, and that applies to individual chapters as well. When I’m outlining a chapter, I’m generally thinking in threes, even before I know what will happen: I’ve learned from experience that three story beats is a strong foundation on which to build a chapter, for the same reason that a tripod needs three legs to stand, so I always make sure that the chapter falls into three roughly similar parts, at least in the first draft.
And yet here’s the funny thing: when it comes to the final draft of a chapter, the first and third parts often don’t need to be there. I’ve spoken before about the importance of writing the middle—that is, of cutting the opening and closing sections of a chapter and jumping from the middle of one scene to the next—and I’ve often noticed that rough drafts spend too much time moving toward and away from the real center of interest. In short, the rule of three is invaluable for structuring a first draft, but in the final version, much of it can be thrown away. In my experience, it’s best to reserve the full three-act treatment for big, climactic scenes, while for transitional chapters or sequences, usually only the middle is necessary. The reader can fill in the first and last parts on his or her own—but only if they’ve been written and cut in the first place. They’re still there, but they’re invisible. And that’s how you use the rule of three.
Earlier this month, in his rather unenthusiastic review of the new musical Nice Work if You Can Get It, Hilton Als wrote of star Matthew Broderick, who, for all his other talents, is manifestly not a dancer: “His dancing should be a physical equivalent of Rex Harrison’s speaking his songs in [My Fair Lady]: self-assured and brilliant in its use of the performer’s limitations.” It’s a nice comparison, and indeed, Rex Harrison is one of the most triumphant examples in the history of entertainment of a performer turning his limitations into something uniquely his own. (If I could go back in time to see only one musical, it would be the original Broadway production of My Fair Lady, starring Harrison and the young Julie Andrews.) And while most of us rightly strive to overcome our limitations, it can also be useful to find ways of turning them into advantages, or at least to find roles for which we’re naturally suited, shortcomings and all.
Years of writing have taught me that I have at least two major limitations as a novelist (although my readers can probably think of more). The first is that my style of writing is essentially serious. I don’t think it’s solemn, necessarily, and I’d like to think that my fiction shows some wit in its construction and execution. But I’m not a naturally funny writer, and I’m in awe of authors like P.G. Wodehouse, Douglas Adams, or even Joss Whedon, whose sense of humor is inseparable from their way of regarding the world. The Icon Thief contains maybe three jokes, and I’m inordinately proud of all of them, just because they don’t come naturally. This isn’t to say that I’m a humorless or dour person, but that being funny in print is really hard, and it’s a skill set that I don’t seem to have, at least not in fiction. And while I’d like to develop this quality, if only to increase my range of available subjects and moods, I expect that it’s always going to be pretty limited.
My other big limitation is that I only seem capable of writing stories in which something is always happening. The Icon Thief and its sequels are stuffed with plot and incident, largely because I’m not sure what I’d do if the action slowed down. In this, I’m probably influenced by the movies I love. In his essay on Yasujiro Ozu, David Thomson writes:
[S]o many American films are pledged to the energy that “breaks out.” Our stories promote the hope of escape, of beginning again, of beneficial disruptions. One can see that energy—hopeful, and often damaging, but always romantic—in films as diverse as The Searchers, Citizen Kane, Mr. Smith Goes to Washington, Run of the Arrow, Rebel Without a Cause, Vertigo, Bonnie and Clyde, Greed, and The Fountainhead. No matter how such stories end, explosive energy is endorsed…Our films are spirals of wish fulfillment, pleas for envy, the hustle to get on with the pursuit of happiness.
As a result, whenever I write a page in which nothing happens, I get nervous. This isn’t the worst problem for a mainstream novelist to have, but like my essential seriousness, it limits my ability to tell certain kinds of stories. (This may be why I’m so impressed by the work of, say, Nicholson Baker, who writes brilliantly funny novels in which almost nothing takes place.)
So what do I do? I do what Rex Harrison did: I look for material where my limitations can be mistaken for strengths. In short, I write suspense fiction, which tends to be forgiving of essential seriousness—it’s hard to find a funny line in any of Thomas Harris or Frederick Forsyth, for example—and for restless, compulsive action, all executed within a fairly narrow range of tone. When I write in other genres, like science fiction, I basically approach the story if I were still writing suspense, which, luckily, happens to be a fairly adaptable mode. And while I’ll always continue to push myself as a writer, and hope to eventually expand my tonal and emotional range, I’m glad that I’ve found at least one place where my limitations feel at home, and where they can occasionally flower forth into full song. For everything else, I’m content just to speak to the music.
When you’re watching a film [as an editor], somebody has to deal with the film in complete ignorance of how it actually got made, because that is the way it is going to be seen. I try to keep myself as removed as possible, because I’m the ombudsman of the audience, looking out for their best interests. If you are the director and it took you an incredible amount of time and anguish to get a particular shot, you might invest that shot with more importance than it really has. It has to carry the burden of the effort that it took to get it. On the other hand, if I as the editor am not aware of that burden, I might look at the shot and think there’s nothing special about it. And occasionally I might be right. On the other hand, a shot that was grabbed just before lunch when everyone was having an argument: the director might dismiss it. Whereas I would say, “Ooh, in the right context, this shot could be magical.”
Yesterday, I finally received the promised delivery of twenty-five contractual copies of The Icon Thief, which took me overnight from regarding my few author’s copies as the most precious things in the world to having more copies of my book than I’ll ever need. I’m not alone in this, of course: every author I’ve ever met has had a box or two of his or her own books on hand, or a whole bookcase taken up with that single title, like the Da Vinci Code shelf at your local thrift store. (As David Thomson says of his insane, widely derided, and oddly compelling study of Nicole Kidman: “I have fortress walls made of it.” Similarly, the biographer Michael Holroyd describes seeing “a long tall corridor that had been built entirely out of unsold copies of my books,” which he calls “an impressive, an undeniable spectacle.”)
So what do authors do with their own books? Ideally, if you’re of a certain temperament, you want to end up with a library like that of Isaac Asimov, who initially kept all the editions of his books, including translations, but finally ran out of room for anything but the English-language originals (and had to throw away the non-Asimov pages from magazines in which his work appeared). All the same, there’s a limit to the amount of space you have for your own work, at least until you can sell all of it to the University of Texas. Any prolific author will inevitably end up with more books than he needs, and may be tempted to shout to his publisher, like James Thurber in his story “File and Forget”: “I don’t want any more copies of my book. I don’t want any more copies of my book. I don’t want any more copies of my book.”
Of course, the best thing to do with spare copies of one’s own book is to send them around to various influential readers. Even the greatest authors have done this, as we see in a letter by Charles Darwin to Thomas Huxley:
Can you tell me of any good and speculative foreigners to whom it would be worth while to send copies of my book, on the ‘Origin of Species’?…I should like to send a few copies about, but how many I can afford I know not yet till I hear what price Murray affixes.
Emerson, among countless others, made sure that copies of his books were sent to all the important New York editors, listing each one by name, while Aleister Crowley eventually took over the job of selling the unsold copies of his books himself, and indignantly noted, against the rumors circulating in London, that his decision to leave his publisher “had nothing at all to do with the strangling of any woman.” (I know that this last story is a little off-topic, but I couldn’t resist.)
As for my own copies, I’ll keep a few around the house, one to read, one for the archives in a mylar bag, and a couple of spares for emergencies. I owe copies to a number of people thanked in the acknowledgments, including those who kindly read earlier versions of the novel. As for the rest, they’ll probably end up in various hands, maybe even yours, if I ever get around to figuring out some kind of giveaway. (But don’t let that stop you from buying your own copy, just in case.) In the meantime, though, it’s nice to see them all lined up in one place, before they wander off to make their way in the world, and in this respect, if no other, I feel a little like Thomas Wolfe, who stared at copies of his first book in a store window so intently that somebody called the police.
Although my life has since taken me in a rather different direction, for a long time, I was convinced that I wanted to be a film critic. My first paying job as a writer was cranking out movie reviews, at fifty dollars a pop, for a now-defunct college website, a gig that happily coincided with the best year for movies in my lifetime. Later, I spent the summer of 2001 writing capsule reviews for the San Francisco Bay Guardian, during a somewhat less distinguished era for film—my most memorable experience was interviewing Kevin Smith about Jay and Silent Bob Strike Back. After college, I tried to get work as a film critic in New York, only to quickly realize that reviewing movies for a print publication is one of the cushier jobs around, meaning that most critics don’t leave the position until they retire or die, and when they do, there’s usually someone in the office—often the television reporter—already waiting in the wings.
In the years since, the proliferation of pop cultural sites on the Internet has led to a mixed renaissance for critics of all kinds: there are more professional reviewers than ever before, but their influence has been correspondingly diluted. Critics have always been distrusted by artists, of course, but these days, they get it from both sides: for every working critic, there are a thousand commenters convinced that they can do a better job, and the rest of us are often swayed less by the opinions of individual writers than the consensus on Rotten Tomatoes, which is a shame. At its best, a critic’s body of work is a substantial accomplishment in its own right, and personalities as dissimilar as those of Pauline Kael, Roger Ebert, and David Thomson—speaking only of film, which is the area I know best—have created lasting legacies in print and online. And while the critical profession is still in a period of transition, the elements of great criticism haven’t changed since the days of James Agee, or even Samuel Johnson.
So what makes a good critic? Knowledge of the field, yes; enthusiasm for art, most definitely. (A critic without underlying affection for his chosen medium, or who sees it only as an excuse for snark, isn’t good for much of anything.) Above all else, it requires a curious mixture of the objective and the subjective. A critic needs to be objective enough to evaluate a work of art on its own terms—to review the work that the creator wanted to make, not the one that the critic wishes had been made instead—while also acknowledging that all good reviews are essentially autobiographical. Ebert has noted that his own criticism is written in the first person, and the most enduring critics are those who write, not as an authority delivering opinions from up on high, but as someone speaking to an intelligent friend. As a result, the collected works of critics like Ebert and Kael are the closest things we have these days to books that seem like living men or women, like Montaigne’s essays or The Anatomy of Melancholy. “Cut these words,” as Emerson said of Montaigne, “and they would bleed.”
Surveying the current crop of writers on the arts, my sense is that while we have many gifted critics, most of them fall short in one way or another. A critic like Anthony Lane, for all his intelligence, tends to treat the subject under consideration as an excuse for an arch bon mot (as with Star Trek: First Contact: “If you thought the Borg were bad, just wait till you meet the McEnroe.”) And while his wit can be devastating when aimed at the right target—The Da Vinci Code, for instance, or the occupants of the New York Times bestseller list—it often betrays both too much self-regard and a lack of respect for the work itself. On the literary side, James Wood has a similar problem: he’s a skilled parodist and mimic, but surely not every review obliges him to show off with one of his self-consciously clever pastiches. (If I were Chang Rae-Lee, I’d still be mad about this.) The writers of the A.V. Club are more my style: in their pop cultural coverage, especially of television, they’ve struck a nice balance between enthusiasm, autobiography, and reader engagement. But I’m always looking for more. Which critics do you like?
For most of the past decade, the Kubrick film on this list would have been Eyes Wide Shut, and while my love for that movie remains undiminished—I think it’s Kubrick’s most humane and emotionally complex work, and endlessly inventive in ways that most viewers tend to underestimate—it’s clear now that The Shining is more central to my experience of the movies. I realized this only recently, after seeing it at midnight earlier this year at the Music Box in Chicago, but this is still a film that has been growing in my estimation for a long time. The crucial factor, perhaps unsurprisingly, was my decision to become a writer. Because while there have been a lot of movies about novelists, The Shining is by far our greatest storehouse of images about the inside of a writer’s head. “You’ve always been the caretaker,” Grady’s ghost says to blocked writer Jack Torrance, and his personality suffuses every frame of the movie whose uneasy center he occupies.
The visual, aural, and visceral experience of The Shining is so overwhelming that there’s no need to describe it here. Instead, I’d like to talk about the performances, which are the richest that Kubrick—often underrated in his handling of actors—ever managed to elicit. At one point, I thought that the film’s only major flaw is that it was impossible to imagine Jack Nicholson and Shelley Duvall as a married couple, but I’m no longer sure about this: there are marriages this strange and mismatched, and the glimpses of their relationship early in the movie are depressingly plausible. As David Thomson was among the first to point out, Nicholson is great when he plays crazy, but he’s also strangely tender in his few quiet scenes with his son. And Duvall gives what is simply one of the major female performances in the history of movies, even if we suspect, after hearing of the hundreds of takes she was forced to endure, that something more than mere acting was involved.
Tomorrow: The triumph of the studio system.
The recent release of Brian Kellow’s biography A Life in the Dark and the Library of America anthology The Age of Movies has led to a resurgence of interest in the career of Pauline Kael. Yet Kael never really went away, at least not for those of us who spend most of our waking hours—and you know who you are—reading about pop culture online. Maud Newton of the New York Times once credited, or blamed, David Foster Wallace for creating the ironic, slangy tone of modern blogs, but three decades earlier, Kael had forever shaped the way we talk about the movies, and, by extension, everything else we care about. Peel back the prose of any top critic on Rotten Tomatoes and you’ll find Kael peeking out from underneath, as writers mimic her snap judgments and rapid turns of phrase while often missing the depths that these surface flourishes concealed.
And while Kael is deservedly remembered for championing the cinema of the sixties and seventies, her lasting legacy is likely to be that of a stylist. I don’t think she’s the best or most insightful film critic of all time; for that honor, I’d nominate David Thomson, although I know he’s the man many movie lovers love to hate. As far as my own personal love of the movies is concerned, I owe the most to Roger Ebert. But Kael’s voice was the most distinctive of all the great film critics, and it’s been jangling in my head for decades. Phrases from her reviews nestle themselves into the corners of your brain, forever changing the way you think of the films under discussion, like her take on Altman’s visual flourishes in The Long Goodbye: “They’re like ribbons tying up the whole history of movies.” Even today, I can recite her enraptured description of the ending of The Fury, which Bret Easton Ellis cheerfully ripped off for his blurb for House of Leaves, almost by heart:
This finale—a parody of Antonioni’s apocalyptic vision at the close of Zabriskie Point—is the greatest finish for any villain ever. One can imagine Welles, Peckinpah, Scorsese, and Spielberg still stunned, bowing to the ground, choking with laughter.
But the trouble with Kael as a role model is that her breathless style, in the absence of a larger philosophy of film, can sometimes cover up the lack of deeper understanding, and, at its worst, turn into something alarmingly like trolling. Kael’s reviews as a whole can be nuanced, but her individual sentences (“The greatest finish for any villain ever”) rarely occupy any middle ground. Imitating Kael on the sentence level only feeds our current tendency, as a culture of online commenters, to believe that everything deserves either five stars or none. This all or nothing approach has been discussed before, notably in an excellent Crosstalk at The A.V. Club, but it’s worth noting that Kael is its unlikely godmother. And if that’s the case, then her influence is vaster than even her greatest admirers acknowledge: her style touches everything we write about the arts, both online and in traditional media, down to this very blog post.
Which makes it all the more important to remember that Kael’s style was the expression of a genuine love of movies. Kael could be cruel to movies she disliked, as in her famously savage (and not entirely inaccurate) dismissal of Raging Bull: “What am I doing here watching these two dumb fucks?” But underpinning it all was a fanatical belief in what movies could do, and a determination that they live up to the standards set by other works of art, which is a quality that many of her imitators lack. Kael was a lot of things, but she wasn’t ironic, and her style was less about showing off than a way of getting her readers to feel the same intense emotions that she did—and, of course, to watch the movies themselves. I’ve sought out countless films just so I could read Kael’s reviews of them, and I know I’m not alone in this. And while I’m not sure if Kael would approve, I suspect that she’d at least be glad I was watching the movies she loved so much.
No other work of art is so central to my love of movies as the last forty minutes of Vertigo. There are movies I admire more, but none I find as emotionally devastating, or as endless in its implications. It’s full of classic moments and images, some of which take several viewings to fully understand, but I may as well start with the most famous: the scene at the Hotel Empire, which you can watch here if you must, culminates in what is simply the greatest shot in the history of cinema. As the camera pans around Stewart and Novak, with Bernard Herrmann’s unforgettable score swelling in the background, we’re as close as we’ll ever get to the reasons we watch movies in the first place, in a sort of gorgeous rhapsody on love, art, and death. As Roger Ebert writes: “This shot, in its psychological, artistic and technical complexity, may be the one time in his entire career that Alfred Hitchcock completely revealed himself, in all of his passion and sadness.”
Not surprisingly, I’ve been obsessed with this movie for a long time. After hearing about it for years, I finally saw it in college, in a study carrel at Lamont Library, watching it on videocassette with headphones on a tiny television set. Later that evening, I came down with some kind of fever, and spent most of the night tossing and turning, convinced that the events of the movie had somehow been part of my own life. The second time I saw it was on the big screen, at the late lamented UC Theater in Berkeley, and the experience there was equally wrenching. And I’m not the only one who responds to it this way: when I showed it to one of my college roommates, he ended up in a fetal position. Vertigo tells us things about art and life, and how we’re driven to transform ourselves and others, that few other works have managed to express. As David Thomson notes, it’s a movie that grabs and haunts the viewer, especially for certain sensibilities:
It’s a test case. If you are moved by this film, you are a creature of cinema. But if you are alarmed by its implausibility, its hysteria, its cruelty—well, there are novels.
Watching it again with my wife last night, the implausibility, the hysteria, and the cruelty were all on clear display. It isn’t a perfect movie, although it has long stretches of icy perfection: the plot sometimes creaks, especially in the first half, and the dialogue scenes often feel like part of a lesser film. But all these concerns are swept away by the extraordinary third act, which may be my favorite in any work of art. I’ve noted before how the original novel keeps the big revelation for the very end, while the film puts it almost forty minutes earlier, shifting points of view and dividing the viewer’s loyalties in the process. It’s a brilliant change—arguably no other creative decision in any movie adaptation has had a greater impact—and it turns the movie from an elegant curiosity into something indescribably beautiful, and painful. The more I watch it, the more I’m convinced that no other American film is so staggeringly complex in its final emotional resonance.
It’s no accident, then, that I’ve been revising and rewriting Vertigo in my head for much of my life. After seeing it in college, I spent the better part of that summer working on a story that would fuse Vertigo with another great American film of startling depths: John Ford’s The Searchers. The project, to put it mildly, was more than I could handle, and I never came close to finishing it, but the vestiges can still be seen in the names of two important characters in The Icon Thief: Maddy and Ethan. Since then, Vertigo has remained a personal and professional touchstone, a movie that I’m constantly revisiting and engaging, for reasons that I can’t always explain. All I know is no matter how many times I see it, ninety minutes into this remarkable movie, when Novak turns to the camera and the screen goes red, I’m sucked in and can’t escape—not any more than she can.
People have been reading in bed ever since there were books, if not beds, but the essential idea of the bedside book was perhaps first articulated by Thackeray, who wrote in his essay “On Two Children in Black”:
Montaigne and Howell’s Letters are my bedside books. If I wake at night, I have one or other of them to prattle me to sleep again. They talk about themselves forever, and don’t weary me. I like to hear them tell their old stories over and over again. I read them in the dozy hours, and only half remember them.
Despite its informal tone, this strikes me as an important moment in the history of literary criticism, because it describes a kind of reading that we all intuitively recognize. Our libraries are filled with one kind of book, our nightstands another, and although most bedside books have certain things in common, above all else, they’re a reflection of the reader’s personality. In some ways, what we read just before going to bed, or in the middle of the night, expresses more about who we are than the books we display for others—or ourselves—during the day.
So what makes a good bedside book? Ideally, given its specialized role, it should be a book that you can pick up casually and put down after a couple of minutes. As such, bedside books tend to have a miscellaneous quality: they’re often collections of short pieces, anthologies, or essays, rather than sustained arguments or narratives. They’re also books that you can open at random in hopes of finding something interesting. As a result, they might be books that you’ve read before and enjoy revisiting, or reference books with entries that don’t need to be read in any particular order. Not surprisingly, there’s a lot of overlap between the bedside book and the bathroom book—although you may want to keep them in two separate stacks.
Apart from these considerations, the ideal bedside book tends to be whatever else you’re reading at the time, so there are often two levels of books on the nightstand. The pictures shown here, of my own bedside table, are uncharacteristically tidy: usually, along with the more or less permanent occupants, there’s another pile of books I’m currently reading. Since my move, though, I’ve had to reconstruct my own bedside library from scratch, so what you see here is something of an idealized version of my nightstand. Note, too, that these pictures are missing the best bedside book in the world, William S. Baring-Gould’s original Annotated Sherlock Holmes, which has been promoted, or apotheosized, to a permanent position on my desk.
Instead, we have Leslie Klinger’s more recent New Annotated Sherlock Holmes, which is a charmer in its own right, along with Baring-Gould’s Annotated Mother Goose. We also have books on film, including David Thomson’s Biographical Dictionary and Have You Seen? and Pauline Kael’s 5001 Nights at the Movies; anthologies, including The Limits of Art and the incomparable Zen in English Literature and Oriental Classics; and, of course, books specifically designed to be read in bed, notably J. Bryan III’s Hodgepodge, Frank Muir’s Irreverent and Thoroughly Incomplete Social History of Almost Everything, and The People’s Almanac. These last three are resolutely old-school, but if you want something more contemporary and twee, Schott’s Original Miscellany will probably do.
The rest of the books reflect my own interests and tastes: A Pattern Language, one of the great books in the world, which I’m reading again as I settle into my new house; World Tales by Idries Shah; Dilys Winn’s classic Murder Ink; Sondheim’s Finishing the Hat, which I’m going to finish one of these days; Bill Simmons’s Book of Basketball, which is great bedside reading even if you aren’t a sports fan; The Essential Jesus by John Dominic Crossan; and the two volumes of Isaac Asimov’s original autobiography. (There should also be a copy of The Whole Earth Catalog here somewhere, along with The PreHistory of the Far Side, but these are still packed away.) And, of course, the iPad. You might think that the latter would make the rest obsolete, but that isn’t the case. Even after all this time, there’s something about reading a book in bed that technology can’t match, especially late at night, in Thackeray’s dozy hours.
It took me a long time to love Citizen Kane. When I first saw this most famous of all movies, which was finally released last week on a gorgeous Blu-ray, I was maybe ten years old, and already steeped, believe it or not, in the culture of such movie lists as the Sight & Sound poll. (I got an early start at being an obsessive film snob.) And my first viewing of Kane, which I knew had been universally acclaimed as the best film of all time, came as something of a shock. Looking back, I think my biggest issue was with the film’s insistent humor, since I had assumed that all great art had to be deadly serious. Xanadu and its brooding shadows were fine, but when we got to the moment when the stagehand holds his nose at Susan Alexander’s operatic debut, I didn’t know what to think. What kind of masterpiece was this, anyway?
Needless to say, in the years since, this sense of fun has become one of my favorite things about Kane, as it was for Pauline Kael and so many others. Like Hamlet, with its ghosts and swordfights, Kane is both popular and sublime, and it’s one of the first movies to directly communicate to the audience the director’s joy in his craft—the sense that a movie studio was “the biggest electric train set a boy ever had.” As Kael points out in “Raising Kane,” the movie is almost a series of blackout sketches, full of tricks and gags, and that underlying pleasure still comes through, especially in the earlier newspaper scenes, which feel like a glimpse of the RKO set itself: the Inquirer, with its exhausted but grateful staff, becomes a dream of all creative collaboration, the warmest memory in a movie that ends with the line “I think it would be fun to run a newspaper.”
And yet, as I’ve grown older, I’m also struck by the undercurrent of sadness and loss, which prompted David Thomson to say, in Rosebud: “This is the most moving picture ever made…Or ever will be.” More than any other film, Kane grows with time, both in the context of film history and in its viewers’ own lives. For one thing, it’s hard to watch it now without seeing it as a prophetic version of what would happen to Orson Welles himself, still only twenty-five and a little more than a baby in the few times he appears in his own face. Welles was a greater man than Kane, but he was already preparing his own warehouse of memories, that incredible mass of stories, myths, and unfinished projects that he carried with him like an invisible Xanadu. Of all great directors, only Coppola—with the ghosts of Zoetrope and the Corleones lingering at the Rubicon estate—can claim to be so haunted.
But Kane isn’t really about Welles himself, but all of us. There’s a reason why such disparate figures as Charles Schulz and Ted Turner have seen themselves in this story: among other things, it’s our best movie about youth and aging. Now that I’ve long since passed the age at which Welles made this film, I’m convinced that there’s no way I could fully appreciate it until now: when you’re twenty-five, the movie seems like a goad, or an exemplar, and it’s only when you’re a little older that you notice its preemptive nostalgia for the promise of youth already lost. I expect that the movie will continue to evolve and show different aspects as I get older, a hall of mirrors, like the one Kane walks through in his very last appearance. It’s an inspiration and a warning, a labyrinth without a center, as Borges writes. And yet running that newspaper still seems like so much fun.
Last night, not long after I mentioned The Lord of the Rings in my discussion of the future of storytelling, my wife and I found ourselves at Ravinia Park in Chicago, where we saw The Fellowship of the Ring with a full orchestra and choir performing Howard Shore’s famous score. An excited crowd had packed itself into the pavilion and lawn, and looking around, I was reminded of the true definition of a four-quadrant movie, which has nothing to do with marketing and everything to do with how it fires an audience’s imagination. “Three generations of any family,” David Thomson has drily noted, “could see [The Lord of the Rings] at the same time, in emotional comfort.” And it’s true. For one thing, I’m pretty sure that there were grandchildren in attendance last night who had not yet been born when the movie came out almost ten years ago.
And whatever its other qualities, the movie works. It still looks great, and the special effects, if not miraculous, do a fine job of serving the narrative and performances. And while I’m personally of the opinion that Peter Jackson never quite figured out the right tone for his material until The Return of the King, Fellowship still has the strongest story in the trilogy. There’s something inexpressibly satisfying about seeing the pieces of the epic falling into place, as the Fellowship is gathered, tested, and finally scattered. The other two movies have their moments, and Return of the King in particular is a masterpiece, but I’m guessing that when most viewers think back to their favorite scenes, whether they’re casual fans or Tolkien obsessives, this is the installment that first comes to mind. And the individual moments haven’t lost any of their power: when Aragorn beheads the Uruk-Hai at the end, for instance, the entire auditorium erupted in cheers, drowning out the orchestra.
There are small problems here and there. Jackson’s treatment of Saruman’s army verges on Sam Raimi-style horror, and not in a good way; he occasionally botches big moments, like Galadriel’s speech, with overuse of special effects; and there’s a little too much slapstick in the Shire. All of these qualities would be progressively improved over the course of the trilogy, and to my relief, I found that that the acting was strong from the very beginning. Now that we’ve come to know these actors so well, it’s important to remember that many of them were unknowns or doubtful quantities at the time, and in many cases, their performances have been enriched in retrospect. It’s hard to watch Orlando Bloom, for instance, without seeing something comic in Legolas’s unblinking intensity, while Viggo Mortensen, who once came off as miscast, now seems ideal as Aragorn. Throughout it all, Ian McKellen’s Gandalf remains the film’s perfect calm center—it’s a performance that looks even better as the years go by.
Watching the film again with an audience, for the first time in almost a decade, reminded me of how movies serve as markers in our own lives. When I first saw Fellowship of the Ring, I was a college senior; now I’m married and about to get my first mortgage. Movies, too, have changed. It would be premature to say that this kind of film now seems old-fashioned, with Deathly Hallows having done a commendable job with a rather different franchise, and the two parts of the Hobbit still on the way. Yet with Universal canceling The Dark Tower, directors like Guillermo Del Toro unable to finance their dream projects, and the likes of Andy Hendrickson running the show at Disney, one senses a certain lack of the will that led New Line and Peter Jackson to risk so much on this trilogy. Thankfully, though, they did. And the movies are permanently richer as a result.
Writers are hired and fired from movies all the time, but few departures were more widely reported than Frank Darabont’s exit from Indiana Jones and the Kingdom of the Crystal Skull. Darabont himself has expressed amazement that the media cared so much: “Where were you guys when that other script four years ago went in the shitter? You weren’t paying attention because it wasn’t Spielberg, and it wasn’t Lucas, and it wasn’t Indiana Jones.” But it was hard not to care, especially when the movie itself turned out to be such a disappointment. For all its other problems, the story was especially weak, and it was common knowledge that Darabont had written a draft that Spielberg loved, but Lucas rejected. (As I’ve said before, Hollywood is the kind of place where the man who wrote The Shawshank Redemption is getting script notes from the guy who wrote Attack of the Clones.)
So it became almost an article of faith that the Darabont version would have resulted in a much better movie. And yet Darabont’s Indiana Jones and the City of the Gods, which I finally read over the weekend, isn’t all that great either. It’s incrementally more interesting than the final version, with some nice action scenes and a much better understanding of the relationship between Indy and Marion. There’s a pleasant air of intrigue and a few inspired double-crosses (which makes the insipid “triple agent” of the final version all the more infuriating). But the machinery of the plot takes a long time to get going, the central adventure never quite takes hold, and I missed Cate Blanchett’s Irina Spalko, if not Shia LaBeouf’s Mutt. If I had been Lucas, I probably would have asked for a rewrite as well. But the real takeaway is that no rewrite could have made up for the shakiness of the underlying conception.
The trouble is that in any version, the crystal skull simply isn’t an interesting artifact. Darabont himself seems slightly bored by it, and doesn’t bother explaining what it does or why it matters until the script is halfway over. Even in the last act, when we finally enter the City of the Gods, we aren’t quite sure what the big deal is. Compared to a movie like Last Crusade, which had a wonderful screenplay by Jeffrey Boam that made the emotional stakes exceptionally clear, it’s hard to forgive this kind of narrative confusion, especially when the payoff is so underwhelming. (Its treatment in the final version of the script, as written by David Koepp, is even less satisfying: instead of searching for the skull, most of the movie is devoted to putting it back where it came from, which isn’t the best way to build narrative momentum.)
Of course, you could argue that the artifact is less important than the man pursuing it: Temple of Doom, after all, is essentially about the recovery of some sacred rocks. But City of the Gods is an uncomfortable reminder that we aren’t interested in the things Indy does because we like Indiana Jones; we like Indiana Jones because he does interesting things. Without a decent plot, he becomes the Harrison Ford of the past decade, the man David Thomson accurately saw as a “limited, anxious actor” with little interest in charming the audience. Given the right material, Ford can be wonderful, but he was never an actor who could elevate a film simply with his own presence. He needed Indy as much as Indy needed him. And neither Darabont nor his successors, alas, could ever quite figure out how to bring Indy back.
A few months ago, after greatly enjoying The Conversations, Michael Ondaatje’s delightful book-length interview with Walter Murch, I decided to read Ondaatje’s The English Patient for the first time. I went through it very slowly, only a handful of pages each day, in parallel with my own work on the sequel to The Icon Thief. Upon finishing it last week, I was deeply impressed, not just by the writing, which had drawn me to the book in the first place, but also by the novel’s structural ingenuity—derived, Ondaatje says, from a long process of rewriting and revision—and the richness of its research. This is one of the few novels where detailed historical background has been integrated seamlessly into the poetry of the story itself, and it reflects a real, uniquely novelistic curiosity about other times and places. It’s a great book.
Reading The English Patient also made me want to check out the movie, which I hadn’t seen in more than a decade, when I watched it as part of a special screening for a college course. I recalled admiring it, although in a rather detached way, and found that I didn’t remember much about the story, aside from a few moments and images (and the phrase “suprasternal notch”). But I sensed it would be worth revisiting, both because I’d just finished the book and because I’ve become deeply interested, over the past few years, in the career of editor Walter Murch. Murch is one of film’s last true polymaths, an enormously intelligent man who just happened to settle into editing and sound design, and The English Patient, for which he won two Oscars (including the first ever awarded for a digitally edited movie), is a landmark in his career. It was with a great deal of interest, then, that I watched the film again last night.
First, the good news. The adaptation, by director Anthony Minghella, is very intelligently done. It was probably impossible to film Ondaatje’s full story, with its impressionistic collage of lives and memories, in any kind of commercially viable way, so the decision was wisely made to focus on the central romantic episode, the doomed love affair between Almásy (Ralph Fiennes) and Katherine Clifton (Kristin Scott Thomas). Doing so involved inventing a lot of new, explicitly cinematic material, some satisfying (the car crash and sandstorm in the desert), some less so (Almásy’s melodramatic escape from the prison train). The film also makes the stakes more personal: the mission of Caravaggio (Willem Dafoe) is less about simple fact-finding, as it was in the book, than about revenge. And the new ending, with Almásy silently asking Hana (Juliette Binoche) to end his life, gives the film a sense of resolution that the book deliberately lacks.
These changes, while extensive, are smartly done, and they respect the book while acknowledging its limitations as source material. As Roger Ebert points out in his review of Apocalypse Now, another milestone in Murch’s career, movies aren’t very good at conveying abstract ideas, but they’re great for showing us “the look of a battle, the expression on a face, the mood of a country.” On this level, The English Patient sustains comparison with the works of David Lean, with a greater interest in women, and remains, as David Thomson says, “one of the most deeply textured of films.” Murch’s work, in particular, is astonishing, and the level of craft on display here is very impressive.
Yet the pieces don’t quite come together. The novel’s tentative, intellectual nature, which the adaptation doesn’t try to match, infects the movie as well. It feels like an art film that has willed itself into being an epic romance, when in fact the great epic romances need to be a little vulgar—just look at Gone With the Wind. Doomed romances may obsess their participants in real life, but in fiction, seen from the outside, they can seem silly or absurd. The English Patient understands a great deal about the craft of the romantic epic, the genre in which it has chosen to plant itself, but nothing of its absurdity. In the end, it’s just too intelligent, too beautifully made, to move us on more than an abstract level. It’s a heroic effort; I just wish it were something a little more, or a lot less.
Last night, my wife and I watched the great documentary Hearts of Darkness: A Filmmaker’s Apocalypse, which will hopefully bring my resurgent fascination with Apocalypse Now to a close, at least for the moment. (Which is something my wife is probably glad to hear.) And yet I’m still not quite sure why this movie, so extraordinary and yet so flawed, seized my imagination so forcefully again, when it had been at least ten years since I saw it any form. Part of it, obviously, was learning about Walter Murch’s fascinating editing process in the book The Conversations, but I think it’s also because this movie represents an audacity and willingness to take risks that has largely passed out of fashion, and which I’m trying to recover in my own work, albeit at a much more modest scale.
For those of us who were too young, or unborn, to remember when this movie came out, here’s the short version. Francis Coppola, coming off the great success of the two Godfather movies, decides to make Apocalypse Now, from a script by John Milius, as the first movie by his nascent Zoetrope Studios, even though he isn’t sure about the ending. Instead of the small, guerrilla-style movie that other potential directors, including George Lucas, had envisioned, Coppola elects to make a big, commercial war movie “in the tradition of Irwin Allen,” as he says in Hearts of Darkness. He pays the most important actor in the world, Marlon Brando, three million dollars for three weeks of filming. The entire Philippine air force is placed at his disposal. He goes off into the jungle, along with his entire family and a huge production team—and then what?
Well, he goes deeper. He throws out the original ending, fires his lead actor (Harvey Keitel, who was replaced with Martin Sheen after filming had already begun), and puts millions of dollars of his own money on the line. When Brando arrives, hugely overweight and unable to perform the role as written, the rest of the production is put on hold as they indulge in days of filmed improvisations, searching for a way out of their narrative bind. Coppola is convinced that the movie will be a failure, yet seems to bet everything on the hope that his own audacity will carry him through. And it works. The movie opens years behind schedule and grossly over budget, but it’s a huge hit. It wins many awards and is named one of the greatest movies of all time. Coppola survives. (It isn’t until a couple of years later, with One From the Heart, that he meets his real downfall, not in the jungle but in his own backyard.)
This is an astonishing story, and one that is unlikely ever to repeat itself. (Only Michael Bay gets that kind of money these days.) And yet, for all its excesses, the story has universal resonance. Coppola is the quintessential director, even more than Welles. His life reads like the perfect summation of the New Hollywood: he began in cheap quickies for the Roger Corman factory, became an Academy Award-winning screenwriter, created two of the greatest and most popular movies in history, became rich enough almost be a studio in himself, gambled it all, won, gambled it all again, lost, spent a decade or more in the wilderness, and now presides over a vineyard, his own personal film projects, and the most extraordinary family in American movies. (Any family that includes Sofia Coppola, Jason Schwartzman, and Nicolas Cage is in a class by itself.)
So what are the lessons here? Looking at Coppola, I’m reminded of what Goethe said about Napoleon: “The story of Napoleon produces on me an impression like that produced by the Revelation of Saint John the Divine. We all feel there must be something more in it, but we do not know what.” And that’s how I feel about St. Francis of the Troubles, as David Thomson so aptly calls him. No director—not Lucas, not Spielberg, not Scorsese—has risked or accomplished more. If Zoetrope had survived in the form for which it had been intended, the history of movies might have been different. Instead, it’s a mirage, a dream, like Kane’s Xanadu. All that remains is Coppola’s voice, so intimate in his commentary tracks, warm, conversational, and charged with regret, inviting us to imagine what might have been.
Since yesterday’s posting on The Shining and Apocalypse Now, I’ve been thinking a lot about Stanley Kubrick and Francis Ford Coppola, who arguably had the two greatest careers in the past half century of American film. There have been other great directors, of course, but what sets Kubrick and Coppola apart is a matter of scale: each had a golden age—for Coppola, less than a decade, while for Kubrick, it lasted more than thirty years—when they were given massive budgets, studio resources, and creative control to make intensely, almost obsessively personal movies. The results are among the pillars of world cinema: aside from the two movies mentioned above, it gave us the Godfather films, 2001: A Space Odyssey, A Clockwork Orange, and more.
And yet these two men are also very different, both in craft and temperament. I’ve been listening to Coppola’s commentary tracks for the better part of a week now, and it’s hard to imagine a warmer, more inviting, almost grandfatherly presence—but even the most superficial look at his career reveals a streak of all but suicidal darkness. As David Thomson puts it:
[Coppola] tries to be everything for everyone; yet that furious effort may mask some inner emptiness. For he is very gregarious and very withdrawn, the life and soul of some parties, and a depressive. He is Sonny and Michael Corleone, for sure, but there are traces of Fredo, too—and he is at his best when secretly telling a part of his own story, or working out his fearful fantasies.
Kubrick, in some respects, is the opposite: a superficially cold and clinical director, deeply pessimistic about the human condition, who nonetheless was able to work happily and with almost complete creative freedom for the better part of his career. His films are often dark, but there’s also an abiding sense of a director tickled by the chance to play with such wonderful toys—whether the spaceships of 2001 or the fantastically detailed dream set of New York in Eyes Wide Shut. Coppola, by contrast, never seems entirely content unless the film stock is watered with his own blood.
These differences are also reflected in their approaches to filmmaking. Coppola and Kubrick have made some of the most visually ravishing movies of all time, but the similarities end there. Kubrick was controlling and precise—one assumes that every moment has been worked out in advance in script and storyboard—while Coppola seemed willing to follow the inner life of the movie wherever it led, whether through actors, the input of valued collaborators like Walter Murch, or the insane workings of chance or fate. This allowed him to make astonishing discoveries on set or in the editing room, but it also led to ridiculous situations like the ending of Apocalypse Now, where he paid Marlon Brando three million dollars to spend three weeks in the Philippines, but didn’t know what would happen when he got there. (And as the last scenes of the movie imply, he never did entirely figure it out.)
So what do these men have to tell us? Kubrick’s career is arguably greater: while you can debate the merits of the individual movies, there’s no doubt that he continued to make major films over the course of four decades. Coppola, alas, had eight miraculous years where he changed film forever, and everything since has been one long, frustrating, sometimes enchanting footnote (even if, like me, you love his Dracula and One From the Heart). It’s possible that Coppola, who spent such a long time in bankruptcy after his delirious dreams had passed, wishes he’d been more like Kubrick the clinician. And yet Coppola is the one who seems to have the most lessons for the rest of us. He’s the model of all true artists and directors: technically astounding, deeply humane, driven to find something personal in the most unlikely subjects, visionary, loyal, sometimes crazy, and finally, it seems, content. We’re all Coppola’s children. Kubrick, for all his genius, is nothing but Kubrick.