Posts Tagged ‘Walter Murch’
“I believe that nothing completely satisfies an imaginative writer,” wrote Frederick Locker-Lampson, “but copious and continuous amounts of unmitigated praise, always provided it is accompanied by a large and increasing sale of his works.” I’d like to say that this is a humorous exaggeration, but really, it’s pretty much the truth. Writers, by nature, are insecure creatures: they’ve chosen a trade that offers few visible rewards for years on end, often in the face of justified skepticism from their family and friends, and even those who make it into print generally only do so after much rejection. Once they’ve been published, they’re likely to find themselves confronted with an entirely different set of problems: the fact that their work is freely available to public opinion leaves them perpetually skinless, to use Walter Murch’s memorable phrase, and these days, a writer who wants to obsess over sales figures and reviews can do so in real time, a prospect that might have made even Locker-Lampson’s head explode.
And the amount of information available to contemporary writers only magnifies their natural tendency to emphasize bad news over good. No matter how well things might be going in other respects, there’s always a lukewarm reader review, a dip in sales rank, or a list of award nominees that glaringly omits the writer’s own name. Worst of all is what I like to think of as the Colonel Cathcart complex, in which a writer can’t be altogether happy if there’s another author out there somewhere, his age or younger, who is doing ever so slightly better in the same general field. Few writers, no matter how emotionally healthy they might be in other respects, can bring themselves to view their own success in absolute terms: it’s always the relative measure that stings. Which is really just a particularly ingenious way of guaranteeing that no writer can ever be entirely content. “Writers seldom wish other writers well,” Saul Bellow says, in a slightly softened version of Gore Vidal’s more pointed observation: “Every time a friend succeeds, I die a little.”
None of these observations are new, of course; even if they hadn’t been confirmed by other writers, they’re facts that any writer can verify just by consulting his own feelings whenever another record-breaking advance or movie deal is announced. And all the evidence implies that such dissatisfaction is a permanent part of the writing life. If you had a laboratory in which you could assemble a perfect writer, one whose career followed a perfect trajectory—early acclaim yielding to massive mainstream success and a second golden period in old age—you’d end up with Philip Roth, whose unhappiness with his own life’s work is a matter of record. But the most terrifying truth of all is that these feelings aren’t an undesirable side effect of a writer’s existence, but an essential element of it. Any writer who survives to produce more than a few good books is a creature who has been forced to evolve under considerable environmental pressure, and the one common trait that lies beneath all great careers is the refusal to be satisfied.
In an ideal world, this kind of professional envy would concentrate solely on matters of art: it’s natural and presumably healthy to want to write better books than any of one’s peers. (Like most writers, I’d like to believe that if the books I wrote already existed, I’d be content just to read them, and leave the hard work to someone else.) Yet this obsession with the quality of one’s craft shades naturally into the less positive characteristics that are equally central to a writer’s identity. A writer is like a show dog who has been bred for certain desirable characteristics that happen to go hand in hand with chronic, sometimes crippling problems, like a pekingese whose flat face leads to trouble breathing, or a great dane with hip dysplasia. For writers, the desirable qualities are perfectionism and obsession with craft; the side effects, sadly, are insecurity and jealousy. As far as treating the condition goes, a steady drip of praise and good sales is one answer; drugs and alcohol are another; but the best cure, inevitably, is work, as Norman Mailer once said with regard to his own bad reviews:
[They] put iron into my heart again, and rage…and so one had to mend, and put on the armor, and go to war, go out to war again, and try to hew huge strokes with the only broadsword God ever gave you, a glimpse of something like Almighty prose.
I hate getting notes. I’m well aware that they’re an important part of the writing process, and I’ve learned from hard experience that you ignore them at your peril, but still, the moment before I open a letter full of editorial comments is always an uneasy one. The great editor Walter Murch, talking about movie preview screenings, describes feeling “skinless” beforehand, and that’s as good a characterization as I can imagine. You’ve spent weeks or months living with a story, and by now, you’ve been so thoroughly exposed to every word that you take it all for granted—but that feeling goes away as soon as an outsider presumes to give you feedback. At this point, I have a trusted circle of readers whose opinions I seek out for every novel I write, but even now, whenever I’m about to look at what they’ve actually said, I’m already telling myself that these are only suggestions, and maybe even preparing myself to pick and choose which notes to really take seriously.
The short answer, at least when it comes to notes from someone with a direct stake in the novel, like an editor or an agent, is that you should listen to every goddamned one. And I say this as much from an artistic as a pragmatic perspective. You may feel that their comments miss the point, that they’ve overlooked important subtleties in the story, that the changes they’re suggesting would irrevocably alter the fine web of narrative you’ve constructed. But I’ve invariably found that there are always ways to address an intelligent reader’s specific concerns while maintaining the heart of what you want to express. To admit anything less would be to confess to a failure of craft or nerve. Sometimes you’ll need to split the difference, or give certain comments more weight than others. But I rarely feel satisfied until I can look back at an editorial letter and confirm that I’ve crossed every last point off the list. (And as an aside, I should note that if an agent tells you that a novel needs to be cut, he’s almost certainly right.)
When it comes to other readers, you can exercise greater discretion. Of my usual circle, a few have been recruited primarily to check the text for obvious factual or continuity errors, or to give me their overall impressions rather than a detailed response. When one of them comes back with a question about how a certain line is worded, I often ignore it: at this point in my life, I’m reasonably secure in my basic writing skills, and if I think a sentence works, I’m likely to keep it. But not always. If the fix is an easy one, and I don’t feel strongly one way or the other, I’ll sometimes make the change. After all, I can always go back and restore it. In practice, however, I tend to forget, and in the end, the change is absorbed imperceptibly into the larger text. It helps, of course, that I’ve chosen my readers carefully, that I’ve worked with them in the past, and that I’m reasonably confident that they’ll heed T.S. Eliot’s sage advice: “An editor should tell the author his writing is better than it is. Not a lot better, a little better.”
Which is the most important point of all. You need to choose your readers wisely, not just for the quality of the feedback they provide, but for the trust they inspire. And not every potential reader will qualify. It won’t be true of everyone in your writer’s group, or in your short fiction class, or on the board of your college literary magazine. Part of being a writer is knowing which readers merit your unqualified respect, and giving it to them once they’ve earned it. When you look back at the famously combative correspondence between Raymond Carver and his editor Gordon Lish, you can sense the initial positive emotions begin to shake, then sour, then boil over:
Now, I’m afraid, mortally afraid, I feel it, that if the book were to be published as it is in its present edited form, I may never write another story…I think I had best pull out, Gordon, before it goes any further. I realize I stand every chance of losing your love and friendship over this. But I strongly feel I stand every chance of losing my soul and my mental health over it, if I don’t take that risk.
Yet Carver stuck it out, and we’re all the better for it. But not every reader is worthy of such trust, as much as they should all strive to deserve it. Tomorrow, I’m going to delve into an even more difficult topic: what to do when someone asks for your thoughts on a story.
There are no rules in screenwriting, as we all know, but one of them is this: you must never ever open your first draft screenplay with a courtroom scene.
—William Goldman, Which Lie Did I Tell?
He’s right. At first, a courtroom scene might seem like a decent opening for a movie. It satisfies the crucial requirement, as laid out usefully by screenwriter Terry Rossio, that every scene in a script be built around a clearly identifiable situation—and there’s nothing more familiar than a courtroom. We know the location, the players, the rules of engagement, and as a result, it gives us a convenient vehicle for generating suspense or drama. The sticking point, the pitfall that makes it impossible to use this as an opening scene, is the huge cast it involves. As Goldman points out, starting a screenplay in court involves laying out multiple characters in quick succession, and after we’ve been introduced to “Melvin Marshall, a bulldog in the courtroom” and “the legendary Tommy ‘the Hat’ Marino” and “Judge Eric Wildenstein himself,” our eyes start to glaze over. In a movie, this kind of scene works fine—we can use the faces of the actors to tell them apart. But in a printed screenplay, or a novel, all these names just blur together. Prose fiction is good at a lot of things, but one of its weaker points, especially at the start of a story, is introducing a large cast in a short period of time without confusing or annoying the reader.
Most good authors seem to understand this, but it’s one of the most common mistakes I find in beginning fiction. When I was reading submissions for my college literary magazine, almost without exception, I’d read the first paragraph of a new story, pause, and then read it over again, because the author was introducing too much information at once. There’s the protagonist, Gerald, and his sister, Sarah, talking about a third person, Horatio, whom we haven’t met yet, and they’re in the kitchen and it’s somewhere in Delaware and maybe there’s some kind of a war, and although I’ve been given a lot of material, I don’t have a single narrative thread to follow. Readers can handle a lot of complexity, but not when it’s deployed in one big lump. And while this sort of problem is much less common in professional short stories that have gone through an editor or two, it’s surprisingly common in science fiction. A lot of the stories in Analog, for instance, begin with a page that makes my head hurt, as we’re introduced to an exotic setting and some advanced technology and a bunch of alien names, and while certain readers seem to enjoy the process of puzzling out what the story is trying to say, I’m not among them.
The best thing a writer can do is begin by focusing on a single character with a clearly defined objective, and then gradually expand the narrative from there. You can, if you like, give us two characters in conflict, but no more than that, at least not until we’ve been adequately grounded in the players we’ve seen so far. Three is definitely a crowd. While editing the sound for THX-1138, Walter Murch discovered that when two characters were walking on screen, he had to carefully sync the sound of their footsteps to the movement of their bodies, but when there were three or more, he could lay the footsteps in anywhere—it was impossible for the audience to match the sound of individual steps to what was on the screen. This made his job easier, but it also led him to conclude that audiences, in general, have trouble keeping track of more than three elements at once. And this applies to more than just sound. Metcalfe’s Law tells us that the value of a social network—like a cast of characters—is proportional to the square of the number of players, and while this complexity can be wonderful when it comes to the overall shape of a story, when presented to us all at once, our natural response is to become frustrated and bored. Presenting the characters one at a time, and giving them clear objectives, is the smartest way to avoid this.
And although movies and television are significantly better than prose fiction at presenting us with a large cast, the best of them approach the problem in the same way. As I’ve mentioned before, there’s no better introduction to an enormous cast than the opening scene of The Godfather, with does precisely what I’m advocating here: it starts with an extended close-up of a minor character, Amergio Bonasera, and allows him to fully explain his situation before cutting to Don Corleone’s response. Later, at the wedding, we’re introduced to each of the major characters in turn, and each is defined by a clear problem or objective. As the movie progresses, these characters will acquire staggering complexities—but it’s that first, simple introduction that locks each of them into place. A similar process occurs in the pilot for Cheers, in which the regular characters enter one at a time until the show’s world is fully populated. By establishing the characters gradually and clarifying their relationships one by one, you’ll prepare the reader or the audience for the complications to follow. Once all the characters have been introduced, you can take full advantage of the possibilities that a large ensemble presents. But don’t do it all at once.
Movies are made in the editing room. It’s a cliché, but it’s also true: you can shoot the best raw footage in the world, but if it doesn’t cut together, the movie isn’t going to work. Beyond their basic responsibilities of maintaining continuity and spacial coherence, the editor is largely responsible for shaping a film’s narrative momentum, streamlining and clarifying the story, and making sure it runs the proper length. And sometimes the editor’s role goes even further. As Charles Koppelman writes in Behind the Seen:
[Walter] Murch says it’s common in editing, and normally easy, to steer scenes five or ten degrees in either direction from their intended course. Shading intensity, favoring a character, softening a moment—that’s “the bread and butter of film editing,” as he calls it. “It also seems that flipping the polarity of a scene—going completely the opposite way from where things were originally intended—is something relatively easy to do in film editing.”
And although there are countless famous cases of movies being radically rewritten in the editing room, like Ralph Rosenblum’s brilliant reshaping of Annie Hall, a casual comparison between the published screenplays and the finished versions of most great movies reveals that crucial changes are being made all the time. To pick just one example: the closing montage of words and images at the end of The Usual Suspects, which gives the entire movie much of its power, is totally absent in the script, and a lot of the credit here needs to be given to editor John Ottman. And smaller, less flashy examples are visible everywhere you look.
At first glance, it might seem as if a novelist is in a somewhat different position. A film editor is constrained by the material at hand, and although in certain cases he may have some input when it comes to expensive reshoots, for the most part, he has no choice but to make do with the footage that results from principal photography, which can be massaged and reconceived, but only to some extent, with the help of clever cutting, wild lines, and lucky discoveries in the slate piece. (The slate piece, as I’ve mentioned before, is the second or two of stray film left at the beginning of a take, before the actors have even begun to speak. Mamet likes to talk about finding important bits of footage in this “accidental, extra, hidden piece of information,” and he isn’t lying—the evocative, ominous shots of empty corridors in the hospital scene in The Godfather, for instance, were salvaged from just such a source.) A novelist, by contrast, can always write new material to fill in the gaps or save an otherwise unworkable scene, and it doesn’t cost anything except time and sanity. In reality, however, it isn’t quite that easy. The mental state required for writing a first draft is very different from that of revision, and while writers, in theory, benefit from an unlimited range of possibilities, in practice, they often find themselves spending most of their time trying to rework the material that they already have.
This is why I’ve become increasingly convinced that writing is revision, and in particular, it’s about cutting and restructuring, especially with regard to reducing length. Fortunately, this is one area, and possibly the only area, in which writers have it easier now than ever before. In The Elements of Style, E.B. White writes:
Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.
There’s something appealing about the image of a writer literally cutting his work using scissors and tape, and it’s possible that there’s something tactile in the process that would lead to happy accidents—which makes me want to try it sometime. These days, however, it’s so easy to cut and restructure files in Word that it seems insane for a writer not to take full advantage of the opportunity. Like editing a movie in Final Cut Pro, it’s nondestructive: you can try anything out and reverse it with a keyboard shortcut. You can cut as much as you like and restore it with ease, as long as you’ve taken the precaution of saving a new version with every round of revision. And I’ve learned that if it occurs to you that something could be cut, it should be. Nine times out of ten, once that initial change has been made, you won’t even remember what was there before—and if, five or ten rereadings later, you find that you still miss it, it’s a simple matter to restore what used to be there.
And almost invariably, the shorter and more focused the story becomes, the better it gets. Not only is cutting a story as much as possible the best trick I know, in some ways, it’s the only trick I know. When I look back at my own published work, I naturally divide it into several categories, based on how happy I am with the finished result. At the top are the stories—The Icon Thief, “The Boneless One,” and a handful of others—that I don’t think I’d change much at all, followed by a bunch that I’d like to revise, and a couple that I wish hadn’t seen print in their current form. Without exception, my regrets are always the same: I wish I’d cut it further. The conception is sound, the writing is fine, but there are a few scenes that go on too long. And although it’s impossible to know how you’ll feel about one of your stories a year or two down the line, I almost always wish I’d made additional cuts. That’s why, as I begin the final push on Eternal Empire, I’m cutting even more savagely than my critical eye might prefer, trying to think in terms of how I’ll feel ten months from now, when the novel is published. (The divergence between my present and future selves reminds me a little of the gap between Nate Silver’s “now-cast” and his election day forecast, which will finally converge on November 6.) I don’t know what my future self will think of this novel. But I can almost guarantee that he’ll wish that I’d cut a little more.
Recently, as I prepare to make the last round of cuts and revisions to my third novel, I’ve been reading one of my favorite books, Charles Koppelman’s Behind the Seen. The book’s rather cumbersome subtitle is How Walter Murch Edited Cold Mountain Using Apple’s Final Cut Pro and What This Means for Cinema, and while this may not sound like a page-turner to most people, it’s one of the five or six best books on film I know. As I’ve made clear before, Walter Murch—the man whom David Thomson describes as “the scholar, gentleman, and superb craftsman of modern film,” and whom Lawrence Weschler calls, more simply, “the smartest man in America”—is one of my heroes, and for those who are interested in narrative and technical craft of any kind, this book is a treasure trove. Yet here’s the thing: I don’t much care for Cold Mountain itself. I watched it dutifully when I first read the book, and although I’ve since revisited Koppelman’s account of Murch’s editing process countless times, nothing of the actual movie has lingered in my memory. I was startled last night, for instance, to realize that Philip Seymour Hoffman plays an important supporting character: his performance, like the rest of the movie, has simply melted away.
This paradox grows all the stronger when we examine the rest of Murch’s filmography. The English Patient, as I’ve said elsewhere, is an intelligent movie of impressive texture and skill, and Murch deserved the two Oscars he won for it. But as with Cold Mountain, I can barely remember anything about it, with only a handful of images left behind even after two viewings. I couldn’t get more than halfway through Hemingway & Gellhorn, despite being fascinated by Murch’s account of his work on it at last year’s Chicago Humanities Festival. Murch has worked as a sound designer on many great movies, above all Apocalypse Now, but when it comes to his primary work as an editor, his only unqualified masterpiece remains The Conversation. (As strange as it sounds, of all the movies that he’s edited, the one I enjoy the most is probably The Godfather Part III.) I have no doubt that Murch approached all these projects with the same care, diligence, and ingenuity that shines through all of his published work and interviews, but in movie after movie, that last extra piece of inspiration, the one that might have given a film a permanent place in my imagination, just isn’t there.
Part of this may be due to the inherent limitations of an editor’s role, since even the most inventive and resourceful editor is ultimately constrained by the material at hand and the quality of his collaborators. But I prefer to think of it, in a larger sense, as a warning about the limits of technique. Movies, for the most part, are technically wonderful, and they’ve been advancing along all the dimensions of craft—cinematography, sound, art direction—since the invention of the medium. Progress in art is never linear, but with respect to craft, progress is continuous and ongoing, with each generation adding to its predecessor’s bag of tricks, and as a result, movies look and sound better now than they ever have before. Moreover, nearly without exception, professionals in film are good at their jobs. Even the directors we love to hate, like Michael Bay, arrived at their position after a fierce process of natural selection, and in the end, only the most tremendously talented and driven artists survive. (Bay, alas, has one of the greatest eyes in movies.) Not everyone can be as articulate or intelligent as Murch, but for the most part, movies these days, on a technical level, are the product of loving craftsmanship.
So why are most movies so bad? It has nothing to do with technique, and everything to do with the factors that even the greatest craftsmen can’t entirely control. When you look at a student project from any of our major film schools, the technical aspects—the lighting, the camerawork, even the acting—are generally excellent. It’s the stories that aren’t very good. For all the tricks that storytellers have accumulated and shared over a century of making movies, decent scripts are either tantalizingly elusive or destroyed along the way by the hands of studio executives—which is one role in the movie business where talent does not tend to rise to the top. And the proof is everywhere, from John Carter on down. If there’s one movie artist who rivals Murch for his intelligence, good advice, and willingness to discuss aspects of his craft, it’s screenwriter William Goldman, who hasn’t written a movie since Dreamcatcher. Technique only gets you so far; the rest is a mystery. And even Murch understands this. On the wall of his editing studio, we’re told, hangs a brass “B.” Koppelman explains what it means: “Work hard to get the best grade you can—in this world, a B is all that is humanly attainable…Getting an A? That depends on good timing and the whims of the gods.”
Last week, after a short break, I went back and reread the rough draft of Eternal Empire, my third novel, and immediately had something close to a panic attack. I was surprised by this, because my initial read, right after finishing the draft, was highly positive—I thought it had the potential to be the best novel I’d ever written. The second time around, however, I could hardly find anything right with it: it seemed too slow, too padded, and above all too long. Looking at it more objectively, I could tell that the structure was ultimately sound, and I knew intellectually, if not viscerally, that the set pieces and story points were all good. I hadn’t constructed this novel haphazardly; I’d approached it with a solid plan. (As David Mamet says: “The more time you have invested, and the more of yourself you have invested in the plan, the more secure you will feel in the face of terror.”) All the same, I was left with a problem: the book was at least 15% too long, after close to the same amount had already been cut from the previous draft, and I had just over four weeks to fix it.
What I’m about to describe is going to sound slightly insane, but please bear with me. I began by going through my printed draft with a pencil and crossing out anything I could. For the most part, I wasn’t so much reading the chapters, which I knew fairly well by that point, as regarding them with the eye of a sculptor: I was cutting paragraphs that seemed too long, unbroken chunks of exposition, lengthy speeches, anything that looked like it was taking up too much space. If I had two long paragraphs in a row, I asked myself if what they were saying could be better expressed in one, and nearly every time, the answer was yes. And I paid particular attention to the beginning and end of each scene, looking for ways to get into the scene later and leave earlier, as well as cutting anything that seemed purely transitional, which can be as simple as starting with two characters already in a room instead of out in the hallway. Every now and then, I’d create a PDF of the draft and flip through it rapidly on my laptop, looking for moments when a chapter seemed to run a page or two longer than I was expecting, working mostly by intuition.
This may seem like a strange way of operating, but it’s not so different from what a film editor like Walter Murch does when he views a movie at high speed or with the sound turned down: I’m not worrying about the details, but focusing on big structural elements, which often express themselves visually on the page. Robert Louis Stevenson says somewhere that all the words on a well-written page should look more or less the same, and to my mind, that’s also true of paragraphs. I’m not saying that every paragraph should be the same length, but that there’s a basic rhythm of description, action, and dialogue that I try to hit on a consistent basis, which is visually apparent at a glance. After all, when you’re browsing through a novel in a bookstore, you aren’t necessarily reading the words: you’re looking at the page to see whether it resembles your personal standard of readability. We all have a different sweet spot, but it’s one that we can intuitively recognize, once we’ve read enough books we like. And even when we’re reading a novel for real, we tend to approach the words on a page with a different state of mind when we see, out of the corner of one eye, that the chapter is about to end—a subliminal factor that doesn’t exist in film.
Personally, I’m convinced that this kind of high-level, predominantly visual approach to editing has a real impact on the experience of a reader who is encountering the story for the first time, moment by moment. And although this shouldn’t be the only editing approach a writer uses, it’s a valuable one, especially at the early stages of the editing phase, when you’re crossing out pages wholesale and focusing on the big picture. There will be plenty of time for granularity later, and if you find, on rereading, that you’ve accidentally cut out something important, you can always restore it. (This, incidentally, is why it’s important to save a new version of your manuscript with each major iteration of editing.) In my own case, by the time I’d finished this part of the process, I found that I’d cut close to 10,000 words from a draft that had already gone through one round of extensive cutting. Still, the memory of that first, awful read-through was a vivid one, and to get the manuscript down to what I thought was a reasonable length, I had to resort to the opposite approach. Tomorrow, I’m going to describe how I cut the next few thousand words, with the help of a well-designed spreadsheet.
On Saturday, my wife and I went to the Siskel Center in Chicago to see the engaging new documentary Side by Side, which focuses on the recent shift toward digital filmmaking and its implications for movies as a whole. Despite some soporific narration by producer and interviewer Keanu Reeves—who is not a man who should ever be allowed to do voiceover—this is a smart, interesting film that treats us to a dazzling range of perspectives, many of them from artists I’ve discussed repeatedly on this blog: David Lynch, Christopher Nolan, David Fincher, George Lucas, Stephen Soderbergh, Lars Von Trier, and the indispensable Walter Murch, not to mention Martin Scorsese, James Cameron, Michael Ballhaus, Robert Rodriguez, the Wachowskis, and many more. And while the interviewees come down on various sides of the digital issue—Rodriguez is probably the most unapologetic defender, Nolan the greatest skeptic—there’s one clear message: digital filmmaking is here to stay, and movies will never be the same.
If there’s one thread that runs through the entire movie, it’s the tradeoffs that come when you trade an expensive, cumbersome, highly challenging medium for something considerably cheaper and easier. At first glance, the benefits are enormous: you can run the camera for as long as you like for next to nothing, allowing you to capture more material, and the relatively small size of digital cameras lets you bring them places and achieve effects that might have been impossible before. Digital photography allows for greater control over technical details like color correction; makes editing far less difficult, at least on a practical level; and offers access to advanced tools to filmmakers with limited budgets. Yet there are tradeoffs as well. Film is still capable of visual glories that digital can’t match, and it’s curious that a movie that features Nolan and his genius cinematographer Wally Pfister lacks a single mention of IMAX. (Despite the multiplicity of voices here, I would have loved to have heard from Brad Bird, who because famous working in an exclusively digital medium but still chose IMAX to film much of Mission: Impossible—Ghost Protocol.)
Still, as the movie demonstrates, resolution and image quality for digital video is advancing at an exponential rate, and within the next ten years or so, it’s possible that we won’t notice the difference between digital photography and even the highest-resolution images available on film. Even then, however, something vital threatens to be lost. As Greta Gerwig, of all people, points out, when there’s real film running through the camera, everyone on set takes the moment very seriously, an intensity that tends to be diminished when video is cheap. The end of constraints comes at the cost of a certain kind of serendipity: as Anne V. Coates, the editor of Lawrence of Arabia, reveals, the greatest cut in the history of movies was originally meant as a dissolve, but was discovered by accident in the editing room. And as both David Lynch and producer Lorenzo DiBonaventure note, the increased availability of digital filmmaking doesn’t necessarily mean that we’ll see a greater number of good movies. In fact, the opposite is more likely to be true, as digital technology lowers the barriers to entry for artists who may not be ready to release movies in the first place—the cinematic equivalent of Kindle publishing.
The answer, clearly, is that we need to continue to impose constraints even as we’re liberated by new technology. That sense of intensity that Gerwig mentions is something that directors can still create, but only if they consciously choose to do so. As I’ve argued before, with a nod to Walter Murch, it’s important to find analog moments in a digital world, by intentionally slowing down the process, using pen and paper, and embracing randomness and restriction whenever possible. Most of all, we need to find time to render, to acknowledge that even when digital technology cuts the production schedule in half, there’s still a necessary period in which works of art must be given time to ripen. David Lynch says he’s done with film, and he’s earned the right to make movies in any way he likes. But when I look at Inland Empire, I see an extraordinary movie that could have been far greater—and central to my own life—if, like Blue Velvet, it had been cut from three hours down to two. Digital technology makes it possible to avoid these hard choices. But that doesn’t mean we should.
Last year, in an interview with The 99 Percent, Francis Ford Coppola offered up a piece of advice that I’ve been turning over in my mind ever since. When asked for the most useful advice he’d give a student, Coppola said:
The first thing you do when you take a piece of paper is always put the date on it, the month, the day, and where it is. Because every idea that you put on paper is useful to you. By putting the date on it as a habit, when you look for what you wrote down in your notes, you will be desperate to know that it happened in April in 1972 and it was in Paris and already it begins to be useful. One of the most important tools that a filmmaker has are his/her notes.
This may seem like a small thing, but we should pay close attention, because here, for once, is a piece of creative advice that is practical, immediately applicable, and utterly important. The more time goes on, the more I come to agree with Coppola that an artist’s notes are crucial, and one of the first things any writer needs to figure out is a system for dealing with the countless pieces of paper that any novel automatically generates.
Simply put, this is a bookkeeping problem of enormous difficulty, and every writer is will come up with his or her own solution. In my own case, a novel like The Icon Thief will usually end up producing something like two thousand separate pieces of paper—index cards, notebook pages, and various random scraps and jottings, all of it accumulated over the intense work of a year or more. And being able to keep track of this material is essential. It’s basically impossible for me to hold the shape of an entire novel in my head at once, so I’m completely dependent on my notes. If I don’t write something down, or if I lose it, it’s quite possible that I’ll forget it entirely. In the end, my pile of notes comes to seem like an extension of my brain—or an urgent means of communication, a la Memento, between my past and future selves—which means that all this paper needs to be treated with particular care, even as it continues to multiply.
As a result, I’ve had to implement a system for what one of my friends compares to inventory management—a means of keeping track, at least in a rough sense, of what’s there at any given time. My system has evolved a great deal over the past couple of years, but at the moment, here’s how it looks:
- I keep a notebook for writing down big, fairly permanent pieces of information about an unwritten story—its premise, its major plot points, and any areas that require further study. This is especially important when you may not get to a novel for a long time. In my current notebook, for instance, I had a page devoted to notes for The Scythian well over a year before I started work on the novel itself.
- An informal card system, using the business cards I mentioned earlier, on which I jot down smaller plot points, beats for specific scenes, and other things as they occur to me. Such items can end up almost anywhere in the finished story, so it’s important that they be sortable. These start as a big stack in a designated corner of my desk, and end up in separate piles for each chapter, usually on the floor.
- Finally, a series of text files on my MacBook that contain slightly more systematic notes, especially detailed research on particular topics.
The result has a sort of jury-rigged feel to it, but it seems to work. And its somewhat ad hoc nature is a big part of its usefulness—because a novelist can’t be too organized. It would be a mistake, for instance, to do everything in text files: it might be easier and more convenient, but it would lose some of the serendipity that comes from seeing cards and notes thrown together at random, in which the juxtaposition of two otherwise unrelated ideas will sometimes lead to an insight. I’ve found that it’s also a good habit to take as many notes by hand as possible, as my hero Walter Murch—who has worked with Coppola on some of his most famous movies—does for his scene cards. And finally, if you possibly can, take Coppola’s advice and date each page. I don’t always do this, but I should, and so should you. Years from now, when you’re looking back with wonder at the pile of notes that somehow turned into a novel, you’ll be glad you did.
When you’re watching a film [as an editor], somebody has to deal with the film in complete ignorance of how it actually got made, because that is the way it is going to be seen. I try to keep myself as removed as possible, because I’m the ombudsman of the audience, looking out for their best interests. If you are the director and it took you an incredible amount of time and anguish to get a particular shot, you might invest that shot with more importance than it really has. It has to carry the burden of the effort that it took to get it. On the other hand, if I as the editor am not aware of that burden, I might look at the shot and think there’s nothing special about it. And occasionally I might be right. On the other hand, a shot that was grabbed just before lunch when everyone was having an argument: the director might dismiss it. Whereas I would say, “Ooh, in the right context, this shot could be magical.”
Done, as a certain social media company likes to remind us, is better than perfect. Not everything we do can be flawless in every respect, and in many cases, it’s better to take shortcuts where possible in order to focus on what really matters. Yesterday, I quoted the historian Arnold Hauser on the pragmatism of Shakespeare, who took certain creative approaches “only because they represented the most simple, convenient, and quickest solution of a difficulty to which the dramatist did not find it worth his while to devote any further trouble.” This Olympian ability to zero in on what counts, rather than becoming distracted by side issues, simply magnifies one of the qualities that we find in nearly all great popular writers: the willingness to use a shortcut, or even blatant sleight of hand, to get from one point in a story to another, and the understanding that such measures not only don’t detract from the quality of the work, but may even add to its richness and unpredictability.
One obvious example is the use of stock characters, which remain as useful today as they did in Shakespeare’s time. Stereotypes have no place among one’s leads, of course, but it’s hard to think of a complex work of art, from the Iliad to Downton Abbey, that doesn’t rely, to some extent, on stock types to people its world. For one thing, it saves time. Here’s Roger Ebert on The Godfather:
Although The Godfather is a long, minutely detailed movie of some three hours, there naturally isn’t time to go into the backgrounds and identities of [all the] characters…Coppola and producer Al Ruddy skirt this problem with understated typecasting. As the Irish cop, for example, they simply slide in Sterling Hayden and let the character go about his business.
Every writer sometimes finds it necessary, when pressed for time, to let a stock character go about his business without further introduction. And while most of us don’t have the chance to “simply slide in Sterling Hayden“—if only we could!—there’s no reason to feel guilty about using stock types in small parts. If anything, a fully rounded character in an insignificant part can be a flaw in the narrative: if we mention that the clerk at the hotel where the hero is staying is named Bill, for instance, this sets up an expectation in the reader’s mind that Bill will return in some significant way. If he doesn’t, it’s an unnecessary distraction, when an anonymous clerk would simply have been accepted as part of the fabric of the story.
The same rule holds for many of those clichés or conventions that can annoy attentive readers, as chronicled exhaustively on TV Tropes. We all have our own private list of the amusing ways in which fiction diverges from reality: the fact that the hero can always find a parking space, for instance, or inevitably has the correct change for a taxi. As William Goldman points out in Which Lie Did I Tell?, however, all these conventions have something in common: they’re all about speed. They save time. And they allow the story to get on with it, gliding past what isn’t important to focus on what is. And although such shortcuts may irritate nitpickers after the fact, if we’re caught up in the story, we don’t care. Once again, it’s about knowing what matters, as in editor Walter Murch’s famous Rule of Six, in which everything from continuity to visual logic is subordinated to the emotion of the moment. Because emotion is the one place where shortcuts can’t be taken.
And if the emotional aspects of a story are sound, the remaining shortcuts can create pockets of space for the reader’s imagination to explore. This is true of nearly all great works of art, which, on closer examination, resemble Citizen Kane, which evokes the vast spaces of Xanadu with a bare set, lighting, and a few simple props, and in the process creates a place that feels much more real to us than all the lavishly detailed sets of Cleopatra. In a similar way, Conan Doyle manifestly didn’t care about the continuity of small details in the Sherlock Holmes stories, like the location of Watson’s wound, which only allowed his readers to furnish the rest of the world on their own. A work of art in which every detail has been determined by the writer, if it sees the light of day at all, will often seem airless and uninviting. But if the writer takes the right kind of shortcuts, not only will he reach his own destination, but the reader will come along for the ride.
For most writers, working too hard is the least of their problems, but sometimes it’s necessary to slow down. In this respect, I’m a bigger offender than most. As regular readers will know, I’m a member of the cult of productivity: I believe that in order to write well, you need to write a lot, and I take pride in the fact that I can reliably crank out a few pages on demand. (Although not without the preliminary work of brainstorming, researching, and outlining, which effectively triples my writing time, without even counting revision.) Yet as I start the process of outlining The Scythian, I’m repeatedly reminded of the fact that it’s occasionally good to pause, look around, and see where you are. Because it’s in the moments between sessions of furious activity, when no visible work is being done, that some of our most important insights take place.
In the old days, writers found plenty of occasions to pause during the day, simply because their materials demanded it. You had quills to cut, inkwells to fill, or, later, typewriter ribbons to replace. (Not to mention figuring out how to reboot WordPerfect.) These tasks were tedious, but they also provided useful intervals of downtime. I never get tired of quoting these lines from Behind the Seen about the great film editor Walter Murch, who found moments of surprising introspection on an old-fashioned editing machine:
As Murch often points out, the simple act of having to rewind film on a flatbed editing machine gave him the chance to see footage in other context (high-speed, reverse) that could reveal a look, a gesture, or a completely forgotten shot. Likewise, the few moments he had to spend waiting for a reel to rewind injected a blank space into the process during which he could simply let his mind wander into subconscious areas.
These days, of course, with modern editing systems and word processing programs, such blank spaces have become harder to find. (Although it’s likely that later generations will look back with amazement on how we managed to get so much work done without the benefit of neural implants.) And while Word still crashes from time to time—in my case, for some reason, whenever I try to use the highlighting tool—that isn’t a substitute for more regular pauses.
In fact, I suspect that many of the brainstorming tools used by writers, including myself, are actually veiled ways of slowing down the creative process, which allows the two hemispheres of the brain to fall into line. Mind maps are a great example. I’ve found that mind maps drawn by hand are infinitely more useful than those made with a computer program, simply because they take longer to make. When I’m seated with a pad of cheap paper, letting my pen wander across the page, I have no choice but to slow down and let my thoughts wander at the same pace as the physical act of writing. As a result, when I’m reviewing the action of the scene I’m outlining, I find myself drilling deeper into individual moments, when I might have hurried past them if I were typing lines into a text box. The activity itself doesn’t really matter: the important thing is to ruminate for an hour or so at a fairly slow speed. Drawing a mind map conveniently gives my eye and hand something to do while my brain does the work.
Other writers will find their own ways of inserting a pause into the creative process. Often just the act of getting up from one’s desk, walking around the room, and doing a few chores—although nothing mentally taxing—will allow the brain to relax. I’ve spoken before of how shaving is the perfect activity for this sort of thing, and I’m not the only one. Here’s Laurence Sterne, author of Tristram Shandy, on dealing with writer’s block:
For if a pinch of snuff, or a stride or two across the room will not do the business for me—I take a razor at once; and having tried the edge of it upon the palm of my hand, without further ceremony, except that of first lathering my beard, I shave it off.
Woody Allen, as I’ve noted before, takes a shower or a walk in the park, and I’ll often get ideas while doing the dishes. Just about anything, in fact, can be used to insert a pause into one’s routine—except going online. Not every writer needs to go as far as Jonathan Franzen, who glued an Ethernet cable into his laptop and broke it off, but it’s worth remembering that nearly all the time you spend online could be more profitably used somewhere else, even if that means doing nothing at all. Which raises the question, of course, of why you’re even reading this post…but lucky for you, I’m done.
Authors have a reputation, sometimes richly deserved, of being stodgy and resistant to change, but nearly every writer I know is grateful for modern technology. Writing a novel is simply less of a pain, on a physical level, than ever before. Yet slowness, time, and silence are still crucial components of the creative process, and with the acceleration of all forms of communication, it’s sometimes necessary to deliberately slow things down. I’ve spoken before of my suspicion that it takes about a year of sitting in a chair to produce any novel, no matter how fast your computer lets you type, so you tend to make up the difference with many small revisions, which are less important in themselves than in the time they grant you to mull over the larger work. Similarly, it’s important for most artists to insert pockets of slowness into their daily routine. Today, I want to focus on how this applies to three crucial areas: how we move, how we write, and how we read.
One of the best ways to slow things down is to walk, rather than drive, whenever possible. In Charles Koppelman’s Behind the Seen, Walter Murch notes that during the editing of Cold Mountain, he was grateful for the chance to walk to work every day, which gave him an extra half hour to think. For my own part, after managing to walk or take public transit everywhere for years, my recent move to the suburbs means that I’m now driving on a regular basis, and I can already feel the loss. Driving, especially in the city, just isn’t a good time for contemplation. Taking the train is better, and walking is best of all. I’ve stated elsewhere that I’ve rarely encountered a plot problem that couldn’t be solved by a walk to the grocery store—which isn’t the case when I drive there. And I can’t imagine a writer whose work habits wouldn’t be improved by a short daily walk. (But please leave the headphones at home.)
It’s also useful to honor the simple act of taking pen to paper. Once again, Murch points the way: as an editor, he’s made use of all kinds of technological innovations, but he still begins each project by spending two days preparing handwritten scene cards, cutting the card stock into “odd little shapes” and coding elements of the movie with different colors. It’s a cataloging tool, but there’s also something meditative about doing this work by hand. Similarly, while I sometimes use a text file to organize my initial thoughts about a project, ultimately, I almost always turn to physical cards. And while I don’t think I’ll ever handwrite an entire novel, I find ways of incorporating pen and paper into the process whenever possible—in notebooks, in mind maps, and in the hundreds of small scraps I use to jot down ideas. I could use software for all of this, and some writers do, but it just wouldn’t be the same.
Finally, perhaps the most useful habit of all is to persist in reading real books. It comes down to the issue, which I mentioned yesterday, of technology giving you what you want, but not necessarily what you need. A website or electronic book can take you directly to the right page or allow you to search the text instantly, but it’s often in the act of flipping through a physical book, or wandering through a library, that you find your next big idea. (There’s also something about running a photocopier, I find, that allows interesting thoughts to creep in.) A few weeks ago, on eBay, I bought a trove of back issues of Discover magazine, which I often use for story ideas, despite the fact that all of the articles are available online. Why? Because while the web is great for research, it isn’t the best place for dreaming. And as the pace of digital innovation grows ever more rapid, it’s important to slow things down when possible—because dreaming, in the end, is an analog activity.
One of the most frustrating and challenging moments in any writer’s life is when you know where you are and where you want to be going, but have no idea how to get there. In fact, there are times when I feel like one of the gnomes in the celebrated episode of South Park from which the above image is taken. You’re writing a story, and you have some good ideas for the beginning and the end, but the part in the middle is a mystery. This unknown element can be as small as the distance between two minor plot points or as large as the entire second act, but in all cases, the essential problem is the same. All you need is something to get from point A to point C, and, ideally, it should be brilliant.
This situation is a familiar one for writers of mystery and suspense fiction. A good mystery novel should come off as a perfect puzzle, in which every element was carefully premeditated and laid in beforehand, but in practice, large gaps are often left by the author to be filled in later. In Writing the Novel, Lawrence Block relates that while writing the first installment in his popular series of Bernie Rhodenbarr mysteries, he got within two or three chapters of the ending before finally figuring out who the villain was, thanks to a chance remark by a friend. “I had to do some rewriting to tie off all the loose ends,” Block notes, “but the book worked out fine.” I have a feeling that most mystery novelists could tell similar stories. And as long as the result looks preordained, it’s perfectly okay.
I’ve encountered similar issues all the time in my own writing, even though I outline like crazy. With The Icon Thief, I knew from early on in the process where the story would end: in the Philadelphia Museum of Art, with my main character standing before the closed door leading into Étant Donnés. How to get her there, however, remained a problem for a long time, and it wasn’t until I had written more than a third of the novel that I managed to come up with a solution. Similarly, in House of Passages, there’s a moment when I knew that a character had to make a series of brilliant deductions to advance to the next stage of the plot. But what? I could see the blank space where they would go, but not the deductions themselves, and like Block, I ended up going back and laying in most of my clues after the fact.
And yet this is one of the great pleasures of writing. I’ve previously quoted Walter Murch on the fact that you don’t want to answer all of the questions posed by a work of art at its earliest stages. In fact, you should hope that serious questions remain unanswered until the very end. In any artistic pursuit, once you’ve reached a certain level of competence, there’s always the risk that you’ll become bored or complacent. The best way to avoid this is by deliberately leaving problems for yourself to solve, trusting that luck, intuition and skill will carry you through. Almost invariably, they do—or at least well enough so that, with the proper adjustments, nobody will ever notice the seams. In the process, you’ll grow as a writer. And maybe, in the end, you’ll even profit.
Now that I’ve reached the home stretch on the sequel to The Icon Thief, I’ll need to turn my attention shortly to the next stage in the process: cutting the manuscript. I’m contracted to deliver a novel in the neighborhood of 100,000 words or so, which means that at the moment, my first draft is at least twenty percent too long. This is mostly on purpose—there’s nothing wrong with having some extra material at the beginning, as long as you’re planning to fulfill Stephen King’s dictum—but in practice, getting a draft down to that desired length can be a bit of a challenge. With that in mind, I thought I’d pull together some of my favorite maxims on cutting, more for my own reference than anything else:
1. Burn the first reel. This is one of David Mamet’s favorite principles, but it goes back at least as far as Frank Capra’s memoir The Name Above the Title, in which he recounts how he saved Lost Horizon by burning the first two reels. (Capra wasn’t kidding, either. He writes: “I ran up to the cutting rooms, took those blasted first two reels in my hot little hands, ran to the ever-burning big black incinerator—and threw them into the fire.”) Whatever the source, the advice remains sound: in a first draft, writers and directors tend to spend a lot of time easing into the story, when audiences benefit most from being thrown right into the action. The moral? Cut exposition and open with your most dramatic scene.
2. Jump from middle to middle. This takes the previous maxim, which governs the structure of the story as a whole, and applies it to the level of individual scenes or chapters. Early on, writers often take their time building to the heart of a scene, then backing out again, which tends to kill the momentum. Instead of a neat beginning, middle, and end for each chapter, just write the middle. And as I’ve said before, if a sequence of episodes is dragging, try cutting the first and last paragraphs of each scene. In terms of its immediate, often startling effectiveness, this may be the single most useful writing trick I know. (For extra credit, check out Robert Parrish’s wonderful account, courtesy of Walter Murch, of how a similar trick was used to save the original film version of All the King’s Men.)
3. When in doubt, cut it out. If you don’t think you need a chapter, a scene, or a line, you’re almost certainly right. For The Icon Thief, I had to cut like a maniac—the original draft was over 180,000 words long, and when you factor in incidental material and subsequent chapters that were written and discarded, I cut close to an entire page for every one I kept. When I look back at it now, though, I can’t remember any of the cuts I made. A cut may seem painful at the time, but it’s surprising how quickly nonessential material disappears down the memory hole. If, months later, you find that you remember and miss it, it may be necessary to restore the missing paragraphs, but this almost never happens. And it’s far more likely, when you finally see your work in print, that you’ll regret the cuts you should have made.
A few months ago, after greatly enjoying The Conversations, Michael Ondaatje’s delightful book-length interview with Walter Murch, I decided to read Ondaatje’s The English Patient for the first time. I went through it very slowly, only a handful of pages each day, in parallel with my own work on the sequel to The Icon Thief. Upon finishing it last week, I was deeply impressed, not just by the writing, which had drawn me to the book in the first place, but also by the novel’s structural ingenuity—derived, Ondaatje says, from a long process of rewriting and revision—and the richness of its research. This is one of the few novels where detailed historical background has been integrated seamlessly into the poetry of the story itself, and it reflects a real, uniquely novelistic curiosity about other times and places. It’s a great book.
Reading The English Patient also made me want to check out the movie, which I hadn’t seen in more than a decade, when I watched it as part of a special screening for a college course. I recalled admiring it, although in a rather detached way, and found that I didn’t remember much about the story, aside from a few moments and images (and the phrase “suprasternal notch”). But I sensed it would be worth revisiting, both because I’d just finished the book and because I’ve become deeply interested, over the past few years, in the career of editor Walter Murch. Murch is one of film’s last true polymaths, an enormously intelligent man who just happened to settle into editing and sound design, and The English Patient, for which he won two Oscars (including the first ever awarded for a digitally edited movie), is a landmark in his career. It was with a great deal of interest, then, that I watched the film again last night.
First, the good news. The adaptation, by director Anthony Minghella, is very intelligently done. It was probably impossible to film Ondaatje’s full story, with its impressionistic collage of lives and memories, in any kind of commercially viable way, so the decision was wisely made to focus on the central romantic episode, the doomed love affair between Almásy (Ralph Fiennes) and Katherine Clifton (Kristin Scott Thomas). Doing so involved inventing a lot of new, explicitly cinematic material, some satisfying (the car crash and sandstorm in the desert), some less so (Almásy’s melodramatic escape from the prison train). The film also makes the stakes more personal: the mission of Caravaggio (Willem Dafoe) is less about simple fact-finding, as it was in the book, than about revenge. And the new ending, with Almásy silently asking Hana (Juliette Binoche) to end his life, gives the film a sense of resolution that the book deliberately lacks.
These changes, while extensive, are smartly done, and they respect the book while acknowledging its limitations as source material. As Roger Ebert points out in his review of Apocalypse Now, another milestone in Murch’s career, movies aren’t very good at conveying abstract ideas, but they’re great for showing us “the look of a battle, the expression on a face, the mood of a country.” On this level, The English Patient sustains comparison with the works of David Lean, with a greater interest in women, and remains, as David Thomson says, “one of the most deeply textured of films.” Murch’s work, in particular, is astonishing, and the level of craft on display here is very impressive.
Yet the pieces don’t quite come together. The novel’s tentative, intellectual nature, which the adaptation doesn’t try to match, infects the movie as well. It feels like an art film that has willed itself into being an epic romance, when in fact the great epic romances need to be a little vulgar—just look at Gone With the Wind. Doomed romances may obsess their participants in real life, but in fiction, seen from the outside, they can seem silly or absurd. The English Patient understands a great deal about the craft of the romantic epic, the genre in which it has chosen to plant itself, but nothing of its absurdity. In the end, it’s just too intelligent, too beautifully made, to move us on more than an abstract level. It’s a heroic effort; I just wish it were something a little more, or a lot less.
On Saturday, my wife and I finally saw Source Code, the new science fiction thriller directed by Moon‘s Duncan Jones. I liked Moon a lot, but wasn’t sure what to expect from his latest film, and was pleasantly surprised when it turned out to be the best new movie I’ve seen this year. Admittedly, this is rather faint praise—by any measure, this has been a slow three months for moviegoers. And Source Code has its share of problems. It unfolds almost perfectly for more than an hour, then gets mired in an ending that tries, not entirely successfully, to be emotionally resonant and tie up all its loose ends, testing the audience’s patience at the worst possible time. Still, I really enjoyed it. The story draws you in viscerally and is logically consistent, at least up to a point, and amounts to a rare example of real science fiction in a mainstream Hollywood movie.
By “real” science fiction, of course, I don’t mean that the science is plausible. The science in Source Code is cheerfully absurd, explained with a bit of handwaving about quantum mechanics and parabolic calculus, but the movie is unusual in having the courage to follow a tantalizing premise—what if you could repeatedly inhabit the mind of a dead man eight minutes before he died?—through most of its possible variations. This is what the best science fiction does: it starts with an outlandish idea and follows it relentlessly through all its implications, while never violating the rules that the story has established. And one of the subtlest pleasures of Ben Ripley’s screenplay for Source Code lies in its gradual reveal of what the rules actually are. (If anything, I wish I’d known less about the story before entering the theater.)
This may sound like a modest accomplishment, but it’s actually extraordinarily rare. Most of what we call science fiction in film is thinly veiled fantasy with a technological sheen. A movie like Avatar could be set almost anywhere—the futuristic trappings are incidental to a story that could have been lifted from any western or war movie. (Walter Murch even suggests that George Lucas based the plot of Star Wars on the work he did developing Apocalypse Now.) Star Trek was often a show about ideas, but its big-screen incarnation is much more about action and spectacle: Wrath of Khan, which I think is the best science fiction film ever made, has been aptly described as Horatio Hornblower in space. And many of the greatest sci-fi movies—Children of Men, Blade Runner, Brazil—are more about creating the look and feel of a speculative future than any sense of how it might actually work.
And this is exactly how it should be. Movies, after all, aren’t especially good at conveying ideas; a short story, or even an episode of a television show, is a much better vehicle for working out a clever premise than a feature film. Because movies are primarily about action, character, and image, it isn’t surprising that Hollywood has appropriated certain elements of science fiction and left the rest behind. What’s heartening about Source Code, especially so soon after the breakthrough of Inception, is how it harnesses its fairly ingenious premise to a story that works as pure entertainment. There’s something deeply satisfying about seeing the high and low aspects of the genre joined so seamlessly, and it requires a peculiar set of skills on the part of the director, who needs to be both fluent with action and committed to ideas. Chris Nolan is one; Duncan Jones, I’m excited to say, looks very much like another.
Last night, my wife and I watched the great documentary Hearts of Darkness: A Filmmaker’s Apocalypse, which will hopefully bring my resurgent fascination with Apocalypse Now to a close, at least for the moment. (Which is something my wife is probably glad to hear.) And yet I’m still not quite sure why this movie, so extraordinary and yet so flawed, seized my imagination so forcefully again, when it had been at least ten years since I saw it any form. Part of it, obviously, was learning about Walter Murch’s fascinating editing process in the book The Conversations, but I think it’s also because this movie represents an audacity and willingness to take risks that has largely passed out of fashion, and which I’m trying to recover in my own work, albeit at a much more modest scale.
For those of us who were too young, or unborn, to remember when this movie came out, here’s the short version. Francis Coppola, coming off the great success of the two Godfather movies, decides to make Apocalypse Now, from a script by John Milius, as the first movie by his nascent Zoetrope Studios, even though he isn’t sure about the ending. Instead of the small, guerrilla-style movie that other potential directors, including George Lucas, had envisioned, Coppola elects to make a big, commercial war movie “in the tradition of Irwin Allen,” as he says in Hearts of Darkness. He pays the most important actor in the world, Marlon Brando, three million dollars for three weeks of filming. The entire Philippine air force is placed at his disposal. He goes off into the jungle, along with his entire family and a huge production team—and then what?
Well, he goes deeper. He throws out the original ending, fires his lead actor (Harvey Keitel, who was replaced with Martin Sheen after filming had already begun), and puts millions of dollars of his own money on the line. When Brando arrives, hugely overweight and unable to perform the role as written, the rest of the production is put on hold as they indulge in days of filmed improvisations, searching for a way out of their narrative bind. Coppola is convinced that the movie will be a failure, yet seems to bet everything on the hope that his own audacity will carry him through. And it works. The movie opens years behind schedule and grossly over budget, but it’s a huge hit. It wins many awards and is named one of the greatest movies of all time. Coppola survives. (It isn’t until a couple of years later, with One From the Heart, that he meets his real downfall, not in the jungle but in his own backyard.)
This is an astonishing story, and one that is unlikely ever to repeat itself. (Only Michael Bay gets that kind of money these days.) And yet, for all its excesses, the story has universal resonance. Coppola is the quintessential director, even more than Welles. His life reads like the perfect summation of the New Hollywood: he began in cheap quickies for the Roger Corman factory, became an Academy Award-winning screenwriter, created two of the greatest and most popular movies in history, became rich enough almost be a studio in himself, gambled it all, won, gambled it all again, lost, spent a decade or more in the wilderness, and now presides over a vineyard, his own personal film projects, and the most extraordinary family in American movies. (Any family that includes Sofia Coppola, Jason Schwartzman, and Nicolas Cage is in a class by itself.)
So what are the lessons here? Looking at Coppola, I’m reminded of what Goethe said about Napoleon: “The story of Napoleon produces on me an impression like that produced by the Revelation of Saint John the Divine. We all feel there must be something more in it, but we do not know what.” And that’s how I feel about St. Francis of the Troubles, as David Thomson so aptly calls him. No director—not Lucas, not Spielberg, not Scorsese—has risked or accomplished more. If Zoetrope had survived in the form for which it had been intended, the history of movies might have been different. Instead, it’s a mirage, a dream, like Kane’s Xanadu. All that remains is Coppola’s voice, so intimate in his commentary tracks, warm, conversational, and charged with regret, inviting us to imagine what might have been.
Since yesterday’s posting on The Shining and Apocalypse Now, I’ve been thinking a lot about Stanley Kubrick and Francis Ford Coppola, who arguably had the two greatest careers in the past half century of American film. There have been other great directors, of course, but what sets Kubrick and Coppola apart is a matter of scale: each had a golden age—for Coppola, less than a decade, while for Kubrick, it lasted more than thirty years—when they were given massive budgets, studio resources, and creative control to make intensely, almost obsessively personal movies. The results are among the pillars of world cinema: aside from the two movies mentioned above, it gave us the Godfather films, 2001: A Space Odyssey, A Clockwork Orange, and more.
And yet these two men are also very different, both in craft and temperament. I’ve been listening to Coppola’s commentary tracks for the better part of a week now, and it’s hard to imagine a warmer, more inviting, almost grandfatherly presence—but even the most superficial look at his career reveals a streak of all but suicidal darkness. As David Thomson puts it:
[Coppola] tries to be everything for everyone; yet that furious effort may mask some inner emptiness. For he is very gregarious and very withdrawn, the life and soul of some parties, and a depressive. He is Sonny and Michael Corleone, for sure, but there are traces of Fredo, too—and he is at his best when secretly telling a part of his own story, or working out his fearful fantasies.
Kubrick, in some respects, is the opposite: a superficially cold and clinical director, deeply pessimistic about the human condition, who nonetheless was able to work happily and with almost complete creative freedom for the better part of his career. His films are often dark, but there’s also an abiding sense of a director tickled by the chance to play with such wonderful toys—whether the spaceships of 2001 or the fantastically detailed dream set of New York in Eyes Wide Shut. Coppola, by contrast, never seems entirely content unless the film stock is watered with his own blood.
These differences are also reflected in their approaches to filmmaking. Coppola and Kubrick have made some of the most visually ravishing movies of all time, but the similarities end there. Kubrick was controlling and precise—one assumes that every moment has been worked out in advance in script and storyboard—while Coppola seemed willing to follow the inner life of the movie wherever it led, whether through actors, the input of valued collaborators like Walter Murch, or the insane workings of chance or fate. This allowed him to make astonishing discoveries on set or in the editing room, but it also led to ridiculous situations like the ending of Apocalypse Now, where he paid Marlon Brando three million dollars to spend three weeks in the Philippines, but didn’t know what would happen when he got there. (And as the last scenes of the movie imply, he never did entirely figure it out.)
So what do these men have to tell us? Kubrick’s career is arguably greater: while you can debate the merits of the individual movies, there’s no doubt that he continued to make major films over the course of four decades. Coppola, alas, had eight miraculous years where he changed film forever, and everything since has been one long, frustrating, sometimes enchanting footnote (even if, like me, you love his Dracula and One From the Heart). It’s possible that Coppola, who spent such a long time in bankruptcy after his delirious dreams had passed, wishes he’d been more like Kubrick the clinician. And yet Coppola is the one who seems to have the most lessons for the rest of us. He’s the model of all true artists and directors: technically astounding, deeply humane, driven to find something personal in the most unlikely subjects, visionary, loyal, sometimes crazy, and finally, it seems, content. We’re all Coppola’s children. Kubrick, for all his genius, is nothing but Kubrick.
Today’s quote of the day comes from a fascinating interview with the poet Gary Snyder, which I came across yesterday after seeing it mentioned in Robert and Michèle Root-Bernstein’s stimulating book Sparks of Genius. The part of the interview that caught my eye goes as follows:
Say you wanted to be a poet, and you saw a man that you recognized as a master mechanic or a great cook. You would do better, for yourself as a poet, to study under that man than to study under another poet who was not a master, that you didn’t recognize as a master.
Snyder goes on to give a specific example:
I use the term master mechanic because I know a master mechanic, Rod Coburn. Whenever I spend any time with him, I learn something from him…About everything. But I see it in terms of my craft as a poet. I learn about my craft as a poet. I learn about what it really takes to be a craftsman, what it really means to be committed, what it really means to work.
Which struck me for a number of reasons. As a writer, I’ve always been conscious of the fact that much of what I’ve learned about the creative process comes from the work of nonliterary artists. Regular readers of this blog know how much I’ve learned about writing and editing from David Mamet and Walter Murch. My approach to my own work owes as much to The Mystery of Picasso or the video games of Shigeru Miyamoto as to John Gardner’s Art of Fiction. More recently, Stephen Sondheim’s Finishing the Hat, with its detailed descriptions of the lyricist’s craft, has been an endless source of instruction and encouragement.
The point of all this, I think, is that it’s easy to get caught up in the conventions of the craft—whether it’s fiction, poetry, art, or something else entirely—that you know best. Studying other forms of art is one way, and perhaps the best, of knocking yourself out of your usual assumptions. And I don’t think I’m alone in this. I recently came across an interview with cartoonist Daniel Clowes in which he explained how his work in film (including Ghost World and Art School Confidential) has influenced the way he plans his comics:
To me, the most useful experience in working in “the film industry” has been watching and learning the editing process. You can write whatever you want and try to film whatever you want, but the whole thing really happens in that editing room. How do you edit comics? If you do them in a certain way, the standard way, it’s basically impossible. That’s what led me to this approach of breaking my stories into segments that all have a beginning and end on one, two, three pages. This makes it much easier to shift things around, to rearrange parts of the story sequence.
And the best way to put lessons from other media to work, as Snyder points out, is to study the masters. This week, if time permits, I’m going to be talking about a handful of artists in other media—music, comics, film, and television—that have influenced the way I approach my own writing.