Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Adam Gopnick

The act of cutting

leave a comment »

In a recent article in The New Yorker on Ernest Hemingway, Adam Gopnik evocatively writes: “The heart of his style was not abbreviation but amputation; not simplicity but mystery.” He explains:

Again and again, he creates his effects by striking out what would seem to be essential material. In “Big Two-Hearted River,” Nick’s complicated European experience—or the way that fishing is sanity-preserving for Nick, the damaged veteran—is conveyed clearly in the first version, and left apparent only as implication in the published second version. In a draft of the heartbreaking early story “Hills Like White Elephants,” about a man talking his girlfriend into having an abortion, Hemingway twice uses the words “three of us.” This is the woman’s essential desire, to become three rather than two. But Hemingway strikes both instances from the finished story, so the key image remains as ghostly subtext within the sentences. We feel the missing “three,” but we don’t read it.

Gopnik concludes: “The art comes from scissoring out his natural garrulousness, and the mystery is made by what was elided. Reading through draft and then finished story, one is repeatedly stunned by the meticulous rightness of his elisions.” Following Hemingway’s own lead, Gopnik compares his practice to that of Cézanne, but it’s also reminiscent of Shakespeare, who frequently omits key information from his source material while leaving the other elements intact. Ambiguity, as I’ve noted here before, emerges from a network of specifics with one crucial piece removed.

Over the last two weeks, I’ve been ruthlessly cutting the first draft of my book, leaving me highly conscious of the effects that can come out of compression. In his fascinating notebooks, which I quoted here yesterday, Samuel Butler writes: “I have always found compressing, cutting out, and tersifying a passage suggests more than anything else does. Things pruned off in this way are like the heads of the hydra, two grow for every two that is lopped off.” This squares with my experience, and it reflects how so much of good writing depends on juxtaposition. By cutting, you’re bringing the remaining pieces closer together, which allows them to resonate. Butler then makes a very interesting point:

If a writer will go on the principle of stopping everywhere and anywhere to put down his notes, as the true painter will stop anywhere and everywhere to sketch, he will be able to cut down his works liberally. He will become prodigal not of writing—any fool can be this—but of omission. You become brief because you have more things to say than time to say them in. One of the chief arts is that of knowing what to neglect and the more talk increases the more necessary does this art become.

I love this passage because it reveals how two of my favorite activities—taking notes and cutting—are secretly the same thing. On some level, writing is about keeping the good stuff and removing as much of the rest as possible. The best ideas are likely to occur spontaneously when you’re doing something unrelated, which is why you need to write them down as soon as they come to you. When you’re sitting at your desk, you have little choice but to write mechanically in hopes that something good will happen. And in the act of cutting, the two converge.

Cutting can be a creative act in itself, which is why you sometimes need to force yourself to do it, even when you’d rather not. You occasionally see a distinction drawn between the additive and subtractive arts, but any work often partakes of both at various stages, which confer different benefits. In Behind the Seen, Charles Koppelman says of editing a movie in postproduction:

The orientation over the last six months has been one of accumulation, a building-up of material. Now the engines are suddenly thrown into full reverse. The enterprise will head in the opposite direction, shedding material as expeditiously as possible.

We shouldn’t disregard how challenging that mental switch can be. It’s why an editor like Walter Murch rarely visits the set, which allows him to maintain a kind of Apollonian detachment from the Dionysian process of filmmaking: he doesn’t want to be dissuaded from the need to cut a scene by the knowledge of how hard it was to make it. Writers and other artists working alone don’t have that luxury, and it can be difficult to work yourself up to the point where you’re ready to cut a section that took a long time to write. Time creates its own sort of psychological distance, which is why you’re often advised to put aside the draft for a few weeks, or even longer, before starting to revise it. (Zadie Smith writes deflatingly: “A year or more is ideal—but even three months will do.”) That isn’t always possible, and sometimes the best compromise is to work briefly on another project, like a short story. A change is as good as a rest, and in this case, you’re trying to transform into your future self as soon as possible, which will allow you to perform clinical surgery on the past.

The result is a lot like the old joke: you start with a block of marble, and you cut away everything that doesn’t look like an elephant. When I began to trim my manuscript, I set myself the slightly arbitrary goal of reducing it, at this stage, by thirty percent, guided by the editing rule that I mentioned here a month ago:

Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat.

There’s no particular reason why the same percentage should hold for a book as well as a film, but I’ve found that it’s about right. (It also applies to other fields, like consumer electronics.) Really, though, it could have been just about any number, as long as it gave me a clear numerical goal at which to aim, and as long as it hurt a little. It’s sort of like physical exercise. If you want to lose weight, the best way is to eat less, and if you want to write a short book, ideally, you’d avoid writing too much in the first place. But the act of cutting, like exercise, has rewards of its own. As Elie Wiesel famously said: “There is a difference between a book of two hundred pages from the very beginning, and a book of two hundred pages which is the result of an original eight hundred pages. The six hundred pages are there. Only you don’t see them.” And the best indication that you’re on the right track is when it becomes physically painful. As Hemingway writes in A Farewell to Arms: “The world breaks everyone and afterward many are strong at the broken places.” That’s also true of books.

Written by nevalalee

June 29, 2017 at 8:38 am

You are here

with one comment

Adam Driver in Star Wars: The Force Awakens

Remember when you were watching Star Wars: The Force Awakens and Adam Driver took off his mask, and you thought you were looking at some kind of advanced alien? You don’t? That’s strange, because it says you did, right here in Anthony Lane’s review in The New Yorker:

So well is Driver cast against type here that evil may turn out to be his type, and so extraordinary are his features, long and quiveringly gaunt, that even when he removes his headpiece you still believe that you’re gazing at some form of advanced alien.

I’m picking on Lane a little here, because the use of the second person is so common in movie reviews and other types of criticism—including this blog—that we hardly notice it, any more than we notice the “we” in this very sentence. Film criticism, like any form of writing, evolves its own language, and using that insinuating “you,” as if your impressions had melded seamlessly with the critic’s, is one of its favorite conventions. (For instance, in Manohla Dargis’s New York Times review of the same film, she says: “It also has appealingly imperfect men and women whose blunders and victories, decency and goofiness remind you that a pop mythology like Star Wars needs more than old gods to sustain it.”) But who is this “you,” exactly? And why has it started to irk me so much?

The second person has been used by critics for a long time, but in its current form, it almost certainly goes back to Pauline Kael, who employed it in the service of images or insights that could have occurred to no other brain on the planet, as when she wrote of Madeline Kahn in Young Frankenstein: “When you look at her, you see a water bed at just the right temperature.” This tic of Kael’s has been noted and derided for almost four decades, going back to Renata Adler’s memorable takedown in the early eighties, in which she called it “the intrusive ‘you'” and noted shrewdly: “But ‘you’ is most often Ms. Kael’s ‘I,’ or a member or prospective member of her ‘we.'” Adam Gopnik later said: “It wasn’t her making all those judgments. It was the Pop Audience there beside her.” And “the second-person address” clearly bugged Louis Menand, too, although his dislike of it was somewhat undermined by the fact that he internalized it so completely:

James Agee, in his brief service as movie critic of The Nation, reviewed many nondescript and now long-forgotten pictures; but as soon as you finish reading one of his pieces, you want to read it again, just to see how he did it…You know what you think about Bonnie and Clyde by now, though, and so [Kael’s] insights have lost their freshness. On the other hand, she is a large part of the reason you think as you do.

Pauline Kael

Kael’s style was so influential—I hear echoes of it in almost everything I write—that it’s no surprise that her intrusive “you” has been unconsciously absorbed by the generations of film critics that followed. If it bothers you as it does me, you can quietly replace it throughout with “I” without losing much in the way of meaning. But that’s part of the problem. The “you” of film criticism conceals a neurotic distrust of the first person that prevents critics from honoring their opinions as their own. Kael said that she used “you” because she didn’t like “one,” which is fair enough, but there’s also nothing wrong with “I,” which she wasn’t shy about using elsewhere. To a large extent, Kael was forging her own language, and I’m willing to forgive that “you,” along with so much else, because of the oceanic force of the sensibilities to which it was attached. But separating the second person from Kael’s unique voice and turning it into a crutch to be indiscriminately employed by critics everywhere yields a more troubling result. It becomes a tactic that distances the writer slightly from his or her own judgments, creating an impression of objectivity and paradoxical intimacy that has no business in a serious review. Frame these observations in “I,” and the critic would feel more of an obligation to own them and make sense of them; stick them in a convenient “you,” and they’re just one more insight to be tossed off, as if the critic happened to observe it unfolding in your brain and can record it here without comment.

Obviously, there’s nothing wrong with wanting to avoid the first person in certain kinds of writing. It rarely has a place in serious reportage, for instance, despite the efforts of countless aspiring gonzo journalists who try to do what Norman Mailer, Hunter S. Thompson, and only a handful of others have ever done well. (It can even plague otherwise gifted writers: I was looking forward to Ben Lerner’s recent New Yorker piece about art conservation, but I couldn’t get past his insistent use of the first person.) But that “I” absolutely belongs in criticism, which is fundamentally a record of a specific viewer, listener, or reader’s impressions of his or her encounter with a piece of art. All great critics, whether they use that “you” or not, are aware of this, and it can be painful to read a review by an inexperienced writer that labors hard to seem “objective.” But if our best critics so often fall into the “you” trap, it’s a sign that even they aren’t entirely comfortable with giving us all of themselves, and I’ve started to see it as a tiny betrayal—meaningful or not—of what ought to be the critic’s intensely personal engagement with the work. And if it’s only a tic or a trick, then we sacrifice nothing by losing it. Replace that “you” with “I” throughout, making whatever other adjustments seem necessary, and the result is heightened and clarified, with a much better sense of who was really sitting there in the dark, feeling emotions that no other human being would ever feel in quite the same way.

%d bloggers like this: