Posts Tagged ‘The Elements of Style’
The allure of unknowing
Although there is no substitute for merit in writing, clarity comes closest to being one. Even to a writer who is being intentionally obscure or wild of tongue we can say, “Be obscure clearly! Be wild of tongue in a way we can understand!”
—E.B. White, The Elements of Style
Last night, while watching the new X-Files episode “Ghouli,” which I actually sort of liked, I found myself pondering the ageless question of why this series is so uneven. It isn’t as if I haven’t wondered about this before. Even during the show’s golden years, which I’d roughly describe as its first five seasons, it was hard not to be struck by how often a classic installment was followed a week later by one that was insultingly bad. (This might explain the otherwise inexplicable reference in last week’s “The Lost Art of Forehead Sweat” to “Teso dos Bichos,” a terrible episode memorable mostly for interrupting the strongest run that the series ever had. As Reggie says: “Guys, if this turns out to be killer cats, I’m going to be very disappointed.”) Part of this may be due to the fact that I’ve watched so many episodes of this show, which had me tuning in every week for years, but I don’t think that it’s just my imagination. Most series operate within a fairly narrow range of quality, with occasional outliers in both sides, but the worst episodes of The X-Files are bad in ways that don’t apply to your average procedural. They aren’t simply boring or routine, but confusing, filled with illogical behavior by the main characters, ugly, and incoherent. There are also wild swings within individual episodes, like “Ghouli” itself, which goes so quickly from decent to awful to inexplicable to weirdly satisfying that it made me tired to watch it. And while last season proved that there are worse things than mere unevenness—with one big exception, it consisted of nothing but low points—I think it’s still worth asking why this series in particular has always seemed intent on punishing its fans with its sheer inconsistency.
One possible explanation is that The X-Files, despite its two regular leads, was basically an anthology show, which meant that every episode had to start from scratch in establishing a setting, a supporting cast, and even a basic tone. This ability to change the rules from one week to the next was a big part of what made the show exciting, but it also deprived it of the standard safety net—a narrative home base, a few familiar faces in the background—on which most shows unthinkingly rely. It’s a testament to the clarity and flexibility of Chris Carter’s original premise that it ever worked at all, usually thanks to a line or two from Scully, leafing through a folder in the passenger seat of a rental car, to explain why they were driving to a small town in the middle of nowhere. (In fact, this stock opening became less common as the show went on, and it never really found a way to improve on it.) It was also a science fiction and fantasy series, which meant that even the rules of reality tended to change from one installment to another. As a result, much of the first act of every episode was spent in orienting the audience, which represented a loss of valuable screen time that otherwise could have been put to other narrative ends. Watching it reminds us of how much other shows can take for granted. In Bambi vs. Godzilla, David Mamet writes: “When you walk into a bar and see a drama on the television, you’ve missed the exposition. Do you have any trouble whatever understanding what’s going on?” That’s true of most dramas, but not necessarily of The X-Files, in which you could sit through an episode from the beginning and still be lost halfway through. You could make a case that this disorientation was part of its appeal, but it wasn’t a feature. It was a bug.
And the most damning criticism that you can advance against The X-Files is that its narrative sins were routinely overlooked or forgiven by its creators because it was supposedly “about” confusion and paranoia. Early on, the myth arose that this was a series that deliberately left its stories unresolved, in contrast to the tidy conclusions of most procedurals. As the critic Rob Tannenbaum wrote in Details back in the late nineties:
What defines The X-Files is the allure of unknowing: Instead of declaring a mystery and solving it by the end of the show, as Columbo and Father Dowling did, Carter has spent five year showing us everything except the truth. He is a high-concept tease who understands an essential psychological dynamic: The less you give, the more people want. Watching The X-Files is almost an interactive venture. It’s incomplete enough to compel viewers to complete the blank parts of the narrative.
This might be true enough of many of the conspiracy episodes, but in the best casefiles, and most of the mediocre ones, there’s really no doubt about what happened. Mulder and Scully might not end up with all of the information, but the viewers usually do, and an episode like “Pusher” or “Ice” is an elegant puzzle without any missing pieces. (Even “Jose Chung’s From Outer Space,” which is explicitly about the failure of definitive explanations, offers a reading of itself that more or less makes sense.) Unfortunately, the blank spaces in the show’s mytharc were also used to excuse errors of clarity and resolution, which in turn encouraged the show to remain messy and unsatisfying for no good reason.
In other words, The X-Files began every episode at an inherent disadvantage, with all of the handicaps of a science fiction anthology show that had to start from nothing each week, as well as a premise that allowed it to explain away its narrative shortcomings as stylistic choices, which wasn’t true of shows like Star Trek or The Twilight Zone. All too often, this was a deadly combination. In an academic study that was published when the show was still on the air, the scholar Jan Delasara writes:
When apprehended consciously, narrative gaps may seem random accidents or continuity errors. Who substitutes the dead dog for Private McAlpin’s corpse in the episode “Fresh Bones?” And why? What did the demon’s first wife remember but not tell her husband in “Terms of Endearment?” Who is conducting the experiment in subliminal suggestion along with chemical phobia enhancement in “Blood?” Is Mulder’s explanation really what’s going on?
Delasara argues that such flaws are the “disturbing gaps and unresolved questions” typical of supernatural horror, but it’s fair to say that in most of these cases, if the writers could have come up with something better, they would have. The X-Files had a brilliant aesthetic that also led to the filming of scripts that never would have been approved on a show that wasn’t expressly about dislocation and the unknown. The result often left me alienated, but probably not in the way that the creators intended. Mulder and Scully might never discover the full truth—but that doesn’t excuse their writers.
The variety show
In this week’s issue of The New York Times Style Magazine, Lin-Manuel Miranda interviews Stephen Sondheim, whom he calls “musical theater’s greatest lyricist.” The two men have known each other for a long time, and Miranda shares a memorable anecdote from their friendship:
Sondheim was one of the first people I told about my idea for a piece about Alexander Hamilton, back in 2008…I’d been hired to write Spanish translations for a Broadway revival of West Side Story, and during our first meeting he asked me what I was working on next. I told him “Alexander Hamilton,” and he threw back his head in laughter and clapped his hands. “That is exactly what you should be doing. No one will expect that from you. How fantastic.” That moment alone, the joy of surprising Sondheim, sustained me through many rough writing nights and missed deadlines. I sent him early drafts of songs over the seven-year development of Hamilton, and his email response was always the same. “Variety, variety, variety, Lin. Don’t let up for a second. Surprise us.”
During their interview, Sondheim expands on the concept of “variety” by describing an Off-Broadway play about “the mad queen of Spain” that he once attended with the playwright Peter Shaffer. When Sondheim wondered why he was so bored by the result, despite its nonstop violence, Shaffer explained: “There’s no surprise.” And Sondheim thought to himself: “Put that on your bathroom mirror.”
“The unexpected, the unexpected, that’s what theater is about,” Sondheim concludes to Miranda. “If you had to patent one thing in the theater, it’s surprise.” This is good advice. Yet when you turn to Sondheim’s own books on the craft of lyric writing, Finishing the Hat and Look I Made a Hat, you find that he doesn’t devote much space to the notions of variety or surprise at all, at least not explicitly. In fact, at first glance, the rules that he famously sets forth in the preface to both books seem closer to the opposite:
There are only three principles necessary for a lyric writer, all of them familiar truisms. They were not immediately apparent to me when I started writing, but have come into focus via Oscar Hammerstein’s tutoring, Strunk and White’s huge little book The Elements of Style and my own sixty-some years of practicing the craft. I have not always been skilled or diligent enough to follow them as faithfully as I would like, but they underlie everything I’ve ever written. In no particular order, and to be inscribed in stone: Content Dictates Form, Less Is More, God Is in the Details, all in the service of Clarity, without which nothing else matters.
Obviously, these guidelines can be perfectly consistent with the virtues of variety and surprise—you could even say that clarity, simplicity, and attention to detail are what enable lyricists to engage in variety without confusing the listener. But it’s still worth asking why Sondheim emphasizes one set of principles here and another when advising Miranda in private.
When you look through Sondheim’s two books of lyrics, the only reference to “variety” in the index is to the show business magazine of the same name, but references to these notions are scattered throughout both volumes. Writing of Sweeney Todd in Finishing the Hat, Sondheim says: “Having taken the project on, I hoped that I’d be able to manage the argot by limiting myself to the British colloquialisms [playwright Christopher] Bond had used, mingled with the few I knew. There weren’t enough, however, to allow for variety of image, variety of humor, and, most important, variety of rhyme.” He criticizes the “fervent lack of surprise” in the lyrics of his mentor, Oscar Hammerstein, and he writes emphatically in his chapter on Gypsy: “Surprise is the lifeblood of the theater, a thought I’ll expand on later.” For his full statement on the subject, however, you have to turn to Look, I Made a Hat. After sharing his anecdote about attending the play with Shaffer, Sondheim continues:
[Shaffer said that] it had many incidents but no surprise. He didn’t mean surprise plot twists—there were plenty of those—but surprises in character and language. Every action, every moment, every sentence foretold the next one. We, the audience, were consciously or unconsciously a step ahead of the play all evening long, and it was a long evening…[Surprise] comes in many flavors: a plot twist, a passage of dialogue, a character revelation, a note in a melody, a harmonic progression, startling moments in staging, lighting, orchestration, unexpected song cues…all the elements of theater. There are surprises to be had everywhere if you want to spring them, and it behooves you to do so. What’s important is that the play be ahead of the audience, not vice versa. Predictability is the enemy.
So if surprise is “the lifeblood of the theater,” why doesn’t Sondheim include it in the preface as one of his central principles? In his next paragraph, he provides an important clue:
The problem with surprise is that you have to lay out a trail for the audience to follow all the while you’re keeping slightly ahead. You don’t want them to be bored, but neither do you want them to be confused, and unfortunately there are many ways to do both. This applies to songs as well as to plays. You can confuse an audience with language by being overly poetic or verbose, or you can bore them by restating something they know, which inserts a little yawn into the middle of the song. It’s a difficult balancing act.
The only way to achieve this balance is through the principles of simplicity and clarity—which is why Sondheim puts them up front, while saving variety for later. If you advise young writers to go for variety and surprise too soon, you end up with Queen Juana of Castile. It’s only after clarity and all of its boring supporting virtues have been internalized that the writer can tackle variety with discipline and skill. (As T.S. Eliot pointed out, it’s better to imitate Dante than Shakespeare: “If you follow Dante without talent, you will at worst be pedestrian and flat; if you follow Shakespeare or Pope without talent, you will make an utter fool of yourself.” And Samuel Johnson, let’s not forget, thought that the great excellence of Hamlet was its “variety.”) Miranda had clearly mastered the fundamentals, so Sondheim advised him to focus on something more advanced. It worked—one of the most thrilling things about Hamilton is its effortless juxtaposition of styles and tones—but only because its author had long since figured out the basics. And that shouldn’t come as a surprise.
Writing with scissors
Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)
Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:
The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)
Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.
And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:
If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.
This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.
The elements of negation
In The Elements of Style, William Strunk and E.B. White provide the useful precept: “Put statements in positive form. Make definite assertions. Avoid timid, colorless, hesitating, noncommittal language. Use the word not as a means of denial or in antithesis, never as a means of evasion.” After offering a few illustrations for the sake of comparison, such as “He was not very often on time” as opposed to “He usually came late,” they conclude:
All [these] examples show the weakness inherent in the word not. Consciously or unconsciously, the reader is dissatisfied with being told only what is not; he wishes to be told what it is. Hence, as a rule, it is better to express even a negative in a positive form.
Along with all the other benefits that come with preferring positives over negatives, there’s the subtle point, which Strunk and White don’t mention explicitly, that it forces the writer to think just a little harder at a time when he or she would probably prefer otherwise. The sentence “Shakespeare does not portray Katherine as a very admirable character, nor does Bianca remain long in memory as an important character in Shakespeare’s works” is both longer and less interesting than “Katharine is disagreeable, Bianca significant,” but it’s also easier to write. It’s in that one additional pass, as the writer has to figure out what something is, rather than what it isn’t, that insight tends to happen. All else being equal, the best writing rules are the ones that oblige us to move beyond the obvious answer.
The other problem with negation is that it carries its positive form along with it, like an unwanted ghost or a double exposure. In Philosophical Investigations, Ludwig Wittgenstein writes, with my emphasis: “The feeling is as if the negation of a proposition had to make it true in a certain sense, in order to negate it.” Wittgenstein continues, in an oddly beautiful passage:
“If I say I did not dream last night, still I must know where to look for a dream; that is, the proposition ‘I dreamt,’ applied to this actual situation, may be false, but mustn’t be senseless.”—Does that mean, then, that you did after all feel something, as it were the hint of a dream, which made you aware of the place which a dream would have occupied?
Again: if I say “I have no pain in my arm,” does that mean that I have a shadow of the sensation of pain, which as it were indicates where the pain might be? In what sense does my present painless state contain the possibility of pain?
Or as he puts it a few paragraphs earlier: “A red patch looks different from when it is there from when it isn’t there—but language abstracts from this difference, for it speaks of a red patch whether it is there or not.”
When it comes to conveying meaning, this fact has real practical consequences. As The Stanford Encyclopedia of Philosophy notes: “Not only are negative statements (e.g., ‘Paris isn’t the capital of Spain’) generally less informative than affirmatives (‘Paris is the capital of France’), they are morphosyntactically more marked (all languages have negative markers while few have affirmative markers) and psychologically more complex and harder to process.” In a footnote, it adds:
One consequence of the formal markedness asymmetry is that a negative statement embeds its affirmative counterpart within it; when Nixon famously insisted “I am not a crook” or Clinton “I did not have sex with that woman,” the concealed affirmation was more significant than the surface denial. The same asymmetry is exploited in non-denial denials, such as Republican campaign operative Mary Matalin’s disingenuous protest “We’ve never said to the press that Clinton’s a philandering, pot-smoking draft-dodger.”
Politics is the arena where literary style, like sociology, is tested in the real world, which makes it all the more striking to see how often politicians turn to the negative form when forced to issue denials. Like the phrase “Mistakes were made,” the “I am not a crook” statement has become such a cliché that you’d think that they would avoid it, but it still appears regularly—which implies that it fulfills some deep psychological need.
So what kind of need is it? The philosopher Henri Bergson gets close to the heart of the matter, I think, in a very evocative passage in Creative Evolution, which I’ve highlighted in a few places for emphasis:
Negation is not the work of pure mind, I should say of a mind placed before objects and concerned with them alone. When we deny, we give a lesson to others, or it may be to ourselves. We take to task an interlocutor, real or possible, whom we find mistaken and whom we put on his guard. He was affirming something: we tell him he ought to affirm something else (though without specifying the affirmation which must be substituted). There is no longer then, simply, a person and an object; there is, in face of the object, a person speaking to a person, opposing him and aiding him at the same time; there is a beginning of society. Negation aims at some one, and not only, like a purely intellectual operation, at some thing. It is of a pedagogical and social nature. It sets straight or rather warms—the person warned and set straight being, possibly by a kind of doubling, the very person who speaks.
Politicians are an unusual species because so many of their private utterances become public, and their verbal slips, as on the analyst’s couch, are where they reveal the most. Sometimes it feels as if we’ve overheard them talking to themselves. When Nixon said, “People have got to know whether or not their president is a crook,” he was introducing a word into the conversation that hadn’t been there before, because it had already been rattling around in his brain. And when a politician speaks in the negative, it offers us a peek into the secret conversation that he has been having in his head all along: “I am not a crook,” “I did not have sex with that woman,” “I did not collude.”
The gospel of nouns and verbs
That’s not a Bible issue.
“Write with nouns and verbs, not with adjectives and adverbs,” William Strunk, Jr. and E.B. White advise in The Elements of Style. It’s one of the first rules that many aspiring writers hear, and it doesn’t take long to figure out why it works. When you make a point of telling stories and expressing thoughts using tangible nouns and concrete verbs, you quickly find that the result is more vivid, clear, and memorable. It’s an exercise in clarity that amounts to a form of courtesy, not just to the reader, but to yourself. Not every idea can be conveyed in the form of images or actions, but by at least making the effort, you’re more likely to discover the areas where your own thinking is muddled or incomplete. The reverse also holds true. Just as a safety handbook becomes a sabotage manual when you just do the opposite of everything it says, The Elements of Style can be used to confuse and mislead, simply by inverting each of its rules into its own negation. By relying on the passive voice, vague language, and empty abstractions, you can make it harder for readers to understand what you’re really saying, or even to think for themselves. As George Orwell knew, such tactics can be used deliberately by governments to discourage critical thinking, and they can also be used unconsciously to avoid uncomfortable truths that we’d prefer not to confront. (My favorite illustration is Vijith Assar’s “An Interactive Guide to Ambiguous Grammar,” which is maybe the single best piece of online content I’ve seen in the last decade.)
And for an example of its potential consequences, you don’t need to look any further than an ongoing experiment that has been underway, in one form or another, for close to two thousand years. It’s called the New Testament. I’ve spoken before of my admiration for The Five Gospels, an ambitious attempt to use modern scholarly tools and consensus to uncover the original core of Jesus’s message. The Jesus Seminar takes a number of approaches to evaluating the authenticity of this material, but one of its most powerful methods comes down to an application of simple common sense. By definition, anything that Jesus said that survived to be written down in the latter half of the first century must have persisted for decades by word of mouth. We can get a rough sense of how that oral tradition might have looked by figuring out, almost from first principles, what kind of material is most likely to be passed down with a minimum of alteration. It tends to consist mostly of short, pithy, self-contained sayings or stories with distinctive ideas, memorable images, or apparent paradoxes. The resulting “database” of parables and aphorisms can be used as a baseline from which we can analyze the rest, and what we find, inevitably, is that the teachings that pass this initial test are concrete, rather than abstract—a gospel of nouns and verbs. You could even say that the whole point of Strunk and White’s rule is to make written prose approximate the vigor and power of spoken language. And the sayings of Jesus that have been transmitted to us intact exemplify a predominantly oral culture at its best.
As the scholars of the Jesus Seminar take pains to point out, identifying certain verses as more likely to have emerged from an oral tradition doesn’t mean that we should ignore the rest. But it’s no exaggeration to say that when we read the gospels with an eye to emphasizing what might plausibly have been recalled by Jesus’s original listeners, we end up with a picture that is startlingly different from what many of us hear in church. For one thing, it’s a message that consists largely of specific actions. Here are some of the sayings that seem most likely to be authentic:
Don’t react violently against the one who is evil: when someone slaps you on your right cheek, turn the other as well. When someone wants to sue you for your shirt, let that person have your coat along with it. Further, when anyone conscripts you for one mile, go an extra mile. Give to everyone who begs from you. Love your enemies.
The Jesus Seminar also identifies verses in which the sentiment appears to have been modified over time to make it more palatable. Matthew, for instance, has “Give to the one who begs from you,” which feels like a softening of Luke’s impossible “Give to everyone who begs from you.” In addition, we end up losing many extended passages of theological exposition that seem unlikely to have been remembered by anyone. Most strikingly, this means giving up nearly all of the Gospel of John, in which Jesus does little else but make claims about himself or expound upon his own nature—a portrait that is inconsistent with both the mechanics of oral transmission and what little we know about Jesus himself.
And I don’t think I’m alone in saying that this gospel is very different from the one that I associate with going to church, which sometimes seems to consist of nothing but metaphysical claims and confessions of belief. This is partially a statistical artifact: the original words of Jesus, whatever they were, account for a very small percentage of the verses in the New Testament. But I think there’s also something more insidious at work. Organized religion embraces abstract language for the same reason that it was incorporated into the gospels in the first place: it makes it easier to live with the underlying message by diluting it beyond recognition, and it excludes outsiders while smoothing over inconvenient issues that might divide the congregation. It’s far easier to meditate on the nature of Christ than to consider the true implications of the words “Sell all your possessions and give the money to the poor.” (One of the first notable schisms within the church, revealingly, was over a choice of adjectives.) Like many forms of institutionalized abstraction, it has real implications for the inner lives of its believers. It makes it possible for millions of Christians to convince themselves that the recent presidential order on refugees is consistent with the values that Jesus explicitly expressed toward the poor, the vulnerable, and the homeless. Franklin Graham, whose own charity is named for the parable that tells us that compassion goes beyond borders, says that it isn’t a biblical issue. Maybe it isn’t, at least not in the subset of the Bible that he has chosen to take to heart. But Orwell had a word for it—doublethink. And Graham would do well to remember the verse that reads: “Why do you call me, ‘Lord, Lord,’ and do not do what I say?”
The art of omission
Over the last couple of years, it has slowly become clear that the series of articles on the writing life that John McPhee is unhurriedly publishing in The New Yorker is one of the modest but indisputable creative events of our time. McPhee has long been regarded as the dean of American nonfiction, and in one essay after another, he has lovingly, amusingly, and unsentimentally unpacked the tricks and secrets of six full decades as a writer and curious character. The fact that these pieces are written from the perspective of a journalist—albeit a preternaturally inventive and sensitive one—makes them even more useful for authors of fiction. Because the point of view has been shifted by ninety degrees, we’re more aware of the common elements shared by all forms of writing: choice of subject, structure, revision, selection, omission. There isn’t a point that McPhee makes here that couldn’t be applied with profit to any form of creative writing, or any kind of artistic effort in general. McPhee isn’t dogmatic, and he frames his advice less as a rulebook than as a string of gentle, sensible suggestions. But the result, when collected at last in the inevitable book, will amount to nothing less than one of the most useful works ever composed on the art of clear writing and thinking, worthy of being placed on the same shelf as The Elements of Style. Strunk and White will always come first, but McPhee has set himself up as their obvious successor.
Take his most recent article, which focuses on the crucial art of omission. McPhee makes many of the same points—although more vividly and memorably—that others have covered before. Writing is cutting; a story should be exactly the length that can be sustained by its material and no more; a rough draft almost always benefits from being trimmed by ten percent. Underlying it all, however, is a deeper philosophical sense of why we omit what we do. McPhee writes:
To cause a reader to see in her mind’s eye an entire autumnal landscape, for example, a writer needs to deliver only a few words and images—such as corn shocks, pheasants, and an early first. The creative writer leaves white space between chapters or segments of chapters. The creative writer silently articulates the unwritten thought that is present in the white space. Let the reader have the experience. Leave judgment in the eye of the beholder. When you are deciding what to leave out, begin with the author. If you see yourself prancing around between subject and reader, get lost. Give elbow room to the creative reader. In other words, to the extent that this is all about you, leave that out.
Omission, in short, is a strategy for enforcing objectivity, and it obliges the writer to keep an eye out for the nonessential. When you’re trying to cut a story or essay by some arbitrary amount, you often find that the first parts to go are the places where you’ve imposed yourself on the subject. And if you sacrifice a telling fact or detail to preserve one of your own opinions, you’ve probably got bigger problems as a writer.
And the word “arbitrary” in the above paragraph is surprisingly important. Yesterday, I quoted Calvin Trillin on the process of greening at Time, in which makeup editors would return an article to its author with curt instructions to cut five or ten lines. McPhee, who did a lot of greening himself over the years, adds a crucial piece of information: “Time in those days, unlike its rival Newsweek, never assigned a given length but waited for the finished story before fitting it into the magazine.” In other words, the number of lines the writer was asked to cut wasn’t dictated by the content of the story, but by an arbitrary outside factor—in this case, the length and layout of the other articles that happened to be jostling for space in that particular issue. And while we might expect this to do violence to the integrity of the story itself, in practice, it turns out to be the opposite: it’s precisely because the quota of lines to remove is essentially random that the writer is forced to think creatively about how and where to condense. I’ve imposed arbitrary length limitations on just about everything meaningful I’ve ever written, and if anything, I wish I had been even more relentless. (One of the few real advantages of the structural conventions of the modern movie script is that it obliges writers to constantly engage in a kind of greening to hit a certain page count. Sometimes, it can feel like cheating, but it’s also a productive way to sweat down a story, and there isn’t a screenwriter alive who hasn’t experienced the truth of McPhee’s observation: “If you kill a widow, you pick up a whole line.”)
Of course, none of this means that the seemingly nonessential doesn’t have its place. Few essays would be any fun to read if they didn’t include the occasional digression or footnote that covered tangentially related territory, and that applies to McPhee as much as to anyone else. (In fact, his piece on omission concludes with a huge shaggy dog story about General Eisenhower, ending on a delicious punchline that wouldn’t be nearly as effective if McPhee hadn’t built up to it with a full page of apparent trivia.) Every work of art, as McPhee notes elsewhere, arrives at its own rhythms and structure, and an essay that is all business, or a series of breathless news items, is unlikely to inspire much affection. If there’s a point to be made here, though, it’s that digression and looseness is best imposed on the level of the overall work, rather than in the individual sentence: McPhee’s finest essays often seem to wander from one subject to the next as connections occur to the author, but on the level of the individual line or image, they’re rigorously organized. Greening is such a valuable exercise because it targets the parts of a work that can always be boiled down further—transitional sections, places where the text repeats itself, redundancies, moments of indulgence. McPhee compares it to pruning, or to removing freight cars to shorten a train, so that no one, even the author, would know in the end that anything has been removed. And it’s only through greening that you discover the shape that the story wants for itself.
How to debug a novel
The most effective debugging tool is still careful thought, coupled with judiciously placed print statements.
—Brian Kernighan
Revising a novel and debugging a computer program may seem like very different activities, but they have one important thing in common: when something breaks, you usually don’t know the hell why. With a program, at least, it’s often clear when you have a problem. You’ve carefully written a bunch of code—or typed it in verbatim from a textbook—and checked it over for mistakes, but when you compile and run it, you end up with an error message, a bunch of garbage, or nothing at all. When a novel is failing, it isn’t always as obvious. You’ve invested months into telling a particular story, which you wouldn’t have done if you didn’t care about it, but when you take a few weeks off and go back to reread it, the result seems inert: the characters are flat, the events don’t pop the way they did in your imagination, and you find yourself getting bored by your own story before you’re halfway done. What’s required, in short, is debugging. And while coders have access to a range of useful debugging tools, if you’re writing fiction, you’ve got no choice but to do it the old-fashioned way.
The first step in traditional debugging is to carefully read over the code, preferably on paper. Sometimes you find that you’ve made a stupid syntactical mistake—there’s a missing semicolon or closing parenthesis that throws the entire program out of whack. The equivalent in fiction is bad grammar or sloppy sentence construction, which makes even the best stories die on the page. (Here the best guide, as usual, is Strunk and White, and it’s no accident that its example inspired Brian Kernighan and P.J. Plauger to write The Elements of Programming Style.) On a somewhat higher level, you can ask yourself whether the story follows the few objective rules of storytelling in which you have any confidence. These vary from one genre to another, as well as between authors, but most writers eventually figure out a set of their own. Does the protagonist have a logical objective from one moment to the next? Does each scene start as late and end as early as possible? Does every chapter open with a clear situation? These are the building blocks of narrative, and if they don’t work on the lowest level, the story is likely to fall apart on the highest.
Which brings us to the print statement. It’s a trick that most beginning programmers figure out on their own: if a program is breaking, it’s often because one or more variables aren’t receiving the values they should. Modern debuggers have tools for tracking down such problems, but the most basic solution is to insert print statements into the code to display the value of the suspect variables at each stage in the process. It’s a window into the program, allowing you to follow it step by step. And although there isn’t an exact equivalent in writing fiction—in which everything is right there on the page—it can often be useful to pause the story to ask yourself where exactly you stand. As a writer reading over your own work, you have full knowledge of where the story is going, and you know that a slow stretch in Chapter 5 is necessary to set up an exciting sequence in Chapter 6. But a reader doesn’t know this. As difficult as it might be, then, you need to ask yourself what a reader encountering the story for the first time will think of a scene on its own terms. And writing it out often helps. Like a print statement, it’s a snapshot of where the story is right now, which is all the reader—or computer—can be expected to care about.
The last technique worth mentioning is the wolf fence algorithm, as first described by Edward Gauss:
There’s one wolf in Alaska, how do you find it? First build a fence down the middle of the state, wait for the wolf to howl, determine which side of the fence it is on. Repeat process on that side only, until you get to the point where you can see the wolf.
In programming, this means subdividing the code until you find the section where the program is failing, then repeating the process until you’ve zeroed in on the problem. Most novelists tend to do this intuitively. When you’re reading over a novel, you start to think of it in large chunks: you know that the sequence from page 150 to 250 works fine, while the first forty pages are giving you trouble, even if you’re not sure how or where. Instead of trying to crack the novel as a whole, it makes sense to focus on the trouble spots, continuing to narrow the scope as you iteratively revise . After the first revision, you find that three out of the five chapters in question seem fine, even though the overall section is still giving you trouble, which implies that the problem is in one of the two chapters remaining. And you repeat as necessary, homing in on something as small as a misconceived page or paragraph, until you’ve found your wolf.
To be or not to be?
One of the most daunting aspects of writing good fiction is the sheer number of rules involved. It’s hard enough to write convincing stories about men and women who never lived, but along with developing empathy and imagination, you also need to think hard about dangling participles, misplaced modifiers, and the sequence of tenses, and in the end, a lot of it comes down to intuition. I’ve been writing for so long that I rarely need to pause to wonder about grammar, and I’ve more or less internalized The Elements of Style—although I still try to read it again every year or so. But no matter how much you think you know, there’s always something you’ve missed. It wasn’t until my first severe copy edit, for instance, that I realized that I was totally ignorant of such matters as the difference between “toward” and “towards” or “further” and “farther,” and I’d never worried much about the use of a comma before the conjunction in a dependent clause. (For the record, I think the latter is fine in certain cases, and I’m strongly in favor of splitting infinitives.)
Faced with so many rules and guidelines, though, it’s easy to lose sight of the larger picture. My favorite example is the admonition, which I’ve seen a lot recently, that writers should avoid “to be” verbs. In itself, the advice seems sound: verbs like is, am, was, and are lack specificity, lend themselves to bland constructions, and aren’t as vivid as verbs that convey clear action. In practice, though, the rule can be a bitch to implement. When we’re told to revise to avoid adverbs or eliminate the passive voice, we can usually fix a sentence by cutting a word or restructuring a clause, but eliminating “to be” verbs isn’t a something you can do with a simple find and replace. (In theory, you could replace most occurrences with the equivalent of “seemed,” which you’ll often see in Updike, among others, and I’ll often do this to emphasize a particular character’s perspective. But this is a solution that is best used sparingly.) Instead, following the rule often means writing a new sentence entirely, which is something that gives most writers the shivers.
But there’s a more general lesson here, which is that this rule is more diagnostic than prescriptive. When you go back over your work and find a lot of “to be” verbs, it’s really a sign that other faults are present: your writing may be too abstract, too passive, too general. Going back to fix the offending sentences by hand may address the problem in the short term, but really, the only good solution is to cultivate habits of thought that prevent such constructions from appearing in the first place. Focusing on the verbs themselves is a little like treating the symptoms while ignoring the disease, or, more accurately, counting calories while forgetting to exercise. The ultimate objective is to write concretely, to create images in the mind of the reader, and to put an emphasis on clarity and vividness, and the only way to achieve this is to write endlessly, to patiently revise, and to read authors who embody the qualities of soul you admire.
In short, the problem of style needs to be attacked from both directions. The rules of grammar are there for a reason: they’re a means of facilitating communication and making sure that the reader understands what the writer is trying to say. In practice, though, they’re acquired mostly through trial and error, by writing a million bad words along the way, until the writer starts to develop an ear for good language, in the same way a songwriter can improvise a vocal melody without thinking consciously about the theory of music. Attaining that kind of intuition is every writer’s dream, but even when you attack the problem as diligently as you can, you generally find that craft keeps moving the goal posts. And even the most accomplished author still makes mistakes: Norman Mailer spent something like ten years writing Harlot’s Ghost, and overlooked a dangling modifier in the very first sentence, even if he tried to justify it after the fact. But it’s still worth playing the game, as long as we remember that when we worry about “to be” or not “to be,” craft is still the question.
My essential writing books
1. The Elements of Style by William Strunk, Jr. and E.B. White. If I were putting together an essential library of books for an aspiring writer of any kind, The Elements of Style would be first on the list. In recent years, there’s been something of a backlash against Struck and White’s perceived purism and dogmatism, but the book is still a joy to read, and provides an indispensable baseline for most good writing. It’s true that literature as a whole would be poorer if every writer slavishly followed their advice, say, to omit needless words, as Elif Batuman says in The Possessed: “As if writing were a matter of overcoming bad habits—of omitting needless words.” Yet much of creative writing does boil down to overcoming bad habits, or at least establishing a foundation of tested usage from which the writer only consciously departs. More than fifty years after it was first published, The Elements of Style is still the best foundation we have.
2. The Art of Fiction by John Gardner. I bought this book more than fifteen years ago at a used bookstore in Half Moon Bay, shortly before starting my freshman year in high school. Since then, I’ve reread it, in pieces, a dozen or more times, and I still know much of it by heart. Writing books tend to be either loftily aspirational or fixated on the nuts and bolts of craft, and Gardner’s brilliance is that he tackles both sides in a way that enriches the whole. He has plenty to say on sentence structure, vocabulary, rhythm, and point of view, but he’s equally concerned with warning young writers away from “faults of soul”—frigidity, sentimentality, and mannerism—and reminding them that their work must have interest and truth. Every element of writing, he notes, should by judged by its ability to sustain the fictional dream: the illusion, to the reader, that the events and characters described are really taking place. And everything I’ve written since then has been undertaken with Gardner’s high standards in mind.
3. Writing to Sell by Scott Meredith. I hesitated between this book and Dean Koontz’s Writing Popular Fiction, which I reread endlessly while I was teaching myself how to write, but I’ve since discovered that it cribs much of its practical material from Meredith. Scott Meredith was a legendary literary agent—his clients included Norman Mailer, Arthur C. Clarke, and P.G. Wodehouse—and his approach to writing is diametrically opposed to Gardner’s: his book is basically a practical cookbook on how to write mainstream fiction for a wide audience, with an emphasis on plot, conflict, and readability. The tone can be a little mercenary at times, but it’s all great advice, and it’s more likely than any book I know to teach an author how to write a novel that the reader will finish. (One warning: Meredith’s chapter on literary agents, and in particular his endorsement of the use of reading fees, should be approached with caution.)
4. On Directing Film by David Mamet. I’ve spoken about this book at length before, but if I seem awed by it, it’s because I encountered it a time in my life when I already thought I’d figured out how to write a novel. At that point, I’d already sold The Icon Thief and a handful of short stories, so reading Mamet’s advice for the first time was a little like a professional baseball player realizing that he could raise his batting average just by making a few minor adjustments to his stance. Mamet’s insistence that every scene be structured around a series of clear objectives for the protagonist may be common sense, but his way of laying it out—notably in a sensational class session at Columbia in which a scene is broken down beat by beat—rocked my world, and I’ve since followed his approach in everything I’ve done. At times, his philosophy of storytelling can be a little arid: any work produced using his rules needs revision, and a touch of John Gardner, to bring it to life. But my first drafts have never been better. It’s so helpful, in fact, that I sometimes hesitate before recommending it, as if I’m giving away a trade secret—but anyway, now you know.
Writing is cutting
Movies are made in the editing room. It’s a cliché, but it’s also true: you can shoot the best raw footage in the world, but if it doesn’t cut together, the movie isn’t going to work. Beyond their basic responsibilities of maintaining continuity and spacial coherence, the editor is largely responsible for shaping a film’s narrative momentum, streamlining and clarifying the story, and making sure it runs the proper length. And sometimes the editor’s role goes even further. As Charles Koppelman writes in Behind the Seen:
[Walter] Murch says it’s common in editing, and normally easy, to steer scenes five or ten degrees in either direction from their intended course. Shading intensity, favoring a character, softening a moment—that’s “the bread and butter of film editing,” as he calls it. “It also seems that flipping the polarity of a scene—going completely the opposite way from where things were originally intended—is something relatively easy to do in film editing.”
And although there are countless famous cases of movies being radically rewritten in the editing room, like Ralph Rosenblum’s brilliant reshaping of Annie Hall, a casual comparison between the published screenplays and the finished versions of most great movies reveals that crucial changes are being made all the time. To pick just one example: the closing montage of words and images at the end of The Usual Suspects, which gives the entire movie much of its power, is totally absent in the script, and a lot of the credit here needs to be given to editor John Ottman. And smaller, less flashy examples are visible everywhere you look.
At first glance, it might seem as if a novelist is in a somewhat different position. A film editor is constrained by the material at hand, and although in certain cases he may have some input when it comes to expensive reshoots, for the most part, he has no choice but to make do with the footage that results from principal photography, which can be massaged and reconceived, but only to some extent, with the help of clever cutting, wild lines, and lucky discoveries in the slate piece. (The slate piece, as I’ve mentioned before, is the second or two of stray film left at the beginning of a take, before the actors have even begun to speak. Mamet likes to talk about finding important bits of footage in this “accidental, extra, hidden piece of information,” and he isn’t lying—the evocative, ominous shots of empty corridors in the hospital scene in The Godfather, for instance, were salvaged from just such a source.) A novelist, by contrast, can always write new material to fill in the gaps or save an otherwise unworkable scene, and it doesn’t cost anything except time and sanity. In reality, however, it isn’t quite that easy. The mental state required for writing a first draft is very different from that of revision, and while writers, in theory, benefit from an unlimited range of possibilities, in practice, they often find themselves spending most of their time trying to rework the material that they already have.
This is why I’ve become increasingly convinced that writing is revision, and in particular, it’s about cutting and restructuring, especially with regard to reducing length. Fortunately, this is one area, and possibly the only area, in which writers have it easier now than ever before. In The Elements of Style, E.B. White writes:
Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.
There’s something appealing about the image of a writer literally cutting his work using scissors and tape, and it’s possible that there’s something tactile in the process that would lead to happy accidents—which makes me want to try it sometime. These days, however, it’s so easy to cut and restructure files in Word that it seems insane for a writer not to take full advantage of the opportunity. Like editing a movie in Final Cut Pro, it’s nondestructive: you can try anything out and reverse it with a keyboard shortcut. You can cut as much as you like and restore it with ease, as long as you’ve taken the precaution of saving a new version with every round of revision. And I’ve learned that if it occurs to you that something could be cut, it should be. Nine times out of ten, once that initial change has been made, you won’t even remember what was there before—and if, five or ten rereadings later, you find that you still miss it, it’s a simple matter to restore what used to be there.
And almost invariably, the shorter and more focused the story becomes, the better it gets. Not only is cutting a story as much as possible the best trick I know, in some ways, it’s the only trick I know. When I look back at my own published work, I naturally divide it into several categories, based on how happy I am with the finished result. At the top are the stories—The Icon Thief, “The Boneless One,” and a handful of others—that I don’t think I’d change much at all, followed by a bunch that I’d like to revise, and a couple that I wish hadn’t seen print in their current form. Without exception, my regrets are always the same: I wish I’d cut it further. The conception is sound, the writing is fine, but there are a few scenes that go on too long. And although it’s impossible to know how you’ll feel about one of your stories a year or two down the line, I almost always wish I’d made additional cuts. That’s why, as I begin the final push on Eternal Empire, I’m cutting even more savagely than my critical eye might prefer, trying to think in terms of how I’ll feel ten months from now, when the novel is published. (The divergence between my present and future selves reminds me a little of the gap between Nate Silver’s “now-cast” and his election day forecast, which will finally converge on November 6.) I don’t know what my future self will think of this novel. But I can almost guarantee that he’ll wish that I’d cut a little more.
Quote of the Day
[A]lthough there is no substitute for merit in writing, clarity comes closest to being one. Even to a writer who is being intentionally obscure or wild of tongue we can say, “Be obscure clearly! Be wild of tongue in a way we can understand!”
Playing the odds
Earlier this week, in my post about revision, I wrote: “If you start revising a novel before you’ve completed a first draft, your chances of finishing it at all are essentially zero.” Later, in the comments, a reader quite reasonably asked if it wasn’t possible to take a slightly less rigid stance—that is, if there was some kind of sensible middle ground of interim revision. The answer, of course, is yes: there are as many approaches to writing as there are writers, and there are certainly some who can finish a novel while diligently revising along the way. The catch, as I see it, is that such writers will always be outnumbered by those who get stuck revising the first few chapters. In short, my warning may not apply to you, but it’s probably safest to assume that it does. Because like most writing advice, it’s really about playing the odds.
Here’s what I mean. Any rational observer would have to conclude that the odds are already stacked against any aspiring writer. I won’t go into the obstacles in detail—the difficulty of finding an agent, the questionable state of modern publishing, the uncertainty of success even after a novel has been released—but it’s safe to say that anyone’s chances of becoming a working writer are fairly slim. As a result, the determined writer is like a smart gambler playing against the house: it may be a losing game, but he still adjusts the odds in his favor whenever he can. Card counters (or even amateur video poker players like me) know that incremental advantages are what make the difference between breaking even and going home broke. The same holds true for writing. It’s such a ridiculously impractical pursuit that it’s necessary to be pragmatic about it whenever possible. And if the odds of our writing a publishable novel are increased by following a few basic rules, we’d be foolish not to consider adjusting our habits accordingly.
Playing the odds is also why I place such emphasis on craft. In her recent book The Possessed, which I’m reading now, writer Elif Batuman questions the whole premise of craft, as drilled into the heads of writers at a thousand workshops:
What did craft ever try to say about the world, the human condition, or the search for meaning? All it had were its negative dictates: “Show, don’t tell”; “Murder your darlings”; “Omit needless words.” As if writing were a matter of overcoming bad habits—of omitting needless words.
Regular readers of this blog can guess how sharply my own opinions differ from Bautman’s—among other things, I do think that a lot of what we call creative writing boils down to overcoming bad habits—and I hope to devote more time soon to the issues she raises. For now, though, I’ll restrict myself to pointing out the obvious, which is that while there’s no guarantee that a writer with good craft will produce great literature, the odds of a writer without craft producing anything readable are vanishingly small. Are there exceptions? Sure. But anyone who hopes to make a living from writing is already betting that he’s going to be the exception to the rule. To worsen the odds by neglecting craft is like blindly discarding your entire hand in hopes of drawing a royal flush. It could happen, but it isn’t likely.
So how do you improve the odds? You begin by working as intelligently as you can: you listen to Strunk and White, you do, in fact, omit needless words, and you don’t revise until the entire first draft is done. (You’ll probably need to make a few mistakes along the way, as I did, before you’re convinced of the wisdom of this approach: the rules of writing, like any kind of philosophy, are only acquired though experience.) Then, once you have enough craft at your disposal, and hopefully a few published credits, you can start to break the rules. After all, if you’re the exceptional writer you hope you are, a little bit of craft—which will adjust the odds in your favor far out of proportion to their actual cost—won’t keep you from finishing the novel you were born to write. So why take the chance?
Stephen Sondheim’s three rules of writing
There are only three principles necessary for a lyric writer, all of them familiar truisms. They were not immediately apparent to me when I started writing, but have come into focus via Oscar Hammerstein’s tutoring, Strunk and White’s huge little book The Elements of Style and my own sixty-some years of practicing the craft. I have not always been skilled or diligent enough to follow them as faithfully as I would like, but they underlie everything I’ve ever written. In no particular order, and to be inscribed in stone:
Content Dictates Form
Less Is More
God Is in the Details
all in the service of
Clarity
without which nothing else matters.
“If she was going to run, it had to be now…”
leave a comment »
Note: This post is the fifty-sixth installment in my author’s commentary for Eternal Empire, covering Chapter 55. You can read the previous installments here.
In general, an author should try to write active protagonists in fiction, for much the same reason that it’s best to use the active voice, rather than the passive, whenever you can. It isn’t invariably the right choice, but it’s better often enough that it makes sense to use it when you’re in doubt—which, when you’re writing a story, is frankly most of the time. In The Elements of Style, Strunk and Write list the reasons why the active voice is usually superior: it’s more vigorous and direct, it renders the writing livelier and more emphatic, and it often makes the sentence shorter. It’s a form of insurance that guards against some of the vices to which writers, even experienced ones, are prone to succumbing. There are few stories that wouldn’t benefit from an infusion of force, and since our artistic calculations are always imprecise, a shrewd writer will do what he or she can to err on the side of boldness. This doesn’t mean that the passive voice doesn’t have a place, but John Gardner’s advice in The Art of Fiction, as usual, is on point:
And most of the same arguments apply to active characters. All else being equal, an active hero or villain is more engaging than a passive victim of circumstance, and when you’re figuring out a plot, it’s prudent to construct the events whenever possible so that they emerge from the protagonist’s actions. (Or, even better, to come up with an active, compelling central character and figure out what he or she would logically do next.) This is the secret goal behind the model of storytelling, as expounded most usefully by David Mamet in On Directing Film, that conceives of a plot as a series of objectives, each one paired with a concrete action. It’s designed to maintain narrative clarity, but it also results in characters who want things and who take active measures to attain them. When I follow the slightly mechanical approach of laying out the objectives and actions of a scene, one beat after another, it gives the story a crucial backbone, but it also usually leads to the creation of an interesting character, almost by accident. If nothing else, it forces me to think a little harder, and it ensures that the building blocks of the story itself—which are analogous, but not identical, to the sentences that compose it—are written in the narrative equivalent of the active voice. And just as the active voice is generally preferable to the passive voice, in the absence of any other information, it’s advisable to focus on the active side when you aren’t sure what kind of story you’re writing: in the majority of cases, it’s simply more effective.
Of course, there are times when passivity is an important part of the story, just as the passive voice can be occasionally necessary to convey the ideas that the writer wants to express. The world is full of active and passive personalities, and of people who don’t have control over important aspects of their lives, and there’s a sense in which plots—or genres as a whole—that are built around action leave meaningful stories untold. This is true of the movies as well, as David Thomson memorably observes:
One of the central goals of modernist realism has been to give a voice to characters who would otherwise go unheard, precisely because of their lack of conventional agency. And it’s a problem that comes up even in suspense: a plot often hinges on a character’s lack of power, less as a matter of existential helplessness than because of a confrontation with a formidable antagonist. (A conspiracy novel is essentially about that powerlessness, and it emerged as a subgenre largely as a way to allow suspense to deal with these issues.)
So how do you tell a story, or even write a scene, in which the protagonist is powerless? A good hint comes from Kurt Vonnegut, who wrote: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading. When I used to teach creative writing, I would tell the students to make their characters want something right away—even if it’s only a glass of water. Characters paralyzed by the meaninglessness of modern life still have to drink water from time to time.” This draws a useful distinction, I think, between the two functions of the active mode: as a reflection of reality and as a tool to structure the reader’s experience. You can use it in the latter sense even in stories or scenes in which helplessness is the whole point, just as you can use the active voice to increase the impact of prose that is basically static or abstract. In Chapter 55 of Eternal Empire, for example, Maddy finds herself in as vulnerable a position as can be imagined: she’s in the passenger seat of a car being driven by a woman whom she’s just realized is her mortal enemy. There isn’t much she can plausibly do to defend herself, but to keep her from becoming entirely passive, I gave her a short list of actions to perform: she checks her pockets for potential weapons, unlocks the door on her side as quietly as she can, and looks through the windshield to get a sense of their location. Most crucially, at the moment when it might be possible to run, she decides to stay where is. The effect is subtle, but real. Maddy isn’t in control of her situation, but she’s in control of herself, and I think that the reader senses this. And it’s in scenes like this, when the action is at a minimum, that the active mode really pays off…
Like this:
Written by nevalalee
June 23, 2016 at 8:54 am
Posted in Books, Writing
Tagged with David Mamet, David Thomson, E.B. White, Eternal Empire commentary, John Gardner, On Directing Film, Strunk and White, The Art of Fiction, The Elements of Style, William Strunk Jr.