Posts Tagged ‘John Gardner’
The survivors
Note: This week marks the twenty-fifth anniversary of the release of Very by the Pet Shop Boys. Today I’ll be publishing the last in a series of posts devoted to the legacy of my favorite album of all time.
Every subculture begins as a strategy for survival, although not everyone arrives at the same set of tactics. In the oral history The World Only Spins Forward, the author Madison Moore describes one possible approach: “Fabulousness becomes, if I may, a giant fuck you to the norms. People emerge out of that. You emerge because you’re tired of hiding. It’s so much easier to be normal, to fit in, to repress yourself.” Brian Herrera, an assistant professor of theater at Princeton, makes a similar point:
You could see the cues, the winks, ways to tell that someone was gay, and you could read that as speaking to you as a gay male person without ever having to name it. In that register, the realm of the fabulous became one of the ways that you could signal that you were in on the joke, you got the joke, you were in some ways making the joke. People like Sylvester. The Village People. Camp was a building of a vocabulary of critical connoisseurship that was celebratory, that was ours.
In The Art of Fiction, John Gardner refers to writing as a yoga, or a way of life in the world, and you could say much the same thing about the notion of camp, which was invented by men and women who had to develop superhuman capacities of mental and emotional endurance. As Prior Walter says as he hears the sound of beating wings at the end of Millennium Approaches: “My brain is fine, I can handle pressure, I am a gay man and I am used to pressure.”
But not everyone reacts to pressure in the same way. In the passage that I quoted above, Moore continues: “A lot of folks, people who embrace fabulousness, are attacked on the street and feel the need to wear men’s clothing, ‘safe’ clothing, as a way to get from A to B, and then when they get there, they bust out.” Yet there’s something equally compelling about those who hold themselves in reserve. The Pet Shop Boys were defined in the early days by reticence and irony, which was wildly misinterpreted by listeners who took “Opportunities” and “Shopping” at face value. Part of this stance stems from what Nabeel Zuberi, as I noted here yesterday, calls “a repression that is part of that residue of English nationalism’s effect on the body,” but it also reflects something in particular about Neil Tennant. In his landmark interview with Attitude, he set himself pointedly apart from the kind of world that Moore and Herrera evoke:
I’ve never wanted to be part of this separate gay world. I know a lot of people will not appreciate hearing me say that. But when people talk about the gay community in London, for instance, what do they really mean by that? There is a community of interests, particularly around the health issue, but beyond that what is there really? There’s nightclubs, drugs, shopping, PAs by Bad Boys Inc. Well…I’m sorry but that isn’t really how I define myself. I don’t want to belong to some narrow group or ghetto. And I think that if they’re really honest a lot of gay people would say that they felt like that as well.
And no matter how you feel about this, the result was a body of work—at least for its first decade—about survival in plain sight. It was about getting from A to B.
The ensuing web of strategies—the detachment, the reserve, the use of technology to conceal overwhelming emotion—is a big part of why the Pet Shop Boys have always been important to me. I’m not comfortable with labels, but if pressed, I would say that I identify as bisexual, and I’ve never been entirely at home in my own skin. The world that their music creates also speaks to a certain kind of introvert, and more recently, I’ve been struck by its parallels to the science fiction community, in which many of the same qualities were channeled along somewhat different lines. Science fiction appealed strongly from the beginning to readers who saw themselves as outsiders, and with a slight change of label, it offered a secret inner life with affinities to what Stephen Spinella describes in The World Only Spins Forward: “Because it is something that can be masked and hidden, there are issues of a dual nature to your presence. You’re living a double life. There is something fabulous about that. There is something outside the norm of living in that mysterious mindset.” When you walk around the World Science Fiction Convention, you see a few fans at the extreme of fabulousness, along with others, like me, who look more like they might be treating everyday life as a form of cosplay. Both cultures also have a vested interest in technology. Science fiction has often been more comfortable talking about machines than about people, and Tennant, Lowe, and their contemporaries were drawn for some of the same reasons to the synthesizer. It was private, anonymous, a reaction against the cult of the self in rock music, and it offered forms of expression for people in solitude. As Stephin Merritt puts it in the wonderful song “Foxx and I,” his admiring ode to the original frontman of Ultravox:
Anyone can change into a machine
Girl or white, black or boy
Dull or very strange, into a machine
Come with me…
I’m perfectly aware, of course, of the differences between these two cultures, as well as the forms of exclusion that can develop even within a community of those who identify themselves as outsiders. But they both offer fascinating insights for anyone who cares urgently about the forms that cultural survival can take. (There are countless others, obviously, but these are the two that happen to have been most important to my own life.) I like to think of myself as a rational person, but I’ve recently begun to realize how much of my view of the world was based on wishful thinking, and I’m starting to confront the real possibility that it will continue to get worse for the rest of my life. This only raises the huge unresolved question of how to live under such circumstances, and I’m still trying to figure it out. And while I’m not the first to take refuge in the consolations of art—my favorite books, movies, and albums nearly all emerged from conditions of existential crisis—I feel obliged to point to one possible line of defense that was designed to be overlooked. In my eyes, Tennant and Lowe’s music exemplifies a certain kind of courage that prefers to go unrecognized. Very marked the point at which those impulses were transmuted into something more liberating, and ever since, the subtext of their early songs has become text, perhaps because their audience now consists largely of the community in which Tennant was never quite sure he wanted to be a member. Some of these later albums are great, and hugely meaningful to me, but it’s the version from Please through Very that sticks with me the most, and which seems to have the most to say to us now. Wryness and understatement may not seem like weapons, but like AutoTune, they have their place, and they served their users well enough at a time not unlike our own. The sense of liberation expressed by Very strikes me now as premature, but not wrong. And I hope that I can hear it again one day.
My ten creative books #7: On Directing Film
Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.
When it comes to giving advice on something as inherently unteachable as writing, books on the subject tend to fall into one of three categories. The first treats the writing manual as an extension of the self-help genre, offering what amounts to an extended pep talk that is long on encouragement but short on specifics. A second, more useful approach is to consolidate material on a variety of potential strategies, either through the voices of multiple writers—as George Plimpton did so wonderfully in The Writer’s Chapbook, which assembles the best of the legendary interviews given to The Paris Review—or through the perspective of a writer and teacher, like John Gardner, generous enough to consider the full range of what the art of fiction can be. And the third, exemplified by David Mamet’s On Directing Film, is to lay out a single, highly prescriptive recipe for constructing stories. This last approach might seem unduly severe. Yet after a lifetime of reading what other writers have to say on the subject, Mamet’s little book is still the best I’ve ever found, not just for film, but for fiction and narrative nonfiction as well. On one level, it can serve as a starting point for your own thoughts about how the writing process should look: Mamet provides a strict, almost mathematical set of tools for building a plot from first principles, and even if you disagree with his methods, they clarify your thinking in a way that a more generalized treatment might not. But even if you just take it at face value, it’s still the closest thing I know to a foolproof formula for generating rock-solid first drafts. (If Mamet himself has a flaw as a director, it’s that he often stops there.) In fact, it’s so useful, so lucid, and so reliable that I sometimes feel reluctant to recommend it, as if I were giving away an industrial secret to my competitors.
Mamet’s principles are easy to grasp, but endlessly challenging to follow. You start by figuring out what every scene is about, mostly by asking one question: “What does the protagonist want?” You then divide each scene up into a sequence of beats, consisting of an immediate objective and a logical action that the protagonist takes to achieve it, ideally in a form that can be told in visual terms, without the need for expository dialogue. And you repeat the process until the protagonist succeeds or fails at his or her ultimate objective, at which point the story is over. This may sound straightforward, but as soon as you start forcing yourself to think this way consistently, you discover how tough it can be. Mamet’s book consists of a few simple examples, teased out in a series of discussions at a class he taught at Columbia, and it’s studded with insights that once heard are never forgotten: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” “Here is a tool—choose your shots, beats, scenes, objectives, and always refer to them by the names you chose.” “Keep it simple, stupid, and don’t violate those rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.” “The audience doesn’t want to read a sign; they want to watch a motion picture.” “A good writer gets better only by learning to cut, to remove the ornamental, the descriptive, the narrative, and especially the deeply felt and meaningful.” “Now, why did all those Olympic skaters fall down? The only answer I know is that they hadn’t practiced enough.” And my own personal favorite: “The nail doesn’t have to look like a house; it is not a house. It is a nail. If the house is going to stand, the nail must do the work of a nail. To do the work of the nail, it has to look like a nail.”
My ten creative books #6: The Art of Fiction
Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.
I bought The Art of Fiction by John Gardner nearly a quarter of a century ago, at a used bookstore in Half Moon Bay, California, shortly before starting my freshman year of high school. (On that same afternoon, I picked up a copy of Critical Path by R. Buckminster Fuller, and my family also somehow acquired our first cat, which suggests that my life would be significantly different if that one day were magically erased.) Since then, I’ve read it in pieces a dozen or more times—it’s one of the few books that I’ve brought wherever I’ve moved—and I still know much of it by heart. Writing guides tend to be either loftily aspirational or fixated on the nuts and bolts of craft, and Gardner’s brilliance is that he tackles both sides in a way that enriches the whole. He has plenty to say on sentence structure, vocabulary, rhythm, and point of view, and his illustrations of process are still the most vivid that I’ve ever seen:
The good writer treats each unit individually, developing them one by one. When he’s working on the description of Uncle Fyodor’s store, he does not think about the hold-up men who in a moment will enter it, though he keeps them in the back of his mind. He describes the store, patiently, making it come alive, infusing every smell with Uncle Fyodor’s emotion and personality (his fear of hold-up men, perhaps); he works on the store as if this were simply an exercise, writing as if he had all eternity to finish it, and when the description is perfect—and not too long or short in relation to its function in the story as a whole—he moves on to his story’s next unit.
Yet Gardner is equally concerned with warning young writers away from “faults of soul,” including frigidity, sentimentality, and mannerism, and in reminding them that their work must have interest and truth. Every element of writing, he notes, should by judged by its ability to sustain the fictional dream: the illusion, to the reader, that the events and characters described are really taking place. And everything I’ve written since then has been undertaken with his high standards in mind.
By now, I’ve internalized all of his advice, even if I don’t always follow it, and as a result, when I read his book again now, it’s less as a guide than as a novel in itself, with an archetypal writer—who shouldn’t be confused with Gardner—who emerges as a character in his own right. For instance:
He begins to brood over what he’s written, reading it over and over, patiently, endlessly, letting his mind wander, sometimes to Picasso or the Great Pyramid, sometimes to the possible philosophical implications of Menelaos’ limp (a detail he introduced by impulse, because it seemed right). Reading in this strange way lines he has known by heart for weeks, he discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery…Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.
And his offhand observations about other writers have stuck in my head as well. Writing of a possible plot hole in Hamlet, for instance, Gardner offers a view of Shakespeare that I’ve never forgotten:
The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.
Of the countless books that I’ve read on writing, this is still the best, as well as the finest manual of the life of which Gardner writes elsewhere: “Novel-writing is not so much a profession as a yoga, or ‘way,’ an alternative to ordinary life-in-the-world…For those who are authentically called to the profession, spiritual profits are enough.”
The rock that flies
William H. Gass: There is a fundamental divergence about what literature is. I don’t want to subordinate beauty to truth and goodness. John and others have values which they think more important. Beauty, after all, is not very vital for most people. I think it is very important, in the cleanliness of the mind, to know why a particular thing is good. A lot of people judge, to use a crude example, the dinner good because of the amount of calories it has. Well, that is important if you don’t want to gain weight, but what has that got to do with the quality of the food? Moral judgments on art constantly confuse the quality of the food. I would also claim that my view is more catholic. It will allow in as good writers more than this other view will; John lets hardly anybody in the door.
John Gardner: I love Bill’s writing, and I honestly think that Bill is the only writer in America that I would let in the door. For twenty-four years I have been screaming at him, sometimes literally screaming at him, saying, “Bill, you are wasting the greatest genius ever given to America by fiddling around when you could be doing big, important things.” What he can do with language is magnificent, but then he turns it against itself. Our definitions of beauty are different. I think language exists to make a beautiful and powerful apparition. He thinks you can make pretty colored walls with it. That’s unfair. But what I think is beautiful, he would think is not yet sufficiently ornate. The difference is that my 707 will fly and his is too encrusted with gold to get off the ground.
Gass: There is always that danger. But what I really want is to have it sit there solid as a rock and have everybody think it is flying.
The fanfic disposition
Yesterday, I mentioned Roxane Gay’s insightful opinion piece on the proposed HBO series Confederate, which was headlined “I Don’t Want to Watch Slavery Fan Fiction.” I’m still sorting out my own feelings toward this show, an alternate history set in the present day in which the South won the Civil War, but I found myself agreeing with just about everything that Gay writes, particularly when she confesses to her own ambivalence:
As a writer, I never wish to put constraints upon creativity nor do I think anything is off limits to someone simply because of who they are. [Creators] Mr. Benioff and Mr. Weiss are indeed white and they have as much a right to create this reimagining of slavery as anyone. That’s what I’m supposed to say, but it is not at all how I feel.
And I was especially struck by Gay’s comparison of the show’s premise to fanfic. Her essay, which appeared in the New York Times, only uses the phrase “fan fiction” once, linking to a tweet from the critic Pilot Bacon, and while its use in reference to Confederate isn’t literally true—at least not if we define fanfic as a derivative work based on characters or ideas by another author—its connotations are clear. Fairly or not, it encapsulates the notion that David Benioff and D.B. Weiss are appropriating existing images and themes to further their own artistic interests.
Even if we table, for now, the question of whether the criticism is justified, it’s worth looking at the history of the word “fanfic” as a pejorative term. I’ve used it that way here myself, particularly in reference to works of art that amount to authorial wish fulfillment toward the characters, like the epilogue to Ann Patchett’s Bel Canto. (Looking back at my old posts, I see that I even once used it to describe a scene in one of my own novels.) Watching The Hobbit: The Battle of the Five Armies recently with my wife, I commented that certain scenes, like the big fight at Dol Guldur, felt like fanfic, except that Peter Jackson was somehow able to get Cate Blanchett, Ian McKellen, Hugo Weaving, and Christopher Lee to reprise all their old roles. And you often see such comparisons made by critics. Gavia Baker-Whitelaw devoted an entire article on The Daily Dot to the ways in which J.K. Rowling’s Harry Potter and the Cursed Child resembled a wok of “badfic,” while Ian Crouch of The New Yorker tried to parse the difference between fanfic and such works as Jean Rhys’s Wide Sargasso Sea:
Fan fiction is surely not a new phenomenon, nor is it an uninteresting one, but it is different in kind and quality from a work like Rhys’s, or, to take a recent example, Cynthia Ozick’s remarkable new novel, Foreign Bodies, which reimagines the particulars of The Ambassadors, by Henry James. Not only do these books interpret texts in the public domain…but they do so with an admirable combination of respect and originality.
As a teenager, I wrote a lot of X-Files fanfic, mostly because I knew that it would give me a readily available audience for the kind of science fiction that I liked, and although I look back on that period in my life with enormous affection—I think about it almost every day—I’m also aware of the limitations that it imposed on my development as a writer. The trouble with fanfic is that it allows you to produce massive amounts of material while systematically avoiding the single hardest element of fiction: the creation of imaginary human beings capable of sustaining our interest and sympathy. It begins in an enviable position, with a cast of characters to which the reader is already emotionally attached. As a result, the writer can easily be left in a state of arrested development, with superb technical skills when it comes to writing about the inner life of existing characters, but little sense of how to do it from scratch. This even holds true when the writer is going back to characters that he or she originally created or realized onscreen. When J.K. Rowling revisits her most famous series or Peter Jackson gives us a fight scene with Elrond and the Ringwraiths, there’s an inescapable sense that all of the heavy lifting took place at an earlier stage. These artists are trading on the affection that we hold toward narrative decisions made years ago, instead of drawing us into the story in the moment. And even when the name on the title page or the director’s credit is the same, readers and viewers can sense when creators are indulging themselves, rather than following the logic of the underlying material.
This all means that fanfic, at its worst, is a code word for a kind of sentimentality, as John Gardner describes it in The Art of Fiction:
If the storyteller appears to stock response (our love of God or country, our pity for the downtrodden, the presumed warm feelings all decent people have for children and small animals)…then the effect is sentimentality, and no reader who’s experienced the power of real fiction will be pleased by it.
Replace “children and small animals” with Harry Potter and Gandalf, and you have a concise description of how fanfic works, encouraging readers to plow through tens of thousands of words because of the hard work of imaginative empathy that someone else did long ago. When Gay and Bacon compare Confederate to fan fiction, I think that this is what they mean. It isn’t drawing on existing characters, but on a collection of ideas, images, and historical events that carry an overwhelming emotional charge before Benioff and Weiss have written a line. You could argue that countless works of art have done the same thing—the canonical work of Civil War fanfic has got to be Gone With the Wind—but if slavery seems somehow different now, it’s largely because of the timing, as Gay notes: “We do not make art in a vacuum isolated from sociopolitical context. We live in a starkly divided country with a president who is shamefully ill equipped to bridge that divide.” Benioff and Weiss spent years developing their premise, and when they began, they couldn’t have anticipated the environment in which their announcement would be received. I don’t want the project to be canceled, which would have a freezing effect throughout the industry, but they should act as if they’re going to be held to a higher standard. Because they will be.
Cruise and control
Over the last week, I’ve been listening to a long interview that the writer and director Christopher McQuarrie gave to The Empire Film Podcast after the release of Mission: Impossible—Rogue Nation. It’s over two and a half hours long and loaded with insight, but it also has a somewhat different tone when you come to it after the recent debacle of The Mummy. McQuarrie, predictably, has nothing but good words for Tom Cruise, whom he describes as the ultimate producer, with a hand in every aspect of the creative process. Now compare this to the postmortem in Variety:
In the case of The Mummy, one person—Cruise—had an excessive amount of control, according to several people interviewed. The reboot of The Mummy was supposed to be the start of a mega-franchise for Universal Pictures. But instead, it’s become a textbook case of a movie star run amok…Several sources close to the production say that Cruise exerted nearly complete creative oversight on The Mummy, essentially wearing all the hats and dictating even the smallest decisions on the set…Universal, according to sources familiar with the matter, contractually guaranteed Cruise control of most aspects of the project, from script approval to post-production decisions.
To put it another way, between Rogue Nation and The Mummy, absolutely nothing changed. On the one hand, Cruise’s perfectionist tendencies resulted in an excellent piece of work; on the other, they led to a movie that most critics agree is nearly unwatchable. This might seem like a paradox, but I’d prefer to see it as proof that this level of obsessiveness is required to make any movie whatsoever, regardless of the outcome. It may come from a producer or director rather than from the star, but in its absence, complicated projects just don’t get made at all. And the quality of the finished product is the result of factors that are out of even Tom Cruise’s control.
If you work in any creative field, you probably know this already, but the extent to which you’re willing to accept it is often determined by where your role falls in production. At one extreme, you have someone like the editor Walter Murch, who hangs a shiny brass “B” in his office. As Charles Koppelman writes in Behind the Seen:
Ask Walter about it, and he’ll tell you about aiming for a “B.” Work hard to get the best grade you can—in this world, a B is all that is humanly attainable. One can be happy with that. Getting an A? That depends on good timing and the whims of the gods—it’s beyond your control. If you start to think that the gods are smiling, they will take your revenge. Keep your blade sharp. Make as good a film as you know how. It’s an Eastern-oriented philosophy, as expressed by the American writer and philosopher, Ralph Waldo Emerson: “We aim above the mark to hit the mark.”
At the other extreme, you have the star, who has been groomed to attribute everything good in a movie to his or her irreplaceable presence. And it’s no accident that you find these two attitudes at opposite ends of the production process. The light that strikes the star’s face is captured on film that works its way down the chain to the editors, who have little choice but to be pragmatic: they can only work with the footage that they’ve been given, and while they have lots of good tricks for manipulating it, they’re ultimately the ones who deal with what remains after all the fond hopes that went into a film have collided with reality. They know exactly what they do and don’t have. And they’re aware that superhuman technical control doesn’t represent the high end of craft, but the bare minimum required to do useful work.
The screenwriter lies somewhere in the middle. In theory, he’s the one who gets paid to dream, and he isn’t constrained by any outside factors when he’s putting ideas down on the page. This isn’t quite how it works in practice, since there are plenty of externalities to consider at every point, and a screenwriter is often asked to solve problems at every stage in production. And we should be a little skeptical of what they have to say. Our understanding of cinematic craft is skewed by the fact that writers have traditionally been its most eloquent and entertaining expositors, which provides just one perspective on the making of the movie. One reason is the fact that screenwriters need to be good with words, not just for the script, but for the pitch meeting, which is another sort of performance—and it encourages them to deliver a hard sell for the act of writing itself. Another is that screenwriters have often been critically denigrated in favor of directors, which obliges them to be exceptionally funny, insightful, and forceful when they’re defending the importance of what they do for a living. Finally, there’s a kind of cynicism about the idea of control, which makes it easier to talk about it afterward. No screenplay is ever shot or released as written, which means that screenwriters exist to have their visions betrayed. If you believe that movies are made up largely of the contingent factors that emerge during production, that’s how it should be. But it also leaves screenwriters in a strange place when it comes to questions of control. Terry Rossio says of formatting the script so that the page breaks come at the right spot: “If you find yourself with this sort of obsessive behavior—like coming up with inventive ways to cheat the page count!—then, I think, you’ve got the right kind of attitude to make it in Hollywood.” He’s clearly right. But it’s also the kind of meticulousness that will be seen by only a handful of insiders, before your ideas pass through the hands of a dozen other professionals on the way to taking an unrecognizable form onscreen.
This may be the real reason why the screenwriters who serve as public advocates for craft—William Goldman, Robert Towne, Tony Gilroy, McQuarrie, and a few others—are also the ones with reputations as fixers, coming in at the very end to work on “troubled” shoots, which, as I’ve argued before, describes nearly every studio movie ever. These writers may well be legitimately better than most of their peers at solving problems, or at least they’re perceived that way, which is why they get those assignments. (As McQuarrie recently said to John August, when asked about the use of writers’ rooms on franchises like Transformers: “I believe you can create all of the Transformers stuff you want. You can build out the whole universe…When the rubber hits the road, that’s all going to change. They’re going to call you. They’re going to call me.” And he’s probably correct.) They’re survivors, and they’ve inevitably got good war stories to share. But we’re also more likely to listen to writers whose contributions come at the end of the process, where their obsessiveness can have a visible impact. It allows them to take credit for what worked while implicitly washing their hands of what didn’t, and there’s an element of chance involved here, too: every screenwriter wants to be the last one hired on a movie, but where you end up on that queue has a lot to do with luck and timing. I still find McQuarrie impossible to resist, and I learn more about storytelling from listening to him for ten minutes than by doing anything else. I’ve been talking about his interview so much that my wife joked that it’s my new religion. Well, maybe it is. But given how little anyone can control, it’s closer to John Gardner says about writing novels: it’s a yoga, a way of life in the world, rather than an end in itself. As McQuarrie himself says to Empire: “Never do anything to effect a result. Do something because you want to do it, or because you have to do it.” And he would know.
A most pitiful ambition
In Magic and Showmanship, which is one of my favorite books on storytelling of any kind, the magician and polymath Henning Nelms sets forth a principle that ought to be remembered by all artists:
An illusion is, by definition, untrue. In every field, we detect untruth by inconsistency. We recognize statements as false when they contradict themselves. An actor who does something which is not in keeping with his role falls out of character, and the spell of the play is broken. If a conjurer’s words and actions fail to match the powers he claims, he pricks the bubble of illusion; he may still entertain his audience with a trick, but he loses the magic of drama. Consistency is the key to conviction.
Nelms adds that consistency is also the key to entertainment, and that it achieves its greatest impact when all of its resources are directed toward the same goal. He continues:
Consistency implies a standard. We cannot merely be consistent; we must be consistent with something. In creating an illusion, our standard is the theme. Once you realize this, you will find that the theme provides a guide to every detail of your presentation. This is a tremendous asset. It answers many questions almost before you can ask them.
And Nelms concludes with a powerful rule: “Plan a routine as if every element of the theme—personalities, phenomena, purpose, and proof—were literally true.”
To some extent, this is simply a restatement of what John Gardner calls “the vivid and continuous fictional dream.” Any lapse or inconsistency will draw viewers or readers out of the performance, and it can be hard to get them back again. As Nelms puts it:
Although the “as if” rule is an inspiring guide, it is also a strict taskmaster. Consistency is essential to any suspension of disbelief. No conviction is so deep that it cannot be destroyed by a discrepancy in the presentation. On the contrary, the more profoundly the spectators are enthralled by a performance, the more likely they are to be jerked back to reality by anything which is not in harmony with the illusion.
Even more usefully, Nelms frames this rule as a courtesy to the magician himself, since it provides a source of information at times when we might otherwise be lost: “It not only helps us to make decisions, but suggests ideas.” He also helpfully observes that it can be more productive, on a creative level, to focus on eliminating discrepancies, rather than on heightening the elements that are already effective:
My whole procedure as a showman is based on a technique of hunting for faults and ruthlessly eliminating them…The good parts of a play or routine take care of themselves. If I see a way to improve them, I do so. But I never worry about them. Instead, I concentrate on spotting and correcting the flaws. These are the places that offer the greatest opportunities for improvement. Hence, they are also the places where time and effort devoted to improvement will produce the greatest results.
On a practical level, Nelms suggests that you write down an outline of the illusion as if it were literally true, and then see where you have to depart from this ideal for technical reasons—which is where you should concentrate your attention to minimize any obvious discrepancies. This all seems like common sense, and if writers and performers sometimes forget this, it’s because they get attached to inconsistencies that provide some other benefit in the short term. Nelms writes:
Many dramas have been ruined by actors who tried to enliven serious scenes by being funny. The spectators laughed at the comedy, but they were bored by the play. The same law holds true for conjuring: No matter how effective an inconsistent part may be, the damage that it does to the routine as a whole more than offsets whatever advantages it may have in itself.
He continues: “Directors and performers alike are so flattered by hearing an audience laugh or exclaim over some line or action that they blind themselves to the harm it does to the play or the illusion.” This tendency is as old as drama itself, as we see in Hamlet’s advice to the players, and it can have a troubling effect on the audience:
A discrepancy may escape conscious notice and still weaken conviction. The suspension of disbelief is a subconscious process. No one says to himself, “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.” Spectators can be led to adopt this attitude, but they must do so without thinking—and without realizing that they have done anything of the kind.
Which brings us, unfortunately, to Donald Trump. If you’re a progressive who is convinced that the president is trying to put one over on the public, you also have to confront the fact that he isn’t especially good at it. Not only are the discrepancies glaring, but they occur with a clockwork regularity that would be funny if it weren’t so horrifying. After the Washington Post reported that Trump had disclosed classified information—remember that?—to the Russian foreign minister and ambassador, his national security adviser said: “I was in the room. It did not happen.” The next day, Trump tweeted that he “wanted to share” the facts with Russia, as he had “the absolute right to do.” After James Comey was fired, the White House issued a statement saying that Trump had acted on the advice of the Justice Department, which based its recommendation on Comey’s handling of the investigation into Hilary Clinton’s emails. Two days later, Trump contradicted both points in an interview with Lester Holt: “I was going to fire Comey. My decision…In fact, when I decided to just do it, I said to myself, I said: ‘You know, this Russia thing with Trump and Russia is a made-up story.’” And when his staff repeatedly asserted that the refugee order wasn’t a travel ban, only to have Trump insist that it was, it felt like a cutaway gag on Arrested Development. You’ll sometimes see arguments that Trump is a chess master, creating distractions like a magician utilizing the technique of misdirection, which strikes me as a weird form of liberal consolation. (It reminds me of what Cooder the carny says of being grifted by Homer Simpson: “Well, there’s no shame in bein’ beaten by the best.” When his son tries to point out that Homer didn’t seem very smart, Cooder interrupts angrily: “We were beaten by the best.”) But the real answer is close at hand. Let’s look at Hamlet’s speech again:
And let those that play your clowns speak no more than is set down for them, for there be of them that will themselves laugh, to set on some quantity of barren spectators to laugh too, though in the meantime some necessary question of the play be then to be considered. That’s villainous and shows a most pitiful ambition in the fool that uses it.
This may be the best thing ever written about the Trump administration. Trump has been trained for years to go for the easy laugh or the quick reaction from the crowd, and he’ll continue to do so, even as “necessary questions” need to be considered. He’s done pretty well with it so far. And he has a receptive audience that seems willing to tell itself exactly what Nelms thought was impossible: “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.”
My ten great books #5: Couples
In his discussion of the aesthetic flaw of frigidity in The Art of Fiction, John Gardner says: “When a skillful writer writes a shallow, cynical, merely amusing book about extramarital affairs, he has wandered—with far more harmful effect—into the same unsavory bog.” There’s little doubt in my mind that he’s thinking of John Updike, of whom a very different author, Lawrence Block, states in Writing the Novel: “It’s probably safe to assume that John Updike wrote Couples out of comparable cupidity, but it’s hardly vintage Updike, and the author’s own detachment from it is evident throughout.” Given the fact that this novel was based so closely on the writer’s personal life that it scandalized his circle of friends in Ipswich, it might seem hard to describe it as shallow, cynical, and detached—which doesn’t mean that it can’t be all of these things as well. Couples made Updike rich and famous, and it was clearly conceived as a mainstream novel, but this was less a question of trying to write a bestseller than of shaping it for the cultural position that he hoped it would attain. Updike had already been promised the cover of Time magazine before it came out, and, as he later recalled: “Then they read the book and discovered, I think, that, the higher up it went in the Time hierarchy, the less they liked it.” As Jonathan Franzen did with The Corrections, Updike seems to have known that his next effort was positioned to break through in a huge way, and he engineered it accordingly, casting his obsessions with sex, death, and mortality into a form that would resonate with a wider audience. The back cover of my paperback copy calls it “an intellectual Peyton Place,” and I think that the quote must have pleased him.
I’ve always been fascinated by the moment in the late sixties and early seventies that made it possible for the conventions of modernist realism—particularly its attitudes toward sex—to be appropriated by bestselling writers. The early novels of Stephen King are a key text here, but so, in its way, is Couples, which shows the line of influence running in the other direction. In his determination to write a big book, Updike drew on the structural symmetries of popular fiction, and the result was his most richly organized novel of any kind. Like Mad Men, which takes place in the same era, it draws you in with its superficial pleasures and then invites you to go deeper, although many readers or viewers seem happy to stop at the surface. Gardner fretted about this possibility at length in On Moral Fiction:
[Updike is] a master of symbolic complexity, but one can’t tell his women apart in a book like Couples; his characters’ sexual preoccupations, mostly perverse, are too generously indulged; and the disparity between the surface and sub-surface of his novels is treacherous: to the naive reader (and most readers of popular bestsellers are likely to be naive), a novel like A Month of Sundays seems like a merry, bourgeois-pornographic book…while to the subtler reader, the novel may be wearily if not ambivalently satirical, a sophisticated attack on false religion…Since the irony—the presumably satiric purpose—is nowhere available on the surface…one cannot help feeling misgivings about Updike’s intent.
It’s certainly possible to read Couples, as I often do, purely for entertainment, or as a kind of gossipy cultural reportage. (No other novel tells us more about what it must have really been like to be a member of the upper middle class at the time of the Kennedy assassination.) Yet we’re also implicated by that choice. I own a copy of the first hardcover edition, which I bought, in a symbolic act that might have struck even Updike as a little too on the nose, on the morning of my wedding day. As it turns out, my life resembles it in a lot of the small ways but none of the big ones. But maybe that’s because Updike got there first.
The frigid juicemaker
By now, many of you have probably heard of the sad case of Juicero, the technology startup that developed the world’s most advanced juicer, which retails for hundreds of dollars, only to be rocked by a Bloomberg report that revealed that its juice packs could just as easily be squeezed by hand. At first glance, this seems like another cautionary tale of Silicon Valley design gone wrong, along the lines of the $1,500 toaster oven, but its lessons are slightly more profound. A few days ago, Ben Einstein, a general partner at the venture capital firm Bolt, conducted a teardown of the Juicero Press to figure out why it was so costly, and he came away impressed by its design and construction: his writeup is filled with such phrases as “beautifully molded,” “a complex assembly with great attention to detail,” “painstakingly textured,” and “incredibly complex and beautifully engineered.” At one point, Einstein marvels: “The number, size, complexity and accuracy of these parts is somewhat mind-blowing for a young hardware startup.” The trouble, he points out, is that the cost of such components makes the juicer far more expensive than most consumers are willing to pay, and it could have delivered comparable performance at a lower price by rethinking its design. A Juicero Press uniformly compresses the entire surface of the juice pack, requiring thousands of pounds of force, while a human hand gets much the same result simply by squeezing it unevenly. Einstein concludes:
I have to believe the engineers that built this product looked at other ways of pressing the juice, but if the primary mechanism could apply force in a more focused way it could easily save hundreds of dollars off the shelf price of the product.
As it stands, the engineers at Juicero evidently “went wild,” building a beautifully made and confoundingly expensive product in the hopes that a market for it would somehow materialize. It’s like a juicer designed by Damien Hirst. In a peculiar way, it makes for a refreshing contrast to the usual hardware startup horror story, in which a company’s plans to build the world’s greatest espresso machine run aground on the inconvenient realities of manufacturing and supply chain management. Juicero’s engineers obviously knew what they were doing, at least on a technical level, but their pursuit of great design for its own sake appears to have blinded them to more practical realities. The market for juicers isn’t the same as that for fine watches, and its buyers have different motivations. In the absence of feedback from customers, the engineers went ahead and built a juicer for themselves, loading it with features that even the most discerning of users would either never notice or wouldn’t feel like factoring into the purchase price. In real estate terms, they overimproved it. When my wife and I bought our house six years ago, our realtor warned us against overspending on renovations—you don’t want to invest so much in the property that, if you sell it, you’re forced to list it at a point that doesn’t make sense for your block. The Juicero’s lovingly machined parts and moldings are the equivalent of granite countertops and a master bathroom in a neighborhood where homeowners are more interested in paying a reasonable price for a short walk to the train.
There are two big takeaways here. One is the fact that there’s no such thing as good design or engineering in isolation—you always have to keep the intended user in mind. The other is that aesthetic considerations or technical specifications aren’t sufficient guidelines in themselves, and that they have to be shaped by other constraints to be channeled in productive directions. Elsewhere, I’ve noted that Apple’s cult of thinness seems to be driven by the search for quantifiable benchmarks that can drive innovation. Lowering the price of its products would be an even better goal, although it isn’t one that Apple seems inclined to pursue. Juicero, to its detriment, doesn’t appear to have been overly concerned by either factor. A juicer that sits on your kitchen counter doesn’t need to be particularly light, and there’s little incentive to pare down the ounces. There clearly wasn’t much of an effort to keep down the price. A third potential source of constraints, and probably the best of all, is careful attention to the consumer, which didn’t happen here, either. As Einstein notes:
Our usual advice to hardware founders is to focus on getting a product to market to test the core assumptions on actual target customers, and then iterate. Instead, Juicero spent $120 million over two years to build a complex supply chain and perfectly engineered product that is too expensive for their target demographic.
Imagine a world where Juicero raised only $10 million and built a product subject to significant constraints. Maybe the Press wouldn’t be so perfectly engineered but it might have fewer features and cost a fraction of the original $699…Suddenly Juicero is incredibly compelling as a product offering, at least to this consumer.
And you don’t need to look hard to find equivalents in other fields. A writer who endlessly revises the same manuscript without seeking comments from readers—or sending it to agents or publishers—is engaging in the same cycle of destructive behavior. In The Art of Fiction, John Gardner talks about artistic frigidity, which he defines as a moral failing that confuses side issues with what really matters. The symptoms are much the same in literature as they are in engineering: “It is sometimes frigidity that leads writers to tinker, more and more obsessively, with form.” Juicero suffered from a kind of technological frigidity, as does its obvious role model, Apple, which seems increasingly obsessed with aesthetic considerations that either have a minimal impact on the user experience or actively undermine it. (We saw this most recently with the Mac Pro, which had a striking cylindrical design that was hard to configure and suffered from heating issues. As engineering chief Craig Federighi admitted: “I think we designed ourselves into a bit of a thermal corner.” And it seems only fitting that Apple’s frigidity led to a problem with heat.) Ordinary companies, or writers, have no choice but to adjust to reality. Deadlines, length limits, and the demands of the market all work together to enforce pragmatic compromises, and if you remain frigid, you die. As the world’s largest tech company, Apple has to actively seek out constraints that will rein in its worst impulses, much as successful writers need to find ways of imposing the same restrictions that existed when they were struggling to break in. As Juicero’s example demonstrates, a company that tries to ignore such considerations from the beginning may never get a chance to prove itself at all. Whether you’re a writer or an engineer, it’s easy to convince yourself that you’re selling juicers, but you’re not. You’re selling the juice.
Quote of the Day
Most men, including men of genius, are not doctrinal. Most of humanity, including the wise, simply muddle through, suspending judgments, making tentative assertions, hopefully snatching what will serve for the moment, groping emotion by emotion toward the grave.
“I know who you are…”
Note: This post is the forty-seventh installment in my author’s commentary for Eternal Empire, covering Chapter 46. You can read the previous installments here.
Freedom in writing is a lot like its equivalent in everyday life. In theory, we’re all free actors of ourselves, as Harold Bloom describes the characters in Shakespeare’s plays, but in practice, we’re hemmed in by choices and decisions that we made years ago—or that other people or larger systems have made for us. Similarly, a novelist, who really does have access to limitless possibilities, inevitably ends up dealing with the kind of creative constriction that Joan Didion describes in an observation to The Paris Review that I never get tired of quoting:
What’s so hard about that first sentence is that you’re stuck with it. Everything else is going to flow out of that sentence. And by the time you’ve laid down the first two sentences, your options are all gone…I think of writing anything at all as a kind of high-wire act. The minute you start putting words on paper you’re eliminating possibilities. Unless you’re Henry James.
The difference between writing and life, of course, is that it’s easier for a writer to start over. No matter how much time or effort you’ve put into a project, you can always throw it away and begin again without anyone being the wiser, which is harder to pull off in the real world. Yet many writers stubbornly insist on sticking to what they’ve written, as if they didn’t have any alternative.
And the kicker is that they’re often perfectly right, at least where the first draft is concerned. Earlier this week, I talked about the finite amount of energy that a writer can allocate to the different stages of the creative process, and the strategies that he or she develops to conserve it. Discarding the material you’ve already written is a good way to sap the limited resources you have. More pragmatically, starting over usually amounts to never finishing the story at all: the most common reason that most attempts at a novel peter out halfway through is because the author was unable to live with what he or she had written the day before. A crucial part of becoming a writer lies in learning how to plow ahead despite the shortcomings of the work you’ve done, which often means treating your existing pages as fixed quantities. If nothing else, it’s a helpful strategy for concentrating exclusively on the work at hand: during the first draft, it makes sense to think of what you’ve written so far as inviolate, because otherwise, you’ll be tempted to go back and tinker, when you should be more concerned with the pages you don’t have. Revision requires another mental shift, in which everything is on the table, no matter how much work you’ve invested in it. And that transition—which flips your approach to the rough draft completely on its head—is one that every writer has to master.
But there’s an even more interesting case to be made for preserving the elements you’ve already written, or at least for doing everything you can to work with them before giving up. In the past, I’ve spoken of writing as a collaboration between your past, present, and future selves: it’s too complicated for any one version of you to handle alone, so you set up ways of communicating across time with different incarnations of yourself, aided by good notes. Your existing pages are a message from the past to the present, and they deserve to be taken seriously, since there’s information there that might otherwise be lost. In The Art of Fiction, John Gardner says of a writer working on a story about Helen of Troy:
He begins to brood over what he’s written, reading it over and over, patiently, endlessly, letting his mind wander, sometimes to Picasso or the Great Pyramid, sometimes to the possible philosophical implications of Menelaos’ limp (a detail he introduced by impulse, because it seemed right). Reading in this strange way lines he has known by heart for weeks, he discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery…Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.
Every writer, I think, will recognize this—but it only works if you trust that your past self knew what he or she was doing. And you’re more likely to come up with useful ideas if you treat that material as irrevocable. Like any constraint, it’s only fruitful if it can’t easily be eluded.
There’s a small but telling example in Eternal Empire of how this works. Way back in Chapter 5, when Maddy visits Tarkovsky’s house for the first time, she sees Nina, the oligarch’s daughter, riding a horse on a polo field, dressed in jodhpurs and a bomber jacket. Nina looks stonily at Maddy for a long second, and then disappears. It provided a nice visual button for the scene, but in order for it to make sense, I had to introduce Nina later on as a character. And I had no idea what role she might play. I had her pop up now and then in the pages that followed, always seen from a distance, just to remind the reader and myself that she still existed. Finally, when I reached Chapter 46—months after writing that first image—I knew that I couldn’t postpone it any longer: from a structural perspective, coming just before the huge set pieces that conclude the novel, it was the last area of calm in which any interaction between Maddy and Nina could take place. I thought more than once about cutting the earlier beat, which only amounted to a few lines, and taking the daughter out of the story entirely. But when I forced the two of them to meet at last, I ended up with what still feels like an essential moment: Nina drops a clue about her father’s plans, as well as to the overall plot, and they feel a brief connection that I knew would pay off in the final act. Without the image of that girl on a horse, which I introduced, as Gardner says, “by impulse, because it seemed right,” none of this would have occurred to me. It took a long time for it to justify itself, but it did. And it wouldn’t have happened at all if I hadn’t acted as if I didn’t have a choice…
“He had played his part admirably…”
Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.
A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:
We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…
Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.
What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.
It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…
The disintegrating cube
Last week, a video made the rounds of a disastrous attempt to construct a 22×22 Rubik’s Cube. Its creator, who remains thankfully anonymous, states that he spent seven months designing the mechanism, printing out the pieces, and assembling it, and the last ninety minutes of the process were streamed live online. And when he finally finishes and tries to turn it for the first time—well, you can skip to the end. (I don’t think I’ll ever forget how he mutters “We are experiencing massive piece separation,” followed by a shocked silence and finally: “Nope. Nope.” And if you listen carefully, after he exits the frame, you can hear what sounds a lot like something being kicked offscreen.) After the video went viral, one commenter wrote: “This makes me feel better about the last seven months I’ve spent doing absolutely nothing.” Yet it’s hard not to see the fate of the cube as a metaphor for something more. Its creator says at one point that he was inspired to build it by a dream, and it’s actually the second of two attempts, the first of which ended in much the same way. And while I don’t feel any less sorry for him, there’s something to be said for a project that absorbs seven months of your life in challenging, methodical work, regardless of how it turned out. Entropy always wins out in the end, if not always so dramatically. The pleasure that a finished cube affords is minimal compared to the effort it took to make it, and there’s something about its sudden disintegration that strikes me as weirdly ennobling, like a sand painting swept away immediately after its completion.
I happened to watch the video at a time when I was particularly prone to such reflections, because I quietly passed a milestone this weekend: five years ago, I launched this blog, and I’ve posted something every day ever since. If you had told me this back when I began, I probably wouldn’t have believed you, and if anything, it might have dissuaded me from starting. By the most conservative estimate, I’ve posted over a million words, which doesn’t even count close to two thousand quotes of the day. The time I’ve invested here—well over an hour every morning, including weekends—probably could have been spent on something more productive, but I have a hard time imagining what that might have been. It’s not like I haven’t been busy: the five years that coincided with the lifespan of this blog saw me produce a lot of other writing, published and otherwise, as well as my first daughter, and I don’t feel that I neglected any of it. (There does, in fact, seem to be a limit to how much time you can spend writing each day without burning out, and once you’ve hit those four to six hours, you don’t gain much by adding more.) Rather than taking up valuable time that would have been occupied by something else, this blog created an hour of productivity that wasn’t there before. It was carved out of each day from the minutes that I just would have frittered away, just as a few dollars squeezed out of a paycheck and properly invested can lead to a comfortable retirement.
Of course, the trouble with that analogy is that the work has to be its own justification. I’m very happy with this blog and its reception, but if I were giving one piece of advice to someone starting out for the first time, it would be to caution against seeing a blog as being good for anything except for itself. It isn’t something you can reasonably expect to monetize or to drive attention to your other projects. And if I had to explain my reasons for devoting so much time to it on such a regular basis, I’d have trouble coming up with a response. There’s no question that it prompted me to think harder and read more deeply about certain subjects, to cast about broadly for quotes and topics, and to refine the bag of tricks I had for generating ideas on demand. Like any daily ritual, it became a form of discipline. If writing, as John Gardner says, is ultimately a yoga, or a way of life in the world, this blog became the equivalent of my morning devotions. My energies were primarily directed to other kinds of work, often frustratingly undefined, and some of which may never see the light of day. The blog became a kind of consolation on mornings when I struggled elsewhere: a clean, discrete unit of prose that I could publish on my own schedule and on my own terms. I could build it, piece by piece, like a cathedral of toothpicks—or a massive Rubik’s Cube. And even if it fell apart in the end, as all blogs inevitably must, the time I spent on it would have been a worthwhile pursuit for its own sake.
I realize that this sounds a little like a valedictory post, so I should make it clear that I don’t plan on stopping anytime soon. Still, the odds are that this blog is closer to its end than to its beginning. When I started out, my resolve to post every day was a kind of preemptive resistance against the fate of so many other blogs, which cling to life for a few months or years before being abandoned. I didn’t want it to succumb to half measures, so, as with most things in life, I overdid it. Whether or not the result will be of lasting interest seems beside the point: you could say much the same of any writing at all, whether or not it appears between book covers. (And in fact, my quick post on George R.R. Martin and WordStar seems likely to be the single most widely read thing I’ll ever write in my life.) The only real measure of any project’s value—and I include my novels and short stories in this category—is whether it brought me pleasure in the moment, or, to put it another way, whether it allowed me to spend my time in the manner I thought best. For this blog, the answer is emphatically yes, as long as I keep that Rubik’s Cube in mind, looking forward with equanimity to the day that it all seems to disintegrate. It’s no different from anything else; it’s just more obvious. And its value comes from the act of construction. As the scientist Wayne Batteau once said of the three laws of thermodynamics: “You can’t win, you can’t break even, and you can’t get out of the game.” Or, as the critic David Thomson puts it in the final line of Rosebud, his biography of Orson Welles: “One has to do something.”
My great books #9: On Directing Film
Note: I’m counting down my ten favorite works of nonfiction, in order of the publication dates of their first editions, and with an emphasis on books that deserve a wider readership. You can find the earlier installments here.
When it comes to giving advice on something as inherently unteachable as writing, books on the subject tend to fall into one of three categories. The first treats the writing manual as an extension of the self-help genre, offering what amounts to an extended pep talk that is long on encouragement but short on specifics. A second, more useful approach is to consolidate material on a variety of potential strategies, either through the voices of multiple writers—as George Plimpton did so wonderfully in The Writer’s Chapbook, which assembles the best of the legendary interviews given to The Paris Review—or through the perspective of a writer and teacher, like John Gardner, generous enough to consider the full range of what the art of fiction can be. And the third, exemplified by David Mamet’s On Directing Film, is to lay out a single, highly prescriptive recipe for constructing stories. This last approach might seem unduly severe. Yet after a lifetime of reading what other writers have to say on the subject, Mamet’s little book is still the best I’ve ever found, not just for film, but for fiction and narrative nonfiction as well. On one level, it can serve as a starting point for your own thoughts about how the writing process should look: Mamet provides a strict, almost mathematical set of tools for building a plot from first principles, and even if you disagree with his methods, they clarify your thinking in a way that a more generalized treatment might not. But even if you just take it at face value, it’s still the closest thing I know to a foolproof formula for generating rock-solid first drafts. (If Mamet himself has a flaw as a director, it’s that he often stops there.) In fact, it’s so useful, so lucid, and so reliable that I sometimes feel reluctant to recommend it, as if I were giving away an industrial secret to my competitors.
Mamet’s principles are easy to grasp, but endlessly challenging to follow. You start by figuring out what every scene is about, mostly by asking one question: “What does the protagonist want?” You then divide each scene up into a sequence of beats, consisting of an immediate objective and a logical action that the protagonist takes to achieve it, ideally in a form that can be told in visual terms, without the need for expository dialogue. And you repeat the process until the protagonist succeeds or fails at his or her ultimate objective, at which point the story is over. This may sound straightforward, but as soon as you start forcing yourself to think this way consistently, you discover how tough it can be. Mamet’s book consists of a few simple examples, teased out in a series of discussions at a class he taught at Columbia, and it’s studded with insights that once heard are never forgotten: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” “Here is a tool—choose your shots, beats, scenes, objectives, and always refer to them by the names you chose.” “Keep it simple, stupid, and don’t violate those rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.” “The audience doesn’t want to read a sign; they want to watch a motion picture.” “A good writer gets better only by learning to cut, to remove the ornamental, the descriptive, the narrative, and especially the deeply felt and meaningful.” “Now, why did all those Olympic skaters fall down? The only answer I know is that they hadn’t practiced enough.” And my own personal favorite: “The nail doesn’t have to look like a house; it is not a house. It is a nail. If the house is going to stand, the nail must do the work of a nail. To do the work of the nail, it has to look like a nail.”
Surviving the German forest
Recently, I was leafing through Jessica Abel’s Out on the Wire, an updated and expanded version of her classic illustrated guide to radio, when I came across the following story from Radiolab host Jad Abrumad:
The station manager came to me and he said, “Hey, do you want to do an hour on Wagner’s Ring Cycle?” Had I done five minutes of research, I would’ve realized that Wagner’s Ring Cycle is an eighteen-hour cycle of operas that tries to encompass the totality of European art in one work. You got imagery, you got music, you got psychology, it was supposed to be “the work of art that ended art.” I could’ve found this out in thirty seconds, but I didn’t, and so I thought to myself: “Wagner, Wagner, Wagner, I don’t know much about Wagner. But, uh, sure, okay, Wagner, why not.”
Fast-forward a couple months, I had missed four deadlines, I’m on the verge of getting fired, and I haven’t slept for four days. I had the pressure of ideas that I just couldn’t reach, I had the pressure of being a newbie and talking to people who were very sophisticated. And I had the pressure of deadlines that were going “splat!” left, right, and center.
Abrumad concludes: “And we at Radiolab have given this state a name, because it happens quite often. We call it ‘the German forest.'” And it’s a place, I think, where most storytellers find themselves sooner later. When you begin a project of any size, whether it’s a long essay or a short story or an entire novel, you can feel overwhelmed by the amount of material you have to cover, and one of the hardest part of the process is translating the inchoate mass of ideas in your head into something that can be consumed in a sequential form. Abrumad doesn’t minimize the difficulties involved, but he notes that wandering through that forest is an essential stage in any creative endeavor:
When I head the Wagner thing on the radio later, I was like, “Whoa, somewhere in the middle of that trauma, I think I found my voice. There’s a real correlation between time spent in the German forest and these moments of emergence. And to be clear, the German forest changes. That sense of, the work is just too big to put my head around this, how am I gonna do this, that never changes. But what does change is that the terror gets reframed for you, because now, you’ve made it out a few times. You can see over the treetops, and into the future, to where, there you are, you’re still there, you’re still alive.
What interests me about this the most, though, is that Abrumad—a MacArthur fellow and very smart guy—is working in a form that has laid down strict rules for managing its material. As I’ve noted elsewhere, because radio poses such unique challenges, it has to be particularly ruthless about sustaining the listener’s attention. In the previous edition of Abel’s book, Ira Glass lays out the formula in a quote that I never tire of repeating:
This is the structure of every story on our program—there’s an anecdote, that is, a sequence of actions where someone says “this happened then this happened then this happened”—and then there’s a moment of reflection about what that sequence means, and then on to the next sequence of actions…Anecdote then reflection, over and over.
Glass frames this structure as a courtesy to the listener, but, more subtly, it’s also there for the sake of the storyteller. It isn’t a map of the forest, exactly, but a compass, or, even better, a set of rules for orienting yourself, and the tricks that survive are the ones that provide value both during the writing process and in the act of reading or listening. You can think of the rules of storytelling as a staircase with the author on one end and the audience at the other, allowing them to meet in the middle. Their primary purpose is to ensure that a project can be brought to completion, but they also allow the finished product to serve its intended purpose, just as the rules of architecture are both a strategy for building a house that won’t fall down halfway through and a blueprint for livable spaces.
And this is a particularly useful way to think about all “rules” of writing or storytelling, particularly plot and structure. Kurt Vonnegut says: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading.” And, he might have added, of keeping writers writing. Similarly, in The Art of Fiction, John Gardner notes that one of the hardest lessons for a writer to learn is how to treat each unit on its own terms:
The good writer treats each unit individually, developing them one by one. When he’s working on the description of Uncle Fyodor’s store, he does not think about the hold-up men who in a moment will enter it, though he keeps them in the back of his mind. He describes the store, patiently, making it come alive, infusing every smell with Uncle Fyodor’s emotion and personality (his fear of hold-up men, perhaps); he works on the store as if this were simply an exercise, writing as if he had all eternity to finish it, and when the description is perfect—and not too long or short in relation to its function in the story as a whole—he moves on to his story’s next unit.
You write a story, as David Mamet likes to say, the same way you write a turkey: one bite at a time. And a few seconds of thought reveal that both the writer and the reader benefit from that approach. You find your way through the forest step by step, just as the reader or listener will, and if you’re lucky, you’ll come to the same conclusion that Abrumad does: “You begin to recognize the German forest for what it is. It’s actually a tool. It’s the place you have to go to hear the next version of yourself.”
Quote of the Day
Sooner or later the writer has no choice but to figure out what he’s doing.
A brand apart
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What individual instances of product placement in movies and television have you found most effective?”
One of the small but consistently troublesome issues that every writer faces is what to do about brand names. We’re surrounded by brands wherever we look, and we casually think and talk about them all the time. In fiction, though, the mention of a specific brand often causes a slight blip in the narrative: we find ourself asking if the character in question would really be using that product, or why the author introduced it at all, and if it isn’t handled well, it can take us out of the story. Which isn’t to say that such references don’t have their uses. John Gardner puts it well in The Art of Fiction:
The writer, if it suits him, should also know and occasionally use brand names, since they help to characterize. The people who drive Toyotas are not the same people who drive BMWs, and people who brush with Crest are different from those who use Pepsodent or, on the other hand, one of the health-food brands made of eggplant. (In super-realist fiction, brand names are more important than the characters they describe.)
And sometimes the clever deployment of brands can be another weapon in the writer’s arsenal, although it usually only works when the author already possesses a formidable descriptive vocabulary. Nicholson Baker is a master of this, and it doesn’t get any better than Updike in Rabbit is Rich:
In the bathroom Harry sees that Ronnie uses shaving cream, Gillette Foamy, out of a pressure can, the kind that’s eating up the ozone so our children will fry. And that new kind of razor with the narrow single-edge blade that snaps in and out with a click on the television commercials. Harry can’t see the point, it’s just more waste, he still uses a rusty old two-edge safety razor he bought for $1.99 about seven years ago, and lathers himself with an old imitation badger-bristle on whatever bar of soap is handy…
For the rest of us, though, I’d say that brand names are one of those places where fiction has to retreat slightly from reality in order to preserve the illusion. Just as dialogue in fiction tends to be more direct and concise than it would be in real life, characters should probably refer to specific brands a little less often than they really would. (This is particularly true when it comes to rapidly changing technology, which can date a story immediately.)
In movies and television, a prominently featured brand sets off a different train of thought: we stop paying attention to the story and wonder if we’re looking at deliberate product placement—if there’s even any question at all. Even a show as densely packed as The Vampire Diaries regularly takes a minute to serve up a commercial for the likes of AT&T MiFi, and shows like Community have turned paid brand integration into entire self-mocking subplots, while still accepting the sponsor’s money, which feels like a textbook example of having it both ways. Tony Pace of Subway explains their strategy in simple terms: “We are kind of looking to be an invited guest with a speaking role.” Which is exactly what happened on Community—and since it was reasonably funny, and it allowed the show to skate along for another couple of episodes, I didn’t really care. When it’s handled poorly, though, this ironic, winking form of product placement can be even more grating than the conventional kind. It flatters us into thinking that we’re all in on the joke, although it isn’t hard to imagine cases where corporate sponsorship, embedded so deeply into a show’s fabric, wouldn’t be so cute and innocuous. Even under the best of circumstances, it’s a fake version of irreverence, done on a company’s terms. And if there’s a joke here, it’s probably on us.
Paid or not, product placement works, at least on me, although often in peculiar forms. I drank Heineken for years because of Blue Velvet, and looking around my house, I see all kinds of products or items that I bought to recapture a moment from pop culture, whether it’s the Pantone mug that reminds me of a Magnetic Fields song or the Spyderco knife that carries the Hannibal seal of approval. (I’ve complained elsewhere about the use of snobbish brand names in Thomas Harris, but it’s a beautiful little object, even if I don’t expect to use it exactly as Lecter does.) If it’s kept within bounds, it’s a mostly harmless way of establishing a connection between us and something we love, but it always ends up feeling a little empty. Which may be why brand names sit so uncomfortably in fiction. Brands or corporations use many of the same strategies as art to generate an emotional response, except the former is constantly on message, unambiguous, and designed to further a specific end. It’s no accident that there are so many affinities between advertising and propaganda. A good work of art, by contrast, is ambiguous, open to multiple interpretations, and asks nothing of us aside from an investment of time—which is the opposite of what a brand wants. Fiction and brands are always going to live together, either because they’ve been paid to do so or because it’s an accurate reflection of our world. But we’re more than just consumers. And art, at its best, should remind us of this.
Exorcising the ghosts
Over the weekend, The New York Times Style Magazine ran a fascinating series of short pieces by writers confronting their own early work. (The occasion for the feature is an auction being held at Christie’s next month by PEN American Center, in which seventy-five first editions with annotations by their authors will go up for sale. If I could get just one, it would be David Simon’s copy of Homicide.) The reflections here are full of intriguing insights, one of which I quoted here on Sunday. There’s Philip Roth’s description of the analytic session in Portnoy’s Complaint as “an appropriate vessel” for the kind of uncensored, frequently repellent story he wanted to write—a nice reminder of how a novel’s most distinctive qualities often represent a solution to particular narrative problems. I also liked George Saunders’s account of revisiting his first collection of short stories, which is full of “ghost-phrases” that he was positive were there, but must have been cut along the way. The version of a story that a writer carries in his or her head is an amalgam of variations, with each draft superimposed over the one before, and it sometimes bears little resemblance to what finally ended up in print.
But the comment that stuck with me the most was from Lydia Davis, who writes tightly compressed, elliptical short stories, some of them only a paragraph long. (I’ve only read a few of them, but they’re extraordinary—worthy contributions to a tradition of parables that goes back through Borges and Kafka. Of all contemporary writers whose work I feel I need to study more closely, Davis is near the top, largely because her virtues are so different from mine.) Appropriately enough, her contribution isn’t much longer than most of the stories that inspired it, but it’s been rattling around in my head ever since:
I read a story through again and again, whether it’s a long story or a short one (or a very very short one). If anything bothers me, even very subtly, I reread it many times, consider alternatives, put the story away for a while, read it again. I don’t consider a story finished until nothing bothers me anymore—though there are a few stories that never completely satisfied me but that I felt were good enough to go out in the world as they were. I simply couldn’t think what more I could do to them.
And the line that really gets me is “until nothing bothers me anymore.” On some level, that’s the only standard to which writers ought to hold themselves, as John Gardner notes in The Art of Fiction: “When the amateur writer lets a bad sentence stand in his final draft, though he knows it’s bad, the sin is frigidity.” The trouble, of course, is that revising a story is like trying to catch a trout with your bare hands. Whenever you think you’ve got a grip on it, it slips through, and one change can set off a series of little crises elsewhere in the draft. To switch to another metaphor, it’s like the horseshoe nail that lost the kingdom: revising a word in a sentence can change the rhythm, which throws off the paragraph, and suddenly the entire chapter—or the whole novel—needs to be rethought. And I’m only slightly exaggerating. At the moment, I’m nearing the end of a significant rewrite of my current novel, with a long list of changes big and small, and although most live on the level of the sentence or paragraph, I won’t know how they really play until I sit down tonight and read the whole thing straight through. That read, in turn, will suggest additional changes, meaning that the novel has to be read yet again, and so on and so forth until I collide with my deadline on Friday.
Ideally, each round of changes will be less extensive than the one before, gradually converging, like a function approaching its limit, at the story’s ideal form, or at least something close enough. This seems to be what Davis is describing, and it’s clear that her stories demand nothing less: they’re so condensed and intense, like poetry, that a single wrong word would tear them apart. The problem is that even as the story nears its perfect shape, if it even exists, the author is changing in the meantime: the standards you had when you started may not be the ones you have now, after you’ve been shaped by the work itself. Much of writing consists of managing that threefold relationship between the story, your original intentions, and whatever you’re feeling today. When the process doesn’t go perfectly, which is to say most of the time, you end up with the ghost-phrases that Saunders describes, a mismatch between the story in your head and its published form. Davis seems determined to exorcise those ghosts, and by her own account, she usually succeeds. She wouldn’t be here if she didn’t. And if the rest of us are still haunted by our ghost-phrases, well, we can take heart in the words of Jez Butterworth, who notes that a matter of milliseconds can make the difference between nearly and really—even if the process can start to feel a little like Butterworth’s own script for Edge of Tomorrow. You try, fail, and repeat.
The Sea Captain syndrome
The other day, after recounting the famous story that John Gardner tells about writer’s block in On Becoming a Novelist, I suggested that Gardner’s inability to figure out a small point of his story was really a reflection of deeper uncertainties. He sensed intuitively that he didn’t know the narrative or his characters well enough to move forward, so his mind seized on a tiny, seemingly trivial detail—the question of whether a certain woman would accept an hors d’oeuvre at a party—as a way of stalling the process, thus buying himself an extra week for unconscious reflection. The hors d’oeuvre didn’t matter in itself; it was only the excuse he needed for a necessary break. And this strikes me as being more generally true of writer’s block itself. I’ve talked about writer’s block here before, noting that the best way of dealing with it is by establishing a routine that fools the creative faculty into thinking that something useful is taking place, even if it isn’t. But it may be more accurate to think of writer’s block less as an impersonal scourge than as a condition tied inextricably to the conditions of writing itself, just as an illness can emerge from a breakdown in the body’s homeostasis.
What I’m proposing is that there are two opposing forces that play a role in any creative artist’s life: the urge to produce and the urge to postpone. Both sides are essential, and at their best, they work together. If we didn’t feel driven to get something down on paper, even on our worst days, we wouldn’t do much of anything at all: half of writing consists of meeting quotas or cranking out words when we’d rather be doing anything else. Left to itself, though, that inclination can lead to shoddy work, or, worse, a kind of deception that the writer imposes on both the reader and himself, as fake insight or emotion stands in for the real thing. Hence the importance of postponement—the ability to know when to pull back, or to wait for the second good idea. It’s a principle that governs everything from Walter Murch’s admonition that an artist should leave “a residue of unresolved problems for the next stage” to David Weinberger’s simpler motto “Include and postpone.” David Mamet notes somewhere that the first thing that occurs to the writer is often the first thing that occurs to the audience, too, so an author needs an internal mechanism in place that prevents him from going with a convenient idea simply because it exists.
Under ideal circumstances, these two impulses exist in harmony, pushing against each other so that the writer oscillates between extremes of productivity and idleness. Average them out, and you’ve got a decent writing life. If either tendency starts to take control, however, it can cause real problems. We all know how it feels when the urge to postpone consumes everything else: we spend more time on research, or we suddenly feel the urgent need to reply to a few old emails, and it can leave us paralyzed with inactivity. Yet the urge to produce can be even more dangerous, precisely because it’s so seductive. I’m a pretty good writer; I’ve trained myself to crank out five hundred words in an hour on just about any subject, and I don’t lack for ideas for long. But when I look back at some of my old work, I can see that this kind of facility can be a trap in itself. Whenever I get notes on a draft, for instance, I immediately come up with five different ways of addressing each problem, but just because the answers come easily doesn’t mean they’re correct. And there are times when I’ve realized, in retrospect, that I would have been better off rejecting the first ideas that presented themselves and waiting for something better to come along.
That’s the greatest danger of writer’s block: it’s so painful that we’ll do anything we can to avoid it, even if it means falling into the opposite extreme. I sometimes think of it as Sea Captain syndrome, named after an exchange involving Captain McCallister on The Simpsons, as he presents a proposal to Mr. Burns:
McCallister: “I’ll need three ships and fifty stout men. We’ll sail ’round the Horn and return with spices and silk the lives of which ye have never seen.”
Mr. Burns (angrily): “We’re building a casino!”
McCallister: “Arrr…Can you give me five minutes?”
I’ve spent much of my writing life coming up with five-minute solutions to problems that really should have taken five days—or five weeks—to solve, and it’s been a liability as much as a strength. The healthier approach, which I’m still trying to master, is to regard productivity and postponement as complementary states, the warp and woof out of which the writing life is made. The former feels a lot better than the latter when you’re in the middle of it, but like all artificial highs, you pay for it in the end. Better, perhaps, to see writer’s block, rightly, as a necessary condition to creativity, even if it leaves us saying, as the Sea Captain does elsewhere: “Yarr…I don’t know what I’m doing.”
“If she was going to run, it had to be now…”
leave a comment »
Note: This post is the fifty-sixth installment in my author’s commentary for Eternal Empire, covering Chapter 55. You can read the previous installments here.
In general, an author should try to write active protagonists in fiction, for much the same reason that it’s best to use the active voice, rather than the passive, whenever you can. It isn’t invariably the right choice, but it’s better often enough that it makes sense to use it when you’re in doubt—which, when you’re writing a story, is frankly most of the time. In The Elements of Style, Strunk and Write list the reasons why the active voice is usually superior: it’s more vigorous and direct, it renders the writing livelier and more emphatic, and it often makes the sentence shorter. It’s a form of insurance that guards against some of the vices to which writers, even experienced ones, are prone to succumbing. There are few stories that wouldn’t benefit from an infusion of force, and since our artistic calculations are always imprecise, a shrewd writer will do what he or she can to err on the side of boldness. This doesn’t mean that the passive voice doesn’t have a place, but John Gardner’s advice in The Art of Fiction, as usual, is on point:
And most of the same arguments apply to active characters. All else being equal, an active hero or villain is more engaging than a passive victim of circumstance, and when you’re figuring out a plot, it’s prudent to construct the events whenever possible so that they emerge from the protagonist’s actions. (Or, even better, to come up with an active, compelling central character and figure out what he or she would logically do next.) This is the secret goal behind the model of storytelling, as expounded most usefully by David Mamet in On Directing Film, that conceives of a plot as a series of objectives, each one paired with a concrete action. It’s designed to maintain narrative clarity, but it also results in characters who want things and who take active measures to attain them. When I follow the slightly mechanical approach of laying out the objectives and actions of a scene, one beat after another, it gives the story a crucial backbone, but it also usually leads to the creation of an interesting character, almost by accident. If nothing else, it forces me to think a little harder, and it ensures that the building blocks of the story itself—which are analogous, but not identical, to the sentences that compose it—are written in the narrative equivalent of the active voice. And just as the active voice is generally preferable to the passive voice, in the absence of any other information, it’s advisable to focus on the active side when you aren’t sure what kind of story you’re writing: in the majority of cases, it’s simply more effective.
Of course, there are times when passivity is an important part of the story, just as the passive voice can be occasionally necessary to convey the ideas that the writer wants to express. The world is full of active and passive personalities, and of people who don’t have control over important aspects of their lives, and there’s a sense in which plots—or genres as a whole—that are built around action leave meaningful stories untold. This is true of the movies as well, as David Thomson memorably observes:
One of the central goals of modernist realism has been to give a voice to characters who would otherwise go unheard, precisely because of their lack of conventional agency. And it’s a problem that comes up even in suspense: a plot often hinges on a character’s lack of power, less as a matter of existential helplessness than because of a confrontation with a formidable antagonist. (A conspiracy novel is essentially about that powerlessness, and it emerged as a subgenre largely as a way to allow suspense to deal with these issues.)
So how do you tell a story, or even write a scene, in which the protagonist is powerless? A good hint comes from Kurt Vonnegut, who wrote: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading. When I used to teach creative writing, I would tell the students to make their characters want something right away—even if it’s only a glass of water. Characters paralyzed by the meaninglessness of modern life still have to drink water from time to time.” This draws a useful distinction, I think, between the two functions of the active mode: as a reflection of reality and as a tool to structure the reader’s experience. You can use it in the latter sense even in stories or scenes in which helplessness is the whole point, just as you can use the active voice to increase the impact of prose that is basically static or abstract. In Chapter 55 of Eternal Empire, for example, Maddy finds herself in as vulnerable a position as can be imagined: she’s in the passenger seat of a car being driven by a woman whom she’s just realized is her mortal enemy. There isn’t much she can plausibly do to defend herself, but to keep her from becoming entirely passive, I gave her a short list of actions to perform: she checks her pockets for potential weapons, unlocks the door on her side as quietly as she can, and looks through the windshield to get a sense of their location. Most crucially, at the moment when it might be possible to run, she decides to stay where is. The effect is subtle, but real. Maddy isn’t in control of her situation, but she’s in control of herself, and I think that the reader senses this. And it’s in scenes like this, when the action is at a minimum, that the active mode really pays off…
Like this:
Written by nevalalee
June 23, 2016 at 8:54 am
Posted in Books, Writing
Tagged with David Mamet, David Thomson, E.B. White, Eternal Empire commentary, John Gardner, On Directing Film, Strunk and White, The Art of Fiction, The Elements of Style, William Strunk Jr.