Posts Tagged ‘John Gardner’
Writers are generally advised not to repeat themselves. After I’ve finished the rough draft of a story, one of my first orders of business is to go back through the manuscript and fix any passages where I’ve inadvertently repeated the same word in the same sentence, or within a short run of text. Knowing how often you can use a word is a matter of taste and intuition. Some words are so common as to be invisible to the reader, so you can, and should, use the word “said” exclusively throughout a story, even as dialogue can usually be varied in other ways. Other words or phrases are so striking that they can’t be used more than once or twice in the course of an entire novel, and I’ll sometimes catch myself maintaining a running count of how often I’ve used a word like “unaccountable.” Then there are the words that fall somewhere in the middle, where they’re useful enough to crop up on a regular basis but catch the reader’s eye to an extent that they shouldn’t be overused. Different writers fall back on different sets of words, and in my case, they tend to be verbs of cognition, like “realized,” or a handful of adverbs that I use entirely too often, like, well, “entirely.”
Whenever I’m sifting through the story like this, part of me wonders whether a reader would even notice. Some of these repetitions jar my ear to a greater extent than they would for someone reading the story more casually: I’ve often revisited these pages something like fifty times, and I’m acutely aware of the shape of each sentence. (Overfamiliarity can have its pitfalls as well, of course: I’m sometimes shocked to discover a glaring repetition in a sentence that I’ve read over and over until I can no longer really see it.) But I encounter this issue often enough in other authors’ books that I know it isn’t just me. Catching an inadvertent repetition in a novel, as when Cormac McCarthy speaks twice in Blood Meridian of something being “footed” to its reflection, has the same effect as an unintentional rhyme: it pulls you momentarily out of the story, wondering if the writer meant to repeat the same word or if he, or his editor, fell asleep at the switch. And a particularly sensitive eye can pick up on repetitions or tics that even an attentive reader might miss. In his otherwise fawning study U & I, Nicholson Baker complains about John Updike’s overuse of the verb “seemed,” which even I, a massive Updike fan, hadn’t noticed until Baker pointed it out.
But repetitions can also be a source of insight, especially when you’re coming to grips with an earlier draft. A writer can learn a lot from the words he habitually overuses. If you find yourself falling back on melodramatic adverbs like “suddenly,” you might want to rethink the tone you’re taking—it’s possible that you’re trying to drum up excitement in a story that lacks inherent dramatic interest. My own overuse of verbs like “realized” might indicate that I’m spending too much time showing characters thinking through a situation, rather than conveying character through action. You can learn even more from longer phrases that reappear by accident. As John Gardner writes in The Art of Fiction, discussing a hypothetical story about Helen of Troy:
Reading…lines he has known by heart for weeks, [the writer] discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery: The brooch Helen threw at Menelaus the writer has described, he discovers, with the same phrase he used in describing, much later, the seal on the message for help being sent to the Trojans’ allies. Why? he wonders. Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.
And the comparison to dreaming is a shrewd one. “Repetitions are magic keys,” Umberto Eco writes in Foucault’s Pendulum, and although he’s talking about something rather different—a string of sentences randomly generated by a computer—there’s a common element here. When you write a first draft, you’re operating by instinct: you accept the first words that come to mind, rather than laboriously revising the text, because you’re working in a mode closer to the events of the story itself. At its best, it’s something like a dream, and the words we select have a lot in common with the unmediated nature of dream imagery or word association in psychoanalysis. Later, we’ll smooth and polish the surface of the prose, and most of these little infelicities will be ironed away, but it doesn’t hurt to look at them first with the eye of an analyst, or a critic, to see what they reveal. This doesn’t excuse us from falling back on the same hackneyed words or phrases, and it doesn’t help a writer who thinks entirely in clichés. But it’s in our slips or mistakes, as Freud knew, that we unconsciously reveal ourselves. Mistakes need to be fixed and repetitions minimized, but it’s still useful to take a moment to ask what they really mean.
When we look at how our minds really work, there’s a strong case to be made that free will isn’t as important as free won’t. We all have a strong intuition that we’re ultimately responsible for how we act and behave, but what we think of as our deliberate actions seem to arise from a stratum of the brain that isn’t normally accessible to conscious thought, and we often unconsciously set ourselves in motion long before we’re aware of making that choice. Benjamin Libet, whose famous experiment was the one that cast grave doubts on the idea of volitional control in the first place, proposed an elegant solution to this apparent problem: the role of consciousness is to veto certain proposals from the unconscious mind while approving others, sifting through and evaluating the possible actions with which it’s presented. This veto power, if it exists, necessarily takes place in a very short window of time, something on the order of a tenth of a second. But if you were to slow down the process and translate the results into a concrete, tangible form, you’d have something oddly similar to what an author experiences while writing a novel—except that the character you’re creating is yourself.
Writing fiction is certainly one of the most peculiar pursuits in which a human being can engage, but really, it’s not so different from any other kind of focused human activity. In theory, you have complete freedom to write whatever you want on the blank page, but in practice, it isn’t so straightforward. You can’t write a sentence without being constrained by the conventions of language, by your own abilities, and by your mood when you sit down at your desk. Ideas, both big and small, generally don’t arise from an effort of the will: they appear, mysteriously, from some shadowy part of the brain. Yet a good writer can influence even the factors over which he seems to have little control, less through what he does in the moment than by what he’s done at every moment before. You can improve your craft over time, increasing the range of possible sentences you’re able to write; you can develop habits that will allow you to write in any emotional state; you can even learn to generate ideas on demand. Or you can do none of this. Whichever way you go, though, you’ll find that your small, unconscious artistic choices are really determined by the seeds you’ve planted in the past—which is a lot like how it works in real life.
Of course, that doesn’t bring us any closer to cracking the problem of free will. As Sam Harris might point out, and has, even if we can influence ourselves with our past behavior, that still doesn’t explain what influenced the influences. But there’s another stage in the writing process that does look a lot—at least to me—like the product of conscious choice, and that’s revision. “For artists, writing has always meant, in effect, the art of endless revising,” John Gardner says, and to the extent that a writer’s personality is expressed in his work, it’s in how he chooses to revise. First drafts are the id of the writing life: they’re rough, unconsidered, and as horrifying in their own way as the unwanted thoughts we encounter in dreams or in our less guarded moments. I suspect that the rough drafts of all writers at the same level of experience look more or less the same, which is to say, awful. In revision, though, you find yourself evaluating the choices you made the first time around, deleting the ones that don’t work and refining the ones that do, and the result, however far it may fall short of your intentions, comes as close to a fully considered action as a human being is capable of achieving.
My argument, then, is that free will in art is something that unfolds over time. Each choice we make may be accidental, serendipitous, or random when it first occurs, but it’s in the act of selecting, polishing, and editing, executed over the course of many months, that we start to find true freedom. And that’s the way it works in life, too. I may be the product of influences I can’t control or decisions I made without conscious deliberation when I was much younger, but it’s in my ongoing attempt to revise myself as a person—using the input of those hidden processes while also subjecting them to what feels like a higher level of consideration—that responsibility enters the picture. Free will seems to disappear the more closely we look at it, just as I’d have trouble explaining why I chose one word over another as I typed this sentence, but it emerges once we stand back to look at the system and its evolution as a whole. This kind of revision, with influences on the smallest scale affecting the larger which affects the smaller in turn, is something we all exercise from one minute to the next, and if it isn’t freedom, it’s close enough to make the distinction seem irrelevant. Because if writing is the art of endless revising, that’s true of life as well.
1. The Elements of Style by William Strunk, Jr. and E.B. White. If I were putting together an essential library of books for an aspiring writer of any kind, The Elements of Style would be first on the list. In recent years, there’s been something of a backlash against Struck and White’s perceived purism and dogmatism, but the book is still a joy to read, and provides an indispensable baseline for most good writing. It’s true that literature as a whole would be poorer if every writer slavishly followed their advice, say, to omit needless words, as Elif Batuman says in The Possessed: “As if writing were a matter of overcoming bad habits—of omitting needless words.” Yet much of creative writing does boil down to overcoming bad habits, or at least establishing a foundation of tested usage from which the writer only consciously departs. More than fifty years after it was first published, The Elements of Style is still the best foundation we have.
2. The Art of Fiction by John Gardner. I bought this book more than fifteen years ago at a used bookstore in Half Moon Bay, shortly before starting my freshman year in high school. Since then, I’ve reread it, in pieces, a dozen or more times, and I still know much of it by heart. Writing books tend to be either loftily aspirational or fixated on the nuts and bolts of craft, and Gardner’s brilliance is that he tackles both sides in a way that enriches the whole. He has plenty to say on sentence structure, vocabulary, rhythm, and point of view, but he’s equally concerned with warning young writers away from “faults of soul”—frigidity, sentimentality, and mannerism—and reminding them that their work must have interest and truth. Every element of writing, he notes, should by judged by its ability to sustain the fictional dream: the illusion, to the reader, that the events and characters described are really taking place. And everything I’ve written since then has been undertaken with Gardner’s high standards in mind.
3. Writing to Sell by Scott Meredith. I hesitated between this book and Dean Koontz’s Writing Popular Fiction, which I reread endlessly while I was teaching myself how to write, but I’ve since discovered that it cribs much of its practical material from Meredith. Scott Meredith was a legendary literary agent—his clients included Norman Mailer, Arthur C. Clarke, and P.G. Wodehouse—and his approach to writing is diametrically opposed to Gardner’s: his book is basically a practical cookbook on how to write mainstream fiction for a wide audience, with an emphasis on plot, conflict, and readability. The tone can be a little mercenary at times, but it’s all great advice, and it’s more likely than any book I know to teach an author how to write a novel that the reader will finish. (One warning: Meredith’s chapter on literary agents, and in particular his endorsement of the use of reading fees, should be approached with caution.)
4. On Directing Film by David Mamet. I’ve spoken about this book at length before, but if I seem awed by it, it’s because I encountered it a time in my life when I already thought I’d figured out how to write a novel. At that point, I’d already sold The Icon Thief and a handful of short stories, so reading Mamet’s advice for the first time was a little like a professional baseball player realizing that he could raise his batting average just by making a few minor adjustments to his stance. Mamet’s insistence that every scene be structured around a series of clear objectives for the protagonist may be common sense, but his way of laying it out—notably in a sensational class session at Columbia in which a scene is broken down beat by beat—rocked my world, and I’ve since followed his approach in everything I’ve done. At times, his philosophy of storytelling can be a little arid: any work produced using his rules needs revision, and a touch of John Gardner, to bring it to life. But my first drafts have never been better. It’s so helpful, in fact, that I sometimes hesitate before recommending it, as if I’m giving away a trade secret—but anyway, now you know.
I know what you’re thinking: I’ve finally lost it. For most of the last two years, I’ve used this blog to rail against the use of excessive backstory, advising writers to kill it whenever it occurs, preferably with fire. I’ve pointed out that characters in a novel are interesting because of their words, deeds, and decisions over the course of the narrative, not because of whatever they might have been doing or thinking before the story began. I’ve argued that backstory violates the principle that a good story should consist of a series of objectives, and that character is best revealed through action. I’ve pointed out, stealing an observation from the great William Goldman, that heroes must have mystery, and that to explain away a character through digressions into his past or psychology—at least in most forms of popular fiction—only serves to diminish him. And I’ve often referred to examples of characters who become more interesting the less we know about them, like Forsyth’s Jackal, and those who have been progressively ruined by excessive backstory, like Hannibal Lecter.
I still believe all these things. Recently, however, I’ve found myself writing reams of backstory for two different projects. One, Eternal Empire, is the concluding novel of a series that can’t be entirely understood without additional information about the earlier installments, which is something that I didn’t really appreciate until reading over the most recent draft. The other is a long, self-contained novel I’ve been working on for years, and whose protagonist’s actions make somewhat more sense with a slightly more detailed backstory. In both cases, I added backstory after both novels were finished, in an attempt to address specific narrative problems, namely a lack of clarity that was preventing readers from getting lost in the story. And although I’ve begun to tactically incorporate backstory where it seems advisable, my earlier convictions haven’t changed. For most writers, I’m convinced that less backstory is preferable to the alternative, and that implication and suggestion are more powerful tools than extended passages of introspection. But there are times, looking back at a story that is otherwise complete, when I’ve found that a few scraps of backstory have their place.
If this seems inconsistent, it’s only because the rules of writing, like most laws, operate under an informal hierarchy, and it’s often worth stretching a minor rule so as to preserve a major one. (Or, as the rabbis say, it’s better to break one sabbath in order to keep many sabbaths.) You can debate which rules are more important than others, but it’s hard to argue with John Gardner’s observation that for most writers, the primary objective is to preserve the fictional dream: the illusion, in the reader’s mind, that these events are actually happening. Anything that tears the reader out of the dream without good reason needs to be examined and, usually, corrected. And one issue that can break the illusion is unintended ambiguity. If the reader puts down the book to wonder about a detail in a character’s past that the author didn’t mean to leave unresolved, it’s probably worth introducing this information, solely for the sake of maintaining momentum. And my reluctance to spell things out has occasionally confused readers in ways I didn’t intend. This led to some trouble with my recent Analog story “The Voices,” and also seems to be an issue in The Icon Thief. (Given the chance, I think I’d insert a few more paragraphs about Duchamp and his place in art history to avoid sending readers to Wikipedia.)
That said, backstory needs to be introduced judiciously, and at the proper point. In particular, it’s often best to save it for a moment when the story can afford to slow down. Such flat moments, which serve as a breather between points of high action, provide a convenient place for filling in the background, as long as it makes sense within the structure of the novel as a whole. The two projects I’m writing now both happen to have a convenient opening in the exact same spot: at the beginning of the second section, which currently picks up immediately from a cliffhanger at the end of the previous chapter. Inserting a flashback here, with the tension of the previous scene unresolved, both extends the suspense and allows me to fill in necessary background in reasonable security that the reader will read on to see what happens next. This sort of thing can be taken too far, of course: I keep such departures as short as possible, afraid that I might conclude what T.E. Lawrence did after rereading a chapter intended as a “flat” in Seven Pillars of Wisdom: “On reflection I agreed…that it was perhaps too successful.” So most of my earlier points still stand—even, or especially, when I’m forced to break them.
“When I was a critic,” writes François Truffaut, “I thought that a successful film had simultaneously to express an idea of the world and an idea of cinema.” I’d argue that this holds true of all works of art, no matter what form they take. If there’s one thing I’ve learned from trying to survive as a working writer, it’s that every book is secretly about the process of its own creation, and the ideas that it tries to express about the world are inextricable from the author’s own experience in writing it. This was certainly the case with City of Exiles. As I’ve said many times before, this is a book about interpretation—about how we read meaning into the world around us and into our own lives—dramatized in the form of two authentic unsolved mysteries: Ezekiel’s vision of the chariot and the incident in the Dyatlov Pass. I combined these two plot threads almost at fancy, drawn intuitively by their thematic and narrative resonance, and did the best I could to embed my solutions in an exciting story about men and women who are also in search of answers, or at least willing to impose them on others. Exile, in this novel, is sometimes literal, but it’s more often a state of existence that my characters carry within themselves, and it’s only now, when I can look back at the book with some detachment, that I understand that this was a story I had to write at that point in my career.
In some ways, I wrote City of Exiles largely to prove to myself that I could. The Icon Thief, like all first novels, was something of a fluke, however diligently pursued: I was writing on my own, without a lot of outside expectations beyond the ones I’d created for myself, and although I’d been writing fiction for most of my life, I was still figuring out basic problems of craft as I went along. My second novel, inevitably, was conceived and written under radically different circumstances. I was being paid to write a book under contract; I had a number of interested parties deeply invested in the outcome; and I was operating under considerable time constraints. It took more than two years to bring my first novel to completion, while the second had just over nine months from synopsis to delivery, which left me with little room for error. As as result, I had to plan it carefully and hope that the final product wasn’t too different from what I’d promised to write. It was a difficult, often taxing experience, but in the end, the novel was startlingly close to the story I’d set out to tell, although there were a number of big surprises along the way. And for the first time, I got a sense of what it really meant to be a working novelist. (It’s no accident that my work on the book coincided with the birth of this blog.)
This struck me, and still does, as the most meaningful discovery I made. When you’re writing your first novel, you’re secretly convinced, and not without reason, that everything will stand or fall on this one book. A second novel, by contrast, implies the future existence of a third, and possibly more, which leads to a very different state of mind. It’s less about any one book than about the idea of working on something or other for the rest of your life, and City of Exiles was the novel where this vision of what my career might be finally fell into place. When I agreed to write it, I didn’t know what the novel would be about, and I had never anticipated writing a series: I just knew that, by the end of the year, it had to exist. The result was a curious mixture of freedom and constraint. The book could be about anything, really, as long as it resembled a sequel to The Icon Thief and brought back certain crucial characters from the first novel. (In fact, Ilya’s return was essentially written into the contract, probably as a formality to ensure that the book I delivered wasn’t completely unlike its predecessor.) Although the finished work hopefully feels like all of a piece, it was initially assembled from various components I simply felt like writing about, trusting that they would come together in the right way. It was a test of all I’d learned since writing my first book, and there were times, in the early days, when I felt that I was willing this novel into existence.
But every novel is the result of some combination of willpower and serendipity, and as I continued to write, I found myself learning a great deal about the story along the way. (As I hope to explain further in an eventual author’s commentary, there’s one shocking development that I didn’t anticipate at all when I began writing, and which deeply influenced the plot of the third installment.) And in many ways, I’m prouder of it than of anything else I’ve published. While The Icon Thief reads, accurately, like a highly compressed version of a novel that was originally much longer, City of Exiles feels to me like the work of a novelist who is finally hitting his stride. In the passage quoted above, Truffaut continues: “Today, I demand that a film express either the joy of making cinema or the agony of making cinema. I am not at all interested in anything in between; I am not interested in all those films that do not pulse.” To my eyes, this book pulses with the effort of a writer earnestly committed to figuring out his own craft and what his life as a novelist will be, as much as to solving the problems, sometimes devastating, faced by his characters themselves. It helped me understand, for the first time, what John Gardner means when he describes writing as a way of life in the world. And in the end, the life whose meaning I was discovering, line by line, was my own.
City of Exiles is available now at bookstores everywhere.
Last week, while assembling my list of my twenty favorite writing quotes, I was struck by a statement by Kurt Vonnegut, which I’d read countless times before without really thinking through its implications. The more I reflect on it, however, the more it seems to sum up much of how I feel about narrative plot and structure, to the extent that it deserves a post of its own. Here it is:
I don’t praise plots as accurate representations of life, but as ways to keep readers reading.
This may seem like an obvious point, but it’s also a profound insight that every author ought to keep in mind. Writers go into fiction for any number of reasons, but keeping the reader reading is as close to a universal objective as they come—and plot remains by far the best solution we’ve ever discovered to seizing and holding a reader’s attention. It isn’t about realism or accuracy, but about allowing a novel or story to reach its ultimate goal, which is to be read in its entirety. And if there’s any possible way for a writer to express his ideas and feelings within the constraints of plot, he’d be a fool to do otherwise, any more than he’d allow his book to be published in an unreadable font, littered with typographical errors, or in any other form that impedes the reader’s engagement with the work itself.
Life is full of stories, but it’s rarely full of plots—that is, of events that have a clear beginning, middle, and end, or any kind of structure at all. Phases in one’s life tend to blur and overlap, and the ending, when it comes, is never as tidy as it is in fiction. As a result, some writers reject plot as inherently unrealistic, and they work hard to develop essentially shapeless or unconventional fictions that more closely mirror the messiness of life. There’s nothing wrong with this, and some of my favorite authors, like Proust, have done remarkable things with minimal plots. (There is a plot in Proust, incidentally, but it’s one that could be adequately covered in a medium-sized novella, rather than spread over seven large volumes.) But to reject plot because it seems unrealistic is to miss the point. Plot, I’m convinced, isn’t so much a flawed simulation of real life as a narrative convention designed to guide the reader to the end of long, complicated fictions, by providing a series of wayposts or guides along the way. Like dialogue markers (“he said,” “she said”) or many of the conventions of realistic fiction, it’s less about realism than readability. And it’s dangerous to underestimate its importance.
In other words, plot—or more generally structure—is something like grammar. The rules of basic English usage are conventions intended to facilitate ease of communication. Some of these rules may seem arbitrary, and they are, but it’s still necessary to have a shared set of standards to maximize clarity, avoid ambiguity, and interpose as few obstacles as possible between the reader and the work itself. Grammar is the etiquette of language. Like etiquette, it’s designed to smooth out interactions and give us a set of guidelines to follow when we aren’t intuitively sure what the right course of action would be, and when in doubt, in absence of a good reason otherwise, it’s smartest to follow it. As John Gardner says in The Art of Fiction, if a writer has attempted a distracting stylistic innovation—like replacing the periods in a story with commas—it’s best to read the passage over many times, asking constantly whether the benefits of the change are outweighed by its potential inaccessibility. And in the majority of cases, it’s often best to work within existing conventions, rather then casting them aside at the risk of seeming frigid or self-indulgent.
And the same thing applies to plot. Plot is a set of conventions designed to accomplish exactly one thing: to get the reader to the end of the novel. A novel that remains unread, or only partially finished, has failed at its only undeniable purpose, which is to present a single organized vision to the reader. Works of nonfiction or certain extraordinary novels may hold a reader’s attention through philosophy, argument, or exceptional writing, but for the most part, the basic tools of creating and preserving interest are as old as storytelling itself: an interesting protagonist, personality expressed through action, and a clear series of objectives and conflicts. (I should also point out that an argument is also a kind of plot, and that even the most abstruse major works of philosophy tend to be invisibly structured to sustain the reader’s curiosity.) And that’s why plot matters. It’s a useful armature or framework for what the writer wants to say, assuming that he’s really interested in keeping his readers to the end. It can, and should be, questioned, undermined, and sometimes rejected, but it can’t be ignored.
It’s hard to believe, but over the past two years, I’ve posted more than six hundred quotes of the day. At first, this was simply supposed to be a way for me to add some new content on a daily basis without going through the trouble of writing a full post, but it ultimately evolved into something rather different. I ran through the obvious quotations fairly quickly, and the hunt for new material has been one of the most rewarding aspects of writing this blog, forcing me to look further afield into disciplines like theater, songwriting, dance, and computer science. Since we’re rapidly approaching this blog’s second anniversary, I thought it might be useful, or at least amusing, to pick out twenty of my own favorites. Some are famous, others less so, but in one way or another they’ve been rattling around in my brain for a long time, and I hope they’ll strike up a spark or two in yours:
Be well-ordered in your life, and as ordinary as a bourgeois, in order to be violent and original in your work.
An artist must approach his work in the spirit of the criminal about to commit a crime.
The best way to have a good idea is to have a lot of ideas.
Poetry is not a turning loose of emotion, but an escape from emotion; it is not the expression of personality, but an escape from personality. But, of course, only those who have personality and emotions know what it means to want to escape from such things.
Graphical excellence is that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space.
Luck is the residue of design.
The first thing you do when you take a piece of paper is always put the date on it, the month, the day, and where it is. Because every idea that you put on paper is useful to you. By putting the date on it as a habit, when you look for what you wrote down in your notes, you will be desperate to know that it happened in April in 1972 and it was in Paris and already it begins to be useful. One of the most important tools that a filmmaker has are his/her notes.
Immature artists imitate. Mature artists steal.
The worst error of the older Shakespeare criticism consisted in regarding all the poet’s means of expression as well-considered, carefully pondered, artistically conditioned solutions and, above all, in trying to explain all the qualities of his characters on the basis of inner psychological motives, whereas, in reality, they have remained very much as Shakespeare found them in his sources, or were chosen only because they represented the most simple, convenient, and quickest solution of a difficulty to which the dramatist did not find it worth his while to devote any further trouble.
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
Great narrative is not the opposite of cheap narrative: it is soap opera plus.
You must train day and night in order to make quick decisions.
I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading. When I used to teach creative writing, I would tell the students to make their characters want something right away—even if it’s only a glass of water. Characters paralyzed by the meaninglessness of modern life still have to drink water from time to time.
The best question I ask myself is: What would a playwright do?
Mechanical excellence is the only vehicle of genius.
To achieve great things, two things are needed: a plan, and not quite enough time.
—Attributed to Leonard Bernstein
If you have taken the time to learn to write beautiful, rock-firm sentences, if you have mastered evocation of the vivid and continuous dream, if you are generous enough in your personal character to treat imaginary characters and readers fairly, if you have held onto your childhood virtues and have not settled for literary standards much lower than those of the fiction you admire, then the novel you write will eventually be, after the necessary labor of repeated revisions, a novel to be proud of, one that almost certainly someone, sooner or later, will be glad to publish.
If you wrote something for which someone sent you a check, if you cashed the check and it didn’t bounce, and if you then paid the light bill with the money, I consider you talented.
You can’t win, you can’t break even, and you can’t get out of the f—king game.
He wishes he had never entered the funhouse. But he has. Then he wishes he were dead. But he’s not. Therefore he will construct funhouses for others and be their secret operator—though he would rather be among the lovers for whom funhouses are designed.
Finally, the true novelist is the one who doesn’t quit. Novel-writing is not so much a profession as a yoga, or “way,” an alternative to ordinary life-in-the-world. Its benefits are quasi-religious—a changed quality of mind and heart, satisfactions no non-novelist can understand—and its rigors generally bring no profit except to the spirit. For those who are authentically called to the profession, spiritual profits are enough.
As I see it, two lessons can be drawn from the Mike Daisey fiasco: 1. If a story seems too good to be true, it probably is. 2. A “journalist” who makes himself the star of his own story is automatically suspect. This last point is especially worth considering. I’ve spoken before about the importance of detachment toward one’s own work, primarily as a practical matter: the more objective you are, the more likely you are to produce something that will be of interest to others. But there’s an ethical component here as well. Every writer, by definition, has a tendency toward self-centeredness: if we didn’t believe that our own thoughts and feelings, or at least our modes of expression, were exceptionally meaningful, we wouldn’t feel compelled to share them. When properly managed, this need to impose our personalities on the world is what results in most works of art. Left unchecked, it can lead to arrogance, solipsism, and a troubling tendency to insert ourselves into the spotlight. This isn’t just an artistic shortcoming, but a moral one. John Gardner called it frigidity: an inability to see what really counts. And frigidity paired with egotism is a dangerous combination.
Simply put, whenever an author, especially of a supposed work of nonfiction, makes himself the star of a story where he obviously doesn’t belong, it’s a warning sign. This isn’t just because it reveals a lack of perspective—a refusal to subordinate oneself to the real source of interest, which is almost never the author himself—but because it implies that other compromises have been made. Mike Daisey is far from the worst such offender. Consider the case of Greg Mortenson, who put himself at the center of Three Cups of Tea in the most self-flattering way imaginable, and was later revealed not only to have fabricated elements of his story, but to have misused the funds his charity raised as a result. At first glance, the two transgressions might not seem to have much in common, but the root cause is the same: a tendency to place the author’s self and personality above all other considerations. On one level, it led to self-aggrandizing falsehood in a supposed memoir; on another, to a charity that spent much of its money, instead of building schools, on Mortenson’s speaking tours and advertisements for his books.
It’s true that some works of nonfiction benefit from the artist’s presence: I wouldn’t want to take Werner Herzog out of Grizzly Man or Claude Lanzmann out of Shoah. But for the most part, documentaries that place the filmmaker at the center of the action should raise our doubts as viewers. Sometimes it leads to a blurring of the message, as when Michael Moore’s ego overwhelms the valid points he makes. Occasionally, it results in a film like Catfish, in which the blatant self-interest of the filmmakers taints the entire movie. And it’s especially problematic in films that try to tackle complex social issues. (It took me a long time to see past the director’s presence in The Cove, for instance, to accept it as the very good movie it really is. But it would have been even better without the director’s face onscreen.)
One could argue, of course, that all forms of journalism, no matter how objective, are implicitly written in the first person, and that every documentary is shaped by an invisible process of selection and arrangement. Which is true enough. But a real artist expresses himself in his choice of details in the editing room, not by inserting himself distractingly into the frame. We rarely, if ever, see Errol Morris in his own movies, while David Simon—who manifestly does not suffer from a lack of ego—appears in Homicide: A Year on the Killing Streets only in the last couple of pages. These are men with real personalities and sensibilities who express themselves unforgettably in the depiction of other strong personalities in their movies and books. In the end, we care about Morris and Simon because they’ve made us care about other people. They’ve earned the right to interest us in their opinions through the painstaking application of craft, not, like Mortenson or Daisey, with self-promoting fabrication. There will always be exceptions, but in most cases, an artist’s best approach lies in invisibility and detachment. Because in the end, you’re only as interesting as the facts you present.
Intuition is getting a bad rap these days. As both the book and movie of Moneyball have made clear, the intuition of baseball scouts is about as useful as random chance, and the same might be said of stock pickers, political pundits, and all other supposed sources of insight whose usefulness is rarely put to a rigorous test. Intuition, it seems, is really just another word for blind guessing, at least as far as accuracy is concerned. The recent book Thinking, Fast and Slow, by Nobel Prize-winning economist Daniel Kahneman, goes even further, providing countless illustrations of how misleading our intuition can be, and how easily it can be distracted by irrelevant factors. (For example, something as simple as rolling a certain number on a rigged roulette wheel can influence our estimates of, say, how many African countries are in the United Nations. Don’t ask me how or why, but Kahneman’s data speaks for itself.)
And yet it’s hard to give up on intuition entirely. For one thing, it’s faster. I believe it was Julian Jaynes who pointed out that intuition is really just another word for the acceleration of experience: after we’ve been forced to make decisions under similar circumstances a certain number of times, the intermediate logic falls away, and we’re left with what feels like an intuitive response. Play it in slow motion, and all the steps are still there, in infinitesimal form. This kind of intuition strikes me as essentially different from the sort debunked above, and it’s especially useful in the arts, when no amount of statistical analysis can take the place of the small, mysterious judgment calls that every artist makes on a daily basis. In writing, as in everything else, the fundamentals of craft are acquired with difficulty, then gradually internalized, freeing the writer’s conscious mind to deal with unique problems while intuition takes care of the rest. And without such intuitive shortcuts, a long, complex project like a novel would take forever to complete.
Every artist develops this sort of intuition sooner or later, making it possible to skip such intermediate steps. As I’ve noted before, Robert Graves has described it as proleptic or “slantwise” thinking, a form of logic that goes from A to C without pausing for B. All great creative artists have this faculty, and the greater the artist, the more pronounced it becomes. One of the most compelling descriptions of poetic intuition I’ve ever seen comes from John Gardner’s The Art of Fiction, in a brief aside about Shakespeare. Gardner points to the fact that in Hamlet, the normally indecisive prince has no trouble sending Rosencrantz and Guidenstern to their deaths offstage, and with almost no explanation, a detail that strikes some readers as inconsistent. “If pressed,” Gardner writes, “Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to.” Fair enough. But then Gardner continues:
But the explanation I’ve put in Shakespeare’s mouth is probably not the true one. The truth is very likely that almost without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter…Shakespeare’s instinct told him, “Get back to the business between Hamlet and Claudius,” and, sudden as lightning, he was back.
That intuition, “sudden as lightning,” is what every writer hopes to develop. And while none of us have it to the extent that Shakespeare did, it’s always satisfying to see it flash forth, even in a modest way. Earlier this week, while reading through the final version of City of Exiles, I noticed a place where the momentum of the story seemed to flag. I made a note of this, then moved on. Later that day, I was working on something else entirely when I suddenly realized how to fix the problem, which was just a matter of eliminating or tightening a couple of paragraphs. After making these changes, I read the chapter over again, but this was almost a formality: I knew the revisions would work. There’s no way of objectively measuring this, of course, and there were probably other approaches that would have worked as well or better. But intuition provided one possible solution when I needed it. And without many such moments, right or wrong, I’d never finish a novel at all.
Earlier this week, critic John Lucas of the Guardian wrote an article alarmingly headlined “Has plot driven out other kinds of story?” He points to what he calls the resurgence of plot in literary fiction—giving Gary Shteyngart’s Super Sad Love Story [sic] as an example, although he gets the title wrong—and wonders if contemporary fiction, influenced by film, has privileged plot above all other elements. (This seems manifestly untrue, at least on the literary side, but we’ll ignore that for now.) He wonders if Kafka would be published today, conveniently overlooking the fact that most of Kafka’s work wasn’t published at all until after his death. He makes the common but unsubstantiated claim that plotless or unresolved fiction is truer to life than its plotted equivalent, and gently slaps the wrist of novels in which, heaven forbid, “every scene advances the action.” In his conclusion, not surprisingly, he hedges a bit:
Plot, as one of many literary strategies, is fantastic: employed carefully it can lend extraordinary emotional resonance to a text. But we shouldn’t lose sight of the fact that it is not the only pleasure to be derived from great literature.
Lucas’s article isn’t a bad one, but I disagree with almost everything it says. Take the assertion in the second sentence quoted above. I don’t think that anyone, anywhere, has ever claimed that plot is the only pleasure to be derived from great literature. If anything, the opposite is true: people tend to underrate the importance of plot in our greatest writers. There’s a common assumption that Shakespeare, for instance, didn’t care about plot, or wasn’t especially good at it, because he took most of his stories from conventional sources. The fact is, though, he was great at plot, and clearly relished it. The sources of Hamlet or Lear contain only the barest outlines of the story, which Shakespeare ingeniously enriches with incident, character, and structure. His plays have the busiest plots in all of literature, and they’re far more intricate than merely commercial considerations would dictate, which implies that he enjoyed plot for its own sake.
I’ve talked about the merits of plot in a previous post, so I won’t repeat all of my points here. To me, though, plot is a joy, both in my own writing and in the work of others. Plot is both a heightening of reality and a reflection of it: life is full of plots and stories, and the construction of a plot that feels true to life and satisfying as art is one of the most extended challenges a writer can face. Removing the plot, with its necessary pattern of constraints, leaves the author free to indulge all of his worst impulses, a freedom that few writers have the discipline to survive. Indeed, I’d argue that the greatest thing about plot is its impersonality, even its coldness. In On Directing Film, David Mamet reminds us that a story is moving to the extent that the writer can leave things out, especially what is deeply felt and meaningful. And in the honest construction of a logical, surprising, inevitable plot, there’s very little room for affectation or self-indulgence.
In the end, plot isn’t the enemy; bad plots are—just as we need to guard against bad style, characterization, and theme. No element of fiction is inherently more worthwhile than any other, and attempts to privilege one above all others generally lead to what John Gardner calls frigidity, an elevation of one’s own personality over the demands of the story. Conversely, when all the elements work together, the effect can be overwhelming. A novel like J.M. Coetzee’s Disgrace, which the Guardian‘s sister paper recently named the best British, Irish, or Commonwealth novel of the past twenty-five years, is as beautifully plotted as they come, a work in which the structure of the story is inseparable from its deeper themes. For most of us, then, plot is the necessary matrix in which a novel can grow in ways that are true to the fictional dream, not to our own preoccupations. Plot, at its best, is a cure for vanity.
Yesterday, with some trepidation, I read through the partial manuscript of the sequel to The Icon Thief for the first time, going through the entire draft in close to one sitting. At this point, I’ve finished the first two sections of the novel, totaling about 100,000 words—although this will be cut considerably—and still need to write the conclusion, which will be much shorter. It seemed like a good time, then, to pause and take stock of what I have. For the past few months, I’ve been writing with my head down, doing a chapter a day and then moving on immediately to the next. The upshot is that I’ve got much of the story on paper, but still don’t know, precisely, what this novel is about, which is something you discover only after much brooding and rereading. And this is also how you discover the ending.
Ending any story is hard. When you write a novel, in particular, you generally have at least some idea of where the plot is headed, but ultimately the story creates its own momentum, accumulating episodes and images until it tells you, quite forcefully, where it wants to go. That’s why rereading the manuscript at this point is particularly important: images or characters that seem incidental at first glance may suddenly take on new life, until it becomes clear that no ending will be satisfying that doesn’t include some or all of these elements. In The Art of Fiction, John Gardner discusses this closing return of images at great length, going so far as to invent a sort of algebraic notation: let a represent a pair of bloody shoes, b a willow tree, c an orphan home, and so on, each of which conjures up associates of its own, until, at the end of the novel, we get something like this:
I have to admit, though, that I’ve been reading The Art of Fiction for most of my life and still have no idea what this diagram means—which only indicates that the return of images at the end of a novel is a complicated matter indeed. More useful, perhaps, is David Mamet’s concept of the slate piece. I’ve mentioned it before, but basically, in filmmaking, the slate piece is the fragment of film captured at the beginning of a take, just after the slate board has been clapped and withdrawn. Normally this piece of film, taken before the scene starts, is discarded, but sometimes it can be mined for useful footage during the editing process—say, if the director desperately needs a shot of the actor glancing to his left, in order to intercut it with a previously unrelated shot to pace up a scene. In Bambi vs. Godzilla, Mamet writes:
This accidental, extra, hidden piece of information is called the slate piece. And most of moviemaking, as a writer, a director, a designer, is the attempt not to invent but to discover that hidden information—the slate piece—that is already lurking in the film.
To a surprising extent, this is true for a novelist as well, especially as one approaches the end. So I definitely had Mamet’s slate piece in mind as I read over my first draft—which went pretty well, or at least as well as this sort of thing ever can. There’s barely a sentence or paragraph that doesn’t contain something cringeworthy, but as far as I can tell, the overall structure is sound, and it doesn’t seem to require any wholesale changes. The major issue, at the moment, is that it needs to be cut, which is only what I expected—I tend to write my drafts about twenty percent too long, which gives me a lot of leeway when the time comes to revise. I see scenes that need to be compressed, whole pages that need to be removed, chapters that need to be cut by half, which, fortunately, is something I know how to do. The hard part, then, will be finding the ending. This is what I’m going to be focusing on for the next four weeks.
Should an aspiring author write short stories or novels? If you’re getting your MFA, you won’t have much of a choice: as perceptive observers have pointed out elsewhere, the machinery of most writing programs is geared toward the production of short stories, to the point where they often don’t know how to deal with novelists at all. And if you’re trying to write for a living, your choices are equally limited: while it’s very hard to make a living as a novelist, at least it’s still possible, while not even the most prolific author can survive solely through short story sales. But if you’re somewhere between the two extremes—say, just starting to figure out that you want to write, but not sure if you want to make a life of it—the decision to pursue one form or the other is a crucial question with no obvious answer.
First, a basic point: writing short stories is hard. Harder, in some respects, than writing a novel. In most cases, a novel gains its power from the cumulative impact of its episodes, rather than from one particularly strong idea (though there are exceptions), so as long as a writer structures the story properly and creates interesting characters, the reader will be reasonably satisfied, even if the author doesn’t hit the target exactly. With a short story, there’s no room for error. There isn’t a lot of space for a writer to build narrative momentum, which means that the entire story—especially in the mystery and science fiction genres—has to turn on one good idea. And if the idea isn’t a strong one, not only will the reader dislike the story, but it probably won’t be published at all.
At this point, I should confess that I don’t really write short stories—at least not the kind I’ve described above. All the stories that I’ve published or sold have been novelettes, which is a rather different form. John Gardner’s definition of the novella, in The Art of Fiction, is probably the best place to start:
The novella can be defined only as a work shorter than a novel…and both longer and more episodic than a short story. I use the word “episodic” loosely here, meaning only that the novella usually has a series of climaxes, each more intense than the last, though it may be built—and perhaps in fact ought to be built—of one continuous action.
This description fits my own work pretty closely. My novelettes tend to be on the long side for short fiction (10,000-12,000 words is about average), with three distinct acts and a structure loosely based on the conventions of mystery or suspense fiction, though with a scientific twist. They’re novels in miniature, or embryonic novels, and have more in common with their longer equivalents than with the very specific requirements of the short story.
Why have I adopted this hybrid form? For one thing, I write this way because I tend to think naturally in profluent plots, and the novelette is the shortest form in which such a plot can acquire real momentum. For another, the “short stories” I love best are really more like novelettes: nearly all of Conan Doyle’s Sherlock Holmes stories, for instance, fall neatly into three acts and run somewhere in the neighborhood of 10,000 words. As a result, I understand the novelette in ways that I’m not sure I understand the classic short story, and it’s no accident that I’ve had much less success in writing conventional short fiction (though I do have one, “Warning Sign,” coming out in an anthology in a few months).
So where does that leave us with our opening question? As someone who loves both novels and short fiction, I’m glad I’ve done both: each form has taught me different things, and the lessons in one have shaped my work in the other. But though I’m clearly biased here, on a more practical level, I think that a writer might specifically benefit by writing novelettes—not novels, not short stories, but something in between. This approach, as I see it, solves a lot of problems. With practice, you can write a 10,000-word novelette in two or three weeks, as opposed to the year or more it might take to write a novel, and you’ll acquire many of the skills that a longer project demands. Plus you won’t get hung up on the technical perfection required by a publishable 3,000-word short story. Later, if you’re so inclined, you can scale your writing up or down, but in the meantime, there’s something to be said for aiming at the middle.
So where do you start? Tomorrow, using my own story “Kawataro” as an example, I’ll begin to discuss, step by step, how a novelette is made.
There’s an amusing tradition, at least as old as Boccaccio, that Dante wrote the first seven cantos of the Inferno before his exile from Florence, and then took up the story again with Canto VIII, after a gap of months or years in the writing process. To mark the resumption of his work, Dante opens the canto with the words Io dico, seguitando: “I say, continuing…” The story is almost certainly apocryphal, but it’s as good an illustration as I know for the fact that long stretches of inactivity may interrupt a writer’s work on a novel, or any long writing project, but that the final result needs to look as continuous as possible. (Unless, of course, you’re aiming for an impression of discontinuity, which may also be an illusion.)
Gaps can occur in the writing process for all sorts of reasons. Usually, it’s because other obligations of life or work have gotten in the way. Sometimes it’s because you feel inspiration flagging and decide to work on another project for a while instead. Or, most frighteningly, it’s because you’ve hit a wall, don’t know where to go with the story, and feel compelled to set it aside for a long time, possibly forever. (John Gardner was unable to work on one of his novels for months because he couldn’t decide if a certain character would accept a drink offered to her at a cocktail party.) And whatever the reason, when you do go back to work, you’ll often find that it’s hard to pick up again precisely where you left off.
This last problem is one that I’ve often encountered, simply because of the way I approach long writing projects. As I’ve said before, I tend to outline in great detail, but I also like being surprised by the story, and it’s hard to reconcile these two impulses. The only solution I’ve found, which has worked well enough for me so far, has been to outline the novel in installments: I’ll put together a detailed outline for Part I, then write that section of the novel, with only a vague sense of what happens in Part II. Then, once I’ve finished the first section, I’ll repeat the process for the next part. This way, I have the structure I need for each day’s work, but I’ve also retained the possibility of surprise, even if it means going back and heavily revising what I’ve written before.
But how do you pick up the thread of a story after taking such a long time off? In my experience, it helps to do what Dante did, or is alleged to have done: write a page or two tacitly acknowledging that you’re returning to the story after a long absence—a transitional scene, a long description, even a recapitulation of what has happened so far—as long as you revise it into invisibility in a subsequent draft. After all, this is only an extreme version of what happens every day when you’re writing a first draft, much of which consists of transitional material that you need to ease yourself into and out of the fictional dream. Nearly all of this stuff, especially at the beginning and end of each scene, will need to be cut. Which is fine. Nobody will ever see it but you. And once its purpose is served, like a military bridge, it can be blown to smithereens. The important thing, the only thing, is to get to the other side.
Today’s quote of the day comes from a fascinating interview with the poet Gary Snyder, which I came across yesterday after seeing it mentioned in Robert and Michèle Root-Bernstein’s stimulating book Sparks of Genius. The part of the interview that caught my eye goes as follows:
Say you wanted to be a poet, and you saw a man that you recognized as a master mechanic or a great cook. You would do better, for yourself as a poet, to study under that man than to study under another poet who was not a master, that you didn’t recognize as a master.
Snyder goes on to give a specific example:
I use the term master mechanic because I know a master mechanic, Rod Coburn. Whenever I spend any time with him, I learn something from him…About everything. But I see it in terms of my craft as a poet. I learn about my craft as a poet. I learn about what it really takes to be a craftsman, what it really means to be committed, what it really means to work.
Which struck me for a number of reasons. As a writer, I’ve always been conscious of the fact that much of what I’ve learned about the creative process comes from the work of nonliterary artists. Regular readers of this blog know how much I’ve learned about writing and editing from David Mamet and Walter Murch. My approach to my own work owes as much to The Mystery of Picasso or the video games of Shigeru Miyamoto as to John Gardner’s Art of Fiction. More recently, Stephen Sondheim’s Finishing the Hat, with its detailed descriptions of the lyricist’s craft, has been an endless source of instruction and encouragement.
The point of all this, I think, is that it’s easy to get caught up in the conventions of the craft—whether it’s fiction, poetry, art, or something else entirely—that you know best. Studying other forms of art is one way, and perhaps the best, of knocking yourself out of your usual assumptions. And I don’t think I’m alone in this. I recently came across an interview with cartoonist Daniel Clowes in which he explained how his work in film (including Ghost World and Art School Confidential) has influenced the way he plans his comics:
To me, the most useful experience in working in “the film industry” has been watching and learning the editing process. You can write whatever you want and try to film whatever you want, but the whole thing really happens in that editing room. How do you edit comics? If you do them in a certain way, the standard way, it’s basically impossible. That’s what led me to this approach of breaking my stories into segments that all have a beginning and end on one, two, three pages. This makes it much easier to shift things around, to rearrange parts of the story sequence.
And the best way to put lessons from other media to work, as Snyder points out, is to study the masters. This week, if time permits, I’m going to be talking about a handful of artists in other media—music, comics, film, and television—that have influenced the way I approach my own writing.
Much of the dialogue one encounters in student fiction, as well as plot, gesture, even setting, comes not from life but from life filtered through TV. Many student writers seem unable to tell their own most important stories—the death of a father, the first disillusionment in love—except in the molds and formulas of TV. One can spot the difference at once because TV is of necessity—given its commercial pressures—false to life.
In the nearly thirty years since Gardner wrote these words, the television landscape has changed dramatically, but it’s worth pointing out that much of what he says here is still true. The basic elements of fiction—emotion, character, theme, even plot—need to come from close observation of life, or even the most skillful novel will eventually ring false. That said, the structure of fiction, and the author’s understanding of the possibilities of the form, doesn’t need to come from life alone, and probably shouldn’t. To develop a sense of what fiction can do, a writer needs to pay close attention to all types of art, even the nonliterary kind. And over the past few decades, television has expanded the possibilities of narrative in ways that no writer can afford to ignore.
If you think I’m exaggerating, consider a show like The Wire, which tells complex stories involving a vast range of characters, locations, and social issues in ways that aren’t possible in any other medium. The Simpsons, at least in its classic seasons, acquired a richness and velocity that continued to build for years, until it had populated a world that rivaled the real one for density and immediacy. (Like the rest of the Internet, I respond to most situations with a Simpsons quote.) And Mad Men continues to furnish a fictional world of astonishing detail and charm. World-building, it seems, is where television shines: in creating a long-form narrative that begins with a core group of characters and explores them for years, until they can come to seem as real as one’s own family and friends.
Which is why Glee can seem like such a disappointment. Perhaps because the musical is already the archest of genres, the show has always regarded its own medium with an air of detachment, as if the conventions of the after-school special or the high school sitcom were merely a sandbox in which the producers could play. On some level, this is fine: The Simpsons, among many other great shows, has fruitfully treated television as a place for narrative experimentation. But by turning its back on character continuity and refusing to follow any plot for more than a few episodes, Glee is abandoning many of the pleasures that narrative television can provide. Watching the show run out of ideas for its lead characters in less than two seasons simply serves as a reminder of how challenging this kind of storytelling can be.
Mad Men, by contrast, not only gives us characters who take on lives of their own, but consistently lives up to those characters in its acting, writing, and direction. (This is in stark contrast to Glee, where I sense that a lot of the real action is taking place in fanfic.) And its example has changed the way I write. My first novel tells a complicated story with a fairly controlled cast of characters, but Mad Men—in particular, the spellbinding convergence of plots in “Shut the Door, Have a Seat”—reminded me of the possibilities of expansive casts, which allows characters to pair off and develop in unexpected ways. (The evolution of Christina Hendricks’s Joan from eye candy to second lead is only the most obvious example.) As a result, I’ve tried to cast a wider net with my second novel, using more characters and settings in the hopes that something unusual will arise. Television, strangely, has made me more ambitious. I’d like to think that even John Gardner would approve.
So work on my second novel is coming along pretty well. Research is winding down; location work is finished. I’ve got a fairly good outline for Part I, a sense of the personalities and backgrounds of a dozen important—though still nameless—characters, and…
Hold on. I have a dozen important characters, but aside from a few holdovers from my first book, I haven’t named them yet. And I need to come up with some names soon. I have just over two weeks before I start writing, but even in the meantime, there’s only so much work I can do with signifiers like “best friend” and “ruthless assassin.” (Note: not the same person.) Characters need names before they can really come to life. And it’s often this step, even before the real imaginative work begins, that feels the most frustrating, if only because it seems so important.
Sometimes I use characters from real life, and sometimes I use their real names—when I do, it’s always in celebration of people that I like. Once or twice, as in October Light, I’ve borrowed other people’s fictional characters. Naming is only a problem, of course, when you make the character up. It seems to me that every character—every person—is an embodiment of a very complicated, philosophical way of looking at the world, whether conscious or not. Names can be strong clues to the character’s system. Names are magic. If you name a kid John, he’ll grow up a different kid than if you named him Rudolph.
I can’t speak to the experience of other writers, but for me, coming up with names for characters becomes more of a nightmare with every story. Unless you’re Thomas Pynchon, who can get away with names like Osbie Feel and Tyrone Slothrop, names need to be distinctive, but not so unusual that they distract the reader; evocative, but natural; easily differentiated from one another; not already possessed by a celebrity or more famous fictional character; and fairly invisible in their origins. (I still haven’t forgiven Michael Crichton for the “Lewis Dodgson” of Jurassic Park.) As a result, it takes me the better part of a day come up with even ten passable names. And it isn’t going to get any easier: the more stories I write, the more names I use, which means that the pool of possibilities is growing ever smaller.
So what do I do? Whatever works. Sometimes a character will have a particular ethnic or national background, like the seemingly endless parade of Russians in Kamera and its sequel, which provides one possible starting point. (Wikipedia’s lists are very useful, especially now that I no longer have a phone book.) I’ll consult baby name sites, scan my bookshelves, and occasionally name characters after friends or people I admire. And the names are always nudging and jostling one another: I try to avoid giving important characters names that sound similar or begin with the same first letter, for example, which means that a single alteration may require numerous other adjustments.
Is it worth it? Yes and no. It certainly isn’t for the sake of the reader, who isn’t supposed to notice any of this—the best character names, I’m convinced, are invisible. And with few exceptions, I’d guess that even the names that feel inevitable now were, in fact, no better or worse than many alternatives: if Conan Doyle had gone with his first inclination, it’s quite possible that we’d all be fans of Ormond Sacker and Sherrinford Holmes. But for the writer, it’s an excuse to brood and meditate on the essence of each character, even if the result barely attracts the reader’s attention. So I feel well within my rights to overthink it. (Although I’m a little worried about what might happen if I ever have to name a baby.)
Today the AV Club tackles an issue that is very close to my own heart: to what extent can we enjoy art that contradicts our own moral beliefs? The ensuing discussion spans a wide range of works, from Gone With the Wind to the films of Roman Polanski and Mel Gibson, but I’m most intrigued by an unspoken implication: that morally problematic works of art are often more interesting, and powerful, than those that merely confirm our existing points of view. When our moral convictions are challenged, it seems, it can yield the same sort of pleasurable dissonance that we get from works that subvert our aesthetic assumptions. The result can be great art, or at least great entertainment.
For me, the quintessential example is 24, a show that I loved for a long time, until it declined precipitously after the end of the fifth season. Before then, it was the best dramatic series on television, and its reactionary politics were inseparable from its appeal. Granted, the show’s politics were more about process than result—nearly every season ended with the exposure of a vast right-wing conspiracy, even if it was inevitably uncovered through massive violations of due process and civil rights—and it seems that the majority of the show’s writers and producers, aside from its creator, were politically liberal to moderate. Still, the question remains: how did they end up writing eight seasons’ worth of stories that routinely endorsed the use of torture?
The answer, I think, is that the writers were remaining true to the rules that the show had established: in a series where the American public is constantly in danger, and where the real-time structure of the show itself rules out the possibility of extended investigations—or even interrogations that last more than five minutes—it’s easier and more efficient to show your characters using torture to uncover information. The logic of torture on 24 wasn’t political, but dramatic. And while we might well debate the consequences of this portrayal on behavior in the real world, there’s no denying that it resulted in compelling television, at least for the first five seasons.
The lesson here, as problematic as it might seem, is that art needs to follow its own premises to their logical conclusion, even if the result takes us into dangerous places. (As Harold Bloom likes to point out, reading Shakespeare will not turn us into better citizens.) And this is merely the flip side of another crucial point, which is that works of art knowingly designed to endorse a particular philosophy are usually awful, no matter where they fall on the political spectrum. At worst, such works are nothing but propaganda; and even at their best, they seem calculated and artificial, rather than honestly derived, however unwillingly, from the author’s own experience. As usual, John Gardner, in The Art of Fiction, says it better than I can:
The question, to pose it one last way, is this: Can an argument manipulated from the start by the writer have the same emotional and intellectual power as an argument to which the writer is forced by his intuition of how life works? Comparisons are odious but instructive: Can a Gulliver’s Travels, however brilliantly executed, ever touch the hem of the garment of a play like King Lear? Or: Why is the Aeneid so markedly inferior to the Iliad?
In my own work, I’ve found that it’s often more productive to deliberately construct a story that contradicts my own beliefs and see where it leads me from there. My novelette “The Last Resort” (Analog, September 2009) is designed to imply sympathy, or even complicity, with ecoterrorism, which certainly goes against my own inclinations. And I’m in the middle of outlining a novel in which the main character is a doubting Mormon whose experiences, at least as I currently conceive the story, actually lead her to become more devout. This sort of thing is harder than writing stories that justify what I already believe, but that’s part of the point. In writing, if not in life, it’s often more useful to do things the hard way.
Runners of the hundred-yard dash do not take off in the same way runners of the marathon do. If the opening pages of a thousand-page novel would serve equally well as the opening pages of a short story, the likelihood is that the novel-opening is wrong.
—John Gardner, The Art of Fiction