Posts Tagged ‘John McPhee’
My ten creative books #8: The Silent Woman
Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.
For various reasons, there are fewer useful books on the craft of literary nonfiction than there are on writing novels. This may just be a result of market demand, since more people seem to think that they might make good novelists than biographers or journalists. (As W.H. Auden devastatingly notes: “In our age, if a young person is untalented, the odds are in favor of his imagining he wants to write.” And he was probably thinking of aspiring fiction writers.) This is a gap that needs to be filled—I’ve learned firsthand that writing a nonfiction book can be practical and rewarding in itself, and I wish that I’d had more models to follow. In recent years, there have been a number of notable efforts, including Good Prose by Tracy Kidder and Richard Todd and the indispensable Draft No. 4 by John McPhee. But by far the best work on the subject that I’ve found is The Silent Woman: Sylvia Plath and Ted Hughes by Janet Malcolm, which, as I recently noted, is probably the best book of any kind that I’ve read in years. It isn’t a guidebook, and if anything, reading it might dissuade a lot of writers from tackling nonfiction at all. Those who persist, however, are rewarded with a book that has more insights per page into the creative process than almost any other that I can name. To pick just one example at random, here’s Malcolm on the biographer’s use of letters:
Letters are the great fixative of experience. Time erodes feeling. Time creates indifference. Letters prove to us that we once cared. They are the fossils of feeling. This is why biographers prize them so: they are biography’s only conduit to unmediated experience. Everything else the biographer touches is stale, hashed over, told and retold, dubious, unauthentic, suspect. Only when he reads a subject’s letters does the biographer feel he has come fully into his presence, and only when he quotes from the letters does he share with his readers the sense of life retrieved. And he shares something else: the feeling of transgression that comes from reading letters not meant for one’s eyes.
And perhaps the book’s most memorable passage comes after Malcolm visits the home of a minor player in the Sylvia Plath saga, who turns out to be a hoarder. Afterward, it strikes her that the house was “a kind of monstrous allegory of truth,” both in how we look at the world around us and in how we face the problem of writing:
This is the way things are, the place says. This is unmediated actuality, in all its multiplicity, randomness, inconsistency, redundancy, authenticity. Before the magisterial mess…the orderly houses that most of us live in seem meagre and lifeless—as, in the same way, the narratives called biographies pale and shrink in the face of the disorderly actuality that is a life…Each person who sits down to write faces not a blank page but his own vastly overfilled mind. The problem is to clear out most of what is in it, to fill huge plastic garbage bags with a confused jumble of things that have accreted there over the days, months, years of being alive and taking things in through the eyes and ears and heart. The goal is to make a space where a few ideas and images and feelings may be so arranged that the reader will want to linger a while among them, rather than to flee…But this task of housecleaning (of narrating) is not merely arduous; it is dangerous. There is the danger of throwing the wrong things out and keeping the wrong things in; there is the danger of throwing too much out and being left with too bare a house; there is the danger of throwing everything out.
Malcolm concludes: “Once one starts throwing out, it may become hard to stop. It may be better not to start. It may be better to hang onto everything…lest one be left with nothing.” Obviously, she hasn’t listened to her own advice, and we’re all the better for it. But that doesn’t mean that she—or the reader—has to be fine with the outcome.
The stuff of thought
On December 4, 1972, the ocean liner SS Statendam sailed from New York to Florida, where its passengers would witness the launch of Apollo 17, the final manned mission to the moon. The guests on the cruise included Isaac Asimov, Robert A. Heinlein, Frederik Pohl, Theodore Sturgeon, Norman Mailer, Katherine Anne Porter, and the newscaster Hugh Downs. It’s quite a story, and I’ve written about it elsewhere at length. What I’d like to highlight today, though, is what was happening a few miles away on shore, as Tom Wolfe recounts in the introduction to the paperback edition of The Right Stuff:
This book grew out of some ordinary curiosity. What is it, I wondered, that makes a man willing to sit up on top of an enormous Roman candle, such as a Redstone, Atlas, Titan, or Saturn rocket, and wait for someone to light the fuse? I decided on the simplest approach possible. I would ask a few astronauts and find out. So I asked a few in December of 1972 when they gathered at Cape Canaveral to watch the last mission to the moon, Apollo 17. I discovered quickly enough that none of them, no matter how talkative otherwise, was about to answer the question or even linger for more than a few seconds on the subject at the heart of it, which is to say, courage.
Wolfe’s “ordinary curiosity” led him to tackle a project that would consume him for the better part of a decade, driven by his discovery of “a rich and fabulous terrain that, in a literary sense, had remained as dark as the far side of the moon for more than half a century: military flying and the modern American officer corps.”
And my mind sometimes turns to the contrast between Wolfe, trying to get the astronauts to open up about their experiences, and the writers aboard the Statendam. You had Mailer, of course, who had written his own book on the moon, and the result was often extraordinary. It was more about Mailer himself than anything else, though, and during the cruise, he seemed more interested in laying out his theory of the thanatosphere, an invisible region around the moon populated by the spirits of the dead. Then you had such science fiction writers as Heinlein and Asimov, who would occasionally cross paths with real astronauts, but whose fiction was shaped by assumptions about the competent man that had been formed decades earlier. Wolfe decided to go to the source, but even he kept the pulps at the back of his mind. In his introduction, speaking of the trend in military fiction after World War I, he observes:
The only proper protagonist for a tale of war was an enlisted man, and he was to be presented not as a hero but as Everyman, as much a victim of war as any civilian. Any officer above the rank of second lieutenant was to be presented as a martinet or a fool, if not an outright villain, no matter whom he fought for. The old-fashioned tale of prowess and heroism was relegated to second- and third-rate forms of literature, ghostwritten autobiographies, and stories in pulp magazines on the order of Argosy and Bluebook.
Wolfe adds: “Even as late as the 1930s the favorite war stories in the pulps concerned World War I pilots.” And it was to pursue “the drama and psychology” of this mysterious courage in the real world that he wrote The Right Stuff.
The result is a lasting work of literary journalism, as well as one of the most entertaining books ever written, and we owe it to the combination of Wolfe’s instinctive nose for a story and his obsessiveness in following it diligently for years. Last year, in a review of John McPhee’s new collection of essays, Malcolm Harris said dryly: “I would recommend Draft No. 4 to writers and anyone interested in writing, but no one should use it as a professional guide uncritically or they’re liable to starve.” You could say much the same about Wolfe, who looks a lot like the kind of journalist we aren’t likely to see again, in part because the market has changed, but also because this kind of luck can be hard for anyone to sustain over the course of a career. Wolfe hit the jackpot on multiple occasions, but he also spent years on books that nobody read—Back to Blood, his last novel, cost its publisher a hundred dollars for every copy that it sold. (Toward the end, he could even seem out of his depth. It probably isn’t a coincidence that I never read I Am Charlotte Simmons, a novel about “Harvard, Yale, Princeton, Stanford, Duke, and a few other places all rolled into one” that was published a few years after I graduated from college. Wolfe’s insights into undergraduate life, delivered with his customary breathlessness, didn’t seem useful for understanding an experience that I had just undergone, and I’ve never forgotten the critic who suggested that the novel should have been titled I Am Easily Impressed.)
But that’s also the kind of risk required to produce major work. Wolfe’s movement from nonfiction to novels still feels like a loss, and I think that it deprived us of two or three big books of the kind that he could write better than anyone else. (It’s too bad that he never wrote anything about science fiction, which is a subject that could only be grasped by the kind of writer who could produce both The Right Stuff and The Electric Kool-Aid Acid Test.) Yet it isn’t always the monumental achievements that matter. In fact, when I think of what Wolfe has meant to me, it’s his offhand critical comments that have stuck in my head. The short introduction that he wrote to a collection of James M. Cain’s novels, in which he justifiably praised Cain’s “momentum,” has probably had a greater influence on my own style—or at least my aspirations for it—than any other single piece of criticism. His description of Umberto Eco as “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac” is one that I’ll always remember, mostly because he might have been speaking of me. In college, I saw him give a reading once, shortly before the release of the collection Hooking Up. I was struck by his famous white suit, of course, but what I’ll never forget is the moment, just before he began to read, when he reached into his inside pocket and produced a pair of reading glasses—also spotlessly white. It was a perfect punchline, with the touch of the practiced showman, and it endeared Wolfe to me at times when I grew tired of his style and opinions. His voice and his ambition inspired many imitators, but at his best, it was the small stuff that set him apart.
Checks and balances
About a third of the way through my upcoming book, while discussing the May 1941 issue of Astounding Science Fiction, I include the sentence: “The issue also featured Heinlein’s “Universe,” which was based on Campbell’s premise about a lost generation starship.” My copy editor amended this to “a lost-generation starship,” to which I replied: “This isn’t a ‘lost-generation’ starship, but a generation starship that happens to be lost.” And the exchange gave me a pretty good idea for a story that I’ll probably never write. (I don’t really have a plot for it yet, but it would be about Hemingway and Fitzgerald on a trip to Alpha Centauri, and it would be called The Double Sun Also Rises.) But it also reminded me of one of the benefits of a copy edit, which is its unparalleled combination of intense scrutiny and total detachment. I sent drafts of the manuscript to some of the world’s greatest nitpickers, who saved me from horrendous mistakes, and the result wouldn’t be nearly as good without their advice. But there’s also something to be said for engaging the services of a diligent reader who doesn’t have any connection to the subject. I deliberately sought out feedback from a few people who weren’t science fiction fans, just to make sure that it remained accessible to a wider audience. And the ultimate example is the copy editor, who is retained to provide an impartial consideration of every semicolon without any preconceived notions outside the text. It’s what Heinlein might have had in mind when he invented the Fair Witness, who said when asked about the color of a nearby house: “It’s white on this side.”
But copy editors are human beings, not machines, and they occasionally get their moment in the spotlight. Recently, their primary platform has been The New Yorker, which has been quietly highlighting the work of its copy editors and fact checkers over the last few years. We can trace this tendency back to Between You & Me, a memoir by Mary Norris that drew overdue attention to the craft of copy editing. In “Holy Writ,” a delightful excerpt in the magazine, Norris writes of the supposed objectivity and rigor of her profession: “The popular image of the copy editor is of someone who favors rigid consistency. I don’t usually think of myself that way. But, when pressed, I do find I have strong views about commas.” And she says of their famous detachment:
There is a fancy word for “going beyond your province”: “ultracrepidate.” So much of copy editing is about not going beyond your province. Anti-ultracrepidationism. Writers might think we’re applying rules and sticking it to their prose in order to make it fit some standard, but just as often we’re backing off, making exceptions, or at least trying to find a balance between doing too much and doing too little. A lot of the decisions you have to make as a copy editor are subjective. For instance, an issue that comes up all the time, whether to use “that” or “which,” depends on what the writer means. It’s interpretive, not mechanical—though the answer often boils down to an implicit understanding of commas.
In order to be truly objective, in other words, you have to be a little subjective. Which equally true of writing as a whole.
You could say much the same of the fact checker, who resembles the copy editor’s equally obsessive cousin. As a rule, books aren’t fact-checked, which is a point that we only seem to remember when the system breaks down. (Astounding was given a legal read, but I was mostly on my own when it came to everything else, and I’m grateful that some of the most potentially contentious material—about L. Ron Hubbard’s writing career—drew on an earlier article that was brilliantly checked by Matthew Giles of Longreads.) As John McPhee recently wrote of the profession:
Any error is everlasting. As Sara [Lippincott] told the journalism students, once an error gets into print it “will live on and on in libraries carefully catalogued, scrupulously indexed…silicon-chipped, deceiving researcher after researcher down through the ages, all of whom will make new errors on the strength of the original errors, and so on and on into an exponential explosion of errata.” With drawn sword, the fact-checker stands at the near end of this bridge. It is, in part, why the job exists and why, in Sara’s words, a publication will believe in “turning a pack of professional skeptics loose on its own galley proofs.”
McPhee continues: “Book publishers prefer to regard fact-checking as the responsibility of authors, which, contractually, comes down to a simple matter of who doesn’t pay for what. If material that has appeared in a fact-checked magazine reappears in a book, the author is not the only beneficiary of the checker’s work. The book publisher has won a free ticket to factual respectability.” And its absence from the publishing process feels like an odd evolutionary vestige of the book industry that ought to be fixed.
As a result of such tributes, the copy editors and fact checkers of The New Yorker have become cultural icons in themselves, and when an error does make it through, it can be mildly shocking. (Last month, the original version of a review by Adam Gopnik casually stated that Andrew Lloyd Webber was the composer of Chess, and although I knew perfectly well that this was wrong, I had to look it up to make sure that I hadn’t strayed over into a parallel universe.) And their emergence at this particular moment may not be an accident. The first installment of “Holy Writ” appeared on February 23, 2015, just a few months before Donald Trump announced that he was running for president, plunging us all into world in which good grammar and factual accuracy can seem less like matters of common decency than obstacles to be obliterated. Even though the timing was a coincidence, it’s tempting to read our growing appreciation for these unsung heroes as a statement about the importance of the truth itself. As Alyssa Rosenberg writes in the Washington Post:
It’s not surprising that one of the persistent jokes from the Trump era is the suggestion that we’re living in a bad piece of fiction…Pretending we’re all minor characters in a work of fiction can be a way of distancing ourselves from the seeming horror of our time or emphasizing our own feelings of powerlessness, and pointing to “the writers” often helps us deny any responsibility we may have for Trump, whether as voters or as journalists who covered the election. But whatever else we’re doing when we joke about Trump and the swirl of chaos around him as fiction, we’re expressing a wish that this moment will resolve in a narratively and morally comprehensible fashion.
Perhaps we’re also hoping that reality itself will have a fact checker after all, and that the result will make a difference. We don’t know if it will yet. But I’m hopeful that we’ll survive the exponential explosion of errata.
Life on the last mile
In telecommunications, there’s a concept called “the last mile,” which states that the final leg of a network—the one that actually reaches the user’s home, school or office—is the most difficult and expensive to build. It’s one thing to construct a massive trunkline, which is basically a huge but relatively straightforward feat of engineering, and quite another to deal with the tangle of equipment, wiring, and specifications on the level of thousands of individual households. More recently, the concept has been extended to public transportation, delivery and distribution services, and other fields that depend on connecting an industrial operation on the largest imaginable scale with specific situations on the retail side. (For instance, Amazon has been trying to cross the last mile through everything from its acquisition of Whole Foods to drone delivery, and the fact that these are seen as alternative approaches to the same problem points to how complicated it really is.) This isn’t just a matter of infrastructure, either, but of the difficulties inherent to any system in which a single pipeline has to split into many smaller branches, whether it’s carrying blood, water, mail, or data. Ninety percent of the wiring can be in that last mile, and success lies less in any overall principles than in the irritating particulars. It has to be solved on the ground, rather than in a design document, and you’ll never be able to anticipate all of the obstacles that you’ll face once those connections start to multiply. It’s literally about the ramifications.
I often feel the same way when it comes to writing. When I think back at how I’ve grown as a writer over the last decade or so, I see clear signs of progress. Thanks mostly to the guidelines that David Mamet presents in On Directing Film, it’s much easier for me to write a decent first draft than it was when I began. I rarely leave anything unfinished; I know how to outline and how to cut; and I’m unlikely to make any huge technical mistakes. In his book Which Lie Did I Tell?, William Goldman says something similar about screenwriting:
Stephen Sondheim once said this: “I cannot write a bad song. You begin it here, build, end there. The words will lay properly on the music so they can be sung, that kind of thing. You may hate it, but it will be a proper song.” I sometimes feel that way about my screenplays. I’ve been doing them for so long now, and I’ve attempted most genres. I know about entering the story as late as possible, entering each scene as late as possible, that kind of thing. You may hate it, but it will be a proper screenplay.
Craft, in other words, can take you most of the way—but it’s the final leg that kills you. As Goldman concludes of his initial pass on the script for Absolute Power: “This first draft was proper as hell—you just didn’t give a shit.” And sooner or later, most writers find that they spend most of their time on that last mile.
Like most other art forms, creative writing can indeed be taught—but only to the point that it still resembles an engineering problem. There are a few basic tricks of structure and technique that will improve almost anyone’s work, much like the skills that you learn in art books like Drawing on the Right Side of the Brain, and that kind of advancement can be enormously satisfying. When it comes to the last mile between you and your desired result, however, many of the rules start to seem useless. You aren’t dealing with the general principles that have gotten you this far, but with problems that arise on the level of individual words or sentences, each one of which needs to be tackled on its own. There’s no way of knowing whether or not you’ve made the right choice until you’ve looked at them all in a row, and even if something seems wrong, you may not know how to fix it. The comforting shape of the outline, which can be assembled in a reasonably logical fashion, is replaced by the chaos of the text, and the fact that you’ve done good work on this level before is no guarantee that you can do it right now. I’ve learned a lot about writing over the years, but to the extent that I’m not yet the writer that I want to be, it lies almost entirely in that last mile, where the ideal remains tantalizingly out of reach.
As a result, I end up revising endlessly, even a late stage, and although the draft always gets better, it never reaches perfection. After a while, you have to decide that it’s as good as it’s going to get, and then move on to something else—which is why it helps to have a deadline. But you can take comfort in the fact that the last mile affects even the best of us. In a recent New York Times profile of the playwright Tony Kushner, Charles McGrath writes:
What makes Angels in America so complicated to stage is not just Mr. Kushner’s need to supervise everything, but that Perestroika, the second part, is to a certain extent a work in progress and may always be. The first part, Millennium Approaches, was already up and running in the spring of 1991, when, with a deadline looming, Mr. Kushner retreated to a cabin in Northern California and wrote most of Perestroika in a feverish eight-day stint, hardly sleeping and living on junk food. He has been tinkering with it ever since…Even during rehearsal last month he was still cutting, rewriting, restructuring.
If Tony Kushner is still revising Angels in America, it makes me feel a little better about spending my life on that last mile. Or as John McPhee says about knowing when to stop: “What I know is that I can’t do any better; someone else might do better, but that’s all I can do; so I call it done.”
This post has no title
In John McPhee’s excellent new book on writing, Draft No. 4, which I mentioned here the other day, he shares an anecdote about his famous profile of the basketball player Bill Bradley. McPhee was going over a draft with William Shawn, the editor of The New Yorker, “talking three-two zones, blind passes, reverse pivots, and the setting of picks,” when he realized that he had overlooked something important:
For some reason—nerves, what else?—I had forgotten to find a title before submitting the piece. Editors of every ilk seem to think that titles are their prerogative—that they can buy a piece, cut the title off the top, and lay on one of their own. When I was young, this turned my skin pink and caused horripilation. I should add that I encountered such editors almost wholly at magazines other than The New Yorker—Vogue, Holiday, the Saturday Evening Post. The title is an integral part of a piece of writing, and one of the most important parts, and ought not to be written by anyone but the writer of what follows the title. Editors’ habit of replacing an author’s title with one of their own is like a photo of a tourist’s head on the cardboard body of Mao Zedong. But the title missing on the Bill Bradley piece was my oversight. I put no title on the manuscript. Shawn did. He hunted around in the text and found six words spoken by the subject, and when I saw the first New Yorker proof the piece was called “A Sense of Where You Are.”
The dynamic that McPhee describes at other publications still exists today—I’ve occasionally bristled at the titles that have appeared over the articles that I’ve written, which is a small part of the reason that I’ve moved most of my nonfiction onto this blog. (The freelance market also isn’t what it used to be, but that’s a subject for another post.) But a more insidious factor has invaded even the august halls of The New Yorker, and it has nothing to do with the preferences of any particular editor. Opening the most recent issue, for instance, I see that there’s an article by Jia Tolentino titled “Safer Spaces.” On the magazine’s website, it becomes “Is There a Smarter Way to Think About Sexual Assault on Campus?”, with a line at the bottom noting that it appears in the print edition under its alternate title. Joshua Rothman’s “Jambusters” becomes “Why Paper Jams Persist.” A huge piece by David Grann, “The White Darkness,” which seems destined to get optioned for the movies, earns slightly more privileged treatment, and it merely turns into “The White Darkness: A Journey Across Antarctica.” But that’s the exception. When I go back to the previous issue, I find that the same pattern holds true. Michael Chabon’s “The Recipe for Life” is spared, but David Owen’s “The Happiness Button” is retitled “Customer Satisfaction at the Push of a Button,” Rachel Aviv’s “The Death Debate” becomes “What Does It Mean to Die?”, and Ian Frazier’s “Airborne” becomes “The Trippy, High-Speed World of Drone Racing.” Which suggests to me that if McPhee’s piece appeared online today, it would be titled something like “Basketball Player Bill Bradley’s Sense of Where He Is.” And that’s if he were lucky.
The reasoning here isn’t a mystery. Headlines are written these days to maximize clicks and shares, and The New Yorker isn’t immune, even if it sometimes raises an eyebrow. Back in 2014, Maria Konnikova wrote an article for the magazine’s website titled “The Six Things That Make Stories Go Viral Will Amaze, and Maybe Infuriate, You,” in which she explained one aspect of the formula for online headlines: “The presence of a memory-inducing trigger is also important. We share what we’re thinking about—and we think about the things we can remember.” Viral headlines can’t be allusive, make a clever play on words, or depend on an evocative reference—they have to spell everything out. (To build on McPhee’s analogy, it’s less like a tourist’s face on the cardboard body of Mao Zedong than an oversized foam head of Mao himself.) A year later, The New Yorker ran an article by Andrew Marantz on the virality expert Emerson Spartz, and it amazed and maybe infuriated me. I’ve written about this profile elsewhere, but looking it over again now, my eye was caught by these lines:
Much of the company’s success online can be attributed to a proprietary algorithm that it has developed for “headline testing”—a practice that has become standard in the virality industry…Spartz’s algorithm measures which headline is attracting clicks most quickly, and after a few hours, when a statistically significant threshold is reached, the “winning” headline automatically supplants all others. “I’m really, really good at writing headlines,” he told me.
And it’s worth noting that while Marantz’s piece appeared in print as “The Virologist,” in an online search, it pops up as “King of Clickbait.” Even as the magazine gently mocked Spartz, it took his example to heart.
None of this is exactly scandalous, but when you think of a title as “an integral part of a piece of writing,” as McPhee does, it’s undeniably sad. There isn’t any one title for an article anymore, and most readers will probably only see its online incarnation. And this isn’t because of an editor’s tastes, but the result of an impersonal set of assumptions imposed on the entire industry. Emerson Spartz got his revenge on The New Yorker—he effectively ended up writing its headlines. And while I can’t blame any media company for doing whatever it can to stay viable, it’s also a real loss. McPhee is right when he says that selecting a title is an important part of the process, and in a perfect world, it would be left up to the writer. (It can even lead to valuable insights in itself. When I was working on my article on the fiction of L. Ron Hubbard, I was casting about randomly for a title when I came up with “Xenu’s Paradox.” I didn’t know what it meant, but it led me to start thinking about the paradoxical aspects of Hubbard’s career, and the result was a line of argument that ended up being integral not just to the article, but to the ensuing book. And I was amazed when it survived intact on Longreads.) When you look at the grindingly literal, unpoetic headlines that currently populate the homepage of The New Yorker, it’s hard not to feel nostalgic for an era in which an editor might nudge a title in the opposite direction. In 1966, when McPhee delivered a long piece on oranges in Florida, William Shawn read it over, focused on a quotation from the poet Andrew Marvell, and called it “Golden Lamps in a Green Night.” McPhee protested, and the article was finally published under the title that he had originally wanted. It was called “Oranges.”
The fictional sentence
Of all the writers of the golden age of science fiction, the one who can be hardest to get your head around is A.E. van Vogt. He isn’t to everyone’s taste—many readers, to quote Alexei and Cory Panshin’s not unadmiring description, find him “foggy, semi-literate, pulpish, and dumb”—but he’s undoubtedly a major figure, and he was second only to Robert A. Heinlein and Isaac Asimov when it came to defining what science fiction became in the late thirties and early forties. (If he isn’t as well known as they are, it’s largely because he was taken out of writing by dianetics at the exact moment that the genre was breaking into the mainstream.) Part of his appeal is that his stories remain compelling and readable despite their borderline incoherence, and he was unusually open about his secret. In the essay “My Life Was My Best Science Fiction Story,” which was originally published in the volume Fantastic Lives, van Vogt wrote:
I learned to write by a system propounded in a book titled The Only Two Ways to Write a Story by John W. Gallishaw (meaning by flashback or in consecutive sequence). Gallishaw had made an in-depth study of successful stories by great authors. He observed that the best of them wrote in what he called “presentation units” of about eight hundred words. Each of these units contained five steps. And every sentence in it was a “fictional sentence.” Which means that it was written either with imagery, or emotion, or suspense, depending on the type of story.
So what did these units look like? Used copies of Gallishaw’s book currently go for well over a hundred dollars online, but van Vogt helpfully summarized the relevant information:
The five steps can be described as follows: 1) Where, and to whom, is it happening? 2) Make clear the scene purpose (What is the immediate problem which confronts the protagonist, and what does it require him to accomplish in this scene?) 3) The interaction with the opposition, as he tries to achieve the scene purpose. 4) Make the reader aware that he either did accomplish the scene purpose, or did not accomplish it. 5) In all the early scenes, whether protagonist did or did not succeed in the scene purpose, establish that things are going to get worse. Now, the next presentation unit-scene begins with: Where is all this taking place. Describe the surroundings, and to whom it is happening. And so forth.
Over the years, this formula was distorted and misunderstood, so that a critic could write something like “Van Vogt admits that he changes the direction of his plot every eight hundred words.” And even when accurately stated, it can come off as bizarre. Yet it’s really nothing more than the principle that every narrative should consist of a series of objectives, which I’ve elsewhere listed among the most useful pieces of writing advice that I know. Significantly, it’s one of the few elements of craft that can be taught and learned by example. Van Vogt learned it from Gallishaw, while I got it from David Mamet’s On Directing Film, and I’ve always seen it as a jewel of wisdom that can be passed in almost apostolic fashion from one writer to another.
When we read van Vogt’s stories, of course, we aren’t conscious of this structure, and if anything, we’re more aware of their apparent lack of form. (As John McPhee writes in his wonderful new book on writing: “Readers are not supposed to notice the structure. It is meant to be about as visible as someone’s bones.”) Yet we still keep reading. It’s that sequence of objectives that keeps us oriented through the centrifugal wildness that we associate with van Vogt’s work—and it shouldn’t come as a surprise that he approached the irrational side as systematically as he did everything else. I’d heard at some point that van Vogt based many of his plots on his dreams, but it wasn’t until I read his essay that I understood what this meant:
When you’re writing, as I was, for one cent a word, and are a slow writer, and the story keeps stopping for hours or days, and your rent is due, you get anxious…I would wake up spontaneously at night, anxious. But I wasn’t aware of the anxiety. I thought about story problems—that was all I noticed then. And so back to sleep I went. In the morning, often there would be an unusual solution. All my best plot twists came in this way…It was not until July 1943 that I suddenly realized what I was doing. That night I got out our alarm clock and moved into the spare bedroom. I set the alarm to ring at one and one-half hours. When it awakened me, I reset the alarm for another one and one-half hours, thought about the problems in the story I was working on—and fell asleep. I did that altogether four times during the night. And in the morning, there was the unusual solution, the strange plot twist…So I had my system for getting to my subconscious mind.
This isn’t all that different from Salvador Dali’s advice on how to take a nap. But the final sentence is the kicker: “During the next seven years, I awakened myself about three hundred nights a year four times a night.” When I read this, I felt a greater sense of kinship with van Vogt than I have with just about any other writer. Much of my life has been spent searching for tools—from mind maps to tarot cards—that can be used to systematically incorporate elements of chance and intuition into what is otherwise a highly structured process. Van Vogt’s approach comes as close as anything I’ve ever seen to the ideal of combining the two on a reliable basis, even if we differ on some of the details. (For instance, I don’t necessarily buy into Gallishaw’s notion that every action taken by the protagonist needs to be opposed, or that the situation needs to continually get worse. As Mamet writes in On Directing Film: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” And that’s often enough.) But it’s oddly appropriate that we find such rules in the work of a writer who frequently came across as chronically disorganized. Van Vogt pushed the limits of form further than any other author of the golden age, and it’s hard to imagine Alfred Bester or Philip K. Dick without him. But I’m sure that there were equally visionary writers who never made it into print because they lacked the discipline, or the technical tricks, to get their ideas under control. Van Vogt’s stories always seem on the verge of flying apart, but the real wonder is that they don’t. And his closing words on the subject are useful ones indeed: “It is well to point out again that these various systems were, at base, just automatic reactions to the writing of science fiction. The left side of the brain got an overdose of fantasizing flow from the right side, and literally had to do something real.”
Writing with scissors
Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)
Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:
The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)
Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.
And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:
If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.
This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.
The art of the index
Earlier this week, as planned, I finished the bulk of the background reading for my book Astounding. I’m far from done with the research process: there are still unanswered questions, gaps that need to be filled, and mysteries that I’m not sure I’ll ever be able to solve. But I have a sense of the territory. I knew going in that I had to cover an immense amount of raw material in a limited amount of time, and from the beginning, I was forced to prioritize and triage based on what I thought would actually end up in the book—which doesn’t mean that there wasn’t still a lot of it. It included all of John W. Campbell’s published novels and stories; something like fifteen thousand pages of unedited correspondence; forty years of back issues of Astounding, Unknown, and Analog; and numerous secondary sources, including interviews, memoirs, and critical studies. I had to do much the same thing with Asimov, Heinlein, and Hubbard, too, but with an important difference: I’m not the first biographer to tackle their lives, so a lot of the structural work had already been done, and I could make educated guesses about what parts would be the most relevant. When it comes to Campbell, however, enormous swaths of his life have never been explored, so I had no choice but to read everything. In the words of editor Alan Hathaway, which I never tire of quoting, I’ve tried to turn every goddamn page. Whenever I see something that might be useful, I make a note of it, trusting that I’ll be able to find it again when I go back to review that section at greater length. Then I have no choice but to move on.
And it’s only recently that I realized that what I’ve been doing, in essence, is preparing an index. We tend to think of indexes as standard features of nonfiction books, and we get annoyed when they aren’t there. (I note with interest that a different John Campbell—a British politician of the nineteenth century, and apparently no relation to mine—proposed that authors who failed to provide an index would be fined and deprived of their copyrights.) In fact, indexes originated as working tools that scholars prepared for themselves, and they tailored them for their individual needs. What I find useful in a book may not interest anybody else, especially if I’m reading with a specific problem in mind, which is why it makes sense for readers to maintain indexes of their own. As Harold Nicholson, another British politician, once said in a commencement speech:
My advice is to go to France, direct from New York to Cherbourg, and to remain there for at least three months, if possible living in a French family. My second piece of advice is always to mark your books and write a personal index for yourself on the flyleaf.
He’s right, of course, and I’ve been doing this for years without thinking about it. Now I’ve started to do it more deliberately, and I’ve gotten into the habit of transcribing those notes into a searchable text file, as an index of indexes that I can use to consolidate my entries and keep the whole mess under control.
It’s hard to write about indexes without thinking of a famous chapter in Kurt Vonnegut’s Cat’s Cradle, which is titled “Never Index Your Own Book.” As a professional indexer says to the narrator, evaluating another writer’s index:
“Flattering to the author, insulting to the reader,” she said. “In a hyphenated word,” she observed, with the shrewd amiability of an expert, “‘self-indulgent.’ I’m always embarrassed when I see an index an author has made of his own work…It’s a revealing thing, an author’s index of his own work…It’s a shameless exhibition—to the trained eye.”
I read this passage again recently with greater attention than usual, because the odds are pretty good that I’m going to end up indexing Astounding myself. (Here’s a tidbit that you might not know: if a publisher wants an index, the author has the right to prepare it, but if he declines—or does an unsatisfactory job—the publisher can hire someone else. The cost is deducted from the author’s advance, which means that there’s a decent financial incentive for writers to do the job themselves.) I’m also uncomfortably aware that Vonnegut is correct in saying that you can tell a lot about an author from his index. For an example that’s very close to home, I don’t need to look any further than William H. Patterson’s two-volume biography of Heinlein. Its index tells you a lot about Patterson himself, or at least about how he saw his subject, and I don’t have any doubt that my index will reflect on me.
But I also don’t think that anyone but the author has any business preparing the index. I’ve spent the last eight months compiling an index for a book that doesn’t exist: the unimaginable book that would include all the known details of Campbell’s life in their original form. (If you want to get really deep, you could say that a biography is the index of the man.) It bears the same relation to its sources that a graphical projection does to the original object: it translates it to a two-dimensional surface, losing some of its properties, but becoming considerably more manageable. The reason I’ve put it together, aside from reminding me of where various facts can be found, is to produce a rough sketch of the whole that I can review in its entirety. It condenses the available material into a form that I can reread over a relatively short period of time, which allows for the iterative review process that tells you what a book is really about. As John McPhee said of his notes to The Paris Review: “I read them until they’re coming out my ears.” And this is only possible if you’ve boiled them down to a set of labels. The author is the only one who can decipher it: it’s a coded message he writes to his future self. But when the time comes to prepare an index for the general reader, it invisibly reflects that ideal index that nobody else will ever see. Only the author, who knows both the words on the page and the unseen words that made them possible, can make it. You can sense this in the indexes for books as different as Sir Richard Francis Burton’s Arabian Nights or Douglas R. Hofstadter’s Le Ton Beau de Marot. These indexes live. They tell you a lot—maybe too much—about the author. But that’s exactly as it should be.
Apple and the cult of thinness
Recently, I’ve been in the market for a new computer. After some thought, I’ve settled on an older model of the MacBook Pro, both because of its price and because it’s the last remaining Apple laptop with an optical drive, which I still occasionally use. The experience put me in mind of a cartoon posted yesterday on Reddit, which shows a conversation between an Apple user and a helpful technician: “So what’s this update you’re installing?” “I’m just removing your USB ports.” “Great!” Apple’s obsession with eliminating unsightly ports, as well as any other features that might interfere with a device’s slim profile, has long been derided, and the recent news that the headphone jack might disappear from the next iPhone has struck many users as a bridge too far. Over the last decade or so, Apple has seemed fixated on pursuing thinness and lightness above all else, even though consumers don’t appear to be clamoring for thinner laptops or phones, and devices that are pared down past a certain point suffer in terms of strength and usability. (Apple isn’t alone in this, of course. Last week, I purchased a Sony Blu-ray player to replace the aging hulk I’d been using for the last five years, and although I like the new one, the lack of a built-in display that provides information on what the player is actually doing is a minor but real inconvenience, and it’s so light that I often end up pushing it backward on the television stand when I press the power button. As far as I can tell, there’s no reason why any device that spends its entire life on the same shelf needs to be so small.)
Obviously, I’m not the first person to say this, and in particular, the design gurus Don Norman and Bruce Tognazzini wrote a long, devastating piece for Fast Company last month on Apple’s pursuit of beauty over functionality. But I’d like to venture an alternative explanation for why it has taken this approach. Apple is a huge corporation, and like all large businesses, it needs quantifiable benchmarks to drive innovation. Once any enterprise becomes big enough, qualitative metrics alone don’t cut it: you need something to which you can assign a number. And while you can’t quantify usability, or even beauty, you can quantify thinness and weight. Apple seems to be using the physical size of a device as a proxy for innovative thought about design, which isn’t so different from the strategy that many writers use during the revision process. I’ve written here before about how I sometimes set length limits for stories or individual chapters, and how this kind of writing by numbers forces me to be smarter and more rigorous about my choices. John McPhee says much the same thing in a recent piece in The New Yorker about the exercise of “greening,” as once practiced by Time, which involved cutting an arbitrary number of lines. As Calvin Trillin writes elsewhere: “I was surprised that what I had thought of as a tightly constructed seventy-line story—a story so tightly constructed that it had resisted the inclusion of that maddening leftover fact—was unharmed, or even improved, by greening ten percent of it. The greening I did in Time Edit convinced me that just about any piece I write could be improved if, when it was supposedly ready to hand in, I looked in the mirror and said sternly to myself ‘Green fourteen’ or ‘Green eight.’ And one of these days I’m going to begin doing that.”
Apple appears to have come to a similar conclusion about its devices, which is that by greening away weight and thickness, you end up with other desirable qualities. And it works—but only up to a point. As McPhee observes, greening is supposed to be invisible: “The idea is to remove words in such a manner that no one would notice that anything has been removed.” And once you pass beyond a certain limit, you risk omitting essential elements, as expressed in the book Behind the Seen by Charles Koppelman, which describes the process of the legendary film editor Walter Murch:
Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat. “It can be done,” says Murch, “and I have done it on a number of films that turned out well in the end. But it is tricky, and the outcome is not guaranteed—like open-heart surgery. The patient is put at risk, and the further beyond thirty percent you go, the greater the risk.
And Apple—which has had a long and productive relationship with Murch, a vocal champion of its Final Cut Pro software—should pay attention. In the past, the emphasis on miniaturization was undoubtedly a force for innovative solutions, but we’ve reached the point where the patient is being endangered by removing features of genuine utility. Murch’s thirty percent factor turns out to describe the situation at Apple eerily well: the earliest models of the MacBook Pro weighed about five and a half pounds, implying that once the weight was reduced below four pounds or so, vital organs would be threatened, which is exactly what happened. (Even more insidiously, the trend has spread into realms where the notion of thinness is entirely abstract, like the fonts that Apple uses for its mobile devices, which, as Norman and Tognazzini point out, are so slender that they’ve become difficult to read.) These changes aren’t driven by consumer demand, but by a corporate culture that has failed to recognize that its old ways of quantifying innovation no longer serve their intended purpose. The means have been confused with the end. Ultimately, I’m still a fan of Apple, and I’m still going to buy that MacBook. I don’t fault it for wanting to qualify its processes: it’s a necessary part of managing creativity on a large scale. But it has to focus on a number other than thickness or weight. What Apple needs is a new internal metric, similarly quantifiable, that reflects something that consumers actually want. There’s one obvious candidate: price. Instead of making everything smaller, Apple could focus on providing the same functionality, beauty, and reliability at lower cost. It would drive innovation just as well as size once did. But given Apple’s history, the chances of that happening seem very slim indeed.
The art of omission
Over the last couple of years, it has slowly become clear that the series of articles on the writing life that John McPhee is unhurriedly publishing in The New Yorker is one of the modest but indisputable creative events of our time. McPhee has long been regarded as the dean of American nonfiction, and in one essay after another, he has lovingly, amusingly, and unsentimentally unpacked the tricks and secrets of six full decades as a writer and curious character. The fact that these pieces are written from the perspective of a journalist—albeit a preternaturally inventive and sensitive one—makes them even more useful for authors of fiction. Because the point of view has been shifted by ninety degrees, we’re more aware of the common elements shared by all forms of writing: choice of subject, structure, revision, selection, omission. There isn’t a point that McPhee makes here that couldn’t be applied with profit to any form of creative writing, or any kind of artistic effort in general. McPhee isn’t dogmatic, and he frames his advice less as a rulebook than as a string of gentle, sensible suggestions. But the result, when collected at last in the inevitable book, will amount to nothing less than one of the most useful works ever composed on the art of clear writing and thinking, worthy of being placed on the same shelf as The Elements of Style. Strunk and White will always come first, but McPhee has set himself up as their obvious successor.
Take his most recent article, which focuses on the crucial art of omission. McPhee makes many of the same points—although more vividly and memorably—that others have covered before. Writing is cutting; a story should be exactly the length that can be sustained by its material and no more; a rough draft almost always benefits from being trimmed by ten percent. Underlying it all, however, is a deeper philosophical sense of why we omit what we do. McPhee writes:
To cause a reader to see in her mind’s eye an entire autumnal landscape, for example, a writer needs to deliver only a few words and images—such as corn shocks, pheasants, and an early first. The creative writer leaves white space between chapters or segments of chapters. The creative writer silently articulates the unwritten thought that is present in the white space. Let the reader have the experience. Leave judgment in the eye of the beholder. When you are deciding what to leave out, begin with the author. If you see yourself prancing around between subject and reader, get lost. Give elbow room to the creative reader. In other words, to the extent that this is all about you, leave that out.
Omission, in short, is a strategy for enforcing objectivity, and it obliges the writer to keep an eye out for the nonessential. When you’re trying to cut a story or essay by some arbitrary amount, you often find that the first parts to go are the places where you’ve imposed yourself on the subject. And if you sacrifice a telling fact or detail to preserve one of your own opinions, you’ve probably got bigger problems as a writer.
And the word “arbitrary” in the above paragraph is surprisingly important. Yesterday, I quoted Calvin Trillin on the process of greening at Time, in which makeup editors would return an article to its author with curt instructions to cut five or ten lines. McPhee, who did a lot of greening himself over the years, adds a crucial piece of information: “Time in those days, unlike its rival Newsweek, never assigned a given length but waited for the finished story before fitting it into the magazine.” In other words, the number of lines the writer was asked to cut wasn’t dictated by the content of the story, but by an arbitrary outside factor—in this case, the length and layout of the other articles that happened to be jostling for space in that particular issue. And while we might expect this to do violence to the integrity of the story itself, in practice, it turns out to be the opposite: it’s precisely because the quota of lines to remove is essentially random that the writer is forced to think creatively about how and where to condense. I’ve imposed arbitrary length limitations on just about everything meaningful I’ve ever written, and if anything, I wish I had been even more relentless. (One of the few real advantages of the structural conventions of the modern movie script is that it obliges writers to constantly engage in a kind of greening to hit a certain page count. Sometimes, it can feel like cheating, but it’s also a productive way to sweat down a story, and there isn’t a screenwriter alive who hasn’t experienced the truth of McPhee’s observation: “If you kill a widow, you pick up a whole line.”)
Of course, none of this means that the seemingly nonessential doesn’t have its place. Few essays would be any fun to read if they didn’t include the occasional digression or footnote that covered tangentially related territory, and that applies to McPhee as much as to anyone else. (In fact, his piece on omission concludes with a huge shaggy dog story about General Eisenhower, ending on a delicious punchline that wouldn’t be nearly as effective if McPhee hadn’t built up to it with a full page of apparent trivia.) Every work of art, as McPhee notes elsewhere, arrives at its own rhythms and structure, and an essay that is all business, or a series of breathless news items, is unlikely to inspire much affection. If there’s a point to be made here, though, it’s that digression and looseness is best imposed on the level of the overall work, rather than in the individual sentence: McPhee’s finest essays often seem to wander from one subject to the next as connections occur to the author, but on the level of the individual line or image, they’re rigorously organized. Greening is such a valuable exercise because it targets the parts of a work that can always be boiled down further—transitional sections, places where the text repeats itself, redundancies, moments of indulgence. McPhee compares it to pruning, or to removing freight cars to shorten a train, so that no one, even the author, would know in the end that anything has been removed. And it’s only through greening that you discover the shape that the story wants for itself.
Quote of the Day
I always say to my [writing] classes that it’s analogous to cooking a dinner. You go to the store and you buy a lot of things. You bring them home and you put them on the kitchen counter, and that’s what you’re going to make your dinner out of. If you’ve got a red pepper over here—it’s not a tomato. You’ve got to deal with what you’ve got.
The Travolta moment
There’s a moment halfway through Jonathan Franzen’s The Corrections when Enid Lambert, the matriarch of the novel’s dysfunctional Midwestern family, visits a doctor on a cruise ship. It’s an important scene—Enid leaves with a handful of antidepressants that will play a big role later in the story—and Franzen lavishes his usual amount of care on the sequence, which runs for a full nine pages. But here’s how he chooses to describe the doctor on his first appearance:
He had a large, somewhat coarse-skinned face like the face of the Italian-American actor people loved, the one who once starred as an angel and another time as a disco dancer.
I adore The Corrections, but this is an embarrassing sentence—one of the worst I’ve ever seen pass the pen of a major novelist. It’s particularly surprising coming from Franzen, who has thought as urgently and probingly as any writer alive about the problem of voice. But it’s also the kind of lapse that turns out to be unexpectedly instructive, precisely because it comes from an author who really ought to know better.
So why does this sentence grate so much? Let’s break down the reasons one at a time:
- Franzen clearly wants to tell us that the doctor looks like John Travolta, but he’s too shy to come out and say so, so he uses more than twenty words to convey what could have easily been expressed in two.
- In the process, he’s false to his character. No woman of Enid’s generation and background would have any trouble coming up with Travolta’s name, especially if she were familiar with his role in Michael, of all movies. It’s not like she’s trying to remember, say, Richard Jenkins.
- Worst of all, it takes us out of the story. Instead of focusing on the moment—which happens to be a crucial turning point for Enid’s character—we’re distracted by Franzen’s failure of style.
And the punchline here is that a lesser novelist would simply have said that the doctor looked like Travolta and been done with it. Franzen, an agonizingly smart writer, senses how lazy this is, so he backs away, but not nearly far enough. And the result reads like nothing a recognizable human being would feel or say.
I got to thinking about this after reading John McPhee’s recent New Yorker piece about frames of reference. McPhee’s pet peeve is when authors describe a person’s appearance by leaning on a perceived resemblance to a famous face, as in this example from Ian Frazier: “She looks enough like the late Bea Arthur, the star of the nineteen-seventies sitcom Maude, that it would be negligent not to say so.” Clearly, if you don’t remember how Bea Arthur looks, this description isn’t very useful. And while any such discussion tends to turn into a personal referendum on which references are obvious and which aren’t—McPhee claims he doesn’t know who Gene Wilder is, for instance—his point is a valid one:
If you say someone looks like Tom Cruise—and you let it go at that—you are asking Tom Cruise to do your writing for you. Your description will fail when your reader doesn’t know who Tom Cruise is.
And references that seem obvious now may not feel that way in twenty years. McPhee concludes, reasonably, that if you’re going to compare a character to a celebrity, you need to pay back that borrowed vividness by amplifying it with a line of description of your own, as when Joel Achenbach follows up his reference to Gene Wilder by referring to the subject’s “manic energy.”
When we evaluate Franzen’s Travolta moment in this light, it starts to look even worse. It reminds me a little of the statistician Edward Tufte, who famously declared that graphical excellence gives the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space. In his classic The Visual Display of Quantitative Information, he introduces the concept of the data-ink ratio, which consists of the amount of data ink divided by the total ink used to print a statistical graphic. (“Data ink” is the ink in a graph or chart that can’t be erased without a real loss of information.) Ideally, as large a proportion of the ink as possible should be devoted to the presentation of the data, rather than to redundant material. As an example of the ratio at its worst, Tufte reprints a graph from a textbook that erased all the data points while retaining the grid lines, noting drily: “The resulting figure achieves a graphical absolute zero, a null data-ink ratio.” And that’s what Franzen gives us here. In twenty words, he offers no information that the reader isn’t asked to supply on his or her own. To be fair, Franzen is usually better than this. But here, it’s like giving us a female character and saying that she looks like Adele Dazeem.
Rediscovering the dictionary
I’ve never owned a dictionary. Well, that isn’t precisely true. Looking around my bookshelves now, I can see all kinds of specialized dictionaries without leaving my chair, from Hobson-Jobson: The Anglo-Indian Dictionary to Partridge’s Dictionary of Slang and Unconventional English to Brewer’s Dictionary of Phrase and Fable. About a year ago, moreover, I was lucky enough to acquire not just a dictionary, but the dictionary. As much as I love my Compact Oxford English Dictionary, however, it isn’t exactly made for everyday use: the volumes are bulky, the print is too small to read without a magnifying glass, and it’s easy to get lost in it for hours when you’re just trying to look up one word. And as far as a conventional desk dictionary is concerned, I haven’t used one in a long time. My vocabulary is more than adequate for the kind of fiction I’m writing, and whenever I have to check a definition just to be on the safe side, there are plenty of online resources that I can consult with ease. So although I have plenty of other reference books, I just never saw the need for Webster’s.
But I was wrong. Or at least I’m strongly reconsidering my position after reading the latest in John McPhee’s wonderful series of essays on the writing life in The New Yorker. The most recent installment covers a lot of ground—it contains invaluable advice on how to write a rough draft, which McPhee says you should approach as if it were a letter to your mother, and includes a fascinating digression on the history of the magazine’s copy editors—but the real meat of the piece lies here:
With dictionaries, I spend a great deal more time looking up words I know than words I have never heard of—at least ninety-nine to one. The dictionary definitions of words you are trying to replace are far more likely to help you out than a scattershot wad from a thesaurus.
The emphasis is mine, but McPhee’s case speaks for itself. He explains, for instance, that he wrote the sentence “The reflection of the sun races through the trees and shoots forth light from the water” after seeing “to shoot forth light” in the dictionary definition of “sparkle.” And after struggling to find a way to describe canoeing, he looked up the definition of the word “sport” and found: “A diversion in the field.” Hence:
A canoe trip has become simply a rite of oneness with certain terrain, a diversion in the field, an act performed not because it is necessary but because there is value in the act itself.
As far as thesauruses go, McPhee calls them “useful things” in their proper place: “The value of a thesaurus is in the assistance it can give you in finding the best possible word for the mission that the word is supposed to fulfill.” In my own case, I tend to use a thesaurus most often in the rewrite, when I’m drilling down more deeply into the meaning of each sentence, and when issues of variety and rhythm start to take greater precedence. I rely mostly on the thesaurus function in Word and on an occasional trip to the excellent free thesauruses available online, where the hyperlinks allow me to skip more easily from one possible synonym to another. And although I recently found myself tempted by a copy of Roget’s at my local thrift store, I expect that I’ll stick to my current routine. (Incidentally, I’ve found that I tend to read thesauruses most obsessively when I’m trying to figure out the title for a novel, which is an exhausting process that needs all the help it can get—I vividly remember going to Thesaurus.com repeatedly on my phone while trying to find a title for what eventually became City of Exiles.)
But McPhee has sold me on the dictionary. After briefly weighing the possibility of picking up McPhee’s own Webster’s Collegiate, I ended up buying a used copy of the American Heritage Dictionary, since I remember it fondly from my own childhood and because it’s the dictionary most warmly recommended by the Whole Earth Catalog, which has never steered me wrong. It’s coming on Tuesday, and after it arrives, I wouldn’t be surprised if it took up a permanent place on my desk, next to my reference copies of my own novels and A Choice of Shakespeare’s Verse by Ted Hughes. Whether or not it will change my style remains to be seen, but it’s still something I wish I’d done years earlier. Dictionaries, as all writers know, are books of magic, and we should consult them as diligently as we would any religious text, an act, like canoeing, performed not because it is necessary but because there is value in the act itself. As Jean Cocteau says: “The greatest masterpiece in literature is only a dictionary out of order.”
The seductions of structure
Learning about a writer’s outlining methods may not be as interesting as reading about his or her sex life, but it exercises a peculiar fascination of its own—at least for other writers. Everyone else probably feels a little like I did while reading Shawn McGrath’s recent appreciation of the beautiful source code behind Doom 3: I understood what he was getting at, but the article itself read like a dispatch from a parallel universe of lexical analyzers and rigid parameters. Still, the rules of good structure are surprisingly constant across disciplines. You don’t want more parts than you need; the parts you do have should be arranged in a logical form; and endless tinkering is usually required before the result has the necessary balance and beauty. And for the most part, the underlying work ought to remain invisible. The structure of a good piece of fiction is something like the structure of a comfortable chair. You don’t necessarily want to think about it while you’re in it, but if the structure has been properly conceived, your brain, or your rear end, will thank you.
In recent weeks, I’ve been lucky enough to read two enjoyable pieces of structure porn. The first is John McPhee’s New Yorker essay on the structure of narrative nonfiction; the second is Aaron Hamburger’s piece in the New York Times on outlining in reverse. McPhee’s article goes into his methods in great, sometimes laborious detail, and there’s something delightful in hearing him sing the praises of his outlining and text editing software. His tools may be computerized, but they only allow him to streamline what he’d always done with a typewriter and scissors:
After reading and rereading the typed notes and then developing the structure and then coding the notes accordingly in the margins and then photocopying the whole of it, I would go at the copied set with the scissors, cutting each sheet into slivers of varying size…One after another, in the course of writing, I would spill out the sets of slivers, arrange them ladderlike on a card table, and refer to them as I manipulated the Underwood.
Regular readers will know that this is the kind of thing I love. Accounts of how a book is written tend to dwell on personal gossip or poetic inspiration, and while such stories can be inspiring or encouraging, as a working writer, I’d much rather hear more about those slivers of paper.
And the reason I love them so much is that they get close to the heart of writing as a profession, which has surprising affinities with more technical or mechanical trades. Writing a novel, in particular, hinges partially on a few eureka moments, but it also presents daunting organizational and logistical challenges. A huge amount of material needs to be kept under control, and a writer’s brain just isn’t large or flexible enough to handle it all at once. Every author develops his or her own strategies for corralling ideas, and for most of us, it boils down to taking good notes, which I’ve compared elsewhere to messages that I’ve left, a la Memento, for my future self to rediscover. By putting our thoughts on paper—or, like McPhee does, in a computerized database—we make them easier to sort and retrieve. It looks like little more than bookkeeping, but it liberates us. McPhee says it better than I ever could: “If this sounds mechanical, the effect was absolutely the reverse…The procedure eliminated all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.”
This kind of organization can also take place closer to the end of the project, as Hamburger notes in his Times piece. Hamburger says that he dislikes using outlines to plan a writing project, and prefers to work more organically, but also observes that it can be useful to view the resulting material with a more objective, even mathematical eye. What he describes is similar to what I’ve called writing by numbers: you break the story down to individual scenes, count the pages or paragraphs, and see how each piece fits in with the shape of the story as a whole. Such an analysis often reveals hidden weaknesses or asymmetries, and the solution can often be as simple as the ten percent rule:
In [some] stories, I found that most of the scenes were roughly equal in length, and so cutting became as easy as an across-the-board budget cut. I dared myself to try to cut ten percent from each scene, and then assessed what was left. Happily, I didn’t always achieve my goal—because let’s face it, writing is not math and never should be. Yet what I learned about my story along the way proved invaluable.
I agree with this wholeheartedly, with one caveat: I believe that writing often is math, although not exclusively, and only as a necessary prop for emotion and intuition. Getting good ideas, as every writer knows, is the easy part. It’s the structure that makes them dance.
John McPhee’s advice to young writers
The writing impulse seeks its own level and isn’t always given a chance to find it. You can’t make up your mind in Comp Lit class that you’re going to be a Russian novelist. Or even an American novelist. Or a poet. Young writers find out what kinds of writers they are by experiment. If they choose from the outset to practice exclusively a form of writing because it is praised in the classroom or otherwise carries appealing prestige, they are vastly increasing the risk inherent in taking up writing in the first place. It is so easy to misjudge yourself and get stuck in the wrong genre. You avoid that, early on, by writing in every genre. If you are telling yourself you’re a poet, write poems. Write a lot of poems. If fewer than one work out, throw them all away; you’re not a poet. Maybe you’re a novelist. You won’t know until you have written several novels…
I have always thought that Ben Jonson must have had young writers in mind when he said, “Though a man be more prone and able for one kind of writing than another, yet he must exercise all.” Gender aside, I take that to be a message to young writers.