Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘John McPhee

The stuff of thought

leave a comment »

On December 4, 1972, the ocean liner SS Statendam sailed from New York to Florida, where its passengers would witness the launch of Apollo 17, the final manned mission to the moon. The guests on the cruise included Isaac Asimov, Robert A. Heinlein, Frederik Pohl, Theodore Sturgeon, Norman Mailer, Katherine Anne Porter, and the newscaster Hugh Downs. It’s quite a story, and I’ve written about it elsewhere at length. What I’d like to highlight today, though, is what was happening a few miles away on shore, as Tom Wolfe recounts in the introduction to the paperback edition of The Right Stuff:

This book grew out of some ordinary curiosity. What is it, I wondered, that makes a man willing to sit up on top of an enormous Roman candle, such as a Redstone, Atlas, Titan, or Saturn rocket, and wait for someone to light the fuse? I decided on the simplest approach possible. I would ask a few astronauts and find out. So I asked a few in December of 1972 when they gathered at Cape Canaveral to watch the last mission to the moon, Apollo 17. I discovered quickly enough that none of them, no matter how talkative otherwise, was about to answer the question or even linger for more than a few seconds on the subject at the heart of it, which is to say, courage.

Wolfe’s “ordinary curiosity” led him to tackle a project that would consume him for the better part of a decade, driven by his discovery of “a rich and fabulous terrain that, in a literary sense, had remained as dark as the far side of the moon for more than half a century: military flying and the modern American officer corps.”

And my mind sometimes turns to the contrast between Wolfe, trying to get the astronauts to open up about their experiences, and the writers aboard the Statendam. You had Mailer, of course, who had written his own book on the moon, and the result was often extraordinary. It was more about Mailer himself than anything else, though, and during the cruise, he seemed more interested in laying out his theory of the thanatosphere, an invisible region around the moon populated by the spirits of the dead. Then you had such science fiction writers as Heinlein and Asimov, who would occasionally cross paths with real astronauts, but whose fiction was shaped by assumptions about the competent man that had been formed decades earlier. Wolfe decided to go to the source, but even he kept the pulps at the back of his mind. In his introduction, speaking of the trend in military fiction after World War I, he observes:

The only proper protagonist for a tale of war was an enlisted man, and he was to be presented not as a hero but as Everyman, as much a victim of war as any civilian. Any officer above the rank of second lieutenant was to be presented as a martinet or a fool, if not an outright villain, no matter whom he fought for. The old-fashioned tale of prowess and heroism was relegated to second- and third-rate forms of literature, ghostwritten autobiographies, and stories in pulp magazines on the order of Argosy and Bluebook.

Wolfe adds: “Even as late as the 1930s the favorite war stories in the pulps concerned World War I pilots.” And it was to pursue “the drama and psychology” of this mysterious courage in the real world that he wrote The Right Stuff.

The result is a lasting work of literary journalism, as well as one of the most entertaining books ever written, and we owe it to the combination of Wolfe’s instinctive nose for a story and his obsessiveness in following it diligently for years. Last year, in a review of John McPhee’s new collection of essays, Malcolm Harris said dryly: “I would recommend Draft No. 4 to writers and anyone interested in writing, but no one should use it as a professional guide uncritically or they’re liable to starve.” You could say much the same about Wolfe, who looks a lot like the kind of journalist we aren’t likely to see again, in part because the market has changed, but also because this kind of luck can be hard for anyone to sustain over the course of a career. Wolfe hit the jackpot on multiple occasions, but he also spent years on books that nobody read—Back to Blood, his last novel, cost its publisher a hundred dollars for every copy that it sold. (Toward the end, he could even seem out of his depth. It probably isn’t a coincidence that I never read I Am Charlotte Simmons, a novel about “Harvard, Yale, Princeton, Stanford, Duke, and a few other places all rolled into one” that was published a few years after I graduated from college. Wolfe’s insights into undergraduate life, delivered with his customary breathlessness, didn’t seem useful for understanding an experience that I had just undergone, and I’ve never forgotten the critic who suggested that the novel should have been titled I Am Easily Impressed.)

But that’s also the kind of risk required to produce major work. Wolfe’s movement from nonfiction to novels still feels like a loss, and I think that it deprived us of two or three big books of the kind that he could write better than anyone else. (It’s too bad that he never wrote anything about science fiction, which is a subject that could only be grasped by the kind of writer who could produce both The Right Stuff and The Electric Kool-Aid Acid Test.) Yet it isn’t always the monumental achievements that matter. In fact, when I think of what Wolfe has meant to me, it’s his offhand critical comments that have stuck in my head. The short introduction that he wrote to a collection of James M. Cain’s novels, in which he justifiably praised Cain’s “momentum,” has probably had a greater influence on my own style—or at least my aspirations for it—than any other single piece of criticism. His description of Umberto Eco as “a very good example of a writer who leads dozens of young writers into a literary cul-de-sac” is one that I’ll always remember, mostly because he might have been speaking of me. In college, I saw him give a reading once, shortly before the release of the collection Hooking Up. I was struck by his famous white suit, of course, but what I’ll never forget is the moment, just before he began to read, when he reached into his inside pocket and produced a pair of reading glasses—also spotlessly white. It was a perfect punchline, with the touch of the practiced showman, and it endeared Wolfe to me at times when I grew tired of his style and opinions. His voice and his ambition inspired many imitators, but at his best, it was the small stuff that set him apart.

Checks and balances

with one comment

About a third of the way through my upcoming book, while discussing the May 1941 issue of Astounding Science Fiction, I include the sentence: “The issue also featured Heinlein’s “Universe,” which was based on Campbell’s premise about a lost generation starship.” My copy editor amended this to “a lost-generation starship,” to which I replied: “This isn’t a ‘lost-generation’ starship, but a generation starship that happens to be lost.” And the exchange gave me a pretty good idea for a story that I’ll probably never write. (I don’t really have a plot for it yet, but it would be about Hemingway and Fitzgerald on a trip to Alpha Centauri, and it would be called The Double Sun Also Rises.) But it also reminded me of one of the benefits of a copy edit, which is its unparalleled combination of intense scrutiny and total detachment. I sent drafts of the manuscript to some of the world’s greatest nitpickers, who saved me from horrendous mistakes, and the result wouldn’t be nearly as good without their advice. But there’s also something to be said for engaging the services of a diligent reader who doesn’t have any connection to the subject. I deliberately sought out feedback from a few people who weren’t science fiction fans, just to make sure that it remained accessible to a wider audience. And the ultimate example is the copy editor, who is retained to provide an impartial consideration of every semicolon without any preconceived notions outside the text. It’s what Heinlein might have had in mind when he invented the Fair Witness, who said when asked about the color of a nearby house: “It’s white on this side.”

But copy editors are human beings, not machines, and they occasionally get their moment in the spotlight. Recently, their primary platform has been The New Yorker, which has been quietly highlighting the work of its copy editors and fact checkers over the last few years. We can trace this tendency back to Between You & Me, a memoir by Mary Norris that drew overdue attention to the craft of copy editing. In “Holy Writ,” a delightful excerpt in the magazine, Norris writes of the supposed objectivity and rigor of her profession: “The popular image of the copy editor is of someone who favors rigid consistency. I don’t usually think of myself that way. But, when pressed, I do find I have strong views about commas.” And she says of their famous detachment:

There is a fancy word for “going beyond your province”: “ultracrepidate.” So much of copy editing is about not going beyond your province. Anti-ultracrepidationism. Writers might think we’re applying rules and sticking it to their prose in order to make it fit some standard, but just as often we’re backing off, making exceptions, or at least trying to find a balance between doing too much and doing too little. A lot of the decisions you have to make as a copy editor are subjective. For instance, an issue that comes up all the time, whether to use “that” or “which,” depends on what the writer means. It’s interpretive, not mechanical—though the answer often boils down to an implicit understanding of commas.

In order to be truly objective, in other words, you have to be a little subjective. Which equally true of writing as a whole.

You could say much the same of the fact checker, who resembles the copy editor’s equally obsessive cousin. As a rule, books aren’t fact-checked, which is a point that we only seem to remember when the system breaks down. (Astounding was given a legal read, but I was mostly on my own when it came to everything else, and I’m grateful that some of the most potentially contentious material—about L. Ron Hubbard’s writing career—drew on an earlier article that was brilliantly checked by Matthew Giles of Longreads.) As John McPhee recently wrote of the profession:

Any error is everlasting. As Sara [Lippincott] told the journalism students, once an error gets into print it “will live on and on in libraries carefully catalogued, scrupulously indexed…silicon-chipped, deceiving researcher after researcher down through the ages, all of whom will make new errors on the strength of the original errors, and so on and on into an exponential explosion of errata.” With drawn sword, the fact-checker stands at the near end of this bridge. It is, in part, why the job exists and why, in Sara’s words, a publication will believe in “turning a pack of professional skeptics loose on its own galley proofs.”

McPhee continues: “Book publishers prefer to regard fact-checking as the responsibility of authors, which, contractually, comes down to a simple matter of who doesn’t pay for what. If material that has appeared in a fact-checked magazine reappears in a book, the author is not the only beneficiary of the checker’s work. The book publisher has won a free ticket to factual respectability.” And its absence from the publishing process feels like an odd evolutionary vestige of the book industry that ought to be fixed.

As a result of such tributes, the copy editors and fact checkers of The New Yorker have become cultural icons in themselves, and when an error does make it through, it can be mildly shocking. (Last month, the original version of a review by Adam Gopnik casually stated that Andrew Lloyd Webber was the composer of Chess, and although I knew perfectly well that this was wrong, I had to look it up to make sure that I hadn’t strayed over into a parallel universe.) And their emergence at this particular moment may not be an accident. The first installment of “Holy Writ” appeared on February 23, 2015, just a few months before Donald Trump announced that he was running for president, plunging us all into world in which good grammar and factual accuracy can seem less like matters of common decency than obstacles to be obliterated. Even though the timing was a coincidence, it’s tempting to read our growing appreciation for these unsung heroes as a statement about the importance of the truth itself. As Alyssa Rosenberg writes in the Washington Post:

It’s not surprising that one of the persistent jokes from the Trump era is the suggestion that we’re living in a bad piece of fiction…Pretending we’re all minor characters in a work of fiction can be a way of distancing ourselves from the seeming horror of our time or emphasizing our own feelings of powerlessness, and pointing to “the writers” often helps us deny any responsibility we may have for Trump, whether as voters or as journalists who covered the election. But whatever else we’re doing when we joke about Trump and the swirl of chaos around him as fiction, we’re expressing a wish that this moment will resolve in a narratively and morally comprehensible fashion.

Perhaps we’re also hoping that reality itself will have a fact checker after all, and that the result will make a difference. We don’t know if it will yet. But I’m hopeful that we’ll survive the exponential explosion of errata.

Life on the last mile

with 2 comments

In telecommunications, there’s a concept called “the last mile,” which states that the final leg of a network—the one that actually reaches the user’s home, school or office—is the most difficult and expensive to build. It’s one thing to construct a massive trunkline, which is basically a huge but relatively straightforward feat of engineering, and quite another to deal with the tangle of equipment, wiring, and specifications on the level of thousands of individual households. More recently, the concept has been extended to public transportation, delivery and distribution services, and other fields that depend on connecting an industrial operation on the largest imaginable scale with specific situations on the retail side. (For instance, Amazon has been trying to cross the last mile through everything from its acquisition of Whole Foods to drone delivery, and the fact that these are seen as alternative approaches to the same problem points to how complicated it really is.) This isn’t just a matter of infrastructure, either, but of the difficulties inherent to any system in which a single pipeline has to split into many smaller branches, whether it’s carrying blood, water, mail, or data. Ninety percent of the wiring can be in that last mile, and success lies less in any overall principles than in the irritating particulars. It has to be solved on the ground, rather than in a design document, and you’ll never be able to anticipate all of the obstacles that you’ll face once those connections start to multiply. It’s literally about the ramifications.

I often feel the same way when it comes to writing. When I think back at how I’ve grown as a writer over the last decade or so, I see clear signs of progress. Thanks mostly to the guidelines that David Mamet presents in On Directing Film, it’s much easier for me to write a decent first draft than it was when I began. I rarely leave anything unfinished; I know how to outline and how to cut; and I’m unlikely to make any huge technical mistakes. In his book Which Lie Did I Tell?, William Goldman says something similar about screenwriting:

Stephen Sondheim once said this: “I cannot write a bad song. You begin it here, build, end there. The words will lay properly on the music so they can be sung, that kind of thing. You may hate it, but it will be a proper song.” I sometimes feel that way about my screenplays. I’ve been doing them for so long now, and I’ve attempted most genres. I know about entering the story as late as possible, entering each scene as late as possible, that kind of thing. You may hate it, but it will be a proper screenplay.

Craft, in other words, can take you most of the way—but it’s the final leg that kills you. As Goldman concludes of his initial pass on the script for Absolute Power: “This first draft was proper as hell—you just didn’t give a shit.” And sooner or later, most writers find that they spend most of their time on that last mile.

Like most other art forms, creative writing can indeed be taught—but only to the point that it still resembles an engineering problem. There are a few basic tricks of structure and technique that will improve almost anyone’s work, much like the skills that you learn in art books like Drawing on the Right Side of the Brain, and that kind of advancement can be enormously satisfying. When it comes to the last mile between you and your desired result, however, many of the rules start to seem useless. You aren’t dealing with the general principles that have gotten you this far, but with problems that arise on the level of individual words or sentences, each one of which needs to be tackled on its own. There’s no way of knowing whether or not you’ve made the right choice until you’ve looked at them all in a row, and even if something seems wrong, you may not know how to fix it. The comforting shape of the outline, which can be assembled in a reasonably logical fashion, is replaced by the chaos of the text, and the fact that you’ve done good work on this level before is no guarantee that you can do it right now. I’ve learned a lot about writing over the years, but to the extent that I’m not yet the writer that I want to be, it lies almost entirely in that last mile, where the ideal remains tantalizingly out of reach.

As a result, I end up revising endlessly, even a late stage, and although the draft always gets better, it never reaches perfection. After a while, you have to decide that it’s as good as it’s going to get, and then move on to something else—which is why it helps to have a deadline. But you can take comfort in the fact that the last mile affects even the best of us. In a recent New York Times profile of the playwright Tony Kushner, Charles McGrath writes:

What makes Angels in America so complicated to stage is not just Mr. Kushner’s need to supervise everything, but that Perestroika, the second part, is to a certain extent a work in progress and may always be. The first part, Millennium Approaches, was already up and running in the spring of 1991, when, with a deadline looming, Mr. Kushner retreated to a cabin in Northern California and wrote most of Perestroika in a feverish eight-day stint, hardly sleeping and living on junk food. He has been tinkering with it ever since…Even during rehearsal last month he was still cutting, rewriting, restructuring.

If Tony Kushner is still revising Angels in America, it makes me feel a little better about spending my life on that last mile. Or as John McPhee says about knowing when to stop: “What I know is that I can’t do any better; someone else might do better, but that’s all I can do; so I call it done.”

This post has no title

leave a comment »

In John McPhee’s excellent new book on writing, Draft No. 4, which I mentioned here the other day, he shares an anecdote about his famous profile of the basketball player Bill Bradley. McPhee was going over a draft with William Shawn, the editor of The New Yorker, “talking three-two zones, blind passes, reverse pivots, and the setting of picks,” when he realized that he had overlooked something important:

For some reason—nerves, what else?—I had forgotten to find a title before submitting the piece. Editors of every ilk seem to think that titles are their prerogative—that they can buy a piece, cut the title off the top, and lay on one of their own. When I was young, this turned my skin pink and caused horripilation. I should add that I encountered such editors almost wholly at magazines other than The New YorkerVogue, Holiday, the Saturday Evening Post. The title is an integral part of a piece of writing, and one of the most important parts, and ought not to be written by anyone but the writer of what follows the title. Editors’ habit of replacing an author’s title with one of their own is like a photo of a tourist’s head on the cardboard body of Mao Zedong. But the title missing on the Bill Bradley piece was my oversight. I put no title on the manuscript. Shawn did. He hunted around in the text and found six words spoken by the subject, and when I saw the first New Yorker proof the piece was called “A Sense of Where You Are.”

The dynamic that McPhee describes at other publications still exists today—I’ve occasionally bristled at the titles that have appeared over the articles that I’ve written, which is a small part of the reason that I’ve moved most of my nonfiction onto this blog. (The freelance market also isn’t what it used to be, but that’s a subject for another post.) But a more insidious factor has invaded even the august halls of The New Yorker, and it has nothing to do with the preferences of any particular editor. Opening the most recent issue, for instance, I see that there’s an article by Jia Tolentino titled “Safer Spaces.” On the magazine’s website, it becomes “Is There a Smarter Way to Think About Sexual Assault on Campus?”, with a line at the bottom noting that it appears in the print edition under its alternate title. Joshua Rothman’s “Jambusters” becomes “Why Paper Jams Persist.” A huge piece by David Grann, “The White Darkness,” which seems destined to get optioned for the movies, earns slightly more privileged treatment, and it merely turns into “The White Darkness: A Journey Across Antarctica.” But that’s the exception. When I go back to the previous issue, I find that the same pattern holds true. Michael Chabon’s “The Recipe for Life” is spared, but David Owen’s “The Happiness Button” is retitled “Customer Satisfaction at the Push of a Button,” Rachel Aviv’s “The Death Debate” becomes “What Does It Mean to Die?”, and Ian Frazier’s “Airborne” becomes “The Trippy, High-Speed World of Drone Racing.” Which suggests to me that if McPhee’s piece appeared online today, it would be titled something like “Basketball Player Bill Bradley’s Sense of Where He Is.” And that’s if he were lucky.

The reasoning here isn’t a mystery. Headlines are written these days to maximize clicks and shares, and The New Yorker isn’t immune, even if it sometimes raises an eyebrow. Back in 2014, Maria Konnikova wrote an article for the magazine’s website titled “The Six Things That Make Stories Go Viral Will Amaze, and Maybe Infuriate, You,” in which she explained one aspect of the formula for online headlines: “The presence of a memory-inducing trigger is also important. We share what we’re thinking about—and we think about the things we can remember.” Viral headlines can’t be allusive, make a clever play on words, or depend on an evocative reference—they have to spell everything out. (To build on McPhee’s analogy, it’s less like a tourist’s face on the cardboard body of Mao Zedong than an oversized foam head of Mao himself.) A year later, The New Yorker ran an article by Andrew Marantz on the virality expert Emerson Spartz, and it amazed and maybe infuriated me. I’ve written about this profile elsewhere, but looking it over again now, my eye was caught by these lines:

Much of the company’s success online can be attributed to a proprietary algorithm that it has developed for “headline testing”—a practice that has become standard in the virality industry…Spartz’s algorithm measures which headline is attracting clicks most quickly, and after a few hours, when a statistically significant threshold is reached, the “winning” headline automatically supplants all others. “I’m really, really good at writing headlines,” he told me.

And it’s worth noting that while Marantz’s piece appeared in print as “The Virologist,” in an online search, it pops up as “King of Clickbait.” Even as the magazine gently mocked Spartz, it took his example to heart.

None of this is exactly scandalous, but when you think of a title as “an integral part of a piece of writing,” as McPhee does, it’s undeniably sad. There isn’t any one title for an article anymore, and most readers will probably only see its online incarnation. And this isn’t because of an editor’s tastes, but the result of an impersonal set of assumptions imposed on the entire industry. Emerson Spartz got his revenge on The New Yorker—he effectively ended up writing its headlines. And while I can’t blame any media company for doing whatever it can to stay viable, it’s also a real loss. McPhee is right when he says that selecting a title is an important part of the process, and in a perfect world, it would be left up to the writer. (It can even lead to valuable insights in itself. When I was working on my article on the fiction of L. Ron Hubbard, I was casting about randomly for a title when I came up with “Xenu’s Paradox.” I didn’t know what it meant, but it led me to start thinking about the paradoxical aspects of Hubbard’s career, and the result was a line of argument that ended up being integral not just to the article, but to the ensuing book. And I was amazed when it survived intact on Longreads.) When you look at the grindingly literal, unpoetic headlines that currently populate the homepage of The New Yorker, it’s hard not to feel nostalgic for an era in which an editor might nudge a title in the opposite direction. In 1966, when McPhee delivered a long piece on oranges in Florida, William Shawn read it over, focused on a quotation from the poet Andrew Marvell, and called it “Golden Lamps in a Green Night.” McPhee protested, and the article was finally published under the title that he had originally wanted. It was called “Oranges.”

Written by nevalalee

February 16, 2018 at 8:50 am

The fictional sentence

with one comment

Of all the writers of the golden age of science fiction, the one who can be hardest to get your head around is A.E. van Vogt. He isn’t to everyone’s taste—many readers, to quote Alexei and Cory Panshin’s not unadmiring description, find him “foggy, semi-literate, pulpish, and dumb”—but he’s undoubtedly a major figure, and he was second only to Robert A. Heinlein and Isaac Asimov when it came to defining what science fiction became in the late thirties and early forties. (If he isn’t as well known as they are, it’s largely because he was taken out of writing by dianetics at the exact moment that the genre was breaking into the mainstream.) Part of his appeal is that his stories remain compelling and readable despite their borderline incoherence, and he was unusually open about his secret. In the essay “My Life Was My Best Science Fiction Story,” which was originally published in the volume Fantastic Lives, van Vogt wrote:

I learned to write by a system propounded in a book titled The Only Two Ways to Write a Story by John W. Gallishaw (meaning by flashback or in consecutive sequence). Gallishaw had made an in-depth study of successful stories by great authors. He observed that the best of them wrote in what he called “presentation units” of about eight hundred words. Each of these units contained five steps. And every sentence in it was a “fictional sentence.” Which means that it was written either with imagery, or emotion, or suspense, depending on the type of story.

So what did these units look like? Used copies of Gallishaw’s book currently go for well over a hundred dollars online, but van Vogt helpfully summarized the relevant information:

The five steps can be described as follows: 1) Where, and to whom, is it happening? 2) Make clear the scene purpose (What is the immediate problem which confronts the protagonist, and what does it require him to accomplish in this scene?) 3) The interaction with the opposition, as he tries to achieve the scene purpose. 4) Make the reader aware that he either did accomplish the scene purpose, or did not accomplish it. 5) In all the early scenes, whether protagonist did or did not succeed in the scene purpose, establish that things are going to get worse. Now, the next presentation unit-scene begins with: Where is all this taking place. Describe the surroundings, and to whom it is happening. And so forth.

Over the years, this formula was distorted and misunderstood, so that a critic could write something like “Van Vogt admits that he changes the direction of his plot every eight hundred words.” And even when accurately stated, it can come off as bizarre. Yet it’s really nothing more than the principle that every narrative should consist of a series of objectives, which I’ve elsewhere listed among the most useful pieces of writing advice that I know. Significantly, it’s one of the few elements of craft that can be taught and learned by example. Van Vogt learned it from Gallishaw, while I got it from David Mamet’s On Directing Film, and I’ve always seen it as a jewel of wisdom that can be passed in almost apostolic fashion from one writer to another.

When we read van Vogt’s stories, of course, we aren’t conscious of this structure, and if anything, we’re more aware of their apparent lack of form. (As John McPhee writes in his wonderful new book on writing: “Readers are not supposed to notice the structure. It is meant to be about as visible as someone’s bones.”) Yet we still keep reading. It’s that sequence of objectives that keeps us oriented through the centrifugal wildness that we associate with van Vogt’s work—and it shouldn’t come as a surprise that he approached the irrational side as systematically as he did everything else. I’d heard at some point that van Vogt based many of his plots on his dreams, but it wasn’t until I read his essay that I understood what this meant:

When you’re writing, as I was, for one cent a word, and are a slow writer, and the story keeps stopping for hours or days, and your rent is due, you get anxious…I would wake up spontaneously at night, anxious. But I wasn’t aware of the anxiety. I thought about story problems—that was all I noticed then. And so back to sleep I went. In the morning, often there would be an unusual solution. All my best plot twists came in this way…It was not until July 1943 that I suddenly realized what I was doing. That night I got out our alarm clock and moved into the spare bedroom. I set the alarm to ring at one and one-half hours. When it awakened me, I reset the alarm for another one and one-half hours, thought about the problems in the story I was working on—and fell asleep. I did that altogether four times during the night. And in the morning, there was the unusual solution, the strange plot twist…So I had my system for getting to my subconscious mind.

This isn’t all that different from Salvador Dali’s advice on how to take a nap. But the final sentence is the kicker: “During the next seven years, I awakened myself about three hundred nights a year four times a night.” When I read this, I felt a greater sense of kinship with van Vogt than I have with just about any other writer. Much of my life has been spent searching for tools—from mind maps to tarot cards—that can be used to systematically incorporate elements of chance and intuition into what is otherwise a highly structured process. Van Vogt’s approach comes as close as anything I’ve ever seen to the ideal of combining the two on a reliable basis, even if we differ on some of the details. (For instance, I don’t necessarily buy into Gallishaw’s notion that every action taken by the protagonist needs to be opposed, or that the situation needs to continually get worse. As Mamet writes in On Directing Film: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” And that’s often enough.) But it’s oddly appropriate that we find such rules in the work of a writer who frequently came across as chronically disorganized. Van Vogt pushed the limits of form further than any other author of the golden age, and it’s hard to imagine Alfred Bester or Philip K. Dick without him. But I’m sure that there were equally visionary writers who never made it into print because they lacked the discipline, or the technical tricks, to get their ideas under control. Van Vogt’s stories always seem on the verge of flying apart, but the real wonder is that they don’t. And his closing words on the subject are useful ones indeed: “It is well to point out again that these various systems were, at base, just automatic reactions to the writing of science fiction. The left side of the brain got an overdose of fantasizing flow from the right side, and literally had to do something real.”

How to be useful

leave a comment »

In his recent review in The New Yorker of a new collection of short stories by Susan Sontag, the critic Tobi Haslett quotes its author’s explanation of why she wrote her classic book Illness as Metaphor: “I wanted to be useful.” I was struck enough by this statement to look up the full version, in which Sontag explains how she approached the literary challenge of addressing her own experience with cancer:

I didn’t think it would be useful—and I wanted to be useful—to tell yet one more story in the first person of how someone learned that she or he had cancer, wept, struggled, was comforted, suffered, took courage…though mine was also that story. A narrative, it seemed to me, would be less useful than an idea…And so I wrote my book, wrote it very quickly, spurred by evangelical zeal as well as anxiety about how much time I had left to do any living or writing in. My aim was to alleviate unnecessary suffering…My purpose was, above all, practical.

This is a remarkable way to look at any book, and it emerged both from Sontag’s own illness and from her awareness of her peculiar position in the culture of her time, as Haslett notes: “Slung between aesthetics and politics, beauty and justice, sensuous extravagance and leftist commitment, Sontag sometimes found herself contemplating the obliteration of her role as public advocate-cum-arbiter of taste. To be serious was to stake a belief in attention—but, in a world that demands action, could attention be enough?”

Sontag’s situation may seem remote from that of most authors, but it’s a problem that every author faces when he or she decides to tackle a book, which usually amounts to a call for attention over action. We write for all kinds of reasons, some more admirable than others, and selecting one idea or project over another comes down prioritizing such factors as our personal interests, commercial potential, and what we want to think about for a year or more of our lives. But as time goes by, I’ve found that Sontag’s test—that the work be useful—is about as sensible a criterion as any. I’ve had good and bad luck in both cases, but as a rule, whenever I’ve tried to be useful to others, I’ve done well, and whenever I haven’t, I’ve failed. Being useful doesn’t necessarily mean providing practical information or advice, although that’s a fine reason to write a book, but rather writing something that would have value even if you weren’t the one whose name was on the cover, simply because it deserves to exist. You often don’t know until long after you start if a project meets that standard, and it might even be a mistake to consciously pursue it. The best approach, in the end, might simply to develop a lot of ideas in hope that some small fraction will survive. I still frequently write just for my own pleasure, out of personal vanity, or for the desire to see something in print, but it only lasts if the result is also useful, so it’s worth at least keeping it in mind as a kind of sieve for deciding between alternatives. As Lin-Manuel Miranda once put it to Grantland, in words that have never ceased to resound in my head: “What’s the thing that’s not in the world that should be in the world?”

One of my favorite examples is the writer Euell Gibbons, who otherwise might seem less like Susan Sontag than any human being imaginable. As John McPhee writes in a short reminiscence, “The Forager,” in the New York Times:

Euell had begun learning about wild and edible vegetation when he was small boy in the Red River Valley. Later, in the dust‐bowl era, his family moved to central New Mexico. They lived in a semi‐dugout and almost starved there. His father left in a desperate search for work. The food supply diminished until all that was left were a few pinto beans and a single egg, which no one would eat. Euell, then teen‐aged and one of four children, took a knapsack one morning and left for the horizon mountains. He came hack with puffball mushrooms, piñon nuts, and fruits of the yellow prickly pear. For nearly a month, the family lived wholly on what he provided, and he saved their lives. “Wild food has meant different things to me at different times,” he said to me once. “Right then it was a means of salvation, a way to keep from dying.”

In years that followed, Euell worked as a cowboy. He pulled cotton. He was for a long time a hobo. He worked in a shipyard. He combed beaches. The longest period during which he lived almost exclusively on wild food was five years. All the while, across decades, he wished to be a writer. He produced long pieces of fiction and he had no luck…He passed the age of fifty with virtually nothing published. He saw himself as a total failure, and he had no difficulty discerning that others tended to agree.

What happened next defied all expectation. McPhee writes: “Finally, after listening to the advice of a literary agent, he sat down to try to combine his interests. He knew his subject first- and second-hand; he knew it backward to the botanies of the tribes. And now he told everybody else how to gather and prepare wild food.” The result was the book Stalking the Wild Asparagus, which became the first in a bestselling series. At times, Gibbons didn’t seem to know how to handle his own success, as McPhee recalls:

He would live to be widely misassessed. His books gave him all the money he would ever need. The deep poverty of his other years was not forgotten, though, and he took to going around with a minimum of $1,500 in his pocket, because with any less there, he said, he felt insecure. Whatever he felt, it was enough to cause him, in his last years, to appear on television munching Grape-Nuts—hard crumbs ground from tough bread—and, in doing so, he obscured his accomplishments behind a veil of commercial personality. He became a household figure of a cartoon sort. People laughed when they heard his name. All too suddenly, he stood for what he did not stand for.

But Gibbons also deserves to be remembered as a man who finally understood and embraced the admonition that a writer be useful. McPhee concludes: “He was a man who knew the wild in a way that no one else in this time has even marginally approached. Having brought his knowledge to print, he died the writer he wished to be.” Gibbons and Sontag might not have had much in common—it’s difficult to imagine them even having a conversation—but they both confronted the same question: “What book should I write?” And we all might have better luck with the answer if we ask ourselves instead: “What have I done to survive?”

Written by nevalalee

December 19, 2017 at 8:14 am

Writing with scissors

leave a comment »

Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)

Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:

The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)

Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational  thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.

And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:

If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.

This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.

%d bloggers like this: