Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Nicholson Baker

Updike’s ladder

with one comment

Note: I’m taking the day off, so I’m republishing a post that originally appeared, in a slightly different form, on September 13, 2017.

Last year, the author Anjali Enjeti published an article in The Atlantic titled “Why I’m Still Trying to Get a Book Deal After Ten Years.” If just reading those words makes your palms sweat and puts your heart through a few sympathy palpitations, congratulations—you’re a writer. No matter where you might be in your career, or what length of time you mentally insert into that headline, you can probably relate to what Enjeti writes:

Ten years ago, while sitting at my computer in my sparsely furnished office, I sent my first email to a literary agent. The message included a query letter—a brief synopsis describing the personal-essay collection I’d been working on for the past six years, as well as a short bio about myself. As my third child kicked from inside my pregnant belly, I fantasized about what would come next: a request from the agent to see my book proposal, followed by a dream phone call offering me representation. If all went well, I’d be on my way to becoming a published author by the time my oldest child started first grade.

“Things didn’t go as planned,” Enjeti says dryly, noting that after landing and leaving two agents, she’s been left with six unpublished manuscripts and little else to show for it. She goes on to share the stories of other writers in the same situation, including Michael Bourne of Poets & Writers, who accurately calls the submission process “a slow mauling of my psyche.” And Enjeti wonders: “So after sixteen years of writing books and ten years of failing to find a publisher, why do I keep trying? I ask myself this every day.”

It’s a good question. As it happens, I first encountered her article while reading the authoritative biography Updike by Adam Begley, which chronicles a literary career that amounts to the exact opposite of the ones described above. Begley’s account of John Updike’s first acceptance from The New Yorker—just months after his graduation from Harvard—is like lifestyle porn for writers:

He never forgot the moment when he retrieved the envelope from the mailbox at the end of the drive, the same mailbox that had yielded so many rejection slips, both his and his mother’s: “I felt, standing and reading the good news in the midsummer pink dusk of the stony road beside a field of waving weeds, born as a professional writer.” To extend the metaphor…the actual labor was brief and painless: he passed from unpublished college student to valued contributor in less than two months.

If you’re a writer of any kind, you’re probably biting your hand right now. And I haven’t even gotten to what happened to Updike shortly afterward:

A letter from Katharine White [of The New Yorker] dated September 15, 1954 and addressed to “John H. Updike, General Delivery, Oxford,” proposed that he sign a “first-reading agreement,” a scheme devised for the “most valued and most constant contributors.” Up to this point, he had only one story accepted, along with some light verse. White acknowledged that it was “rather unusual” for the magazine to make this kind of offer to a contributor “of such short standing,” but she and Maxwell and Shawn took into consideration the volume of his submissions…and their overall quality and suitability, and decided that this clever, hard-working young man showed exceptional promise.

Updike was twenty-two years old. Even now, more than half a century later and with his early promise more than fulfilled, it’s hard to read this account without hating him a little. Norman Mailer—whose debut novel, The Naked and the Dead, appeared when he was twenty-five—didn’t pull any punches in “Some Children of the Goddess,” an essay on his contemporaries that was published in Esquire in 1963: “[Updike’s] reputation has traveled in convoy up the Avenue of the Establishment, The New York Times Book Review, blowing sirens like a motorcycle caravan, the professional muse of The New Yorker sitting in the Cadillac, membership cards to the right Fellowships in his pocket.” Even Begley, his biographer, acknowledges the singular nature of his subject’s rise:

It’s worth pausing here to marvel at the unrelieved smoothness of his professional path…Among the other twentieth-century American writers who made a splash before their thirtieth birthday…none piled up accomplishments in as orderly a fashion as Updike, or with as little fuss…This frictionless success has sometimes been held against him. His vast oeuvre materialized with suspiciously little visible effort. Where there’s no struggle, can there be real art? The Romantic notion of the tortured poet has left us with a mild prejudice against the idea of art produced in a calm, rational, workmanlike manner (as he put it, “on a healthy basis of regularity and avoidance of strain”), but that’s precisely how Updike got his start.

Begley doesn’t mention that the phrase “regularity and avoidance of strain” is actually meant to evoke the act of defecation, but even this provides us with an odd picture of writerly contentment. As Dick Hallorann says in The Shining, the best movie about writing ever made: “You got to keep regular, if you want to be happy.”

If there’s a larger theme here, it’s that the sheer productivity and variety of Updike’s career—with its reliable production of uniform hardcover editions over the course of five decades—are inseparable from the “orderly” circumstances of his rise. Updike never lacked a prestigious venue for his talents, which allowed him to focus on being prolific. Writers whose publication history remains volatile and unpredictable, even after they’ve seen print, don’t always have the luxury of being so unruffled, and it can affect their work in ways that are almost subliminal. (A writer can’t survive ten years of chasing after a book deal without spending the entire time convinced that he or she is on the verge of a breakthrough, anticipating an ending that never comes, which may partially account for the prevalence in literary fiction of frustration and unresolved narratives. It also explains why it helps to be privileged enough to fail for years.) The short answer to Begley’s question is that struggle is good for a writer, but so is success, and you take what you can get, even as you’re transformed by it. I think on a monthly basis of what Nicholson Baker writes of Updike in his tribute U and I:

I compared my awkward public self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

We’re all on that ladder, including Enjeti, who I’m pleased to note finally scored her book deal—she has an essay collection in the works from the University of Georgia Press. Some are on their way up, some are headed down, and some are stuck for years on the same rung. But you never get anywhere if you don’t try to climb.

Updike’s ladder

with 2 comments

In the latest issue of The Atlantic, the author Anjali Enjeti has an article titled “Why I’m Still Trying to Get a Book Deal After Ten Years.” If just reading those words makes your palms sweat and puts your heart through a few sympathy palpitations, congratulations—you’re a writer. No matter where you might be in your career, or what length of time you can mentally insert into that headline, you can probably relate to Enjeti when she writes:

Ten years ago, while sitting at my computer in my sparsely furnished office, I sent my first email to a literary agent. The message included a query letter—a brief synopsis describing the personal-essay collection I’d been working on for the past six years, as well as a short bio about myself. As my third child kicked from inside my pregnant belly, I fantasized about what would come next: a request from the agent to see my book proposal, followed by a dream phone call offering me representation. If all went well, I’d be on my way to becoming a published author by the time my oldest child started first grade.

“Things didn’t go as planned,” Enjeti says drily, noting that after landing and leaving two agents, she’s been left with six unpublished manuscripts and little else to show for it. She goes on to share the stories of other writers in the same situation, including Michael Bourne of Poets & Writers, who accurately calls the submission process “a slow mauling of my psyche.” And Enjeti wonders: “So after sixteen years of writing books and ten years of failing to find a publisher, why do I keep trying? I ask myself this every day.”

It’s a good question. As it happens, I came across her article while reading the biography Updike by Adam Begley, which chronicles a literary career that amounts to the exact opposite of the ones described above. Begley’s account of John Updike’s first acceptance from The New Yorker—just months after his graduation from Harvard—is like lifestyle porn for writers:

He never forgot the moment when he retrieved the envelope from the mailbox at the end of the drive, the same mailbox that had yielded so many rejection slips, both his and his mother’s: “I felt, standing and reading the good news in the midsummer pink dusk of the stony road beside a field of waving weeds, born as a professional writer.” To extend the metaphor…the actual labor was brief and painless: he passed from unpublished college student to valued contributor in less than two months.

If you’re a writer of any kind, you’re probably biting your hand right now. And I haven’t even gotten to what happened to Updike shortly afterward:

A letter from Katharine White [of The New Yorker] dated September 15, 1954 and addressed to “John H. Updike, General Delivery, Oxford,” proposed that he sign a “first-reading agreement,” a scheme devised for the “most valued and most constant contributors.” Up to this point, he had only one story accepted, along with some light verse. White acknowledged that it was “rather unusual” for the magazine to make this kind of offer to a contributor “of such short standing,” but she and Maxwell and Shawn took into consideration the volume of his submissions…and their overall quality and suitability, and decided that this clever, hard-working young man showed exceptional promise.

Updike was twenty-two years old. Even now, more than half a century later and with his early promise more than fulfilled, it’s hard to read this account without hating him a little. Norman Mailer—whose debut novel, The Naked and the Dead, appeared when he was twenty-five—didn’t pull any punches in “Some Children of the Goddess,” an essay on his contemporaries that was published in Esquire in 1963: “[Updike’s] reputation has traveled in convoy up the Avenue of the Establishment, The New York Times Book Review, blowing sirens like a motorcycle caravan, the professional muse of The New Yorker sitting in the Cadillac, membership cards to the right Fellowships in his pocket.” And Begley, his biographer, acknowledges the singular nature of his subject’s rise:

It’s worth pausing here to marvel at the unrelieved smoothness of his professional path…Among the other twentieth-century American writers who made a splash before their thirtieth birthday…none piled up accomplishments in as orderly a fashion as Updike, or with as little fuss…This frictionless success has sometimes been held against him. His vast oeuvre materialized with suspiciously little visible effort. Where there’s no struggle, can there be real art? The Romantic notion of the tortured poet has left us with a mild prejudice against the idea of art produced in a calm, rational, workmanlike manner (as he put it, “on a healthy basis of regularity and avoidance of strain”), but that’s precisely how Updike got his start.

Begley doesn’t mention that the phrase “regularity and avoidance of strain” is actually meant to evoke the act of defecation, but even this provides us with an odd picture of writerly contentment. As Dick Hallorann says in The Shining, the best movie about writing ever made: “You got to keep regular, if you want to be happy.”

If there’s a larger theme here, it’s that the qualities that we associate with Updike’s career—with its reliable production of uniform hardcover editions over the course of five decades—are inseparable from the “orderly” circumstances of his rise. Updike never lacked a prestigious venue for his talents, which allowed him to focus on being productive. Writers whose publication history remains volatile and unpredictable, even after they’ve seen print, don’t always have the luxury of being so unruffled, and it can affect their work in ways that are almost subliminal. (A writer can’t survive ten years of waiting for a book deal without spending the entire time convinced that he or she is on the verge of a breakthrough, anticipating an ending that never comes, which may partially explain the literary world’s fondness for frustration and unresolved narratives.) The short answer to Begley’s question is that struggle is good for a writer, but so is success, and you take what you can get, even you’re transformed by it. I seem to think on a monthly basis of what Nicholson Baker writes of Updike in his tribute U and I:

I compared my awkward public self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

We’re all on that ladder. Some are on their way up, some are headed down, and some are stuck for years on the same rung. But you never get anywhere if you don’t try to climb.

The weight of lumber

with 4 comments

In my discussion yesterday of huge scholarly projects that expanded to take up the lives of their authors, I deliberately left out one name. Arnold J. Toynbee was a British historian and author of the twelve volumes of A Study of History, the most ambitious attempt to date at a universal theory of the rise and fall of civilizations. Toynbee has intrigued me for as long as I can remember, but he’s a little different from such superficially similar figures as Joseph Needham and Donald Knuth. For one thing, he actually finished his magnum opus, and even though it took decades, he more or less stuck to the plan of the work that he published in the first installment, which was an achievement in itself. He also differed from the other two in reaching a wide popular audience. Thousands of sets of his book were sold, and it became a bestseller in its two-volume abridgment by D.C. Somervell. It inspired countless essays and thick tomes of commentary, argument, and response—and then, strangely, it simply went away. Toynbee’s name has all but disappeared from mainstream and academic consideration, maybe because his ideas were too abstruse for one and too grandiose for the other, and if he’s recognized today at all, it’s probably because of the mysterious Toynbee tiles. (One possible successor is the psychohistory of the Foundation series, which has obvious affinities to his work, although Isaac Asimov played down the connection. He read the first half of A Study of History in 1944, borrowing the volumes one at a time from L. Sprague de Camp, and recalled: “There are some people who, on reading my Foundation series, are sure that it was influenced basically by Toynbee. They are only partly right. The first four stories were written before I had read Toynbee. ‘Dead Hand,’ however, was indeed influenced by it.”)

At the Newberry Library Book Fair last week, I hesitated over buying a complete set of Toynbee, and by the time I made up my mind and went back to get it, it was gone—which is the kind of mistake that can haunt me for the rest of my life. As a practical matter, though, I have all the Toynbee I’ll ever need: I already own the introductory volume of A Study of History and the Somervell abridgment, and it’s frankly hard to imagine reading anything else. But I did pick up the twelfth and last volume, Reconsiderations, published seven years after the rest, which might be the most interesting of them all. It’s basically Toynbee’s reply to his critics in over seven hundred pages of small type, in the hardcover equivalent of a writer responding to all the online comments on his work one by one. Toynbee seems to have read every review of his book, and he sets out to engage them all, including a miscellaneous section of over eighty pages simply called Ad Hominem. It’s a prickly, fascinating work that is probably more interesting than the books that inspired it, and one passage in particular caught my eye:

One of my critics has compared earlier volumes of this book to a “palace” in which “the rooms…are over-furnished to the point of resembling a dealer’s warehouse.” This reviewer must also be a thought-reader; for I have often thought of myself as a man moving old furniture about. For centuries these lovely things had been lying neglected in the lumber-rooms and attics. They had been piled in there higgledy-piggledy, in utter disorder, and had been crammed so tight that nobody could even squeeze his way in to look at them and find out whether they were of any value. In the course of ages they had been accumulating there—unwanted rejects from a score of country houses. This unworthy treatment of these precious pieces came to trouble me more and more; for I knew that they were not really junk; I knew that they were heirlooms, and these so rare and fine that they were not just provincial curiosities; they were the common heritage of anyone who had any capacity for appreciating beauty in Man’s handiwork.

In speaking of “lumber-rooms and attics,” Toynbee is harking back to a long literary tradition of comparing the mind itself to a lumber-room, which originally meant a spare room in a house full of unused furniture and other junk. I owe this knowledge to Nicholson Baker’s famous essay “Lumber,” reprinted in his collection The Size of Thoughts, in which he traces the phrase’s rise and fall, in a miniature version of what Toynbee tries to do for entire civilizations. Baker claims to have chosen the word “lumber” essentially at random, writing in his introduction: “Now feels like a good time to pick a word or a phrase, something short, and go after it, using the available equipment of intellectual retrieval, to see where we get…It should be representatively out of the way; it should have seen better days. Once or twice in the past it briefly enjoyed the status of a minor cliché, but now, for one reason or another, it is ignored or forgotten.” This might be a description of A Study of History itself—and yet, remarkably, Baker doesn’t mention the passage that I’ve quoted here. I assume that this is because he wasn’t aware of it, because it fits in beautifully with the rest of his argument. The dread of the mind becoming a lumber-room, crammed with useless odds and ends, is primarily a fear of intellectuals, as expressed by their patron saint Sherlock Holmes:

I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things, so that he has a difficulty in laying his hands upon it. Now the skillful workman is very careful indeed as to what he takes into his brain-attic…It is a mistake to think that this little room has elastic walls and can distend to any extent.

Baker explains: “This is a form of the great scholarly worry—a worry which hydroptically book-thirsty poets like Donne, Johnson, Gray, Southey, and Coleridge all felt at times—the fear that too much learning will eventually turn even an original mind into a large, putty-colored regional storage facility of mislabeled and leaking chemical drums.”

Toynbee’s solution to the problem of mental lumber, like that of Needham and Knuth, was simply to pull it out of his brain and put it down on paper, even if it took three decades and twelve volumes. It’s hard not to be stirred by his description of his efforts:

At last I found that I could not bear this shocking situation any longer, so I set my own hand to a back-breaking job. I began to drag out the pieces, one by one, and to arrange them in the hall. I could not pretend to form a final judgement on the order in which they should be placed. Indeed, there never could be a final judgement on this, for a number of attractive different orders could be imagined, each of them the right order from some particular point of view. The first thing to be done was to get as many of the pieces as possible out into the open and to assemble them in some order or other. If once I had them parked down in the hall, I could see how they looked and could shift them and re-shift them at my leisure. Perhaps I should not have the leisure; perhaps the preliminary job of extracting these treasures from the lumber-rooms and attics would turn out to be as much as I could manage with my single pair of hands. If so, this would not matter; for there would be plenty of time afterwards for other people to rearrange the pieces, and, no doubt, they would be doing this again and again as they studied them more closely and came to know more about them than would ever be known by me.

It’s through arrangement and publication that lumber becomes precious again, and from personal experience, I know how hard it can be to relinquish information that has been laboriously brought to light. But part of the process is knowing when to stop. As Baker, a less systematic but equally provocative thinker, concludes:

I have poked through verbal burial mounds, I have overemphasized minor borrowings, I have placed myself deep in the debt of every accessible work of reference, and I have overquoted and overquibbled—of course I have: that is what always happens when you pay a visit to the longbeards’ dusty chamber…All the pages I have flipped and copied and underlined will turn gray again and pull back into the shadows, and have no bearing on one another. Lumber becomes treasure only temporarily, through study, and then it lapses into lumber again. Books open, and then they close.

A visit to the chainmaker

leave a comment »

In the landmark study The Symbolist Movement in Literature by the critic Arthur Symons, there’s a short chapter titled “A Note on Zola’s Method.” Even if you’ve never gotten around to reading Émile Zola—and I confess that I haven’t—it’s an essay that every writer should take to heart. After describing the research that Zola devoted to his novel L’Assommoir, Symons launches a brutal attack on the value of this kind of work:

[Zola] observes with immense persistence, but his observation, after all, is only that of the man in the street; it is simply carried into detail, deliberately…And so much of it all is purely unnecessary, has no interest in itself and no connection with the story: the precise details of Lorilleux’s chainmaking, bristling with technical terms…Goujet’s forge, and the machinery in the shed next door; and just how you cut out zinc with a large pair of scissors.

We’ve all read stories in which the writer feels obliged to include every last bit of research, and Symons’s judgment of this impulse is deservedly harsh:

To find out in a slang dictionary that a filthy idea can be expressed by an ingeniously filthy phrase…is not a great feat, or, on purely artistic grounds, altogether desirable. To go to a chainmaker and learn the trade name of the various kinds of chain which he manufactures, and of the instruments with which he manufactures them, is not an elaborate process, or one which can be said to pay you for the little trouble which it no doubt takes. And it is not well to be too certain after all that Zola is always perfectly accurate in his use of all this manifold knowledge.

And the most punishing comparison is yet to come: “My main contention is that Zola’s general use of words is, to be quite frank, somewhat ineffectual. He tries to do what Flaubert did, without Flaubert’s tools, and without the craftsman’s hand at the back of the tools. His fingers are too thick; they leave a blurred line. If you want merely weight, a certain kind of force, you get it; but no more.” It’s the difference, Symons observes, between the tedious accumulation of detail, in hopes that its sheer weight will somehow make the scene real, and the one perfect image that will ignite a reader’s imagination:

[Zola] cannot leave well alone; he cannot omit; he will not take the most obvious fact for granted…He tells us particularly that a room is composed of four walls, that a table stands on its four legs. And he does not appear to see the difference between doing that and doing as Flaubert does, namely, selecting precisely the detail out of all others which renders or consorts with the scene in hand, and giving that detail with an ingenious exactness.

By way of illustration, Symons quotes the moment in Madame Bovary in which Charles turns away at the exact moment that his first wife dies, which, he notes, “indicates to us, at the very opening of the book, just the character of the man about whom we are to read so much.” And he finishes with a devastating remark that deserves to be ranked alongside Mark Twain’s classic demolition of James Fenimore Cooper: “Zola would have taken at least two pages to say that, and, after all, he would not have said it.”

Flaubert, of course, is usually seen as the one shining example of a writer whose love of research enhanced his artistry, rather than diminishing it. In his takedown of a very different book, Allan Folsom’s thriller The Day After Tomorrow, the critic Anthony Lane cites one typical sentence—“Two hundred European cities have bus links with Frankfurt”—and adds:

When Flaubert studied ancient Carthage for Salammbô, or the particulars of medieval falconry for “The Legend of St. Julien Hospitalier,” he was furnishing and feathering a world that had already taken shape within his mind; when Allan Folsom looks at bus timetables, his book just gets a little longer.

Even Flaubert’s apparent mistakes, on closer examination, turn out to be controlled by an almost inhuman attentiveness. In his novel Flaubert’s Parrot, Julian Barnes quotes a line from the literary critic Enid Starkie: “Flaubert does not build up his characters, as did Balzac, by objective, external description; in fact, so careless is he of their outward appearance that on one occasion he gives Emma brown eyes; on another deep black eyes; and on another blue eyes.” When the narrator, who shouldn’t be confused with Barnes himself, goes back to the text, he finds that Flaubert, in fact, describes Emma’s eyes with meticulous precision. In their first appearance, he writes: “In so far as she was beautiful, this beauty lay in her eyes: although they were brown, they would appear black because of her lashes.” A little later on: “They were black when she was in shadow and dark blue in full daylight.” And just after her seduction, as Emma looks in the mirror: “Her eyes had never been so large, so black, nor contained such depth.” Barnes’s narrator concludes: “It would be interesting to compare the time spent by Flaubert making sure that his heroine had the rare and difficult eyes of a tragic adulteress with the time spent by Dr. Starkie in carelessly selling him short.”

This level of diligent observation is a universe apart from the mechanical gathering of detail, and there’s no question that writers should aim for one, not the other. But to some extent, we all pay visits to the chainmaker—that is, we conduct research aimed at furnishing our stories with material that we can’t get from personal experience. Sometimes we even get this information from books. (Tolstoy seems to have derived all of the information about the Freemasons in War and Peace from his reading, which scandalizes some critics, as if they’ve caught him in an embarrassing breach of etiquette.) If an author’s personality is strong enough, it can transmute it into something more. John Updike turned this into a calling card, moving methodically through a series of adulterous white male protagonists who were distinguished mostly by their different jobs. In U and I, Nicholson Baker tries to call this a flaw: “He gives each of his male characters a profession, and then he has him think in metaphors drawn from that profession. That’s not right.” But after approvingly quoting one of the metaphors that emerge from the process, Baker changes his mind:

Without Updike’s determination to get some measure of control over his constant instinct to fling outward with a simile by filtering his correspondences through the characters’ offstage fictional professions, he would probably not have come up with this nice little thing, dropped as it is into the middle of a paragraph.

I like that phrase “measure of control,” which gets at the real point of research. It isn’t to pad out the story, but to channel it along lines that wouldn’t have occurred to the author otherwise. Research can turn into a set of chains in itself. But after all the work is done, the writer should be able to say, like Dylan Thomas in “Fern Hill”: “I sang in my chains like the sea.”

How to rest

with 4 comments

As a practical matter, there appears to be a limit to how long a novelist can work on any given day while still remaining productive. Anecdotally, the maximum effective period seems to fall somewhere in the range of four to six hours, which leaves some writers with a lot of time to kill. In a recent essay for The New Yorker, Gary Shteyngart writes:

I believe that a novelist should write for no more than four hours a day, after which returns truly diminish; this, of course, leaves many hours for idle play and contemplation. Usually, such a schedule results in alcoholism, but sometimes a hobby comes along, especially in middle age.

In Shteyngart’s case, the hobby took the form of a fascination with fine watches, to the point where he was spending thousands of dollars on his obsession every year. This isn’t a confession designed to elicit much sympathy from others—especially when he observes that spending $4,137.25 on a watch means throwing away “roughly 4.3 writing days”—but I’d like to believe that he chose a deliberately provocative symbol of wasted time. Most novelists have day jobs, with all their writing squeezed into the few spare moments that remain, so to say that writers have hours of idleness at their disposal, complete with that casual “of course,” implies an unthinking acceptance of a privilege that only a handful of authors ever attain. Shteyngart, I think, is smarter than this, and he may simply be using the luxury watch as an emblem of how precious each minute can be for writers for whom time itself hasn’t become devalued.

But let’s assume that you’re lucky enough to write for a living, and that your familial or social obligations are restricted enough to leave you with over half the day to spend as you see fit. What can you do with all those leisure hours? Alcoholism, as Shteyngart notes, is an attractive possibility, but perhaps you want to invest your time in an activity that enhances your professional life. Georg von Békésy, the Hungarian biophysicist, thought along similar lines, as his biographer Floyd Ratliff relates:

His first idea about how to excel as a scientist was simply to work hard and long hours, but he realized that his colleagues were working just as hard and just as long. So he decided instead to follow the old rule: sleep eight hours, work eight hours, and rest eight hours. But Békésy put a “Hungarian twist” on this, too. There are many ways to rest, and he reasoned that perhaps he could work in some way that would improve his judgment, and thus improve his work. The study of art, in which he already had a strong interest, seemed to offer this possibility…By turning his attention daily from science to art, Békésy refreshed his mind and sharpened his faculties.

This determination to turn even one’s free time into a form of self-improvement seems almost inhuman. (His “old rule” reminds me of the similar advice that Ursula K. LeGuin offers in The Left Hand of Darkness: “When action grows unprofitable, gather information; when information grows unprofitable, sleep.”) But I think that Békésy was also onto something when he sought out a hobby that provided a contrast to what he was doing for a living. A change, as the saying goes, is as good as a rest.

In fact, you could say that there are two types of hobbies, although they aren’t mutually exclusive. There are hobbies that are orthogonal to the rest of our lives, activating parts of the mind or personality that otherwise go unused, or providing a soothing mechanical respite from the nervous act of brainwork—think of Churchill and his bricklaying. Alternatively, they can channel our professional urges into a contained, orderly form that provides a kind of release. Ayn Rand, of all people, wrote perceptively about stamp collecting:

Stamp collecting is a hobby for busy, purposeful, ambitious people…because, in pattern, it has the essential elements of a career, but transposed to a clearly delimited, intensely private world…In stamp collecting, one experiences the rare pleasure of independent action without irrelevant burdens or impositions.

In my case, this blog amounts to a sort of hobby, and I keep at it for both reasons. It’s a form of writing, so it provides me with an outlet for those energies, but it also allows me to think about subjects that aren’t directly connected to my work. The process is oddly refreshing—I often feel more awake and alert after I’ve spent an hour writing a post, as if I’ve been practicing my scales on the piano—and it saves an hour from being wasted in unaccountable ways. This may be why many people are drawn to hobbies that leave you with a visible result in the end, whether it’s a blog post, a stamp collection, or a brick wall.

But there’s also something to be said for doing nothing. If you’ve devoted four hours—or whatever amount seems reasonable—to work that you love, you’ve earned the right to spend your remaining time however you like. As Sir Walter Scott wrote in a letter to a friend:

And long ere dinner time, I have
Full eight close pages wrote;
What, duty, has thou now to crave?
Well done, Sir Walter Scott!

At the end of the day, I often feel like watching television, and the show I pick serves as an index to how tired I am. If I’m relatively energized, I can sit through a prestige drama; if I’m more drained, I’ll suggest a show along the lines of Riverdale; and if I can barely see straight, I’ll put on a special feature from my Lord of the Rings box set, which is my equivalent of comfort food. And you can see this impulse in far more illustrious careers. Ludwig Wittgenstein, who thought harder than anyone else of his century, liked to relax by watching cowboy movies. The degree to which he felt obliged to unplug is a measure of how much he drove himself, and in the absence of other vices, this was as good a way of decompressing as any. It prompted Nicholson Baker to write: “[Wittgenstein] would go every afternoon to watch gunfights and arrows through the chest for hours at a time. Can you take seriously a person’s theory of language when you know that he was delighted by the woodenness and tedium of cowboy movies?” To which I can only respond: “Absolutely.”

Written by nevalalee

April 5, 2017 at 9:36 am

The excerpt opinion

leave a comment »

Norman Mailer

“It’s the rare writer who cannot have sentences lifted from his work,” Norman Mailer once wrote. What he meant is that if a reviewer is eager to find something to mock, dismiss, or pick apart, any interesting book will provide plenty of ammunition. On a simple level of craft, it’s hard for most authors to sustain a high pitch of technical proficiency in every line, and if you want to make a novelist seem boring or ordinary, you can just focus on the sentences that fall between the high points. In his famously savage takedown of Thomas Harris’s Hannibal, Martin Amis quotes another reviewer who raved: “There is not a single ugly or dead sentence.” Amis then acidly observes:

Hannibal is a genre novel, and all genre novels contain dead sentences—unless you feel the throb of life in such periods as “Tommaso put the lid back on the cooler” or “Eric Pickford answered” or “Pazzi worked like a man possessed” or “Margot laughed in spite of herself” or “Bob Sneed broke the silence.”

Amis knows that this is a cheap shot, and he glories in it. But it isn’t so different from what critics do when they list the awful sentences from a current bestseller or nominate lines for the Bad Sex in Fiction Award. I laugh at this along with anyone else, but I also wince a little, because there are few authors alive who aren’t vulnerable to that sort of treatment. As G.K. Chesterton pointed out: “You could compile the worst book in the world entirely out of selected passages from the best writers in the world.”

This is even more true of authors who take considerable stylistic or thematic risks, which usually result in individual sentences that seem crazy or, worse, silly. The fear of seeming ridiculous is what prevents a lot of writers from taking chances, and it isn’t always unjustified. An ambitious novel opens itself up to savaging from all sides, precisely because it provides so much material that can be turned against the author when taken out of context. And it doesn’t need to be malicious, either: even objective or actively sympathetic critics can be seduced by the ease with which a writer can be excerpted to make a case. I’ve become increasingly daunted by the prospect of distilling the work of Robert A. Heinlein, for example, because his career was so long, varied, and often intentionally provocative that you can find sentences to support any argument about him that you want to make. (It doesn’t help that his politics evolved drastically over time, and they probably would have undergone several more transformations if he had lived for longer.) This isn’t to say that his opinions aren’t a fair target for criticism, but any reasonable understanding of who Heinlein was and what he believed—which I’m still trying to sort out for myself—can’t be conveyed by a handful of cherry-picked quotations. Literary biography is useful primarily to the extent that it can lay out a writer’s life in an orderly fashion, providing a frame that tells us something about the work that we wouldn’t know by encountering it out of order. But even that involves a process of selection, as does everything else about a biography. The biographer’s project isn’t essentially different from that of a working critic or reviewer: it just takes place on a larger scale.

John Updike

And it’s worth noting that prolific critics themselves are particularly susceptible to this kind of treatment. When Renata Adler described Pauline Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless,” any devotee of Kael’s work had to disagree—but it was also impossible to deny that there was plenty of evidence for the prosecution. If you’re determined to hate Roger Ebert, you just have to search for the reviews in which his opinions, written on deadline, weren’t sufficiently in line with the conclusions reached by posterity, as when he unforgivably gave only three stars to The Godfather Part II. And there isn’t a single page in the work of David Thomson, who is probably the most interesting movie critic who ever lived, that couldn’t be mined for outrageous, idiotic, or infuriating statements. I still remember a review on The A.V. Club of How to Watch a Movie that quoted lines like this:

Tell me a story, we beg as children, while wanting so many other things. Story will put off sleep (or extinction) and the child’s organism hardly trusts the habit of waking yet.

And this:

You came into this book under deceptive promises (mine) and false hopes (yours). You believed we might make decisive progress in the matter of how to watch a movie. So be it, but this was a ruse to make you look at life.

The reviewer quoted these sentences as examples of the book’s deficiencies, and they were duly excoriated in the comments. But anyone who has really read Thomson knows that such statements are part of the package, and removing them would also deny most of what makes him so fun, perverse, and valuable.

So what’s a responsible reviewer to do? We could start, maybe, by quoting longer or complete sections, rather than sentences in isolation, and by providing more context when we offer up just a line or two. We can also respect an author’s feelings, explicit or otherwise, about what sections are actually important. In the passage I mentioned at the beginning of this post, which is about John Updike, Mailer goes on to quote a few sentences from Rabbit, Run, and he adds:

The first quotation is taken from the first five sentences of the book, the second is on the next-to-last page, and the third is nothing less than the last three sentences of the novel. The beginning and end of a novel are usually worked over. They are the index to taste in the writer.

That’s a pretty good rule, and it ensures that the critic is discussing something reasonably close to what the writer intended to say. Best of all, we can approach the problem of excerpting with a kind of joy in the hunt: the search for the slice of a work that will stand as a synecdoche of the whole. In the book U & I, which is also about Updike, Nicholson Baker writes about the “standardized ID phrase” and “the aphoristic consensus” and “the jingle we will have to fight past at some point in the future” to see a writer clearly again, just as fans of Joyce have to do their best to forget about “the ineluctable modality of the visible” and “yes I said yes I will Yes.” For a living author, that repository of familiar quotations is constantly in flux, and reviewers might approach their work with a greater sense of responsibility if they realized that they were playing a part in creating it—one tiny excerpt at a time.

Anthologies of interest

leave a comment »

The Astounding Science Fiction Anthology

If you really want to influence readers, don’t be an author—be an anthologist. Anthologies are among the earliest books that most of us read: the collections of fairy tales and poems we’re given as children, followed by the textbooks of stories we’re assigned in grade school, mark our first general exposure to literature of any kind, and all of those selections have been chosen for us by another human being, at least in theory. (These days, textbooks are more likely to be cobbled together by committee, drawing primarily on the work of their predecessors.) Later in life, when we pick up paperback anthologies for our own reading pleasure, it’s out of an unconscious desire to replicate or extend that education. The world of literature is so vast that it seems too large for any one reader to navigate alone. We depend on curators to cull it for us, singling out the essential nuggets from the disposable fluff that every healthy culture produces in such great quantities. The result, we hope, will be a sampling accurate enough to allow us to understand the whole, and for most of us, it comes to define it. But it’s really something else altogether. Even if we assume a perfect anthologist gifted enough to truly present us with “the best,” judging a culture or a genre from its masterpieces alone delivers a skewed picture. How much better was the best from the rest? Does it really reflect the experience of a reader at the time, who had to figure out what was good, bad, or mediocre without any assistance from the outside? As Nicholson Baker writes in his novel The Anthologist: “Anthology knowledge isn’t real knowledge. You have to read the unchosen poems to understand the chosen ones.”

I’ve been thinking about anthologies a lot recently, mostly because of the daunting amount of reading I need to do for Astounding. Obviously, I need to read as much as I can of the science fiction and fantasy that John W. Campbell, Isaac Asimov, Robert A. Heinlein, and L. Ron Hubbard wrote, and I expect to get pretty close to that goal by the time the book is finished. But what about the rest? I can’t make critical judgments about their work without a sense of what else was happening at the same time, and my reading up to this point in my life has been fannish but unsystematic, leaving me with considerable gaps in my understanding of the genre. As I’ve mentioned before, I have a complete collection of Astounding Science Fiction, but I can’t possibly read all of it, and it doesn’t even include what was going on in the other magazines. Predictably, then, I’ve turned to anthologies to fill in the blanks. Earlier this year, I put together a reading list for myself, drawing mostly on a shelf’s worth of classic short story collections. These include the three volumes of The Science Fiction Hall of Fame; The Astounding Science Fiction Anthology, which was edited by Campbell himself; The Astounding-Analog Reader; Analog’s Golden Anniversary Anthology; Analog Readers’ Choice; Adventures in Time and Space; The Road to Science Fiction; The Golden Age of Science Fiction; and various other “best of” lists and reader polls. The result is a list of nearly five hundred novels and stories, ranging in length from a few pages to massive tomes like Battlefield Earth, and at the moment, I’m about two thirds of the way through.

Nicholson Baker

Of course, this approach has obvious limitations. It ends up focusing mostly on Astounding, at least through the early fifties, so it doesn’t tell me much about what was going on in Amazing or Thrilling Wonder or the countless other pulp magazines that once flooded the newsstands. There’s very little from before the golden age. It’s almost exclusively in the English language, and particularly from American authors—although I’m willing to accept this shortcoming, since it reflects the milieu in which my four major figures emerged. Stories of limited aesthetic interest but considerable historical significance, like Cleve Cartmill’s “Deadline,” tend to fall through the cracks. And the result probably doesn’t have much in common with the experience of a reader who was buying these magazines from one month to the next. But it’s a beginning, and in some ways, it’s better than it sounds. In trying to read these stories more or less in the order in which they appeared, I’m creating an alternate version of myself who was born in, say, 1920, and was exposed to science fiction at the age when I was most likely to be influenced by it. In practice, what I end up with isn’t so much the inner life of that bright twelve year old, but the memories of that same reader thirty years down the line. Memory naturally filters what we read, leaving the stories that made the greatest impression on us at first glance, the ones that only gradually revealed their power, and a few that have stuck around for no discernible reason, aside from where we were in our lives when we first encountered them. And I’m hopeful that the subset of science fiction stories I’ve been reading will provide the same sort of background noise for the book I’m writing that my half-remembered reservoir of fiction does in my everyday life.

Needless to say, very little of what I’m reading now will end up explicitly in the book: given the nature of a work like this, I doubt I’ll have a chance to discuss more than a handful of stories that weren’t written by my central four authors. But I wouldn’t be doing this at all if I didn’t feel that the experience would change me, and how I think, in ways that will be reflected in every line. This is true even, or especially, if I forget much of what I read. In his story “Incest,” John Updike uses the phrase “vast, dying sea”—a description that Nicholson Baker quotes with approval in U & I—to evoke all the poetry that his main character has forgotten over the years. We all have a similar sea inside of us, collected and neglected by our internal anthologist, who operates when we aren’t aware of it. The anthologies we all carry in our brains differ markedly from one another, even more so than the tables of contents of the anthologies in print. (One of the nice things about the anthologies I’m reading is how little they overlap: only a few stories, like Asimov’s “Nightfall,” appear in more than two, which indicates how flexible, varied, and mutable the canon of science fiction really is.) An anthologist is the custodian of a genre’s past for the sake of the future: as time goes by, aside from a handful of books and authors that everyone is expected to read, anthologies are our only conduit for transmitting the memory of what a literature used to be, at least for the majority of readers. The same can be said of the reader’s own imperfect memory, which preserves, through a sort of memetic natural selection, the bits and pieces of the tradition that he or she needs. We can’t all be writers, or even perfect readers. But we’re all anthologists at heart.

Written by nevalalee

July 20, 2016 at 8:37 am

The power of the page

leave a comment »

Laura Hillenbrand

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2014.

Over the weekend, I found myself contemplating two very different figures from the history of American letters. The first is the bestselling nonfiction author Laura Hillenbrand, whose lifelong struggle with chronic fatigue syndrome compelled her to research and write Seabiscuit and Unbroken while remaining largely confined to her house for the last quarter of a century. (Wil S. Hylton’s piece on Hillebrand in The New York Times Magazine is absolutely worth a read—it’s the best author profile I’ve seen in a long time.) The other is the inventor and engineer Buckminster Fuller, whose life was itinerant as Hillebrand’s is stationary. There’s a page in E.J. Applewhite’s Cosmic Fishing, his genial look at his collaboration with Fuller on the magnum opus Synergetics, that simply reprints Fuller’s travel schedule for a representative two weeks in March: he flies from Philadelphia to Denver to Minneapolis to Miami to Washington to Harrisburg to Toronto, attending conferences and giving talks, to the point where it’s hard to see how he found time to get anything else done. Writing a coherent book, in particular, seemed like the least of his concerns; as Applewhite notes, Fuller’s natural element was the seminar, which allowed him to spin complicated webs of ideas in real time for appreciative listeners, and one of the greatest challenges of producing Synergetics lay in harnessing that energy in a form that could be contained within two covers.

At first glance, Hillenbrand and Fuller might seem to have nothing in common. One is a meticulous journalist, historian, and storyteller; the other a prodigy of worldly activity who was often reluctant to put his ideas down in any systematic way. But if they meet anywhere, it’s on the printed page—and I mean this literally. Hylton’s profile of Hillebrand is full of fascinating details, but my favorite passage describes how her constant vertigo has left her unable to study works on microfilm. Instead, she buys and reads original newspapers, which, in turn, has influenced the kinds of stories she tells:

Hillenbrand told me that when the newspaper arrived, she found herself engrossed in the trivia of the period—the classified ads, the gossip page, the size and tone of headlines. Because she was not hunched over a microfilm viewer in the shimmering fluorescent basement of a research library, she was free to let her eye linger on obscure details.

There are shades here of Nicholson Baker, who became so concerned over the destruction of library archives of vintage newspapers that he bought a literal ton of them with his life savings, and ended up writing an entire book, the controversial Human Smoke, based on his experience of reading press coverage of the events leading up to World War II day by day. And the serendipity that these old papers afforded was central to Hillebrand’s career: she first stumbled across the story of Louie Zamperini, the subject of Unbroken, on the opposite side of a clipping she was reading about Seabiscuit.

Buckminster Fuller

Fuller was similarly energized by the act of encountering ideas in printed form, with the significant difference that the words, in this case, were his own. Applewhite devotes a full chapter to Fuller’s wholesale revision of Synergetics after the printed galleys—the nearly finished proofs of the typeset book itself—had been delivered by their publisher. Authors aren’t supposed to make extensive rewrites in the galley stage; it’s so expensive to reset the text that writers pay for any major changes out of their own pockets. But Fuller enthusiastically went to town, reworking entire sections of the book in the margins, at a personal cost of something like $3,500 in 1975 dollars. And Applewhite’s explanation for this impulse is what caught my eye:

Galleys galvanize Fuller partly because of the large visual component of his imagination. The effect is reflexive: his imagination is triggered by what the eye frames in front of him. It was the same with manuscript pages: he never liked to turn them over or continue to another sheet. Page = unit of thought. So his mind was retriggered with every galley and its quite arbitrary increment of thought from the composing process.

The key word here is “quite arbitrary.” A sequence of pages—whether in a newspaper or in a galley proof—is an arbitrary grid laid on a sequence of ideas. Where the page break falls, or what ends up on the opposite side, is largely a matter of chance. And for both Fuller and Hillenbrand, the physical page itself becomes a carrier of information. It’s serendipitous, random, but no less real.

And it makes me reflect on what we give up when pages, as tangible objects, pass out of our lives. We talk casually about “web pages,” but they aren’t quite the same thing: now that many websites, including this one, offer visitors an infinite scroll, the effect is less like reading a book than like navigating the spool of paper that Kerouac used to write On the Road. Occasionally, a web page’s endlessness can be turned into a message in itself, as in the Clickhole blog post “The Time I Spent on a Commercial Whaling Ship Totally Changed My Perspective on the World,” which turns out to contain the full text of Moby-Dick. More often, though, we end up with a wall of text that destroys any possibility of accidental juxtaposition or structure. I’m not advocating a return to the practice of arbitrarily dividing up long articles into multiple pages, which is usually just an excuse to generate additional clicks. But the primacy of the page—with its arbitrary slice or junction of content—reminds us of why it’s still sometimes best to browse through a physical newspaper or magazine, or to look at your own work in printed form. At a time when we all have access to the same world of information, something as trivial as a page break or an accidental pairing of ideas can be the source of insights that have occurred to no one else. And the first step might be as simple as looking at something on paper.

Trading places

leave a comment »

John Updike

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What famous person’s life would you want to assume?”

“Celebrity,” John Updike once wrote, “is a mask that eats into the face.” And Updike would have known, having been one of the most famous—and the most envied—literary novelists of his generation, with a career that seemed to consist of nothing but the serene annual production of poems, stories, essays, and hardcovers that, with their dust jackets removed, turned out to have been bound and designed as a uniform edition. From the very beginning, Updike was already thinking about how his complete works would look on library shelves. That remarkable equanimity made an impression on the writer Nicholson Baker, who wrote in his book U & I:

I compared my awkward self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!

Plenty of writers, young or old, might have wanted to switch places with Updike, although the first rule of inhabiting someone else’s life is that you don’t want to be a writer. (The Updike we see in Adam Begley’s recent biography comes across as more unruffled than most, but all those extramarital affairs in Ipswich must have been exhausting.) Writing might seem like an attractive kind of celebrity: you can inspire fierce devotion in a small community of fans while remaining safely anonymous in a restaurant or airport. You don’t even need to go as far as Thomas Pynchon: how many of us could really pick Michael Chabon or Don DeLillo or Cormac McCarthy out of a crowd? Yet that kind of seclusion carries a psychological toll as well, and I suspect that the daily life of any author, no matter how rich or acclaimed, looks much the same as any other. If you want to know what it’s like to be old, Malcolm Cowley wrote: “Put cotton in your ears and pebbles in your shoes. Pull on rubber gloves. Smear Vaseline over your glasses, and there you have it: instant old age.” And if you want to know what it’s like to be a novelist, you can fill a room with books and papers, go inside, close the door, and stay there for as long as possible while doing absolutely nothing that an outside observer would find interesting. Ninety percent of a writer’s working life looks more or less like that.

Werner Herzog Eats His Shoe

What kind of celebrity, then, do you really want to be? If celebrity is a mask, as Updike says, it might be best to make it explicit. Being a member of Daft Punk, say, would allow you to bask in the adulation of a stadium show, then remove your helmet and take the bus back to your hotel without any risk of being recognized. The mask doesn’t need to be literal, either: I have a feeling that Lady Gaga could dress down in a hoodie and ponytail and order a latte at any Starbucks in the country without being mobbed. The trouble, of course, with taking on the identity of a total unknown—Banksy, for instance—is that you’re buying the equivalent of a pig in a poke: you just don’t know what you’re getting. Ideally, you’d switch places with a celebrity whose life has been exhaustively chronicled, either by himself or others, so that there aren’t any unpleasant surprises. It’s probably best to also go with someone slightly advanced in years: as Solon says in Herodotus, you don’t really know how happy someone’s life is until it’s over, and the next best thing would be a person whose legacy seems more or less fixed. (There are dangers there, too, as Bill Cosby knows.) And maybe you want someone with a rich trove of memories of a life spent courting risk and uncertainty, but who has since mellowed into something slightly more stable, with the aura of those past accomplishments still intact.

You also want someone with the kind of career that attracts devoted collaborators, which is the only kind of artistic wealth that really counts. But you don’t want too much fame or power, both of which can become traps in themselves. In many respects, then, what you’d want is something close to the life of half and half that Lin Yutang described so beautifully: “A man living in half-fame and semi-obscurity.” Take it too far, though, and you start to inch away from whatever we call celebrity these days. (Only in today’s world can an otherwise thoughtful profile of Brie Larson talk about her “relative anonymity.”) And there are times when a touch of recognition in public can be a welcome boost to your ego, like for Sally Field in Soapdish, as long as you’re accosted by people with the same basic mindset, rather than those who just recognize you from Istagram. You want, in short, to be someone who can do pretty much what he likes, but less because of material resources than because of a personality that makes the impossible happen. You want to be someone who can tell an interviewer: “Throughout my life I have been able to do what I truly love, which is more valuable than any cash you could throw at me…So long as I have a roof over my head, something to read and something to eat, all is fine…What makes me so rich is that I am welcomed almost everywhere.” You want to be Werner Herzog.

A brand apart

with 5 comments

Kyle MacLachlan in Blue Velvet

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What individual instances of product placement in movies and television have you found most effective?”

One of the small but consistently troublesome issues that every writer faces is what to do about brand names. We’re surrounded by brands wherever we look, and we casually think and talk about them all the time. In fiction, though, the mention of a specific brand often causes a slight blip in the narrative: we find ourself asking if the character in question would really be using that product, or why the author introduced it at all, and if it isn’t handled well, it can take us out of the story. Which isn’t to say that such references don’t have their uses. John Gardner puts it well in The Art of Fiction:

The writer, if it suits him, should also know and occasionally use brand names, since they help to characterize. The people who drive Toyotas are not the same people who drive BMWs, and people who brush with Crest are different from those who use Pepsodent or, on the other hand, one of the health-food brands made of eggplant. (In super-realist fiction, brand names are more important than the characters they describe.)

And sometimes the clever deployment of brands can be another weapon in the writer’s arsenal, although it usually only works when the author already possesses a formidable descriptive vocabulary. Nicholson Baker is a master of this, and it doesn’t get any better than Updike in Rabbit is Rich:

In the bathroom Harry sees that Ronnie uses shaving cream, Gillette Foamy, out of a pressure can, the kind that’s eating up the ozone so our children will fry. And that new kind of razor with the narrow single-edge blade that snaps in and out with a click on the television commercials. Harry can’t see the point, it’s just more waste, he still uses a rusty old two-edge safety razor he bought for $1.99 about seven years ago, and lathers himself with an old imitation badger-bristle on whatever bar of soap is handy…

For the rest of us, though, I’d say that brand names are one of those places where fiction has to retreat slightly from reality in order to preserve the illusion. Just as dialogue in fiction tends to be more direct and concise than it would be in real life, characters should probably refer to specific brands a little less often than they really would. (This is particularly true when it comes to rapidly changing technology, which can date a story immediately.)

Danny Pudi and Alison Brie on Community

In movies and television, a prominently featured brand sets off a different train of thought: we stop paying attention to the story and wonder if we’re looking at deliberate product placement—if there’s even any question at all. Even a show as densely packed as The Vampire Diaries regularly takes a minute to serve up a commercial for the likes of AT&T MiFi, and shows like Community have turned paid brand integration into entire self-mocking subplots, while still accepting the sponsor’s money, which feels like a textbook example of having it both ways. Tony Pace of Subway explains their strategy in simple terms: “We are kind of looking to be an invited guest with a speaking role.” Which is exactly what happened on Community—and since it was reasonably funny, and it allowed the show to skate along for another couple of episodes, I didn’t really care. When it’s handled poorly, though, this ironic, winking form of product placement can be even more grating than the conventional kind. It flatters us into thinking that we’re all in on the joke, although it isn’t hard to imagine cases where corporate sponsorship, embedded so deeply into a show’s fabric, wouldn’t be so cute and innocuous. Even under the best of circumstances, it’s a fake version of irreverence, done on a company’s terms. And if there’s a joke here, it’s probably on us.

Paid or not, product placement works, at least on me, although often in peculiar forms. I drank Heineken for years because of Blue Velvet, and looking around my house, I see all kinds of products or items that I bought to recapture a moment from pop culture, whether it’s the Pantone mug that reminds me of a Magnetic Fields song or the Spyderco knife that carries the Hannibal seal of approval. (I’ve complained elsewhere about the use of snobbish brand names in Thomas Harris, but it’s a beautiful little object, even if I don’t expect to use it exactly as Lecter does.) If it’s kept within bounds, it’s a mostly harmless way of establishing a connection between us and something we love, but it always ends up feeling a little empty. Which may be why brand names sit so uncomfortably in fiction. Brands or corporations use many of the same strategies as art to generate an emotional response, except the former is constantly on message, unambiguous, and designed to further a specific end. It’s no accident that there are so many affinities between advertising and propaganda. A good work of art, by contrast, is ambiguous, open to multiple interpretations, and asks nothing of us aside from an investment of time—which is the opposite of what a brand wants. Fiction and brands are always going to live together, either because they’ve been paid to do so or because it’s an accurate reflection of our world. But we’re more than just consumers. And art, at its best, should remind us of this.

The power of the page

leave a comment »

Laura Hillenbrand

Over the weekend, I found myself contemplating two very different figures from the history of American letters. The first is the bestselling nonfiction author Laura Hillenbrand, whose lifelong struggle with chronic fatigue syndrome compelled her to research and write Seabiscuit and Unbroken while remaining largely confined to her house for the last quarter of a century. (Wil S. Hylton’s piece on Hillebrand in The New York Times Magazine is absolutely worth a read—it’s the best author profile I’ve seen in a long time.) The other is the inventor and engineer Buckminster Fuller, whose life was itinerant as Hillebrand’s is stationary. There’s a page in E.J. Applewhite’s Cosmic Fishing, his genial look at his collaboration with Fuller on the magnum opus Synergetics, that simply reprints Fuller’s travel schedule for a representative two weeks in March: he flies from Philadelphia to Denver to Minneapolis to Miami to Washington to Harrisburg to Toronto, attending conferences and giving talks, to the point where it’s hard to see how he found time to get anything else done. Writing a coherent book, in particular, seemed like the least of his concerns; as Applewhite notes, Fuller’s natural element was the seminar, which allowed him to spin complicated webs of ideas in real time for appreciative listeners, and one of the greatest challenges of producing Synergetics lay in harnessing that energy in a form that could be contained within two covers.

At first glance, Hillenbrand and Fuller might seem to have nothing in common. One is a meticulous journalist, historian, and storyteller; the other a prodigy of worldly activity who was often reluctant to put his ideas down in any systematic way. But if they meet anywhere, it’s on the printed page—and I mean this literally. Hylton’s profile of Hillebrand is full of fascinating details, but my favorite passage describes how her constant vertigo has left her unable to study works on microfilm. Instead, she buys and reads original newspapers, which, in turn, has influenced the kinds of stories she tells:

Hillenbrand told me that when the newspaper arrived, she found herself engrossed in the trivia of the period—the classified ads, the gossip page, the size and tone of headlines. Because she was not hunched over a microfilm viewer in the shimmering fluorescent basement of a research library, she was free to let her eye linger on obscure details.

There are shades here of Nicholson Baker, who became so concerned over the destruction of library archives of vintage newspapers that he bought a literal ton of them with his life savings, and ended up writing an entire book, the controversial Human Smoke, based on his experience of reading press coverage of the events leading up to World War II day by day. And the serendipity that these old papers afforded was central to Hillebrand’s career: she first stumbled across the story of Louie Zamperini, the subject of Unbroken, on the opposite side of a clipping she was reading about Seabiscuit.

Buckminster Fuller

Fuller was similarly energized by the act of encountering ideas in printed form, with the significant difference that the words, in this case, were his own. Applewhite devotes a full chapter to Fuller’s wholesale revision of Synergetics after the printed galleys—the nearly finished proofs of the typeset book itself—had been delivered by their publisher. Authors aren’t supposed to make extensive rewrites in the galley stage; it’s so expensive to reset the text that writers pay for any major changes out of their own pockets. But Fuller enthusiastically went to town, reworking entire sections of the book in the margins, at a personal cost of something like $3,500 in 1975 dollars. And Applewhite’s explanation for this impulse is what caught my eye:

Galleys galvanize Fuller partly because of the large visual component of his imagination. The effect is reflexive: his imagination is triggered by what the eye frames in front of him. It was the same with manuscript pages: he never liked to turn them over or continue to another sheet. Page = unit of thought. So his mind was retriggered with every galley and its quite arbitrary increment of thought from the composing process.

The key word here is “quite arbitrary.” A sequence of pages—whether in a newspaper or in a galley proof—is an arbitrary grid laid on a sequence of ideas. Where the page break falls, or what ends up on the opposite side, is largely a matter of chance. And for both Fuller and Hillenbrand, the physical page itself becomes a carrier of information. It’s serendipitous, random, but no less real.

And it makes me reflect on what we give up when pages, as tangible objects, pass out of our lives. We talk casually about “web pages,” but they aren’t quite the same thing: now that many websites, including this one, offer visitors an infinite scroll, the effect is less like reading a book than like navigating the spool of paper that Kerouac used to write On the Road. Occasionally, a web page’s endlessness can be turned into a message in itself, as in the Clickhole blog post “The Time I Spent on a Commercial Whaling Ship Totally Changed My Perspective on the World,” which turns out to contain the full text of Moby-Dick. More often, though, we end up with a wall of text that destroys any possibility of accidental juxtaposition or structure. I’m not advocating a return to the practice of arbitrarily dividing up long articles into multiple pages, which is usually just an excuse to generate additional clicks. But the primacy of the page—with its arbitrary slice or junction of content—reminds us of why it’s still sometimes best to browse through a physical newspaper or magazine, or to look at your own work in printed form. At a time when we all have access to the same world of information, something as trivial as a page break or an accidental pairing of ideas can be the source of insights that have occurred to no one else. And the first step might be as simple as looking at something on paper.

What is poetry like?

with one comment

Vladimir Mayakovsky

Poetry is like mining for radium. The output an ounce, the labor a year.

Vladimir Mayakovsky

Poetry is like making a joke. If you get one word wrong at the end of a joke, you’ve lost the whole thing.

W.S. Merwin

Your teacher says that poetry is like an exquisite and towering pagoda that appears at the snap of the fingers or like the twelve towers of the five cities of the immortals that ephemerally exist at the edge of heaven. I do not agree. To use a metaphor, poetry is like building a house out of tiles, glazed bricks, wood, and stone—he must put them all together, one by one, on solid ground.

Shih Jun-chang

Wallace Stevens

Poetry is like prayer in that it is most effective in solitude and in the times of solitude, as, for example, in earliest morning.

Wallace Stevens

Poetry is like a panther: it delights the eye; but against any attempt to enslave it, it may wreak revenge.

Walter Kaufmann

Many a fair precept in poetry is like a seeming demonstration in the mathematics, very specious in the diagram, but failing in the mechanic operation.

John Dryden

Nicholson Baker

Poetry is like math or chess or music—it requires a slightly freaky misshapen brain, and those kinds of brains don’t last.

Nicholson Baker

Writing a poem is like getting a short-term contract from God. You get this one done and if you do a good job, then maybe another contract will come along.

David Bottoms

Writing poetry is like writing history—talent, learning, and understanding in suitable proportion.

Yuan Mei

P.D. James

Poetry is like religion: sometimes the vision is immediate and almost frightening in its intensity; sometimes it is reached with difficulty, giving intimations only, and those confused and partial.

P.D. James

Writing a poem is like solving for X in an equation.

—Attributed to W.H. Auden by Robert Earl Hayden

Poetry is like being alive twice.

Robert Hass

The evolution of art

leave a comment »

Neil deGrasse Tyson

If you’ve been watching Cosmos: A Spacetime Odyssey as faithfully as I have, you probably came away from last night’s episode with a newfound appreciation for the wonders of natural selection. Darwinian evolution, as Daniel Dennett likes to point out, is probably the single best idea anyone ever had, and it’s since been applied to fields far beyond those of biology. The notion that ideas and abstract concepts, for instance, are subject to selection pressure—both within the human mind and in the larger world beyond—is a familiar one, and every writer knows how it feels. Life is full of story ideas, and the means by which one or another wins out is a mysterious one, with selection often taking place below the level of conscious thought. Even once you’ve started a story, it can go in any number of directions, with the author selecting and discarding variations based on their perceived rightness, a process that happens all over again once the story is released into the wild. (The publishing industry is a battleground for survival of a particularly ruthless kind.) And if you want to harness the power of evolution in your own work, it’s not a bad idea to take a few cues from nature itself:

1. Move through a series of useful intermediate steps. My favorite part of last night’s Cosmos episode was its takedown of the argument, common among proponents of intelligent design, that an organ like the human eye is too complex to have evolved by chance. It may be true that half an eye isn’t very useful, but an approximation of an eye certainly is, and there’s a beautiful sequence illustrating how the eye evolved from a series of intermediate stages, each useful in itself: a cluster of a few photosensitive proteins develops gradually into a depression with an aperture and finally an eye with a lens. And this is a striking analogy to how the creative process works. Half a story isn’t any more useful than half an eye, but a finished rough draft—one that takes the entire narrative from beginning to end, however imperfectly—is both a sketch of the whole and a template that can be refined through successive revisions. And it isn’t until you’ve got something that holds together on its own provisional terms that you can start to make it better. (This is part of the reason why I always start with a detailed outline, which is the roughest version of the complete story that can possibly exist.)

Portrait of Charles Darwin by George Richmond

2. Introduce a little randomness. Natural selection proceeds as a succession of accidents, with random mutations in the genetic code that usually lead nowhere, but occasionally result in a useful adaptation. The process of writing a story can’t work in quite the same way: unlike nature, we’re writing with an end in mind, and most of us start with a plan. Even with such a teleological approach, however, it’s still possible to embrace a productive element of chance. I’ve described my own methods in detail, but every author will develop his or her own strategies for making raids on the random. Nicholson Baker, for instance, used a random number generator to reorder the chapters in his novel The Anthologist, and although he discarded most of the results, it led to a handful of promising juxtapositions that were preserved in the final draft. Even if you aren’t as systematic about it, you’ll soon find that every finished novel represents a compromise between the vision that the author had at the beginning and the unpredictable variations that the process introduced. And it’s essential to be able to depart from the plan enough to incorporate the unexpected—and to test it diligently against the alternatives.

3. Give it time. If the diversity and ingenuity of the adaptations that nature creates can sometimes seem unimaginable, it’s because natural selection operates over millions of years. Time, scale, and variation can do remarkable things. Artists, unfortunately, don’t have that luxury: we can only write one version of a story at a time, and we only have a few months or years to get it done. Even on that reduced level, though, time is crucial. I’ve spoken elsewhere about the importance of rendering in the creative process, and the fact that most novels need a year or so to percolate in the author’s mind, no matter how fast we can write. There’s a simple explanation: most of us only have so many good ideas at any one time, and if we can extend the period of writing, we’ll increase our chances of finding an idea that can be applied to the problem at hand. It’s possible to take this too far, of course, and there always comes a time when the draft needs to be sent out to meet its fate. But even a break of a few weeks can have positive effect, especially if we’ve turned our attention to other projects in the meantime. And when we do finally go back to the work we’ve set aside, we’ll often find that it has evolved in our absence, when we weren’t even aware of it. Now that’s some intelligent design.

Written by nevalalee

March 17, 2014 at 9:57 am

How to repeat yourself

with 2 comments

John Gardner

Writers are generally advised not to repeat themselves. After I’ve finished the rough draft of a story, one of my first orders of business is to go back through the manuscript and fix any passages where I’ve inadvertently repeated the same word in the same sentence, or within a short run of text. Knowing how often you can use a word is a matter of taste and intuition. Some words are so common as to be invisible to the reader, so you can, and should, use the word “said” exclusively throughout a story, even as dialogue can usually be varied in other ways. Other words or phrases are so striking that they can’t be used more than once or twice in the course of an entire novel, and I’ll sometimes catch myself maintaining a running count of how often I’ve used a word like “unaccountable.” Then there are the words that fall somewhere in the middle, where they’re useful enough to crop up on a regular basis but catch the reader’s eye to an extent that they shouldn’t be overused. Different writers fall back on different sets of words, and in my case, they tend to be verbs of cognition, like “realized,” or a handful of adverbs that I use entirely too often, like, well, “entirely.”

Whenever I’m sifting through the story like this, part of me wonders whether a reader would even notice. Some of these repetitions jar my ear to a greater extent than they would for someone reading the story more casually: I’ve often revisited these pages something like fifty times, and I’m acutely aware of the shape of each sentence. (Overfamiliarity can have its pitfalls as well, of course: I’m sometimes shocked to discover a glaring repetition in a sentence that I’ve read over and over until I can no longer really see it.) But I encounter this issue often enough in other authors’ books that I know it isn’t just me. Catching an inadvertent repetition in a novel, as when Cormac McCarthy speaks twice in Blood Meridian of something being “footed” to its reflection, has the same effect as an unintentional rhyme: it pulls you momentarily out of the story, wondering if the writer meant to repeat the same word or if he, or his editor, fell asleep at the switch. And a particularly sensitive eye can pick up on repetitions or tics that even an attentive reader might miss. In his otherwise fawning study U & I,  Nicholson Baker complains about John Updike’s overuse of the verb “seemed,” which even I, a massive Updike fan, hadn’t noticed until Baker pointed it out.

Nicholson Baker

But repetitions can also be a source of insight, especially when you’re coming to grips with an earlier draft. A writer can learn a lot from the words he habitually overuses. If you find yourself falling back on melodramatic adverbs like “suddenly,” you might want to rethink the tone you’re taking—it’s possible that you’re trying to drum up excitement in a story that lacks inherent dramatic interest. My own overuse of verbs like “realized” might indicate that I’m spending too much time showing characters thinking through a situation, rather than conveying character through action. You can learn even more from longer phrases that reappear by accident. As John Gardner writes in The Art of Fiction, discussing a hypothetical story about Helen of Troy:

Reading…lines he has known by heart for weeks, [the writer] discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery: The brooch Helen threw at Menelaus the writer has described, he discovers, with the same phrase he used in describing, much later, the seal on the message for help being sent to the Trojans’ allies. Why? he wonders. Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.

And the comparison to dreaming is a shrewd one. “Repetitions are magic keys,” Umberto Eco writes in Foucault’s Pendulum, and although he’s talking about something rather different—a string of sentences randomly generated by a computer—there’s a common element here. When you write a first draft, you’re operating by instinct: you accept the first words that come to mind, rather than laboriously revising the text, because you’re working in a mode closer to the events of the story itself. At its best, it’s something like a dream, and the words we select have a lot in common with the unmediated nature of dream imagery or word association in psychoanalysis. Later, we’ll smooth and polish the surface of the prose, and most of these little infelicities will be ironed away, but it doesn’t hurt to look at them first with the eye of an analyst, or a critic, to see what they reveal. This doesn’t excuse us from falling back on the same hackneyed words or phrases, and it doesn’t help a writer who thinks entirely in clichés. But it’s in our slips or mistakes, as Freud knew, that we unconsciously reveal ourselves. Mistakes need to be fixed and repetitions minimized, but it’s still useful to take a moment to ask what they really mean.

Written by nevalalee

November 18, 2013 at 8:39 am

The treacherous craft of Aaron Sorkin

leave a comment »

When I consider Aaron Sorkin and the weirdly watchable train wreck that is The Newsroom, I’m reminded of something that Norman Mailer once said about craft: “I think of it as being like a Saint Bernard with that little bottle of brandy under his neck. Whenever you get into trouble, craft can keep you warm long enough to be rescued. Of course, this is exactly what keeps good novelists from becoming great novelists.” Craft, in other words, becomes a kind of intellectual sleight of hand, a way of disguising bad thinking or more fundamental narrative problems, when a writer of lesser facility might have been forced to deal more honestly with the true implications of his material. Mailer cites Robert Penn Warren as an example:

Robert Penn Warren might have written a major novel if he hadn’t just that little extra bit of craft to get him out of all the trouble in All the King’s Men…And his plot degenerated into a slam-bang mix of exits and entrances, confrontations, tragedies, quick wits and woe. But he was really forcing an escape from the problem.

Which, if you think about it, sounds a lot like the The Newsroom, which so often confuses manic action and the rapid-fire exchange of factoids with drama and witty repartee. It’s a frustrating, often outright terrible show, and yet I find myself watching it with increasing fascination, because it achieves the level of badness that can only be attained with the aid of remarkable craft. Sorkin is a man of enormous talent, but in his best work, he’s been aided and restrained by other strong creative voices. The Newsroom gives us Sorkin uncut, without the guiding hand he needs to hold him back from his worst impulses, and the result tells us a lot not just about Sorkin, but about the nature and limitations of a certain kind of drama. Because watching this show forces us to confront what David Thomson, speaking about David Mamet, has called “the time-killing aridness in brilliant situations, crackling talk, and magnificent acting.”

That sort of “crackling talk” is a skill that can be learned over time, and Sorkin, who has written hundreds of hours of television, theater, and film, has had more practice doing it than just about anyone else. As a recent supercut made clear, he also tends to return repeatedly to the same verbal tics and phrases (“Well, that was predictable”). Yet this only reflects how good he really is. Sorkin is a machine for creating great dialogue, and like all insanely productive creative professionals, he likes to fall back on the same tricks, which he generates almost unconsciously. If he’d slaved over a line to make it work, he wouldn’t have used it again, but the fact that these lines reappear so often implies that they came easily. As Nicholson Baker says in U and I of John Updike’s reuse of certain images in his novels: “He liked it enough to consent to it when it appeared in a street scene the first time, and yet he didn’t like it well enough for his memory to warn him off a second placement.” And that’s the mark of a writer of almost supernatural felicity.

Yet it also conceals deeper problems of substance, as well as a disturbing lack of real ideas. As Sorkin recently said to Terry Gross: “I phonetically create the sound of smart people talking to each other.” And what The Newsroom demonstrates is that Sorkin’s blessed ability with dialogue has left him underdeveloped along other parameters, a shortcoming that seems especially visible now. If you write wonderful words for actors to say, this can conceal any number of other limitations, sometimes for years, but eventually the mask starts to slip. Sorkin is a verbal genius, with the Oscar and Emmys to show for it, but without good collaborators, his gift tends to ripen and rot. What Sorkin needs, clearly, is a strong creative force to push against, which David Fincher provided with The Social Network and Thomas Schlamme and John Wells did on The West Wing—although his recent purge of many members of his writing staff makes it doubtful if this will happen soon. But I hope it does. Because otherwise, the show will continue to waste its great potential, and a legion of viewers can only say: “Well, that was predictable.”

What would Rex Harrison do?

with 4 comments

Earlier this month, in his rather unenthusiastic review of the new musical Nice Work if You Can Get It, Hilton Als wrote of star Matthew Broderick, who, for all his other talents, is manifestly not a dancer: “His dancing should be a physical equivalent of Rex Harrison’s speaking his songs in [My Fair Lady]: self-assured and brilliant in its use of the performer’s limitations.” It’s a nice comparison, and indeed, Rex Harrison is one of the most triumphant examples in the history of entertainment of a performer turning his limitations into something uniquely his own. (If I could go back in time to see only one musical, it would be the original Broadway production of My Fair Lady, starring Harrison and the young Julie Andrews.) And while most of us rightly strive to overcome our limitations, it can also be useful to find ways of turning them into advantages, or at least to find roles for which we’re naturally suited, shortcomings and all.

Years of writing have taught me that I have at least two major limitations as a novelist (although my readers can probably think of more). The first is that my style of writing is essentially serious. I don’t think it’s solemn, necessarily, and I’d like to think that my fiction shows some wit in its construction and execution. But I’m not a naturally funny writer, and I’m in awe of authors like P.G. Wodehouse, Douglas Adams, or even Joss Whedon, whose sense of humor is inseparable from their way of regarding the world. The Icon Thief contains maybe three jokes, and I’m inordinately proud of all of them, just because they don’t come naturally. This isn’t to say that I’m a humorless or dour person, but that being funny in print is really hard, and it’s a skill set that I don’t seem to have, at least not in fiction. And while I’d like to develop this quality, if only to increase my range of available subjects and moods, I expect that it’s always going to be pretty limited.

My other big limitation is that I only seem capable of writing stories in which something is always happening. The Icon Thief and its sequels are stuffed with plot and incident, largely because I’m not sure what I’d do if the action slowed down. In this, I’m probably influenced by the movies I love. In his essay on Yasujiro Ozu, David Thomson writes:

[S]o many American films are pledged to the energy that “breaks out.” Our stories promote the hope of escape, of beginning again, of beneficial disruptions. One can see that energy—hopeful, and often damaging, but always romantic—in films as diverse as The Searchers, Citizen Kane, Mr. Smith Goes to Washington, Run of the Arrow, Rebel Without a Cause, Vertigo, Bonnie and Clyde, Greed, and The Fountainhead. No matter how such stories end, explosive energy is endorsed…Our films are spirals of wish fulfillment, pleas for envy, the hustle to get on with the pursuit of happiness.

As a result, whenever I write a page in which nothing happens, I get nervous. This isn’t the worst problem for a mainstream novelist to have, but like my essential seriousness, it limits my ability to tell certain kinds of stories. (This may be why I’m so impressed by the work of, say, Nicholson Baker, who writes brilliantly funny novels in which almost nothing takes place.)

So what do I do? I do what Rex Harrison did: I look for material where my limitations can be mistaken for strengths. In short, I write suspense fiction, which tends to be forgiving of essential seriousness—it’s hard to find a funny line in any of Thomas Harris or Frederick Forsyth, for example—and for restless, compulsive action, all executed within a fairly narrow range of tone. When I write in other genres, like science fiction, I basically approach the story if I were still writing suspense, which, luckily, happens to be a fairly adaptable mode. And while I’ll always continue to push myself as a writer, and hope to eventually expand my tonal and emotional range, I’m glad that I’ve found at least one place where my limitations feel at home, and where they can occasionally flower forth into full song. For everything else, I’m content just to speak to the music.

How to write like your grandmother

with 3 comments

Yesterday, I made the radical observation that everyone’s grandmother tends to be a good cook. (I also can’t resist the chance to quote, completely out of context, one of my favorite lines from Bertolucci’s The Dreamers: “Other people’s parents are always nicer than our own, and yet for some reason, our grandparents are always nicer than other people’s.”) It isn’t hard to figure out why: by the time most of us are old enough to really notice what our grandparents are like, they’ve had a head start of something like fifty years to find their way around a kitchen. By definition, barring some kind of time travel—which never goes well for grandparents—we aren’t around to see what our grandmother was like in her twenties or thirties. And I think most of us would be startled to see how little she had figured out, about cooking or anything else, well into middle age.

And that’s true of art as well. One of the curious facts about art is that nearly all of a major artist’s works fall into oblivion, with only a few left standing in libraries or anthologies. In general, although not always, these are works of the creator’s most mature period, which means that we see artists at their most developed, like our grandparents, and with a similar lack of context. In his amusing novel The Anthologist, Nicholson Baker points out how this is true of poetry:

What does it mean to be a great poet? It means that you wrote one or two great poems. Or great parts of poems. That’s all it means. Don’t try to picture the waste or it will alarm you…Out of hundreds of poems two or three are really good. Maybe four or five. Six tops.

The same holds true for novels, movies, paintings, and any other medium you can name: we’re left with a handful of the good stuff, and the rest tends to disappear. And for an artist, this can be simultaneously daunting and liberating—most of what you produce be forgotten, but if you can generate one masterpiece along the way, it won’t matter.

For the rest of us, however, it can be risky to draw conclusions from those remaining works, especially when it comes to making choices about how to plan our own artistic lives. What we see, in our libraries and museums and movie collections, are a handful of end results—and not even all of them, but a selection of the best—with the earlier stages either invisible or accessible only to real enthusiasts. As a result, we tend to imitate the wrong things: we copy the product, but not the process. We try to paint like Picasso without remembering that Picasso not only started by painting like Raphael, but often went through the same procedure, layer by layer, in many of his works. Or in literature, we imitate the result of a long artistic and personal process and end up writing bad Hemingway.

Fortunately, in art, we have the chance to time travel in the way we can’t in our own lives. Except in a few exceptional cases, we don’t have access to discarded drafts, but we can always go back to early published work and see how an artist ended up where he did, and, more importantly, why. And along the way, we’re reminded that it’s impossible to separate a masterpiece, if we’re interested in doing good work ourselves, from the larger process that generated it. Here’s Nicholson Baker again:

All the middling poems they write are necessary to form a raised mulch bed or nest for the great poems and to prove to the world that they labored diligently and in good faith for some years at their calling. In other words, they can’t just dash off one or two great poems and then stop. That won’t work…But it’s perfectly okay, in fact it’s typical, if ninety-five percent of the poems they write aren’t great. Because they never are.

And the same is true of your grandmother. Ninety-five percent of the meals she made probably weren’t all that great, but luckily for us, they were clustered disproportionately toward the beginning, so only your grandfather knows for sure. But she did it every day, and she got better. So keep cooking. Your grandchildren, and your readers, will thank you for it.

Written by nevalalee

April 25, 2012 at 10:23 am

Posted in Writing

Tagged with ,

Nicholson Baker on randomness

leave a comment »

There was also the random-number stage [while writing The Anthologist.] That took a few months. I had all these pieces—probably five hundred, six hundred pages of writing—and I got very constricted. What do I start with? So I decided that each chunklet should be assigned a random number from a random-number generator at a Web site called Random.org. Immediately I could see that the new artificial order was totally wrong. The rational side of me revolted against this horrendous scrambling. I fought back and hacked and slashed and crawled my way back to the order the book needed to be in. But sometimes the randomization forced some good conjunctions—there were some sequences that survived.

Nicholson Baker, to the Paris Review

Written by nevalalee

October 9, 2011 at 8:58 am

Quote of the Day

with 4 comments

There is no good word for stomach; just as there is no good word for girlfriend. Stomach is to girlfriend as belly is to lover, and as abdomen is to consort, and as middle is to petite amie.

—Nicholson Baker, The Mezzanine

Written by nevalalee

January 14, 2011 at 8:00 am

The Legend of Miyamoto

leave a comment »

For reasons known only to itself, The New Yorker has evidently decided that the best way to write about video games is to assign these stories to writers who emphatically have no gaming experience. This approach, which wouldn’t be tolerated for any other art form, high or low, has already resulted in this notorious article by Nicholson Baker—one of my favorite living writers, but clearly unequipped to say anything interesting about Red Dead Redemption. And now we have Nick Paumgarten’s disappointing profile of Shigeru Miyamoto, which is a huge missed opportunity, in more ways than one.

Miyamoto, the creator of the Mario and Zelda franchises and the greatest video game designer of all time, has often been compared to Walt Disney, an accolade he shares with his fellow genius Hayao Miyazaki. (Miyamoto and Miyazaki also share a deep nostalgia for the forests and villages of rural Japan, an abiding affection that shows up throughout their work.) Miyamoto is an artist, a storyteller, an engineer, and a visionary, and he’s exactly the sort of creative force that the readers of The New Yorker ought to know more about. The fact that Paumgarten scored only a brief interview with Miyamoto, which he pads out to feature length with pages of unenlightening digressions, is only the most disappointing thing about the profile. A single glimpse of one of Miyamoto’s sketches for Zelda would be more interesting than anything on display here.

Still, there are a few moments worth mentioning. Here’s Miyamoto on calibrating the difficulty of a game, and how important it is to incorporate quiet moments alongside every challenge:

A lot of the so-called action games are not made that way…All the time, players are forced to do their utmost. If they are challenged to the limit, is it really fun for them?…[In Miyamoto’s own games] you are constantly providing the players with a new challenge, but at the same time providing them with some stages or some occasions where they can simply, repeatedly, do something again and again. And that can be a joy.

This is especially good advice for writers in genres, such as suspense, that place a premium on intensity. A few strategically timed breaks in the action, which give the reader a moment of breathing room, can make the rest of the novel read much more quickly. The key, as Miyamoto knows, is putting yourself in the position of a person approaching a work of art for the first time:

I always remind myself, when it comes to a game I’m developing, that I’m the perfect, skillful player. I can manipulate all this controller stuff. So sometimes I ask the younger game creators to try playing the games they are making by switching their left and right hands. In that way, they can understand how inexperienced the first-timer is.

Similarly, once a writer has internalized the plot of a novel, it can be hard to see it with fresh eyes. One solution is to set the book aside for a month and read it again once the memory of the story has faded. Another approach, which I’ve done a few times, is to read a sequence of chapters in reverse, or at random, which often reveals problems or repetitions that I wouldn’t have noticed otherwise.

Finally, here’s Paumgarten on one of my favorite topics, the importance of constraints as a creative tool:

Mario, [Miyamoto’s] most famous creation, owes his appearance to the technological limitations of the first Donkey Kong game. The primitive graphics—there were hardly enough pixels to approximate a human form—compelled Miyamoto to give Mario white gloves and red overalls (so that you could see his arms swing), a big bushy mustache and a red hat (to hide the fact that engineers couldn’t yet do mouths or hair that moved), and a big head (to exaggerate his collisions). Form has always followed functionality. The problem now, if you want to call it one, is the degree of functionality. [Italics mine.]

This is a nice, crucial point. And it applies to more than video games. The limitations that made Mario so distinctive are the same ones that led to the look of Mickey Mouse, among so many other stars of early animation. One problem with the recent availability of beautifully rendered computer graphics is that character design is becoming a lost art. Even the best recent Pixar, Disney, and DreamWorks films have suffered from this: they can render every hair on a character’s head, but can’t make the character itself a memorable one. (Kung Fu Panda may be the last computer-animated movie with really distinctive character designs.)

So are video games art? Paumgarten glances at the subject only briefly, but with all due respect to Roger Ebert, there’s no doubt in my mind that the best video games are indeed art. At least, that’s the only explanation I have for something like Super Mario Galaxy, which is one of the few recent works, in any medium, that has filled me with something like my childhood envy for those who get to spend their lives telling stories. (The J.J. Abrams reboot of Star Trek is another.) Miyamoto’s great skill, as the article reminds us, is to bring us back to the best moments of our childhood. And while not all art needs to aspire to this, the world definitely needs art that does.

%d bloggers like this: