Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘zadie smith

Quote of the Day

leave a comment »

[Fred Astaire] is transcendent. When he dances a question proposes itself: what if a body moved like this through the world? But it is only a rhetorical, fantastical question, for no bodies move like Astaire, no, we only move like him in our dreams…He is “poetry in motion.” His movements are so removed from ours that he sets a limit on our own ambitions. Nobody hopes or expects to dances like Astaire, just as nobody really expects to write like Nabokov.

Zadie Smith, “Dance Lessons for Writers”

Written by nevalalee

February 23, 2018 at 7:30 am

Quote of the Day

leave a comment »

A writer should not undervalue any tool of her trade just because she finds it easier to use than the others. As you get older you learn not to look a gift horse in the mouth.

Zadie Smith, in The Guardian

Written by nevalalee

September 29, 2017 at 7:30 am

The act of cutting

leave a comment »

In a recent article in The New Yorker on Ernest Hemingway, Adam Gopnik evocatively writes: “The heart of his style was not abbreviation but amputation; not simplicity but mystery.” He explains:

Again and again, he creates his effects by striking out what would seem to be essential material. In “Big Two-Hearted River,” Nick’s complicated European experience—or the way that fishing is sanity-preserving for Nick, the damaged veteran—is conveyed clearly in the first version, and left apparent only as implication in the published second version. In a draft of the heartbreaking early story “Hills Like White Elephants,” about a man talking his girlfriend into having an abortion, Hemingway twice uses the words “three of us.” This is the woman’s essential desire, to become three rather than two. But Hemingway strikes both instances from the finished story, so the key image remains as ghostly subtext within the sentences. We feel the missing “three,” but we don’t read it.

Gopnik concludes: “The art comes from scissoring out his natural garrulousness, and the mystery is made by what was elided. Reading through draft and then finished story, one is repeatedly stunned by the meticulous rightness of his elisions.” Following Hemingway’s own lead, Gopnik compares his practice to that of Cézanne, but it’s also reminiscent of Shakespeare, who frequently omits key information from his source material while leaving the other elements intact. Ambiguity, as I’ve noted here before, emerges from a network of specifics with one crucial piece removed.

Over the last two weeks, I’ve been ruthlessly cutting the first draft of my book, leaving me highly conscious of the effects that can come out of compression. In his fascinating notebooks, which I quoted here yesterday, Samuel Butler writes: “I have always found compressing, cutting out, and tersifying a passage suggests more than anything else does. Things pruned off in this way are like the heads of the hydra, two grow for every two that is lopped off.” This squares with my experience, and it reflects how so much of good writing depends on juxtaposition. By cutting, you’re bringing the remaining pieces closer together, which allows them to resonate. Butler then makes a very interesting point:

If a writer will go on the principle of stopping everywhere and anywhere to put down his notes, as the true painter will stop anywhere and everywhere to sketch, he will be able to cut down his works liberally. He will become prodigal not of writing—any fool can be this—but of omission. You become brief because you have more things to say than time to say them in. One of the chief arts is that of knowing what to neglect and the more talk increases the more necessary does this art become.

I love this passage because it reveals how two of my favorite activities—taking notes and cutting—are secretly the same thing. On some level, writing is about keeping the good stuff and removing as much of the rest as possible. The best ideas are likely to occur spontaneously when you’re doing something unrelated, which is why you need to write them down as soon as they come to you. When you’re sitting at your desk, you have little choice but to write mechanically in hopes that something good will happen. And in the act of cutting, the two converge.

Cutting can be a creative act in itself, which is why you sometimes need to force yourself to do it, even when you’d rather not. You occasionally see a distinction drawn between the additive and subtractive arts, but any work often partakes of both at various stages, which confer different benefits. In Behind the Seen, Charles Koppelman says of editing a movie in postproduction:

The orientation over the last six months has been one of accumulation, a building-up of material. Now the engines are suddenly thrown into full reverse. The enterprise will head in the opposite direction, shedding material as expeditiously as possible.

We shouldn’t disregard how challenging that mental switch can be. It’s why an editor like Walter Murch rarely visits the set, which allows him to maintain a kind of Apollonian detachment from the Dionysian process of filmmaking: he doesn’t want to be dissuaded from the need to cut a scene by the knowledge of how hard it was to make it. Writers and other artists working alone don’t have that luxury, and it can be difficult to work yourself up to the point where you’re ready to cut a section that took a long time to write. Time creates its own sort of psychological distance, which is why you’re often advised to put aside the draft for a few weeks, or even longer, before starting to revise it. (Zadie Smith writes deflatingly: “A year or more is ideal—but even three months will do.”) That isn’t always possible, and sometimes the best compromise is to work briefly on another project, like a short story. A change is as good as a rest, and in this case, you’re trying to transform into your future self as soon as possible, which will allow you to perform clinical surgery on the past.

The result is a lot like the old joke: you start with a block of marble, and you cut away everything that doesn’t look like an elephant. When I began to trim my manuscript, I set myself the slightly arbitrary goal of reducing it, at this stage, by thirty percent, guided by the editing rule that I mentioned here a month ago:

Murch also has his eye on what he calls the “thirty percent factor”—a rule of thumb he developed that deals with the relationship between the length of the film and the “core content” of the story. In general, thirty percent of a first assembly can be trimmed away without affecting the essential features of the script: all characters, action, story beats will be preserved and probably, like a good stew, enhanced by the reduction in bulk. But passing beyond the thirty percent barrier can usually be accomplished only by major structural alterations: the reduction or elimination of a character, or whole sequences—removing vital organs rather than trimming fat.

There’s no particular reason why the same percentage should hold for a book as well as a film, but I’ve found that it’s about right. (It also applies to other fields, like consumer electronics.) Really, though, it could have been just about any number, as long as it gave me a clear numerical goal at which to aim, and as long as it hurt a little. It’s sort of like physical exercise. If you want to lose weight, the best way is to eat less, and if you want to write a short book, ideally, you’d avoid writing too much in the first place. But the act of cutting, like exercise, has rewards of its own. As Elie Wiesel famously said: “There is a difference between a book of two hundred pages from the very beginning, and a book of two hundred pages which is the result of an original eight hundred pages. The six hundred pages are there. Only you don’t see them.” And the best indication that you’re on the right track is when it becomes physically painful. As Hemingway writes in A Farewell to Arms: “The world breaks everyone and afterward many are strong at the broken places.” That’s also true of books.

Written by nevalalee

June 29, 2017 at 8:38 am

The book of laughter and forgetting

leave a comment »

In her autobiography, Agatha Christie makes a confession that might strike those of us who haven’t written more than sixty novels as rather strange:

Murder at the Vicarage was published in 1930, but I cannot remember where, when, or how I wrote it, why I came to write it, or even what suggested to me that I should select a new character—Miss Marple—to act as the sleuth in the story.

Christie says the same thing about a novel that followed two years later: “Peril at End House was another of my books which left so little impression on me that I cannot even remember writing it.” In On Writing, Stephen King makes a similar admission: “There’s one novel, Cujo, that I barely remember writing at all. I don’t say that with pride or shame, only with a vague sense of sorrow and loss. I like that book. I wish I could remember enjoying the good parts as I put them down on the page.” To be fair, Christie and King were monstrously prolific, and in both cases, there may have been other factors involved—Christie had suffered from a “fugue state” several years earlier in which she disappeared for ten days without explanation, while King was drinking heavily and using drugs. But even novelists with more mundane lifestyles have reported a similar kind of amnesia. On rereading her novel The Autograph Man, which she bought on an impulse at an airport, Zadie Smith recounts: “The book was genuinely strange to me; there were whole pages I didn’t recognize, didn’t remember writing.”

I find these testimonials oddly reassuring, because they tell me that I’m not alone. Recently, I realized that I couldn’t remember how I came up with one of the most important characters in the trilogy of novels that began with The Icon Thief. If I tried, I could probably reconstruct it, and I’ve even written a whole author’s commentary devoted to preserving this kind of information. But it’s still troubling. I’ve published only three novels, the most recent of which appeared less than four years ago, but I don’t think I could tell you much about them today. This is partially due to the fact that I don’t like reading my old work: in the essay that I quoted above, Smith refers to the “nausea” that overcomes her when she looks back at her books, as well as “a feeling of fraudulence,” and I think most authors can relate to that revulsion. Yet it doesn’t entirely account for how little I remember. In the moment, writing a novel feels unbelievably hard, and it consists of so many discrete choices that I’ve even used it as an argument in favor of the existence of free will—but afterward, it seems to evaporate completely. Which just means that it’s like everything else in life, except that it leaves a more tangible trace of itself behind. In his memoir Self-Consciousness, John Updike writes:

That we age and leave behind this litter of dead, unrecoverable selves is both unbearable and the commonest thing in the world—it happens to everybody…In the dark one truly feels that immense sliding, that turning of the vast earth into darkness and eternal cold, taking with it all the furniture and scenery, and the bright distractions and warm touches, of our lives.

Not only can’t I recall much about writing The Icon Thief, but when I look at pictures of my daughter as a baby, from just two or three years ago, I can barely seem to remember that, either. I’d laugh about it, but it also makes me very sad.

And I suspect that a lot of parents would report the same phenomenon. Part of this is because we tend to have children at an age when time already seems to pass more quickly, but there’s also something else involved. It’s generally agreed that forgetting plays an important role in memory. In a paper first published in 1970, the psychologist Robert A. Bjork argued that forgetting is a way of minimizing interference between old and new experiences:

When people voice complaints about their memory, they invariably assume that the problem is one of insufficient retention of information. In a very real sense, however, the problem may be at least partly a matter of insufficient or inefficient forgetting. If one views the human cognitive apparatus as an ongoing information-handling system, it is clear that some mechanism to update the system, to keep the system current, is crucial…The positive function of any such forgetting mechanism is to prevent information no longer needed from interfering with the handling of current information.

Bjork went on to provide an example that seems more resonant the more I think about it:

Consider the information processing task faced by the typical short-order cook. He must process one by one…a series of orders that have high interorder similarity. Once he is through with “scramble two, crisp bacon, and an English,” his later processing of similar but not identical orders can only suffer to the degree that he has not, in effect, discarded “scrambled two, crisp bacon, and an English.”

The crucial phrase here, I think, is “interorder similarity.” It’s the everyday things that we tend to forget first. I have trouble reconstructing my daily routine from earlier periods in my life, like what I ate for breakfast in my twenties, but exceptional events, like travel to foreign countries, remain relatively vivid. There’s nothing odd about the idea that unusual or striking memories would persist more strongly, but you could also turn that argument on its head: the days that were more or less the same as the ones that followed are more likely to be discarded because they interfere with surrounding information. This allows us to focus on the problems of each day without distraction, but over time, it can turn entire years into a blur. That’s certainly true of writing novels, in which the sameness of each day’s work allows for those rare moments in which inspiration takes place. (It’s noteworthy that both Christie and King were genre novelists who reworked the same conventions over the course of many books. You could also say the same thing about many “literary” authors like Updike, whose novels tend to blend together. And I’d be curious to know if a writer whose style and themes change radically between novels, like David Mitchell or Mark Helprin, would have a different perspective.) Writing a novel, like raising a baby, can also be unpleasant, and perhaps this selective amnesia is what fools us into trying it again. Smith writes of The Autograph Man: “Between that book and me there now exists a sort of blank truce, neither pleasant nor unpleasant.” Sometimes you have to make a similar kind of truce with the past to go on living, and forgetfulness is where it begins. As Hercule Poirot would say, it’s a matter of little grey cells, and we can’t expect to hold onto them forever.

Written by nevalalee

May 4, 2017 at 9:05 am

“History often had plans of its own…”

leave a comment »

"According to legend..."

Note: This post is the sixteenth installment in my author’s commentary for Eternal Empire, covering Chapter 17. You can read the previous installments here.

“A genre is hardening,” the literary critic James Wood wrote fifteen years ago, in his enormously influential New Republic essay “Hysterical Realism.” It’s the set of conventions, he observed, that we see in so many big, ambitious novels published in the last few decades: they’re crammed with plot and information, and they often take a greater interest in how social and political systems work than in the inner lives of their own characters. Dickens provides the original model, with Pynchon setting the standard, followed by the likes of Rushdie, Wallace, and DeLillo. Woods quotes Zadie Smith, who says that she’s concerned with “ideas and themes that I can tie together—problem-solving from other places and worlds,” and who goes on to state:

[It’s not the writer’s job] to tell us how somebody feels about something, it’s to tell us how the world works…These are guys who know a great deal about the world. They understand macro-microeconomics, the way the Internet works, math, philosophy, but…they’re still people who know something about the street, about family, love, sex, whatever. That is an incredibly fruitful combination. If you can get the balance right. And I don’t think any of us have quite yet, but hopefully one of us will.

Woods, as the title of his essay implies, isn’t a fan. He notes, accurately, that this kind of “realism” can serve as an evasion of reality itself: it allows writers to retreat, fashionably, from the unglamorous consideration of the genuine emotions of real men and women. And even if you’re determined to work within that genre, the challenge, as Smith says, is balance. An ambitious literary novel these days is expected to move between two or more registers: the everyday interactions of its characters and the larger social context—meticulously researched and imagined—in which the human story takes place. Shifting between these levels is a hard technical problem, and we can feel the strain even in good novels. In Smith’s White Teeth, Woods sees “an instructive squabble…between these two literary modes,” and a book like The Corrections gains much of its interest from the tension between these kinds of storytelling. Jonathan Franzen, who is as smart a writer as they come, has as much trouble as anyone with managing those transitions: all too often, we end up with passages that read, as Norman Mailer puts it, like “first-rate magazine pieces, but no better.” But in a really fine example of the form, like Joseph O’Neill’s Netherland, the social concerns emerge so organically from the story that it’s hard to tell where one leaves off and the other begins.

"History often had plans of its own..."

What’s funny, of course, is that genre novelists have been dealing with these issues for a long time, and literary fiction is only now taking up the challenge. Science fiction or fantasy, for instance, is invariably set in an unfamiliar world, the rules of which need to be conveyed seamlessly within the action, and one of the first problems any thoughtful writer confronts is how to establish this background in an unobtrusive way. It also affects historical fiction, or even suspense, which often takes place in a realm far removed from the reader’s experience. And the bad examples—in which the story grinds to a halt as the author explains the workings of interstellar travel or the political situation in his warring kingdoms—aren’t so different from the moments in which hysterical realism abandons its characters for a treatise on geopolitical trade. The difference is that it’s our own world that these novels are describing, as if the authors were alien journalists encountering it for the first time. That kind of fictional reportage can be valuable: at its best, it forces us to see the world around us with new eyes, or discloses patterns that have lurked there unseen. But literary fiction, which was able to stick to a narrowly focused register for so long, is still figuring out what the best genre novelists have been doing for decades.

So what does this have to do with Eternal Empire? Like many suspense novels, it devotes ample space to filling in background—on the British prison system, the security services, and the world of oligarchs and gangsters—that few readers could be expected to know firsthand. It also follows a template, established by the first two books in the series, of engaging with history and religion, which creates another level of story in which it has to dip from time to time. I devoted a lot of effort, possibly too much, to integrating those digressions in ways that seemed natural, and it wasn’t always easy. In Chapter 17, for instance, I include a page of material about the Khazars, the enigmatic tribe of Central Asian horsemen that disappeared shortly after their unprecedented conversion to Judaism. The Khazars aren’t essential to the story; they serve primarily as a kind of sustained analogy for Ilya’s inward journey, to a degree that isn’t clear until the end. I realized early on that it would be asking too much of the reader to deliver all of this material at once, so I carved it up into three or four shorter sections, each of which represented a self-contained stage, and inserted them at points in which Ilya’s own thoughts or situation provided a natural transition. (They also serve, more practically, to create a pause in the action where such a delay seemed useful.) The result sometimes resembles the “squabble” that Woods sees in more literary novels. But the problem of moving between two worlds is one that most writers, like Ilya, will have to confront sooner or later…

Written by nevalalee

April 23, 2015 at 9:56 am

What do you care what other people think?

with 2 comments

Immanuel Kant

We’re often told that we shouldn’t care about what other people think, but of course, we’re mindful of this all the time, and sometimes it leads to better behavior, in ways both large and small. When I’m noodling around on the ukulele, I find that my performance gets more focused when I imagine myself playing for an imaginary audience. Whenever I make an investment decision, I ask myself whether John Bogle—or, more accurately, the obsessively frugal index investors on the Bogleheads forum—would approve. More generally, when I stand back to look at my life, I often think about how it would seem to someone observing from the outside. I’m not sure who this hypothetical observer would be; perhaps, to take a page from Matthew McConaughey’s Oscar speech, it’s myself ten years from now. It’s a small thing, but I’d like to believe that it makes me slightly more civilized in my everyday actions. The existentialists believed that we should act as if what we did set the example for the rest of mankind, which only paraphrases what Kant said two centuries earlier: “Live your life as though your every act were to become a universal law.”

Of course, that’s an impossibly high standard to maintain, so it’s usually enough to think in terms of one person, living or dead, real or imaginary, whose approval we’d like to earn. In writing, this takes the form of an ideal reader to whom all of our work is addressed, and I suspect that nearly every writer does this, whether consciously or not. In some ways, there’s no more fundamental decision in a writer’s life than the question of what reader you’re trying to impress. It shapes the projects you tackle and the style you employ, and it even influences some of your larger life decisions, like whether you want to end up in Iowa or New York. In practice, you’ll find yourself writing with an eye to real individuals with an ability to directly influence the outcome: trusted readers, prospective agents, busy editors. Over time, though, our ideal reader starts to resemble a composite of all these people, or a version of a particular person in our lives who may never see the draft we’re working on now. Ideally, this hypothetical reader should be benevolent but also a little scary, and the standards he or she sets for us should be at least somewhat higher than the ones we’d be willing to settle for ourselves.

Zadie Smith

Sometimes, our imaginary reader is another author whose work we admire, which can set insurmountable standards of its own: if we’re constantly wondering, as the critic James Wood says somewhere, what Flaubert would think of the sentence we’re writing, most of us wouldn’t get past the first paragraph. More commonly, this voice is often the product of the author’s own life story. In my own fiction, on the largest scale, I’m trying to live up to the standard that I set for myself when I was a child, back when nothing seemed more magical than the prospect of telling stories for a living. On a more granular level, I find that I’m often writing with an eye to the first writer who ever gave me useful feedback on a story. (I won’t mention him by name, but you can read more about him here.) Back when I was starting out, he read several of my stories and covered the pages with merciless notes and corrections, and although the process was draining, I’m convinced that it allowed me to get published five years earlier than I otherwise would. One of the stories he read, “Inversus,” was my first sale to Analog, and I don’t think it would have sold at all in its unedited form—which might well have discouraged me from pursuing that audience at all.

As a result, whenever I go over a draft, I’m frequently asking myself what he would think. It forces me to be harder on myself than I otherwise would: I’ll sometimes cross out entire pages and cut others to the bone, knowing that he’d react to what was currently there with a marginal question mark or even just a simple “No.” Of course, I’m really listening to my own inner voice, which has quietly taken on the qualities of the editors and readers I’ve come to respect. It’s a voice that is rightfully skeptical of everything it sees—as both Samuel Butler and Zadie Smith have pointed out, it’s a good habit to look over your work as if it were being read by an enemy—and I don’t think it would work nearly as well if I didn’t think of it as something external to me. I turn it off as much as I can during the first draft, but crank it up during the rewrite, when there’s no danger of fear or anxiety preventing me from at least finishing a manuscript. And although I try not to read published work with that voice, since there’s no changing what is already in print, I still sometimes sense it shaking its head when I go back to revisit a story, asking me: “Is that really what you wanted to say?”

Written by nevalalee

March 4, 2014 at 8:42 am

The importance of irrational optimism

leave a comment »

An editor should tell the author his writing is better than it is. Not a lot better, a little better.

T.S. Eliot

Looking back at recent posts, one of the themes I seem to hit repeatedly is the importance of objectivity. When you’re working on a novel or short story, you need to view it as coldly as possible, trying to see it, as Zadie Smith reminds us, through the eyes of an enemy. Otherwise, you’ll never be able to make the hard choice to cut a favorite line or scene, to radically restructure the plot, or even to abandon a project altogether. Objectivity, then, is one of the greatest strengths a writer can possess—with one big exception. Because in order to make the decision to be a writer in the first place, and wholeheartedly devote yourself to the writing life, the last thing you should be is objective. A writer’s state of mind, when first starting out, needs to be one of irrational optimism, because if we were totally objective about it, most of us wouldn’t become writers at all.

It’s safe to say that no one with a completely realistic temperament would ever dream of becoming a professional writer, let alone a writer of fiction. The odds of success have never been high, but these days, they’re objectively steeper than ever. First, you need to write, and finish, a good book—and that goal itself can often seem dauntingly out of reach. Next, you need to find an agent, and after that, a publisher who is willing to put real money on the line. Then, even if you’ve gotten that far, you need to navigate a hugely competitive market, with thousands of new novels published every year, not to mention what everyone agrees is a historically challenging moment for publishing of any kind. There’s a reason why the percentage of published writers who make a living solely through fiction is vanishingly small. And by definition, the odds of becoming one of those authors yourself are even more negligible.

And yet I don’t think there’s any writer, no matter how objective in other ways, who doesn’t secretly think: I will be the one who makes it. Certainly that was true of me. While I won’t say that I wouldn’t have tried to become a writer at all if I’d known exactly what was in store, it would have given me pause. Looking back, I’m little embarrassed at how confident, even arrogant, I was five years ago. Most writers would probably say the same thing. But here’s my point: for a writer, this sort of unwarranted optimism is essential. It’s the only thing that could possibly entice an otherwise rational person to become a novelist, or to enter any kind of creative field. Novices always overestimate their chances of success, and some will be bitterly disappointed, but none of them would have gotten anywhere if they’d accurately judged the odds. Paul Graham, the programmer, investor, and charming essayist, makes a similar point:

One reason the young sometimes succeed where the old fail is that they don’t realize how incompetent they are. This lets them do a kind of deficit spending. When they first start working on something, they overrate their achievements. But that gives them confidence to keep working, and their performance improves. Whereas someone clearer-eyed would see their initial incompetence for what it was, and perhaps be discouraged from continuing.

In other words, irrational enthusiasm can sometimes confer a selective advantage. If a mother didn’t have an irrational attachment to her own children, she’d smother them in the cradle. Similarly, if a young writer wasn’t convinced that he was much better than he really was, he’d never work long enough at his craft to become as good as he could be. The fact is, most young novelists aren’t very good, and even the best are generally producing little more than resourceful pastiches of more experienced authors—which, in itself, can be enough for a career. It takes years of objective practice to find an original voice, but only irrational optimism, and an inflated regard for one’s own potential, can carry a writer to the point where it pays off. Achieving this balance between optimism and objectivity is one of the hardest things for any kind of artist, but it’s essential. Because even the coldest, most objective writer needs to believe, for the sake of his own survival, that he is also, somehow, the exception to the rule.

Written by nevalalee

November 2, 2011 at 10:01 am

Posted in Writing

Tagged with , ,

%d bloggers like this: