Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘James Joyce

The apostolic succession

with one comment

Ever since I began working as a biographer—which is one of the few acceptable ways of earning a living as a private eye of culture—I’ve naturally become interested in what other writers have had to say on the subject. My favorite example, as I’ve noted here before, is Janet Malcolm’s The Silent Woman, which isn’t just the best book that I’ve read on the art of biography, but one of the best that I’ve read about anything. James Atlas’s The Shadow in the Garden offers an engaging look at the profession from the inside, even if you sometimes get the sense that Atlas wrote it mostly to settle a few old scores relating to his biography of Saul Bellow. And there are certain loose, baggy monsters of the form that can’t help but comment on their own monstrousness. A book like The Life of Graham Greene by Norman Sherry functions both as a straight work of scholarship and as a bizarre mediation on its own creation, and by the last volume, the two elements become so unbalanced that you’re forced to confront the underlying strangeness of the whole biographical enterprise. Such hybrid books, which read like unwitting enactments of Nabokov’s Pale Fire, tend to have three qualities in common. One is the biographer’s extensive use of the first person, which allows him to insert himself into the narrative like a shadowy supporting player. Another is the inordinate amount of time or wordage devoted to the project, which usually occupies multiple decades or volumes. And the last, which should probably serve as a warning, is that this tendency is often most pronounced when the biographer is investigating the life of another living writer, which leads to insidious problems of identification, admiration, and resentment. As Sherry said of his biography of Greene to the New York Times: “I almost destroyed myself. By the time I had finished, my life had been taken from me.”

Which brings us to Anthony Burgess by Roger Lewis, which combines all of these ingredients into one of the strangest books I’ve ever seen. It first caught my eye over a decade ago, with its striking cover inspired by Philip Castle’s poster for A Clockwork Orange, but I’m glad that I’m only reading it now, when perhaps I have a better understanding of the emotions that it expresses. After describing his first encounter as a young man with Burgess, whom he compares to a baboon with “vampiral” red eyes,  Lewis writes:

My need to know about Burgess twenty years ago: what lack or absence in me was being compensated for? I was youthful, full of ambition and ideals; he was a constellation, larger than life-size, a writer’s writer, crammed with allusions. He was, as Carlyle said of Danton, “a gigantic mass of ostentation,” and the piratical swagger was alluring and I had an abiding affinity with it. The facets which you are taken in by when you are young—the languages, the apparent wide knowledge—genuine academics and professionals, people in the know, see it as so nonsensical, it’s beneath them to contradict Burgess’s bluster. His success came from impressing people who didn’t quite know better; he was left alone by those who did. He fell into that gap, and made a fortune for himself.

If it isn’t abundantly clear by this point, Lewis goes on to explain that his feelings have curdled toward his old mentor, whom he later describes as a “pretentious prick” and a ”complete fucking fool.” But Lewis also adds incongruously: “Twenty years on from my days as a student prince, if I’m allegedly repudiating the lion of my late adolescence, it’s no doubt because deep down I continue to feel close to him.”

Not surprisingly, many reviewers regarded the book as an act of “character assassination,” as Blake Morrison put it in The Guardian, or a case study in the pathology of hero worship. But the tangled lines of influence are even weirder than they seem. Lewis’s real mentor wasn’t Burgess, but Richard Ellmann, his thesis adviser, the biographer of James Joyce and Oscar Wilde who is generally regarded as the greatest modern practitioner of literary biography. He played a similar role in the life of none other than James Atlas, who devotes many pages to Ellmann in The Shadow in the Garden, writing of his first encounter with the man who agreed to supervise his work at Oxford: “Steven [sic] Dedalus had stumbled upon his Leopold Bloom.” In a lengthy footnote on the very first page of Anthony Burgess, Lewis uses almost identical language to describe their relationship:

Ellmann was my supervisor (though he didn’t do much supervising) for a doctoral dissertation on Ezra Pound, of which I wrote not one word. We became friends and used to dine lavishly at the Randolph…We were both aware of a Bloom/Dedalus dynamic in our relationship. I was immensely cocky and callow, Ellmann wholly lacked the Oxford way of people being interested in each other only for their own advantage.

It was probably impossible to be mentored by Richard Ellmann, of all people, without thinking of the surrogate father and son of Ulysses, but in Lewis’s case, the Joycean labyrinth was even more twisted—because it was through Ellmann that Lewis met Burgess in the first place. His biography opens with an account of the evening of May 7, 1985, when Ellmann and Lewis picked up Burgess at a train station and gave him a ride to Oxford: “We all went to find Ellman’s rusty, seldom-washed car…Ellmann took us through the city, turning corners by mounting the kerb, grazing bollards and scattering cyclists.” And all the while, Lewis informs us, Burgess had been “murmuring to Ellmann about Joyce.”

And it gets even stranger. One of Ellmann’s other students was the biographer Henry Hart, who later wrote an essay on his mentor titled “Richard Ellmann’s Oxford Blues.” Hart is also the author of the biography James Dickey: The World as a Lie, another book full of mixed feelings toward its self-mythologizing subject, of whom he writes: “To my great relief, Dickey expressed little animosity toward my project. But he obviously had worries, the main one being the way I would address the romanticized versions of his life that he had aired so free-spiritedly in conversations and publications.” Hart addresses these problems in depth, as the full title of the book indicates. (The subtitle, he claims, was Dickey’s idea.) And I’m fascinated by how Richard Ellmann, the author of perhaps the most acclaimed literary biography of all time, produced three separate protégés whose work—Atlas on Bellow, Hart on Dickey, Lewis on Burgess—all but explodes with ambivalence toward their subjects, their own ambitions, and the whole notion of biography itself. Thinking of Ellmann and his literary progeny, I’m reminded, as many of them undoubtedly were, of Stephen Dedalus’s famous speech in the library scene in Ulysses:

A father, Stephen said, battling against hopelessness, is a necessary evil…Fatherhood, in the sense of conscious begetting, is unknown to man. It is a mystical estate, an apostolic succession, from only begetter to only begotten…Paternity may be a legal fiction. Who is the father of any son that any son should love him or he any son?

That uneasy succession, which assumes unpredictable shapes in its passage from one generation to another, must be as difficult for biographers as for anyone else. And Ellmann may well have had other students whose names I don’t know yet. There’s obviously a good story here. Somebody should write a book about it.

The knock at the door

leave a comment »

Once or twice [James Joyce] dictated a bit of Finnegans Wake to [Samuel] Beckett, though dictation did not work very well for him; in the middle of one such session there was a knock at the door which Beckett didn’t hear. Joyce said, “Come in,” and Beckett wrote it down. Afterwards he read back what he had written and Joyce said, “What’s that ‘Come in?’” “Yes, you said that,” said Beckett. Joyce thought for a moment, then said, “Let it stand.” He was quite willing to accept coincidence as his collaborator.

Richard Ellmann, James Joyce

Written by nevalalee

April 1, 2017 at 7:30 am

The cyborg on the page

with 3 comments

Locutus of Borg

In an excellent anthology of his short stories, the author Joe Haldeman describes an exercise that he used to give to his students at M.I.T., where he taught a course on science fiction for many years. Reading it, I found myself wishing—for just about the first time ever—that I could have taken that class. Here’s what Haldeman says:

For this assignment, I gave each student a random number between 8 and 188, which corresponded to page numbers in the excellent sourcebook The Science in Science Fiction, by Peter Nicholls, with David Langford and Brian Stableford. They had to come up with a story using that scientific device or principle. I further restricted them by saying they had to use a story structure from one of the stories in our textbook The Science Fiction Hall of Fame, edited by Robert Silverberg. The point of the assignment was partly to demonstrate that art thrives under restrictions. (It was also to give them a starting point; many had never written fiction before, and a blank page or screen is a terrible thing.)

Haldeman notes that he always does his own assignments, at least to demonstrate the concept for a couple of pages, and that in this case, he was given the word “cyborg” and the structure of Daniel Keyes’s “Flowers for Algernon.” The result was a solid short story, “More Than the Sum of His Parts,” which was later published in Playboy.

Not surprisingly, I love this idea, for reasons that longtime readers of this blog will probably be able to guess. Constraints, as Haldeman observes, are where fiction flourishes. This partially because of the aforementioned tyranny of the blank page: any starting point, even a totally random one, is better than nothing at all, and a premise that is generated by chance can be more stimulating than one of great personal significance. (When you’re trying to write about something important to you, you’re often too intimidated by the possibilities to start, while it’s easy to get started on a premise that has been handed to you for free. As Trump might put it, what have you got to lose?) There’s also the fact that a kind of synergy results when you pair a story structure with a concept: the dialogue between form and content yields ideas that neither one could have generated in isolation. Nearly every story I’ve ever written has resulted from a pairing of two or more notions, and I’ve developed a fairly reliable intuition about which combinations will be the most fruitful. But I haven’t really experimented with structure in the same way, which is why this exercise is so useful. When I brought it up with Haldeman, he said that the assignment is designed to make students think of form as a tool—or a toy—that can be explored and enjoyed independently of plot, which is a point subtle enough that a lot of writers, including me, never get around to playing with it. But when I take my scheduled break in a couple of months to work out a new story, I’m going to give it a try.

Schema for Joyce's Ulysses

It’s revealing, too, that the story that Haldeman uses to illustrate his point is about a cyborg, since that’s what we’re really talking about here—a mixture of artificial and organic parts that theoretically forms a single viable organism. (In the actual story, it doesn’t turn out well.) Sometimes you start with a few components from off the shelf, or an assortment of discrete pieces of information, and once you start to combine them, they knit themselves together with newly grown tissue. In other cases, you begin with something more natural, like the chain of logical events that follow from a dramatic situation, and then add parts as needed. And incorporating a bit of randomness at an early stage results in solutions that never would have occurred to you otherwise. There’s a famous design exercise in which students are told to draw the human body in a state of movement, and then to construct an apparatus that will support the body in that position. At the end, the teacher points out that they’ve been designing furniture. That’s how writing works, too. Writers are frequently drawn to metaphors from carpentry, as when Gabriel García Marquez compares writing to making a table, or when José Saramago says that any chair he makes has to have four stable feet. But the result is more interesting when you don’t think in terms of making a table or a chair, but of creating a support system that will hold up the bodies you’ve set in motion. A cyborg carries his essential furniture with him at all times, stripped down to its purest functional form. And that’s also true of a story.

If every story is a cyborg, there’s also a range of approaches to how visible the parts should be. Some wear their artificial components openly, like Locutus of Borg, so that the result is a style in itself, while others keep their enhancements hidden. A book like Joyce’s Ulysses, with its endless experiments and pastiches in form, looks like a manufacturer’s catalog, or a fashion spread in which the same handful of models show off various possible outfits. I don’t recall offhand if Joyce assigned the various epic episodes, literary styles, and symbols to the chapters of Ulysses at random, but I’d like to believe that he did, simply because it’s such a pragmatic tool: “Let the bridge blow up,” Joyce once said, “provided I have got my troops across.” Sometimes the writer takes pleasure in making the joints between the pieces as invisible as possible, and sometimes it’s more fun to play up the artifice, or even to encourage the reader to spot the references—although a little of this goes a long way. It’s a matter of taste, which is another reason why the use of randomness at an early stage can be a good thing: the more detached you are from the big conceptual blocks of the plot, the more likely you are to make the right decisions when it comes to the details. If you’re the kind of writer who wants to crank out a story a week for a year, as Ray Bradbury once advised, Haldeman’s exercise is invaluable. (As Bradbury says: “I dare any young writer to write fifty-two stories that are all bad.”) I wouldn’t want to take the same approach for every story, since there comes a point at which the author himself starts to resemble a machine. But when used wisely, it’s a nice reminder that every story is more than the sum of its parts.

Written by nevalalee

August 25, 2016 at 8:56 am

Quote of the Day

leave a comment »

Written by nevalalee

May 20, 2016 at 7:30 am

Quote of the Day

with 4 comments

James Joyce

I am really one of the greatest engineers, if not the greatest, in the world besides being a musicmaker, philosophist and heaps of other things. All the engines I know are wrong. Simplicity. I am making an engine with only one wheel. No spokes of course. The wheel is a perfect square.

James Joyce, in a letter to Harriet Shaw Weaver

Written by nevalalee

August 11, 2015 at 6:59 am

Introduction to finality

leave a comment »

Community

Yesterday, while I was out of the house, my wife asked our daughter: “Do you want your veggie snacks in a bowl?” My daughter, who is two years old, replied: “Sure!” I wasn’t there to see it, but when I got back, I was assured by all involved that it was hilarious. Then, this morning, in response to another question, my daughter said: “Sure!” Without thinking twice, I said: “That’s a good callback, honey. Is that your new catchphrase?” Which made me realize how often I talk about myself and those around me as if we were on a television show. We’ve always used terms from art and literature to describe the structure of our own lives: when we talk about “starting a new chapter” or “turning the page,” we’re implicitly comparing ourselves to the characters in novels. Even a phrase like “midlife crisis” is indebted, almost without knowing it, to the language of literary criticism. It was perhaps inevitable, then, that we’d also appropriate the grammar of television, which is the art form that has the most in common with the way our lives tend to unfold. When I moved to New York after college, I thought of myself as the star of a spinoff featuring a breakout character from the original series, a supporting player who ended up being my roommate. And a friend once told me that he felt that the show jumped the shark after I got a job in finance. (He wasn’t wrong.)

Which goes a long way toward explaining why Community has exercised such a hold over the imaginations of its viewers. For the past six years, it hasn’t always been the funniest sitcom around, or the most consistent, or even the most creative. But it’s the show that thought most urgently about the ways in which we use television to understand ourselves. For most of the show’s run, these themes centered on the character of Abed, but as last night’s season finale—which feels an awful lot like it ought to be the last episode of the entire series—clearly demonstrated, their real impact was on Jeff. Community could sometimes be understood as a dialogue between Abed and Jeff, with one insisting on seeing events in terms of narrative conventions while the other brought him down to earth, but in the end, Jeff comes to see these tropes as a way of making sense of his own feelings of loss. We’re aware, of course, that these people are characters on a television series, which is why Abed’s commentary on the action was often dismissed as a winking nod to the audience. But it wouldn’t be so powerful, so compelling, and ultimately so moving if we didn’t also sense that seeing ourselves through that lens, at least occasionally, is as sane a way as any of giving a shape to the shapelessness of our lives.

Danny Pudi on Community

Community may not endure as a lasting work of our culture—it’s more than enough that it was a great sitcom about half the time—but it’s part of a long tradition of stories that offer us metaphors drawn from their own artistic devices. Most beautifully, we have Shakespeare’s seven ages of man, which are framed as acts in a play. (Shakespeare returned to such images with a regularity that implies that he saw such comparisons as more than figures of speech: “Life’s but a walking shadow, a poor player / That struts and frets his hour upon the stage…” “These our actors, / As I foretold you, were all sprits and / Are melted into air, into thin air…”) Joyce used The Odyssey to provide a framework for his characters’ lives, as Proust did, more subtly, with The Thousand and One Nights. These are all strategies for structuring a literary work, but we wouldn’t respond to them so profoundly if they didn’t also reflect how we felt about ourselves. You could even say that fiction takes the form of a shapely sequence of causal events, at least in the western tradition, because we see our lives in much the same way. When you stand back, everyone’s life looks more or less the same, even as they differ in the details, and as we grow older, we see how much we’re only repeating patterns that others before us have laid down.

This might seem like a lot of pressure to place on a show that included a self-conscious fart joke—with repeated callbacks—in its final episode. But it’s the only way I can explain why Community ended up meaning more to me than any other sitcom since the golden age of The Simpsons. The latter show also ended up defining our lives as completely as any work of art can, mostly because its sheer density and longevity allowed it to provide a reference point to every conceivable situation. Community took a clever, almost Borgesian shortcut by explicitly making itself its own subject, and on some weird level, it benefited from the cast changes, creator firings, cancellations, and unexpected revivals that put its viewers through the wringer almost from the start. It was a show that was unable to take anything for granted, no more than any of us can, and even if it sometimes strained to keep itself going through its many incarnations, it felt like a message to those of us who struggle to impose a similar order on our own lives. Life, like a television show on the brink, has to deal with complications that weren’t part of the plan. If those ups and downs pushed Community into darker and stranger places, it’s a reminder that life gains much of its meaning, not from our conscious intentions, but as an emergent property of the compromises we’re forced to make. And like any television show, it’s defined largely by the fact that it ends.

Written by nevalalee

June 3, 2015 at 9:33 am

The Ive Mind

with 3 comments

Jonathan Ive and Steve Jobs

Like many readers, I spent much of yesterday working my way through Ian Parker’s massive New Yorker profile of Apple designer Jonathan Ive. Over the years, we’ve seen plenty of extended feature pieces on Ive, who somehow manages to preserve his reputation as an intensely private man, but it feels like Parker set out to write the one to end them all: it’s well over fifteen thousand words long, and there were times, as I watched my progress creeping slowly by in the scroll bar, when I felt like I was navigating an infinite loop of my own. (It also closes in that abrupt New Yorker way that takes apparent pride in ending articles at the most arbitrary place possible, as if the writer had suffered a stroke before finishing the last paragraph.) Still, it’s a fine piece, crammed with insights, and I expect that I’ll read it again. I’ve become slightly less enamored of Apple ever since my latest MacBook started to disintegrate a few months after I bought it—by which I mean its screws popped out one by one and its plastic casing began to bubble alarmingly outward—but there’s no doubting Ive’s vision, intelligence, and ability to articulate his ideas.

Like a lot of Apple coverage, Parker’s article builds on the company’s mythology while making occasional stabs at deflating it, with paragraphs of almost pornographic praise alternating with a skeptical sentence or two. (“I noticed that, at this moment in the history of personal technology, Cook still uses notifications in the form of a young woman appearing silently from nowhere to hold a sheet of paper in his line of sight.”) And he’s best not so much at talking about Apple’s culture as at talking about how they talk about it. Here’s my favorite part:

[Ive] linked the studio’s work to NASA’s: like the Apollo program, the creation of Apple products required “invention after invention after invention that you would never be conscious of, but that was necessary to do something that was new.” It was a tic that I came to recognize: self-promotion driven by fear that one’s self-effacement might be taken too literally. Even as Apple objects strive for effortlessness, there’s clearly a hope that the effort required—the “huge degree of care,” the years of investigations into new materials, the months spent enforcing cutting paths in Asian factories—will be acknowledged.

Early patent sketches for Apple handheld device

I love this because it neatly encapsulates the neurosis at the heart of so much creative work, from fiction to industrial design. We’re constantly told that we ought to strive for simplicity, and that the finished product, to use one of Ive’s favorite terms, should seem “inevitable.” Yet we’re also anxious that the purity of the result not be confused with the ease of its creation. Writers want readers to accept a novel as a window onto reality while simultaneously noticing the thousands of individual choices and acts of will that went into fashioning it, which is inherently impossible. And it kills us. Writing a novel is a backbreaking process that wants to look as simple as life, and that contradiction goes a long way toward explaining why authors never feel as if they’ve received enough love: the terms of the game that they’ve chosen ensure that most of their work remains invisible. Novels, even mediocre ones, consist of “invention after invention after invention,” a daunting series, as T.H. White noted, of “nouns, verbs, prepositions, adjectives, reeling across the page.” And even when a story all but begs us to admire the brilliance of its construction, we’ll never see more than a fraction of the labor it required.

So what’s a creative artist to do? Well, we can talk endlessly about process, as Ive does, and dream of having a profile in The Paris Review, complete with images of our discarded drafts. Or we can push complexity to the forefront, knowing at least that it will be acknowledged, even if it goes against what we secretly believe about the inevitability of great art. (“The artist, like the God of creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” James Joyce writes, and yet few other authors have been so insistent that we recognize his choices, even within individual words.) Or, if all else fails, we can rail against critics who seem insufficiently appreciative of how much work is required to make something feel obvious, or who focus on some trivial point while ignoring the agonies that went into a story’s foundations. None of which, of course, prevents us from taking the exact same attitude toward works of art made by others. Ultimately, the only solution is to learn to live with your private store of effort, uncertainty, and compromise, never advertising it or pointing to all your hard work as an excuse when it falls short. Because in the end, the result has to stand on its own, even if it’s the apple of your eye.

Written by nevalalee

February 18, 2015 at 9:50 am

Beethoven, Freud, and the mystery of genius

leave a comment »

Beethoven

“The joy of listening to Beethoven is comparable to the pleasure of reading Joyce,” writes Alex Ross in a recent issue of The New Yorker: “The most paranoid, overdetermined interpretation is probably the correct one.” Even as someone whose ear for classical music is underdeveloped compared to his interest in other forms of art, I have to agree. Great artists come in all shapes and sizes, but the rarest of all is the kind whose work can sustain the most meticulous level of scrutiny because we’re aware that every detail is a conscious choice. When we interpret an ordinary book or a poem, our readings are often more a reflection of our own needs than the author’s intentions; even with a writer like Shakespeare, it’s hard to separate the author’s deliberate decisions from the resonances that naturally emerge from so much rich language set into motion. With Beethoven, Joyce, and a handful of others—Dante, Bach, perhaps Nabokov—we have enough information about the creative process to know that little, if anything, has happened by accident. Joyce explicitly designed his work to “keep professors busy for centuries,” and Beethoven composed for a perfect, omniscient audience that he seemed to will into existence.

Or as Colin Wilson puts it: “The message of the symphonies of Beethoven could be summarized: ‘Man is not small; he is just bloody lazy.'” When you read Ross’s perceptive article, which reviews much of the recent scholarship on Beethoven and his life, you’re confronted by the same tension that underlies any great body of work made within historical memory. On the one hand, Beethoven has undergone a kind of artistic deification, and there’s a tradition, dating back to E.T.A. Hoffmann, that there are ideas and emotions being expressed in his music that can’t be matched by any other human production; on the other, there’s the fact that Beethoven was a man like any other, with a messy personal life and his own portion of pettiness, neediness, and doubt. As Ross points out, before Beethoven, critics were accustomed to talk of “genius” as a kind of impersonal quality, but afterward, the concept shifted to that of “a genius,” which changes the terms of the conversation without reducing its underlying mystery. Beethoven’s biography provides tantalizing clues about the origins of his singular greatness—particularly his deafness, which critics tend to associate with his retreat to an isolated, visionary plane—but it leaves us with as many questions as before.

Sigmund Freud

As it happens, I read Ross’s article in parallel with Howard Markel’s An Anatomy of Addiction, which focuses on the early career of another famous resident of Vienna. Freud seems to have been relatively indifferent to music: he mentions Beethoven along with Goethe and Leonardo Da Vinci as “great men” who have produced “splendid creations,” although this feels more like a rhetorical way of filling out a trio than an expression of true appreciation. Otherwise, his relative silence on the subject is revealing in itself: if he wanted to interpret an artist’s work in psychoanalytic terms, Beethoven’s life would have afforded plenty of material, and he didn’t shy from doing the same for Leonardo and Shakespeare. It’s possible that Freud avoided Beethoven because of the same godlike intentionality that makes him so fascinating to listeners and critics. If we’ve gotten into the habit of drawing a distinction between what a creative artist intends and his or her unconscious impulses, it’s largely thanks to Freud himself. Beethoven stands as a repudiation, or at least a strong counterexample, to this approach: however complicated Beethoven may have been as a man, it’s hard to make a case that there was ever a moment when he didn’t know what he was doing.

This may be why Freud’s genius—which was very real—seems less mysterious than Beethoven’s: we know more about Freud’s inner life than just about any other major intellectual, thanks primarily to his own accounts of his dreams and fantasies, and it’s easy to draw a line from his biography to his work. Markel, for instance, focuses on the period of Freud’s cocaine use, and although he stops short of suggesting that all of psychoanalysis can be understood as a product of addiction, as others have, he points out that Freud’s early publications on cocaine represent the first time he publicly mined his own experiences for insight. But of course, there were plenty of bright young Jewish doctors in Vienna in the late nineteenth century, and while many of the ideas behind analysis were already in the air, it was only in Freud that they found the necessary combination of obsessiveness, ambition, and literary brilliance required for their full expression. Freud may have done his best to complicate our ideas of genius by introducing unconscious factors into the equation, but paradoxically, he made his case in a series of peerlessly crafted books and essays, and their status as imaginative literature has only been enhanced by the decline of analysis as a science. Freud doesn’t explain Freud any more than he explains Beethoven. But this doesn’t stop him, or us, from trying.

“And this has something to do with Operation Pepel?”

leave a comment »

"A few of the files talk about a poison program..."

Note: This post is the forty-first installment in my author’s commentary for City of Exiles, covering Chapter 40. You can read the earlier installments here

As I’ve written here elsewhere, research in fiction is less about factual accuracy than a way of dreaming. Fiction, like a dream, isn’t assembled out of nothing: it’s an assimilation and combination of elements that we’ve gathered in our everyday lives, in stories we hear from friends, in our reading and consumption of other works of art, and through the conscious investigation of whatever world we’ve decided to explore. This last component is perhaps the most crucial, and probably the least appreciated. Writers vary in the degree of novelistic attention they can bring to their surroundings at any one time, but most of us learn to dial it down: it’s both exhausting and a little unfair to life itself to constantly be mining for material. When we commence work on a project, though, our level of engagement rises correspondingly, to the point where we start seeing clues or messages everywhere we look. Research is really just a way of taking that urge for gleaning or bricolage and making it slightly more systematic, exposing ourselves to as many potential units of narrative as we can at a time when we’re especially tuned to such possibilities.

The primordial function of research—-of “furnishing and feathering a world,” in Anthony Lane’s memorable phrase—is especially striking when it comes to details that would never be noticed by the average reader. Few of us would care whether or not the fence at No. 7 Eccles Street could really be climbed by an ordinary man, but for James Joyce, it was important enough for him to write his aunt to confirm it. If we’re thinking only in terms of the effect on readers, this kind of meticulous accuracy can start to seem a little insane, but from the author’s point of view, it makes perfect sense. For most of the time we spend living with a novel, the only reader whose opinion matters is our own, and a lot of research consists of the author convincing himself that the story he’s describing could really have taken place. In order to lose ourselves in the fictional dream, the smallest elements have to seem persuasive to us, and even if a reader couldn’t be expected to know that we’ve fudged or invented a detail that we couldn’t verify elsewhere, we know it, and it subtly affects how deeply we can commit ourselves to the story we’re telling. A reader may never notice a minor dishonesty, but the writer will always remember it.

"And this has something to do with Operation Pepel?"

In my own fiction, I’ve tried to be as accurate as I can even in the smallest things. I keep a calendar of the major events in the story, and I do my best to square it with such matters as railway schedules, museum hours, and the times for sunrise and sunset. (As Robert Louis Stevenson wrote: “And how troublesome the moon is!”) I walk the locations of each scene whenever possible, counting off the steps and figuring out how long it would take a character to get from one point to another, and when I can’t go there in person, I spend a long time on Google Street View. It may seem like a lot of trouble, but it actually saves me work in the long run: being able to select useful details from a mass of existing material supplements the creative work that has to be done, and I’m always happier to take something intact from the real world than to have to invent it from scratch. And I take a kind of perverse pleasure in the knowledge that a reader wouldn’t consciously notice any of it. At best, these details serve as a kind of substratum for the visible events of the story, and tiny things add up to a narrative that is convincing in its broadest strokes. There’s no guarantee that such an approach will work, of course, but it’s hard to make anything work without it.

In City of Exiles, for instance, I briefly mention something called Operation Pepel, which is described as a special operation by Russian intelligence that occurred in Turkey in the sixties. Operation Pepel did, in fact, exist, even if we don’t know much about who was involved or what it was: I encountered it thanks to a passing reference, amounting to less than a sentence, in the monumental The Sword and the Shield by Christopher Andrew and Vasili Mitrokhin. (It caught my eye, incidentally, only because I’d already established that part of the story would center on an historical event involving Turkey, which is just another illustration of how parts of the research process can end up informing one another across far-flung spaces.) Later, I tie Operation Pepel—purely speculatively—to elements of the Soviet poison program, and the details I provide on such historical events as Project Bonfire are as accurate as I can make them. None of this will mean anything even to most specialists in the history of Russia, and I could easily have made up something that would have served just as well. But since I invent so much elsewhere, and so irresponsibly, it felt better to retain as many of the known facts I could. It may not matter to the reader, but it mattered a lot to me…

Written by nevalalee

July 24, 2014 at 9:44 am

“But the changes reveal more than they intend…”

leave a comment »

"But the body of God appears throughout scripture..."

Note: This post is the thirty-sixth installment in my author’s commentary for City of Exiles, covering Chapter 35. You can read the earlier installments here

Yesterday, I alluded to the cartographer Arthur H. Robinson’s story of how he developed his famous projection of the globe: he decided on the shapes he wanted for the continents first, then went back to figure out the underlying mathematics. Authors, of course, engage in this kind of inverted reasoning all the time. One of the peculiar things about a novel—and about most kinds of narrative art—is that while, with a few exceptions, it’s designed to be read in a linear fashion, the process of its conception is anything but straightforward. A writer may begin with a particular scene he wants to write, or, more commonly, a handful of such scenes, then assemble a cast of characters and an initial situation that will get him from one objective to the next. He can start with an outrageous plot twist and then, using the anthropic principle of fiction, set up the story so that the final surprise seems inevitable. Or he can take a handful of subjects or ideas he wants to explore and find a story that allows him to talk about them all. Once the process begins, it rarely proceeds straight from start to finish: it moves back and forth, circling back and advancing, and only in revision does the result begin to feel like all of a piece.

And I’ve learned that this tension between the nonlinear way a novel is conceived and the directional arrow of the narrative is a central element of creativity. (In many ways, it’s the reverse of visual art: a panting is built up one element at a time, only to be experienced all at once when finished, which leads to productive tensions and discoveries of its own.) In most stories, the range of options open to the characters grows increasingly narrow as the plot advances: the buildup of events and circumstance leaves the protagonist more and more constrained, whether it’s by a web of danger in a thriller or the slow reduction of personal freedom in a more realistic novel. That’s how suspense emerges, covertly or overtly; we read on to see how the characters will maneuver within the limits that the story has imposed. What ought to be less visible is the fact that the author has been operating under similar constraints from the very first page. He has some idea of where the story is going; he knows that certain incidents need to take place, rather than their hypothetical alternatives, to bring the characters to the turning points he’s envisioned; and this knowledge, combined with the need to conceal it, forces him to be more ingenious and resourceful than if he’d simply plowed ahead with no sense of what came next.

"But the changes reveal more than they intend..."

This is why I always set certain rules or goals for myself in advance of preparing a story, and it often helps if they’re a little bit arbitrary. When I started writing City of Exiles, for instance, I decided early on that the vision of Ezekiel would play a role in the plot, even if I didn’t know how. This is partially because I’d wanted to write something on the merkabah—the vision of the four fabulous creatures attending the chariot of God—for a long time, and I knew the material was rich and flexible enough to inform whatever novel I decided to write. More important, though, was my need for some kind of overriding constraint in the first place. Knowing a big element of the novel in advance served as a sort of machine for making choices: certain possibilities would suggest themselves over others, from the highest level to the lowest, and if I ever felt lost or got off track, I had an existing structure to guide me back to where I needed to be. And really, it could have been almost anything; as James Joyce said of the structure of Ulysses, it’s a bridge that can be blown up once the troops have gotten to the other side.  (Not every connective thread is created equal, of course. Using the same approach I’d used for my previous novels, I spent a long time trying to build Eternal Empire around the mystery of the Urim and Thummim, only to find that the logical connections I needed just weren’t there.)

Chapter 35 contains the longest extended discussion of Ezekiel’s vision in the novel so far, as Wolfe pays her second visit to Ilya in prison, and it provides an illustration in miniature of the problems I had to confront throughout the entire story. The material may be interesting in its own right, but if I can’t find ways of tying it back to events in the larger narrative, readers might well wonder what it’s doing here at all. (To be fair, some readers did have this reaction.) At various points in this chapter, you can see me, in the person of Wolfe, trying to bring the discussion back around to what is happening elsewhere in the story. According to the rabbis, Ezekiel’s vision can’t be discussed with a student under forty, and those who analyze the merkabah without the proper preparation run the risk of being burned alive by fire from heaven, which turns it into a metaphor for forbidden knowledge of any kind. And my own theory about the vision’s meaning, in which I’m highly indebted to David J. Halperin’s book The Faces of the Chariot, centers on the idea that elements of the story have been redacted or revised, which points to the acts of deception and erasure practiced by the Russian intelligence services. In the end, Wolfe leaves with a few precious hints, and if she’s able to put them to good use, that’s no accident. The entire story is designed to take her there…

Written by nevalalee

June 19, 2014 at 9:55 am

On the novelist’s couch

with one comment

Sigmund Freud

Recently, I’ve been thinking a lot about Freud. Psychoanalysis may be a dying science, or religion, with its place in our lives usurped by neurology and medication, but Freud’s influence on the way we talk about ourselves remains as strong as ever, not least because he was a marvelous writer. Harold Bloom aptly includes him in a line of great essayists stretching back to Montaigne, and he’s far and away the most readable and likable of all modern sages. His writings, especially his lectures and case notes, are fascinating, and they’re peppered with remarkable insights, metaphors, and tidbits of humor and practical advice. Bloom has argued convincingly for Freud as a close reader of Shakespeare, however much he might have resisted acknowledging it—he believed until the end of his days that Shakespeare’s plays had really been written by the Earl of Oxford, a conjecture known endearingly as the Looney hypothesis—and he’s as much a prose poet as he is an analytical thinker. Like most geniuses, he’s as interesting in his mistakes as in his successes, and even if you dismiss his core ideas as an ingeniously elaborated fantasy, there’s no denying that he constructed the central mythology of our century. When we talk about the libido, repression, anal retentiveness, the death instinct, we’re speaking in the terms that Freud established.

And I’ve long been struck by the parallels between psychoanalysis and what writers do for a living. Freud’s case studies read like novels, or more accurately like detective stories, with the analyst and the patient navigating through many wild guesses and wrong turns to reach the heart of the mystery. In her classic study Psychoanalysis: The Impossible Profession, Janet Malcolm writes:

In the Dora paper, Freud illustrates the double vision of the patient which the analyst must maintain in order to do his work: he must invent the patient as well as investigate him; he must invest him with the magic of myth and romance as well as reduce him to the pitiful bits and pieces of science and psychopathology. Only thus can the analyst sustain his obsessive interest in another—the fixation of a lover or a criminal investigator—and keep in sight the benign raison d’être of its relentlessness.

To “the fixation of a lover or a criminal investigator,” I might also add “of a writer.” The major figures in a novel can be as unknowable as the patient on the couch, and to sustain the obsession that finishing a book requires, a writer often has to start with an imperfect, idealized version of each character, then grope slowly back toward something more true. (Journalists, as Malcolm has pointed out elsewhere, sometimes find themselves doing the same thing.)

Janet Malcolm

The hard part, for novelists and analysts alike, is balancing this kind of intense engagement with the objectivity required for good fiction or therapy. James Joyce writes that a novelist, “like the God of the creation, remains within or behind or beyond or above his handiwork, invisible, refined out of existence, indifferent, paring his fingernails,” and that’s as fine a description as any of the perfect psychoanalyst, who sits on a chair behind the patient’s couch, pointedly out of sight. It’s worth remembering that psychoanalysis, in its original form, has little in common with the more cuddly brands of therapy that have largely taken its place: the analyst is told to remain detached, impersonal, a blank slate on which the patient can project his or her emotions. At times, the formal nature of this relationship can resemble a kind of clinical cruelty, with earnest debates, for instance, over whether an analyst should express sympathy if a patient tells him that her mother has died. This may seem extreme, but it’s also a way of guarding against the greatest danger of analysis: that transference, in which the patient begins to use the analyst as an object of love or hate, can run the other way. Analysts do fall in love with their patients, as well as patients with their analysts, and the rigors of the psychoanalytic method are designed to anticipate, deflect, and use this.

It’s in the resulting dance between detachment and connection that psychoanalysis most resembles the creative arts. Authors, like analysts, are prone to develop strong feelings toward their characters, and it’s always problematic when a writer falls in love with the wrong person: witness the case of Thomas Harris and Hannibal Lecter—who, as a psychiatrist himself, could have warned his author of the risk he was taking. Here, authors can take a page from their psychoanalytic counterparts, who are encouraged to turn the same detached scrutiny on their own feelings, not for what it says about themselves, but about their patients. In psychoanalysis, everything, including the seemingly irrelevant thoughts and emotions that occur to the analyst during a session, is a clue, and Freud displays the same endless diligence in teasing out their underlying meaning as a good novelist does when dissecting his own feelings about the story he’s writing. Whether anyone is improved by either process is another question entirely, but psychoanalysis, like fiction, knows to be modest in its moral and personal claims. What Freud said of the patient may well be true of the author: “But you will see for yourself that much has been gained if we succeed in turning your hysterical misery into common unhappiness.”

Written by nevalalee

October 25, 2013 at 8:49 am

Luca Brasi flubs his lines, or the joy of happy accidents

leave a comment »

Marlon Brando and Lenny Montana in The Godfather

During the troubled filming of The Godfather, Lenny Montana, the actor who played the enforcer Luca Brasi, kept blowing his lines. During his big speech with Don Corleone at the wedding—”And may their first child be a masculine child”—Montana, anxious about working with Brando for the first time, began to speak, hesitated, then started over again. It was a blown take, but Coppola liked the effect, which seemed to capture some of the character’s own nervousness. Instead of throwing the shot away, he kept it, and he simply inserted a new scene showing Brasi rehearsing his words just before the meeting. It was a happy accident of the sort that you’ll often find in the work of a director like Coppola, who is more open than most, almost to a fault, to the discoveries that can be made on the set. (A more dramatic example is the moment early in Apocalypse Now when Martin Sheen punches and breaks the mirror in his hotel room, which wasn’t scripted—Sheen cut up his hand pretty badly. And for more instances of how mischance can be incorporated into a film, please see this recent article by Mike D’Angelo of The A.V. Club, as well as the excellent comments, which inspired this post.)

You sometimes see these kinds of happy accidents in print as well, but they’re much less common. One example is this famous story of James Joyce, as told by Richard Ellimann:

Once or twice he dictated a bit of Finnegans Wake to [Samuel] Beckett, though dictation did not work very well for him; in the middle of one such session there was a knock at the door which Beckett didn’t hear. Joyce said, “Come in,” and Beckett wrote it down. Afterwards he read back what he had written and Joyce said, “What’s that ‘Come in?’” “Yes, you said that,” said Beckett. Joyce thought for a moment, then said, “Let it stand.”

Similarly, a chance misprint inspired W.H. Auden to change his line “The poets have names for the sea” to “The ports have names for the sea.” And it’s widely believed that one of the most famous lines in all of English poetry, “Brightness falls from the air,” was also the result of a typo: Nashe may have really written “Brightness falls from the hair,” which makes more sense in context, but is much less evocative.

Lenny Montana in The Godfather

Still, it isn’t hard to see why such accidents are more common in film than in print. A novelist or poet can always cross out a line or delete a mistyped word, but filmmaker is uniquely forced to live with every flubbed take or reading: once you’ve started shooting, there’s no going back, and particularly in the days before digital video, a permanent record exists of each mistake. As a result, you’re more inclined to think hard about whether or not you can use what you have, or if the error will require another costly camera setup. In some ways, all of film amounts to this kind of compromise. You never get quite the footage you want: no matter how carefully you’ve planned the shoot, when the time comes to edit, you’ll find that the actors are standing in the wrong place for one shot to cut cleanly to the next, or that you’re missing a crucial closeup that would clarify the meaning of the scene. It’s part of the craft of good directors—and editors—to cobble together something resembling their original intentions from material that always falls short. Every shot in a movie, in a sense, is a happy accident, and the examples I’ve mentioned above are only the most striking examples of a principle that governs the entire filmmaking process.

And it’s worth thinking about the ways in which artists in other media can learn to expose themselves to such forced serendipity. (I haven’t even mentioned the role it plays in such arts as painting, in which each decision starts to feel similarly irrevocable, at least once you’ve started to apply paint to canvas.) One approach, which I’ve tried in the planning stages of my own work, is to work in as permanent a form as possible: pen on paper, rather than pencil or computer, which means that every wrong turn or mistaken impulse lingers on after you’ve written it. A typewriter, I suspect, might play the same role, and I have a feeling that writers of a previous generation occasionally shaped their sentences to match a mistyped word, rather than going through the trouble of typing the page all over again. Writers are lucky: we have a set of tools of unmatched portability, flexibility, and privacy, and it means that we can deal with any errors at our leisure, at least until they see print. But with every gain, there’s also a loss: in particular, of the kind of intensity and focus that actors describe when real, expensive film is running through the camera. When so much is on the line, you’re more willing to find ways of working with what you’ve been given by chance. And that’s an attitude that every artist could use.

Written by nevalalee

October 8, 2013 at 8:12 am

The problem of narrative complexity

with 5 comments

David Foster Wallace

Earlier this month, faced with a break between projects, I began reading Infinite Jest for the first time. If you’re anything like me, this is a book you’ve been regarding with apprehension for a while now—I bought my copy five or six years ago, and it’s followed me through at least three moves without being opened beyond the first page. At the moment, I’m a couple of hundred pages in, and although I’m enjoying it, I’m also glad I waited: Wallace is tremendously original, but he also pushes against his predecessors, particularly Pynchon, in fascinating ways, and I’m better equipped to engage him now than I would have been earlier on. The fact that I’ve published two novels in the meantime also helps. As a writer, I’m endlessly fascinated by the problem of managing complexity—of giving a reader enough intermediate rewards to justify the demands the author makes—and Wallace handles this beautifully. Dave Eggers, in the introduction to the edition I’m reading now, does a nice job of summing it up:

A Wallace reader gets the impression of being in a room with a very talkative and brilliant uncle or cousin who, just when he’s about to push it too far, to try our patience with too much detail, has the good sense to throw in a good lowbrow joke.

And the ability to balance payoff with frustration is a quality shared by many of our greatest novels. It’s relatively easy to write a impenetrable book that tries the reader’s patience, just as it’s easy to create a difficult video game that drives players up the wall, but parceling out small satisfactions to balance out the hard parts takes craft and experience. Mike Meginnis of Uncanny Valley makes a similar point in an excellent blog post about the narrative lessons of video games. While discussing the problem of rules and game mechanics, he writes:

In short, while it might seem that richness suggests excess and maximal inclusion, we actually need to be selective about the elements we include, or the novel will not be rich so much as an incomprehensible blur, a smear of language. Think about the very real limitations of Pynchon as a novelist: many complain about his flat characters and slapstick humor, but without those elements to manage the text and simplify it, his already dangerously complex fiction would become unreadable.

Pynchon, of course, casts a huge shadow over Wallace—sometimes literally, as when two characters in Infinite Jest contemplate their vast silhouettes while standing on a mountain range, as another pair does in Gravity’s Rainbow. And I’m curious to see how Wallace, who seems much more interested than Pynchon in creating plausible human beings, deals with this particular problem.

Inception

The problem of managing complexity is one that has come up on this blog several times, notably in my discussion of the work of Christopher Nolan: Inception‘s characters, however appealing, are basically flat, and the action is surprisingly straightforward once we’ve accepted the premise. Otherwise, the movie would fall apart from trying to push complexity in more than one direction at once. Even works that we don’t normally consider accessible to a casual reader often incorporate elements of selection or order into their design. The Homeric parallels in Joyce’s Ulysses are sometimes dismissed as an irrelevant trick—Borges, in particular, didn’t find them interesting—but they’re very helpful for a reader trying to cut a path through the novel for the first time. When Joyce dispensed with that device, the result was Finnegans Wake, a novel greatly admired and rarely read. That’s why encyclopedic fictions, from The Divine Comedy to Moby-Dick, tend to be structured around a journey or other familiar structure, which gives the reader a compass and map to navigate the authorial wilderness.

On a more modest level, I’ve frequently found myself doing this in my own work. I’ve mentioned before that I wanted one of the three narrative strands in The Icon Thief to be a police procedural, which, with its familiar beats and elements, would serve as a kind of thread to pull the reader past some of the book’s complexities. More generally, this is the real purpose of plot. Kurt Vonnegut, who was right about almost everything, says as much in one of those writing aphorisms that I never tire of quoting:

I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading.

The emphasis is mine. Plot is really a way of easing the reader into that greatest of imaginative leaps, which all stories, whatever their ambitions, have in common: the illusion that these events are really taking place, and that characters who never existed are worthy of our attention and sympathy. Plot, structure, and other incidental pleasures are what keep the reader nourished while the real work of the story is taking place. If we take it for granted, it’s because it’s a trick that most storytellers learned a long time ago. But the closer we look at its apparent simplicity, the sooner we realize that, well, it’s complicated.

“The following morning, a few blocks from the courthouse…”

leave a comment »

"The following morning..."

(Note: This post is the thirty-fourth installment in my author’s commentary for The Icon Thief, covering Chapter 33. You can read the earlier installments here.)

One of the great pleasures of planning a new writing project is the chance to do research on location. Writing is such a sedentary pursuit that any excuse to get out of the house is usually welcome, and one of the best ways a writer can spend his time is by exploring new or familiar places with an eye to their dramatic potential. And you don’t need to go far afield to make fascinating discoveries. One thing I’ve learned as a writer is that the observing faculty—the part of the brain that mines the world around you for material—can’t stay switched on all the time: it’s just too exhausting. Ideally, a writer, as Henry James said, should be one on whom nothing is lost, but aside from a few exceptional personalities like Proust or Updike, most of us learn to parcel out our energies, activating that ravenous inner eye only when necessary. In particular, it tends to be most alert when we’re regarding a location with a specific story in mind. And once we’ve made that inward adjustment, the most ordinary places are suddenly bursting with meaning.

In particular, when you’re writing a thriller, you’re often looking for a new way to stage a murder or a chase in a real location. Most suspense novelists ultimately become what Thomas Pynchon calls “aficionados of the chase scene, those who cannot look at the Taj Mahal, the Uffizi, the Statue of Liberty without thinking chase scene, chase scene, wow yeah Douglas Fairbanks scampering across that moon minaret there…” In my own work, I’ve mentally planned heists, killings, and chases in Boston, New York, Philadelphia, London, and elsewhere, a process that usually involves spending hours at a museum, neighborhood, or public building, taking surreptitious photographs and generally acting as suspiciously as possible. On a few occasions, I’ve received stern warnings from security. And although I’ve sometimes been forced to plan scenes from a distance, with the help of guidebooks, photo references, and Google Maps, there’s no substitute for being there on the ground yourself, pacing off the exact route that your hero or villain will follow.

"Leaving the checkpoint, he entered the courthouse..."

And like most forms of research, location work isn’t primarily about factual accuracy, but about furnishing the material for dreams. It’s much more rewarding to write a scene that takes place on a real, particular street, with specific alleys and stairwells and other landmarks for the action, than to invent one for a street that exists only in your imagination. A real location, like a standing set, suggests props, story beats, and bits of business that would never occur to you on your own. Later, while you’re writing, you’re free to fudge the details if you must, but it’s better to work within the constraints that the actual location affords. James Joyce knew this when he asked his aunt to verify that an ordinary man could climb over the fence at No. 7 Eccles Street. And if you’re writing a chase scene, you’re more likely to come up with something ingenious or surprising when you notice, say, that none of the exits are conveniently located near the area where the main action takes place, and that your protagonist will need to get past several levels of security in order to make his escape.

This is basically the process that went into Chapter 33 of The Icon Thief, in which Ilya meets Sharkovsky for an exchange at the New York County Courthouse. I chose this landmark because I wanted my characters to meet in a secure location with metal detectors, so that neither one could be armed, and a courthouse seemed like an interesting backdrop. Once the decision was made, I spent the better part of an afternoon hanging out in Foley Square, checking out the surroundings and taking notes on secondary locations I might want to use—the playground, the comfort station, the construction site at the federal building next door—before entering the courthouse itself. Inside, I took notes on security, layout, and architecture, and paid special attention to the placement of the emergency exits. Above all else, I tried to see the building as it would look through Ilya’s eyes. And by the time I was done, I found that the logic of the building had determined the shape of the chapter itself, as well as the two that followed. Because I couldn’t see it, naturally, without thinking of a chase scene…

Written by nevalalee

February 7, 2013 at 9:50 am

The better part of valor

leave a comment »

This morning, I published an essay in The Daily Beast on Karl Rove’s curious affection for the great Argentine author Jorge Luis Borges, a connection that I’ve found intriguing ever since Rove mentioned it two years ago in a Proust questionnaire for Vanity Fair. Borges, as I’ve mentioned before, is one of my favorite writers, and it’s surprising, to say the least, to find myself agreeing with Rove on something so fundamental. It’s also hard to imagine two men who have less in common. While Rove jumped with both feet into a political career, and was cheerfully engaging in dirty tricks before he was out of college, Borges survived the Peron regime largely by keeping his head down, and in later years seemed pointedly detached from events in Argentina. It’s a mistake to think of him as an entirely apolitical writer—few authors of his time wrote more eloquently against the rise of Nazism—but it’s clear that for much of his life, he just wanted to be left alone. As a result, he’s been criticized, and not without reason, for literally turning a blind eye on the atrocities of the Dirty War, claiming that his loss of eyesight made it impossible to read the newspapers.

This policy of avoidance is one that we often see in the greatest writers, who prudently decline to engage in politics, often for reasons of survival. Shakespeare was more than willing, when the occasion demanded it, to serve as the master of revels for the crown, but as Harold Bloom points out, he carefully avoided any treatment of the political controversies of his time, perhaps mindful of the cautionary fate of Christopher Marlowe. Discretion, as Falstaff advises us, is the better part of valor, and also of poetry, at least if the poet wants to settle into a comfortable retirement in Stratford. Dante, Shakespeare’s only peer among Western poets, might seem like an exception to the rule—he certainly didn’t shy away from political attacks—but his most passionate jeremiads were composed far from Florence. “Beyond a doubt he was the wisest, most resolute man of his time,” Erich Auerbach writes. “According to the Platonic principle which is still valid whenever a man is manifestly endowed with the gift of leadership, he was born to rule; however, he did not rule, but led a life of solitary poverty.”

Borges, too, chose exile, spending his declining years overseas, and finally died in Geneva. It’s a pattern that we see repeatedly in the lives of major poets and artists, especially those who emerge from nations with a history of political strife. The great works of encyclopedic fiction, as Edward Mendelson reminds us, tend to be written beyond the borders of the countries they document so vividly: the closing words of Ulysses, the encyclopedia of Dublin, are “Trieste-Zurich-Paris.” This is partly the product of sensible caution, but it’s also a professional necessity. Most creative work is founded on solitude, quiet, and a prudent detachment from the world, and any degree of immersion in politics tends to destroy the delicate thread of thought necessary for artistic production. Even when writers are tempted by worldly power, they’re usually well aware of the consequences. Norman Mailer, writing of his doomed run for mayor of New York, observes of himself, in the third person: “He would never write again if he were Mayor (the job would doubtless strain his talent to extinction) but he would have his hand on the rump of History, and Norman was not without such lust.”

In the end, as Mailer notes acidly, “He came in fourth in a field of five, and politics was behind him.” Which is all for the best—otherwise, we never would have gotten The Executioner’s Song or Of a Fire on the Moon, not to mention Ancient Evenings, which is the sort of foolhardy masterpiece, written over the course of a decade, that could only be written by a man whose political ambitions have been otherwise frustrated. Besides, as I’ve pointed out elsewhere, novelists don’t make good politicians. And their work is often the better for it. In the case of Borges, there’s no question that much of what makes him great—his obsession with ideas, his receptivity to the structures of speculative fiction, his lifelong dialogue with all of world literature—arose from this tactical refusal to engage in politics. Unable or unwilling to criticize the government, he turned instead to a life of ideas, leaving behind a body of extraordinary fiction defined as much by what it leaves out as by what it includes. And I don’t think any sympathetic reader would want it any other way.

The oops file

with 2 comments

After thirty yards, the road curved and the shade trees vanished. To his left, the hedge continued as before. On his right, the houses disappeared, replaced by a pond trimmed with reeds and pitch pines. Ospreys floated on the calm surface of the water.

This description comes from Chapter 6 of The Icon Thief, when Ilya is casing the mansion where he and another thief will shortly stage an elaborate heist, and it strikes me as a nice image, one that clearly evokes the setting, a peaceful neighborhood in the Hamptons. It isn’t flashy, but the writing is efficient and clear. The trouble, unfortunately, is that it contains a mistake, as a reader pointed out to me in a terse email, which read in its entirety: “Ospreys do not rest on the water; they rest in trees (preferably dead ones).” Well, I hope he liked the rest of the book. But I can’t deny that it’s a definite error on my part. In the months since The Icon Thief was first published, I’ve noticed a few factual lapses like this, some of which I’d rather not mention, although I’d like to correct the record to reflect that the woman to whom I refer, in passing, as “a dead patron of the arts” is actually very much alive.

And yet I’m strangely relieved that there aren’t more mistakes. The Icon Thief contains hundreds of factual statements that, even outside the context of the story, can be independently checked, verified, or disproved, and so far, the errors I’ve been told about or seen on my own amount to only a handful. I’ve been especially gratified to hear from a number of readers in the art world, including two experts on Duchamp, who would be more than capable of pointing out any inaccuracies. So far, if they’ve found any serious ones, they’ve been too polite to say so—allowing, of course, for the occasional liberties I’ve taken in the interest of constructing a fictional narrative. (I should also confess that my readers caught a number of similar mistakes before the book was published, which only demonstrates the necessity of subjecting any manuscript to thoughtful critical review.) But I’ve put a lot of effort into making sure, within human reason, that this book is correct in its details, even in points that are likely to elude the attention, or interest, of even the most diligent reader.

In this regard, I was motivated throughout by the example of the ferociously observant readers of the Sherlock Holmes stories. Arthur Conan Doyle was not what we’d call a great researcher, and he had trouble keeping even his own continuity straight. The most delightful aspect of the field of Sherlockian studies is the energy that these readers invest in both fact-checking and justifying any discrepancies they uncover, which include issues ranging from the location of Watson’s wound to the species of the speckled band to whether the weather in London was, in fact, drizzly and gray on a particular morning in 1895. (Sometimes they go a little too far: I’ve gone on record as saying that The Annotated Sherlock Holmes by William S. Baring-Gould is the best book in the world, but if it has one shortcoming, it’s that the editor rearranges the stories into his own eccentric chronology, ignoring narrative logic and character development to order them based on, say, contemporary weather reports.) And whenever I go back to check my own work, it’s with an eye to such a reader: highly intelligent, endlessly skeptical, and blessed with a seemingly unlimited amount of time.

Of course, the odds of my novels ever receiving even a fraction of the attention of the Holmes stories is pretty remote. All the same, the habit of reading your own work with this kind of audience in mind is a useful one. As I’ve noted before, all novels, especially in the suspense genre, tend to use factual information and accuracy in small details as a kind of synecdoche for the credibility of the plot as a whole, and any lapse will throw not just the disputed passage but the entire story into question. Even the tiniest mistake will pull the reader out of the fictional dream. As a result, I’ve found myself checking weather reports for the day in which a certain scene takes place, usually with the assistance of the invaluable Wolfram Alpha, and poring over maps and photographs—or, better yet, visiting locations in person—to make sure the action is plausible, or at least physically possible. While writing Ulysses, James Joyce wrote a letter to his aunt asking her to verify that an ordinary man could climb over the fence at No. 7 Eccles Street, and it’s that kind of diligence toward which we should strive. And all the while, we should remember that, unlike the Navajos, there’s no need for us to weave deliberate flaws into our blankets—they’ll have plenty of flaws of their own.

Written by nevalalee

October 17, 2012 at 10:00 am

Better late than never: The Magic Mountain

leave a comment »

It’s safe to say that out of all the acknowledged masterpieces of twentieth-century literature, Thomas Mann’s The Magic Mountain is the least inviting. Part of this is due to the fact that Mann’s reputation, or his snob chic, has suffered in comparison to Joyce and Proust, at least for the purposes of cocktail party conversation. The smooth surface of Mann’s prose offers fewer enticements to the casual browser: while a glance at the pages of Ulysses suggests a wealth of unexplored treasures, Mann presents only an unbroken succession of dense paragraphs. And there’s no denying that the plot of The Magic Mountain—a young engineer, Hans Castorp, visits a sanitarium in the Alps for a short visit and ends up staying for seven years—doesn’t quite promise nonstop delights, especially when spread across more than seven hundred pages. It may be true, as Mann says in the introduction, that only the exhaustive is truly interesting, but most of us are probably inclined to take him at his word.

And yet The Magic Mountain has always been on my short list of books to read, especially after I picked up the acclaimed John E. Woods translation at the Printer’s Row Lit Fest earlier this year. Finally, last month, I took my copy along with me to China, reasoning that I’d be more likely to finish it if it were the only book I had in my native language in a foreign country. (This wasn’t the first time I’d employed this trick: I’d read Gravity’s Rainbow in Rome and most of Proust in Finland using the same method, and it had always worked pretty well.) Still, I slid The Magic Mountain into my bag less with anticipation than out of a sense of obligation, and with a distinct sense that I was taking my medicine. Part of me suspected that I would regret the choice, which may have been why I also packed James Clavell’s Noble House—one of the great trashy popular novels—as a backup choice. And it was only when I was deep in China, in a bus headed to the mountains of Guilin, that I opened my copy of Mann and resignedly began to read.

Inevitably, I was blown away. It’s hard to convincingly describe the pleasures of this book, which seems so dry and forbidding at first glance, but here’s my attempt: this is a really great novel, fascinating, ingenious, and surprisingly dramatic and moving. Mann is clearly a writer who can do almost anything, and while the book is best known for its extended discussions of art, politics, science, religion, and every other topic of interest to turn-of-the-century modernism, Mann takes obvious delight in showing us that he also knows how to generate suspense. The Magic Mountain is a novel of ideas, but it’s also full of extraordinary set pieces—Walpurgis Night, Hans Castorp’s nearly fatal excursion in the snow, the séance, the duel between Naptha and Settembrini—that shamelessly offer all the satisfactions of classic fiction. There’s a reason why Mann, unlike Joyce and Proust, was a bestseller in his own land during his lifetime, and in The Magic Mountain, he does what David Foster Wallace struggled to accomplish in The Pale King: write a novel about boredom that is alive on every page.

It’s always difficult to predict the role that a given novel will play in one’s life. Some make a huge impression, then quickly fade; others grow in one’s imagination over time (as John Crowley’s Little, Big has begun to do with me). It’s safe to say that The Magic Mountain is the best novel I’ve read in at least five years, and it may be even more: a book that will ultimately play a central role in my understanding of the world. I’m in awe of its intelligence, its savage parody of the Bildungsroman, its astonishingly accurate depiction of romantic obsession, and, most surprisingly, its warmth and humor. And as often happens with great books, I seem to have discovered it at just the right moment. It’s hard for me, and I suspect for many readers, not to identify with Hans Castorp, who is twenty-three when the novel begins and thirty when he descends from the magic mountain to his own ironic destiny. Looking back at my twenties, I see more of Hans in myself than I’d like to admit. Where my own Bildungsroman will take me, or any of us, remains to be seen. But I can’t imagine a better guide for the journey than Mann.

Written by nevalalee

December 30, 2011 at 10:13 am

Squashing the semicolon

with 4 comments

Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.

Kurt Vonnegut, A Man Without a Country

I don’t like semicolons. I’ve always been conscious of avoiding them in my published fiction, but I’m not sure I realized the truly comical extent of my aversion until I did a few quick searches in Word. Here, then, are the results of—wait for it—my semicolonoscopy: The Icon Thief, a novel of over 100,000 words and half a million characters, contains a grand total of six semicolons, while its sequel, City of Exiles, which is about the same length, has exactly six as well, which implies that I’m holding disturbingly close to some invisible quota. And of the three novelettes I’ve published in Analog over the past few years, along with two more stories slated to appear in the next six months, there’s exactly one semicolon. (If you’re curious, it’s in “The Boneless One,” on page 88 of the November 2011 issue. A few choice revisions, and I could have called it “The Semicolonless One.”)

The really surprising discovery is that this seems to be a relatively recent development. “Inversus,” my first professionally published story, is something of an outlier: it came out in January/February 2004, more than four years before I began making sales on a regular basis, and it contains ten semicolons, or nearly the same number that I’ve since employed in two full novels. Over the last five years, then, as my overall productivity has increased, my use of semicolons has gone down drastically. In itself, the timing isn’t hard to understand: it wasn’t until I began writing for a living, and particularly after I wrote my first novel, that I began to develop a style of my own. And whoever this writer is, he seems to hate semicolons, at least when it comes to fiction. (For what it’s worth, I use semicolons slightly more often in my personal correspondence, as well as on this blog, but I still don’t especially care for them.)

And I’m not entirely sure why. If pressed, I’d say that my dislike of semicolons, and most other forms of punctuation aside from the comma and period, comes from my classical education, in which I spent years reading Latin authors who managed to convey meaning and rhythm through sentence structure alone. These days, writers have a world of possible punctuation at their disposal, but this isn’t necessarily a good thing. One of the best things a writer can do, to build muscle, is to consciously deprive himself of a common tool, while developing other strategies to take its place. The semicolon is essentially a crutch for combining two sentences into one, for the sake of meaning or variety. By eschewing semicolons, I’ve forced myself to achieve these goals in other ways, revising sentences to have rhythm and clarity on the most fundamental level: in the arrangement of the words themselves.

But really, if I’m honest, I have to admit that it isn’t rational at all. Many writers have irrational dislikes of certain kinds of punctuation: George Bernard Shaw thought of apostrophes as “uncouth bacilli,” and James Joyce, as well as many of his pretentious imitators, disliked inverted commas, using a French- or Italian-style quotation dash to indicate dialogue. Other authors, such as Wodehouse and Beckett, have as much of an aversion to semicolons as I do. Such choices can be justified on stylistic grounds, but in my experience, such obsessive decisions are more often personal and idiosyncratic, the result of a writer’s customary isolation. After you’ve spent years of your life staring at the same stack of pages, it takes on an almost physical presence, like a view of your backyard, until such otherwise innocent features as ragged line breaks and ellipses, invisible to casual readers, start to drive you crazy. So if you like semicolons, please keep using them; I only wish that I could do the same.

Quote of the Day

leave a comment »

The reason I dislike Chamber Music as a title is that it is too complacent. I should prefer a title which repudiated the book without altogether disparaging it.

James Joyce, in a letter to Arthur Symons

Written by nevalalee

September 8, 2011 at 7:07 am

Quote of the Day

leave a comment »

I realised that Joyce had gone as far as one could in the direction of knowing more, [being] in control of one’s material. He was always adding to it; you only have to look at his proofs to see that. I realised that my own way was in impoverishment, in lack of knowledge and in taking away, in subtracting rather than in adding.

Samuel Beckett

Written by nevalalee

July 25, 2011 at 7:22 am

%d bloggers like this: