Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “Do you have anybody’s autograph?”
Last weekend, I took part in a local authors program sponsored by the Oak Park Public Library, in which writers from the area were given three minutes each to talk about their work and then hopefully make a few sales. I had a good time and I sold a bunch of copies, which is always a plus. Yet whenever I do an event like this, I’m brought up against the fundamental awkwardness of the interaction when I’m asked to sign a book. For one thing, I can never come up with anything clever to say in the inscription, so I end up scrawling something like “Enjoy!” or “Best wishes!” even to my own family members. (I’ve also realized that when you’re signing all three copies of a trilogy, the buyer starts to get a little impatient by the time you’ve begun dating, inscribing, and signing the third book.) And while I’m always gratified by sales and attention—especially sales—I usually feel like a nocturnal creature that has been dragged, blinking, into the daylight. I became a writer partially because I like hanging out on my own, absorbed in a draft or a pile of research materials, and whenever I’m compelled to be out in the world, it’s as if I’m engaging in a kind of elaborate impersonation.
I assume, though I don’t know for sure, that a lot of other writers feel the same way, even as they’re asked to invest increasing amounts of time in creating a public life that has little in common with what they do for a living. These days, it’s taken for granted that writers will promote themselves with any and all means at their disposal, to the point where even an ordinary desire for privacy starts to seem outré. Here’s the thing about Thomas Pynchon: it’s fun to talk about his “reclusiveness,” as if he were an elusive cryptid like Bigfoot, but by all accounts, he’s an ordinary guy living in New York, with an active social life and a diverse circle of friends. He just doesn’t feel like giving interviews or having his picture published, and that perfectly reasonable stance is so out of line with our expectations that it becomes newsworthy in itself. I’ve always liked what the critic Arthur Salm had to say on the subject:
The man simply chooses not to be a public figure, an attitude that resonates on a frequency so out of phase with that of the prevailing culture that if Pynchon and Paris Hilton were ever to meet—the circumstances, I admit, are beyond imagining—the resulting matter/antimatter explosion would vaporize everything from here to Tau Ceti IV.
It’s possible, of course, that certain authors would love to be public figures, if only they’d get the chance. Yet it’s revealing that when we think of the novelist as a celebrity, our minds go back to Norman Mailer and Gore Vidal feuding on The Dick Cavett Show, and the pool of memories dries up the closer we get to the present. Occasionally, a writer famous for fiction will start to assume a role of greater social importance, but it’s almost always at the expense of his or her work as a novelist: Arundhati Roy hasn’t published a novel in seventeen years. That’s the strange thing about the push toward ever greater levels of exposure: even as writers share more and more of themselves on Twitter, Facebook, and blogs like this, their real role in public life—at least for novelists—grows progressively marginalized. In that light, the aggressive presence of authors on social media feels less like self-promotion than a simulation of the cultural role writers no longer possess, if they ever really did. (And it’s also clear that the skill sets required to write a novel and curate a decent Twitter feed have about as much in common as writing and public speaking, which is to say, next to nothing.)
Deep down, I feel much the same way about readings and signings, which are moments when writers can play at being famous in ways that they haven’t experienced—or are spared from—otherwise. (Obviously, this doesn’t include celebrities who were already famous before their books were published: Lena Dunham’s book tour has about as much in common with your average reading as Cirque du Soleil does with a local black box theater’s production of Hedda Gabler.) If it sounds like I’m overthinking it, well, I probably am. But it doesn’t prevent me from feeling a little uncomfortable whenever I find myself in that kind of encounter, regardless of which side of the autograph table I’m on. The only person I’ve ever asked for an autograph is Walter Murch, and the fact that he was a perfect gentleman about it didn’t make our few seconds of small talk any less awkward. Luckily, that moment occupies only a tiny sliver of the mental real estate devoted to his movies, books, and interviews. To the extent that I feel I know Murch, or anyone, it has less to do with the handshake we shared than with all the time I’ve spent in his virtual company, when neither of us was playing a role, and we were far enough apart for at least one of us to really say something, even if the conversation only ran in one direction.
A prefatory glow, not unlike some benign variety of the aura before an epileptic attack, is something the artist learns to perceive very early in life. This feeling of tickly well-being branches through him like the red and the blue in the picture of a skinned man under Circulation.
—Vladimir Nabokov, on inspiration
I’ve noted elsewhere that I have mixed feelings about the increasing willingness among television shows to abruptly kill off their characters. On the one hand, it discourages audience complacency and raises the stakes if we feel that anyone could die at any moment; on the other, it encourages a kind of all or nothing approach to writing stories, and even a sort of laziness. Ninety percent of the time, a show can coast along on fairly conventional storytelling—as Game of Thrones sometimes does—before somebody gets beheaded or shoved in front of a subway train. But it would have been better, or at least more interesting, to create tension and suspense while those characters were sill alive. Major deaths should be honestly earned, not just a way to keep the audience awake. At least Game of Thrones knows how to milk such moments for all they’re worth; with a show like The Vampire Diaries, diminishing returns quickly set in when characters are dispatched in every other episode. It cheapens the value of life and personality, and it starts to feel questionable on both narrative and ethical levels.
Of course, I’ve been guilty of this myself, and the way certain character deaths have been incorporated into my novels testify both to how effective and to how arbitrary this kind of device can be. Ethan’s death in The Icon Thief gets a pass: it’s a striking scene that propels the last third of the story forward, and although it works in terms of momentary shock value, its repercussions continue to define the series until the final book. (The fact that it was a late addition to the story—indeed, it was one of the last things I wrote—hasn’t kept it from feeling inevitable now.) The corresponding scene in City of Exiles, which echoes its predecessor in a lot of ways, is a little harder to defend. It’s a nice, tricky chapter, and I’m still proud of the reversal it pulls, but it feels a bit more like a gimmick, especially because its consequences don’t fully play out until the following novel. From a structural point of view, it works, and it provides a necessary jolt of energy to the story at the right place, but it’s not that far removed from the way a show like 24 will throw in a surprise betrayal when the audience’s attention starts to wander.
Looking back, I have a feeling that my own uneasiness over this moment—as well as the high body count of the novel as a whole—may have led me to spare another character’s life. Toward the end of the process, there was a lot of talk about whether I should kill off Powell. After reading the first draft, my agent was openly in favor of it, and it’s true that things don’t look particularly good for Powell at this point: realistically speaking, it’s hard to imagine that anyone on that airplane could have survived. Much earlier, I’d even toyed with the idea of killing Powell at the end of Part I, which would have made Wolfe’s journey all the more urgent. Between these two possibilities, the latter seemed much more preferable. A death at the conclusion of the novel wouldn’t have advanced the narrative in any particular fashion; we’re only a few pages from the end anyway, and if the stakes aren’t clear by now, there’s no point in trying to heighten them in retrospect. Killing him earlier would have served a clearer dramatic purpose, but I also would have lost his far more wrenching scene on the plane, which I don’t think would have been nearly as strong without him at its center.
In the end, I let him live, though badly hurt, for a number of different reasons. At the time, I thought that I wanted to preserve the duo of Powell and Wolfe for a potential third novel, although as it turned out, they don’t spend a lot of time together in Eternal Empire, and his role could conceivably have been filled by somebody else. Powell also benefited from my impulse to pull back on the death toll of plane crash: I didn’t want to kill off Chigorin, mostly because he was transparently based on a real person whose imagined demise I didn’t much feel like exploiting, so most of the other passengers ended up being protected by his character shield. Most of all, I thought that keeping Powell alive would restore a necessary degree of balance to the ending. City of Exiles concludes on something of a down note: Ilya is still in prison, Karvonen’s handler is still at large, and Wolfe still doesn’t know—although the reader does—that the traitor in her organization is someone close to her own heart. Killing off Powell would have left the situation feeling even more hopeless, so I spared him. If this all sounds a little cold and calculated, well, maybe it was. Powell might not have made it, but he escaped thanks to luck, impersonal considerations, and a moment of mercy from the universe. And that’s true of all of us at times…
The author always loads his dice, but he must never let the reader see that he has done so, and by the manipulation of his plot he can engage the reader’s attention so that he does not perceive what violence has been done him.
In this week’s issue of the New York Review of Books, the literary critic Edward Mendelson outs himself as yet another fan of old-school word processors, in this case WordPerfect, which he describes as “the instrument best suited to the way I think when I write.” He goes on to draw a contrast between his favored program, “a mediocrity that’s almost always right,” and Microsoft Word, “a work of genius that’s almost always wrong as an instrument for writing prose,” with its commitment to a platonic ideal of sections and styles that make it all the harder for writers to format a single page. It’s the difference, Mendelson implies, between a mindset that approaches the document from the top down, thinking in terms of templates and overall consistency, and the daily experience of a writer, who engages in direct combat with individual words and sentences, some of which have to be italicized, indented, or otherwise massaged in ways that don’t have anything to do with their neighbors. And as someone who lives comfortably within his own little slice of Word but wants to tear his hair out whenever he strays beyond it, I can’t help but sympathize.
I happened to read Mendelson’s essay with particular interest, because I’m a longtime fan of his work. Mindful Pleasures, the collection of essays he edited on Thomas Pynchon, is one of those books I revisit every few years, and in particular, his piece on encyclopedic fiction has shaped the way I read authors from Dante to Joyce. Pynchon, of course, is a writer with more than a few ideas about how technology affects the way we live and think, and in his conclusion, Mendelson takes a cue from the master:
When I work in Word, for all its luxuriant menus and dazzling prowess, I can’t escape a faint sense of having entered a closed, rule-bound society. When I write in WordPerfect, with all its scruffy, low-tech simplicity, the world seems more open, a place where endings can’t be predicted, where freedom might be real.
There’s more than an echo here of Gravity’s Rainbow, which pits its anarchic, cartoonish personalities against an impersonal conspiracy that finally consumes and assimilates them. And if Pynchon’s fantasy is centered on a rocket cartel that manipulates world events to its own advantage, a writer trying to wrestle a document into shape can sometimes feel like he’s up against an equally faceless enemy.
If Word can be a frustrating tool for writers, it’s because it wasn’t made for anyone in particular, but for “everyone.” As one of the core handful of programs included in the Microsoft Office suite, it’s meant to serve a wide range of functions, from hammering out a high school essay to formatting a rudimentary corporate newsletter. It’s intended to be equally useful to someone who creates a document twice a month and someone who uses it every day, which means that it’s tailored to the needs of precisely nobody. And it was presumably implemented by coders who would rebel against any similar imposition. There’s a reason why so many programmers still live in Emacs and its text-based brethren: they’re simple once you get to know them, they’re deeply customizable, and they let you keep your hands on the keyboard for extended periods of time. Word, by contrast, seems to have been designed for a hypothetical consumer who would rather follow a template than fiddle with each line by hand. This may be true of most casual users, but it’s generally not true of coders—or writers. And Word, like so much other contemporary technology, offers countless options but very little choice.
There are times, obviously, when a standard template can be useful, especially when you’re putting together something like an academic bibliography. Yet there’s a world of difference between really understanding bibliographic style from the inside and trusting blindly to the software, which always needs to be checked by hand, anyway, to catch the errors that inevitably creep in. In the end, though, Word wasn’t made for me; it was made for users who see a word processor as an occasional tool, rather than the environment in which they spend most of their lives. For the rest of us, there are either specialized programs, like Scrivener, or the sliver of Word we’ve managed to colonize. In my post on George R.R. Martin and his use of WordStar—which, somewhat embarrassingly, has turned out to be the most widely read thing I’ve ever written—I note that a writer’s choice of tools is largely determined by habit. I’ve been using Word for two decades, and the first drafts of all my stories are formatted in exactly the way the program imposes, in single-spaced 12-point Times New Roman. I’m so used to how it looks that it fades into invisibility, which is exactly how it should be. The constraints it imposes are still there, but I’ve adapted so I can take them for granted, like a deep-sea fish that would explode if taken closer to the surface, or an animal that has learned to live with gravity.
When an object enters the frame, ensure it’s moving at its peak velocity. This behavior emulates natural movement: a person entering the frame of vision does not begin walking at the edge of the frame but well before it. Similarly, when an object exits the frame, have it maintain its velocity, rather than slowing down as it exits the frame. Easing in when entering and slowing down when exiting draw the user’s attention to that motion, which, in most cases, isn’t the effect you want.
“The joy of listening to Beethoven is comparable to the pleasure of reading Joyce,” writes Alex Ross in a recent issue of The New Yorker: “The most paranoid, overdetermined interpretation is probably the correct one.” Even as someone whose ear for classical music is underdeveloped compared to his interest in other forms of art, I have to agree. Great artists come in all shapes and sizes, but the rarest of all is the kind whose work can sustain the most meticulous level of scrutiny because we’re aware that every detail is a conscious choice. When we interpret an ordinary book or a poem, our readings are often more a reflection of our own needs than the author’s intentions; even with a writer like Shakespeare, it’s hard to separate the author’s deliberate decisions from the resonances that naturally emerge from so much rich language set into motion. With Beethoven, Joyce, and a handful of others—Dante, Bach, perhaps Nabokov—we have enough information about the creative process to know that little, if anything, has happened by accident. Joyce explicitly designed his work to “keep professors busy for centuries,” and Beethoven composed for a perfect, omniscient audience that he seemed to will into existence.
Or as Colin Wilson puts it: “The message of the symphonies of Beethoven could be summarized: ‘Man is not small; he is just bloody lazy.'” When you read Ross’s perceptive article, which reviews much of the recent scholarship on Beethoven and his life, you’re confronted by the same tension that underlies any great body of work made within historical memory. On the one hand, Beethoven has undergone a kind of artistic deification, and there’s a tradition, dating back to E.T.A. Hoffmann, that there are ideas and emotions being expressed in his music that can’t be matched by any other human production; on the other, there’s the fact that Beethoven was a man like any other, with a messy personal life and his own portion of pettiness, neediness, and doubt. As Ross points out, before Beethoven, critics were accustomed to talk of “genius” as a kind of impersonal quality, but afterward, the concept shifted to that of “a genius,” which changes the terms of the conversation without reducing its underlying mystery. Beethoven’s biography provides tantalizing clues about the origins of his singular greatness—particularly his deafness, which critics tend to associate with his retreat to an isolated, visionary plane—but it leaves us with as many questions as before.
As it happens, I read Ross’s article in parallel with Howard Markel’s An Anatomy of Addiction, which focuses on the early career of another famous resident of Vienna. Freud seems to have been relatively indifferent to music: he mentions Beethoven along with Goethe and Leonardo Da Vinci as “great men” who have produced “splendid creations,” although this feels more like a rhetorical way of filling out a trio than an expression of true appreciation. Otherwise, his relative silence on the subject is revealing in itself: if he wanted to interpret an artist’s work in psychoanalytic terms, Beethoven’s life would have afforded plenty of material, and he didn’t shy from doing the same for Leonardo and Shakespeare. It’s possible that Freud avoided Beethoven because of the same godlike intentionality that makes him so fascinating to listeners and critics. If we’ve gotten into the habit of drawing a distinction between what a creative artist intends and his or her unconscious impulses, it’s largely thanks to Freud himself. Beethoven stands as a repudiation, or at least a strong counterexample, to this approach: however complicated Beethoven may have been as a man, it’s hard to make a case that there was ever a moment when he didn’t know what he was doing.
This may be why Freud’s genius—which was very real—seems less mysterious than Beethoven’s: we know more about Freud’s inner life than just about any other major intellectual, thanks primarily to his own accounts of his dreams and fantasies, and it’s easy to draw a line from his biography to his work. Markel, for instance, focuses on the period of Freud’s cocaine use, and although he stops short of suggesting that all of psychoanalysis can be understood as a product of addiction, as others have, he points out that Freud’s early publications on cocaine represent the first time he publicly mined his own experiences for insight. But of course, there were plenty of bright young Jewish doctors in Vienna in the late nineteenth century, and while many of the ideas behind analysis were already in the air, it was only in Freud that they found the necessary combination of obsessiveness, ambition, and literary brilliance required for their full expression. Freud may have done his best to complicate our ideas of genius by introducing unconscious factors into the equation, but paradoxically, he made his case in a series of peerlessly crafted books and essays, and their status as imaginative literature has only been enhanced by the decline of analysis as a science. Freud doesn’t explain Freud any more than he explains Beethoven. But this doesn’t stop him, or us, from trying.