In this week’s issue of the New York Review of Books, the literary critic Edward Mendelson outs himself as yet another fan of old-school word processors, in this case WordPerfect, which he describes as “the instrument best suited to the way I think when I write.” He goes on to draw a contrast between his favored program, “a mediocrity that’s almost always right,” and Microsoft Word, “a work of genius that’s almost always wrong as an instrument for writing prose,” with its commitment to a platonic ideal of sections and styles that make it all the harder for writers to format a single page. It’s the difference, Mendelson implies, between a mindset that approaches the document from the top down, thinking in terms of templates and overall consistency, and the daily experience of a writer, who engages in direct combat with individual words and sentences, some of which have to be italicized, indented, or otherwise massaged in ways that don’t have anything to do with their neighbors. And as someone who lives comfortably within his own little slice of Word but wants to tear his hair out whenever he strays beyond it, I can’t help but sympathize.
I happened to read Mendelson’s essay with particular interest, because I’m a longtime fan of his work. Mindful Pleasures, the collection of essays he edited on Thomas Pynchon, is one of those books I revisit every few years, and in particular, his piece on encyclopedic fiction has shaped the way I read authors from Dante to Joyce. Pynchon, of course, is a writer with more than a few ideas about how technology affects the way we live and think, and in his conclusion, Mendelson takes a cue from the master:
When I work in Word, for all its luxuriant menus and dazzling prowess, I can’t escape a faint sense of having entered a closed, rule-bound society. When I write in WordPerfect, with all its scruffy, low-tech simplicity, the world seems more open, a place where endings can’t be predicted, where freedom might be real.
There’s more than an echo here of Gravity’s Rainbow, which pits its anarchic, cartoonish personalities against an impersonal conspiracy that finally consumes and assimilates them. And if Pynchon’s fantasy is centered on a rocket cartel that manipulates world events to its own advantage, a writer trying to wrestle a document into shape can sometimes feel like he’s up against an equally faceless enemy.
If Word can be a frustrating tool for writers, it’s because it wasn’t made for anyone in particular, but for “everyone.” As one of the core handful of programs included in the Microsoft Office suite, it’s meant to serve a wide range of functions, from hammering out a high school essay to formatting a rudimentary corporate newsletter. It’s intended to be equally useful to someone who creates a document twice a month and someone who uses it every day, which means that it’s tailored to the needs of precisely nobody. And it was presumably implemented by coders who would rebel against any similar imposition. There’s a reason why so many programmers still live in Emacs and its text-based brethren: they’re simple once you get to know them, they’re deeply customizable, and they let you keep your hands on the keyboard for extended periods of time. Word, by contrast, seems to have been designed for a hypothetical consumer who would rather follow a template than fiddle with each line by hand. This may be true of most casual users, but it’s generally not true of coders—or writers. And Word, like so much other contemporary technology, offers countless options but very little choice.
There are times, obviously, when a standard template can be useful, especially when you’re putting together something like an academic bibliography. Yet there’s a world of difference between really understanding bibliographic style from the inside and trusting blindly to the software, which always needs to be checked by hand, anyway, to catch the errors that inevitably creep in. In the end, though, Word wasn’t made for me; it was made for users who see a word processor as an occasional tool, rather than the environment in which they spend most of their lives. For the rest of us, there are either specialized programs, like Scrivener, or the sliver of Word we’ve managed to colonize. In my post on George R.R. Martin and his use of WordStar—which, somewhat embarrassingly, has turned out to be the most widely read thing I’ve ever written—I note that a writer’s choice of tools is largely determined by habit. I’ve been using Word for two decades, and the first drafts of all my stories are formatted in exactly the way the program imposes, in single-spaced 12-point Times New Roman. I’m so used to how it looks that it fades into invisibility, which is exactly how it should be. The constraints it imposes are still there, but I’ve adapted so I can take them for granted, like a deep-sea fish that would explode if taken closer to the surface, or an animal that has learned to live with gravity.
When an object enters the frame, ensure it’s moving at its peak velocity. This behavior emulates natural movement: a person entering the frame of vision does not begin walking at the edge of the frame but well before it. Similarly, when an object exits the frame, have it maintain its velocity, rather than slowing down as it exits the frame. Easing in when entering and slowing down when exiting draw the user’s attention to that motion, which, in most cases, isn’t the effect you want.
“The joy of listening to Beethoven is comparable to the pleasure of reading Joyce,” writes Alex Ross in a recent issue of The New Yorker: “The most paranoid, overdetermined interpretation is probably the correct one.” Even as someone whose ear for classical music is underdeveloped compared to his interest in other forms of art, I have to agree. Great artists come in all shapes and sizes, but the rarest of all is the kind whose work can sustain the most meticulous level of scrutiny because we’re aware that every detail is a conscious choice. When we interpret an ordinary book or a poem, our readings are often more a reflection of our own needs than the author’s intentions; even with a writer like Shakespeare, it’s hard to separate the author’s deliberate decisions from the resonances that naturally emerge from so much rich language set into motion. With Beethoven, Joyce, and a handful of others—Dante, Bach, perhaps Nabokov—we have enough information about the creative process to know that little, if anything, has happened by accident. Joyce explicitly designed his work to “keep professors busy for centuries,” and Beethoven composed for a perfect, omniscient audience that he seemed to will into existence.
Or as Colin Wilson puts it: “The message of the symphonies of Beethoven could be summarized: ‘Man is not small; he is just bloody lazy.'” When you read Ross’s perceptive article, which reviews much of the recent scholarship on Beethoven and his life, you’re confronted by the same tension that underlies any great body of work made within historical memory. On the one hand, Beethoven has undergone a kind of artistic deification, and there’s a tradition, dating back to E.T.A. Hoffmann, that there are ideas and emotions being expressed in his music that can’t be matched by any other human production; on the other, there’s the fact that Beethoven was a man like any other, with a messy personal life and his own portion of pettiness, neediness, and doubt. As Ross points out, before Beethoven, critics were accustomed to talk of “genius” as a kind of impersonal quality, but afterward, the concept shifted to that of “a genius,” which changes the terms of the conversation without reducing its underlying mystery. Beethoven’s biography provides tantalizing clues about the origins of his singular greatness—particularly his deafness, which critics tend to associate with his retreat to an isolated, visionary plane—but it leaves us with as many questions as before.
As it happens, I read Ross’s article in parallel with Howard Markel’s An Anatomy of Addiction, which focuses on the early career of another famous resident of Vienna. Freud seems to have been relatively indifferent to music: he mentions Beethoven along with Goethe and Leonardo Da Vinci as “great men” who have produced “splendid creations,” although this feels more like a rhetorical way of filling out a trio than an expression of true appreciation. Otherwise, his relative silence on the subject is revealing in itself: if he wanted to interpret an artist’s work in psychoanalytic terms, Beethoven’s life would have afforded plenty of material, and he didn’t shy from doing the same for Leonardo and Shakespeare. It’s possible that Freud avoided Beethoven because of the same godlike intentionality that makes him so fascinating to listeners and critics. If we’ve gotten into the habit of drawing a distinction between what a creative artist intends and his or her unconscious impulses, it’s largely thanks to Freud himself. Beethoven stands as a repudiation, or at least a strong counterexample, to this approach: however complicated Beethoven may have been as a man, it’s hard to make a case that there was ever a moment when he didn’t know what he was doing.
This may be why Freud’s genius—which was very real—seems less mysterious than Beethoven’s: we know more about Freud’s inner life than just about any other major intellectual, thanks primarily to his own accounts of his dreams and fantasies, and it’s easy to draw a line from his biography to his work. Markel, for instance, focuses on the period of Freud’s cocaine use, and although he stops short of suggesting that all of psychoanalysis can be understood as a product of addiction, as others have, he points out that Freud’s early publications on cocaine represent the first time he publicly mined his own experiences for insight. But of course, there were plenty of bright young Jewish doctors in Vienna in the late nineteenth century, and while many of the ideas behind analysis were already in the air, it was only in Freud that they found the necessary combination of obsessiveness, ambition, and literary brilliance required for their full expression. Freud may have done his best to complicate our ideas of genius by introducing unconscious factors into the equation, but paradoxically, he made his case in a series of peerlessly crafted books and essays, and their status as imaginative literature has only been enhanced by the decline of analysis as a science. Freud doesn’t explain Freud any more than he explains Beethoven. But this doesn’t stop him, or us, from trying.
No building of very small dimensions can be grand, and no building as lofty as the Pyramids or the Colosseum can be mean. The Pyramids are a proof: for what on earth could be viler than a pyramid thirty feet high?
A few days ago, I stumbled across the little item that The Onion ran shortly after the death of Steve Jobs: “Last American Who Knew What The Fuck He Was Doing Dies.” It’s especially amusing to read it now, at a time when the cult of adulation that surrounded Jobs seems to be in partial retreat. These days, it’s impossible to find an article about, say, the upcoming biopic written by Aaron Sorkin without a commenter bringing up all the usual counterarguments: Jobs was fundamentally a repackager and popularizer of other people’s ideas, he was a bully and a bad boss, he hated to share credit, he benefited enormously from luck and good timing, and he pushed a vision of simplicity and elegance that only reduces the user’s freedom of choice. There’s a lot of truth to these points. Yet the fact remains that Jobs did know what he was doing, or at least that he carefully cultivated the illusion that he did, and he left a void in the public imagination that none of his successors have managed to fill. He was fundamentally right about a lot of things for a very long time, and the legacy he left continues to shape our lives, in ways both big and small, one minute after another.
And that Onion headline has been rattling around in my head for most of the week, because I often get the sense I don’t really know what I’m doing, as a writer, as a dad, or as a human being. I do my best to stick to the channel, as Stanislavski would say: I follow the rules I know, maintain good habits, make my lists, and seek out helpful advice wherever I can find it. I have what I think is a realistic sense of my own strengths and weaknesses; I’m a pretty good writer and a pretty good father. But there’s no denying that writing a novel and raising a child are tasks of irreducible complexity, particularly when you’re trying to do both at the same time. Writing, like parenting, imposes a state of constant creative uncertainty: just because you had one good idea or wrote a few decent pages yesterday is no guarantee that you’ll be able to do the same today. If I weren’t fundamentally okay with that, I wouldn’t be here. But there always comes a time when I find myself repeating that line from Calvin and Hobbes I never tire of quoting: “I don’t think I’d have been in such a hurry to reach adulthood if I’d known the whole thing was going to be ad-libbed.”
My only consolation is that I’m not alone. Recently, I’ve been rereading The Magus by John Fowles, a novel that made a huge impression on me when I first encountered it over twenty years ago. In places, it feels uncomfortably like the first work of a young man writing for other young men, but it still comes off as spectacularly assured, which is why it’s all the more striking to read what Fowles has to say about it in his preface:
My strongest memory is of constantly having to abandon drafts because of an inability to describe what I wanted…The Magus remains essentially where a tyro taught himself to write novels—beneath its narrative, a notebook of an exploration, often erring and misconceived, into an unknown land. Even in its final published form it was a far more haphazard and naïvely instinctive work than the more intellectual reader can easily imagine; the hardest blows I had to bear from critics were those that condemned the book as a coldly calculated exercise in fantasy, a cerebral game. But then one of the (incurable) faults of the book was the attempt to conceal the real state of endless flux in which it was written.
Fowles is being consciously self-deprecating, but he hits on a crucial point, which is that most novels are designed to make a story that emerged from countless wrong turns and shots in the dark seem inevitable. In fact, it’s a little like being a parent, or a politician, or the CEO of a major corporation: you need to project an air of authority even if you don’t have the slightest idea if you’re doing the right thing. (And just as you can’t fully appreciate your own parents until you’ve had a kid of your own, you can’t understand the network of uncertainties underlying even the most accomplished novel until you’ve written a few for yourself.) I’d like to believe that the uncertainties, doubts, and fears that persist throughout are a necessary corrective, a way of keeping us humble in the face of challenges that can’t be reduced to a few clear rules. The real danger isn’t being unsure about what comes next; it’s turning into a hedgehog in a world of foxes, convinced that we know the one inarguable truth that applies to every situation. In fiction, that kind of dogmatic certainty leads to formula or propaganda, and we’ve all seen its effects in business, politics, and parenting. It’s better, perhaps, to admit that we’re all faking it until we make it, and that we should be satisfied if we’re right ever so slightly more often than we’re wrong.
The most important thing is to build up a mood, which is productive in liberating your imagination further. I fully believe in “attacking” the paper so that you can bring out what you are visualizing in your mind. I liken it to a dream—even as you are recalling it and attempting to tell someone about it, it starts to evaporate—and you cannot remember the end, so getting the “bare bones” down is crucial. Even three or four lines can work as a stimulus to develop the piece. Sometimes I am knee deep in paper trying to work an idea through, but it comes eventually…
I like to work “big.” I use paper that is a meter high, and I work from the shoulder, not the arm or wrist, so I can get the muscle into the drawing. I like the size and the vigor of the initial drawing, and certainly using pen and ink straight on to the paper. I have needed strong nibs to accommodate the enthusiasm of the approach. You have to draw “in the moment” to capture the impulsiveness of the invention, and to capture what you can. You can add the detail later, but that becomes clearer when you have got the essence of the image.