Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Marcel Proust

The castle on the keyboard

with 2 comments

In March, the graphic artist Susan Kare, who is best known for designing the fonts and icons for the original Apple Macintosh, was awarded a medal of recognition from the professional organization AIGA. It occurred to me to write a post about her work, but when I opened a gallery of her designs, I found myself sidetracked by an unexpected sensation. I felt happy. Looking at those familiar images—the Paintbrush, the Trash Can, even the Bomb—brought me as close as I’ve come in a long time to what Proust describes after taking a bite of the madeleine in the first volume of In Search of Lost Time:

Just as the Japanese amuse themselves by filling a porcelain bowl with water and steeping in it little crumbs of paper which until then are without character or form, but, the moment they become wet, stretch themselves and bend, take on color and distinctive shape, become flowers or houses or people, permanent and recognizable, so in that moment all the flowers in our garden…and the good folk of the village and their little dwellings and the parish church and the whole of Combray and of its surroundings, taking their proper shapes and growing solid, sprang into being, town and gardens alike, from my cup of tea.

In my case, it wasn’t a physical location that blossomed into existence, but a moment in my life that I’ve tried repeatedly to evoke here before. I was in my early teens, which isn’t a great period for anyone, and I can’t say that I was content. But for better or worse, I was becoming whatever I was supposed to be, and throughout much of that process, Kare’s icons provided the inescapable backdrop.

You could argue that nostalgia for computer hardware is a fairly recent phenomenon that will repeat itself in later generations, with children who are thirteen or younger today feeling equally sentimental toward devices that their parents regard with indifference—and you might be right. But I think that Kare’s work is genuinely special in at least two ways. One is that it’s a hallmark of perhaps the last time in history when a personal computer could feel like a beguiling toy, rather than an indispensable but utilitarian part of everyday life. The other is that her icons, with their handmade look and origins, bear the impression of another human being’s personality in ways that would all but disappear within a few years. As Alexandra Lange recounts in a recent profile of Kare:

In 1982, [Kare] was a sculptor and sometime curator when her high-school friend Andy Hertzfeld asked her to create graphics for a new computer that he was working on in California. Kare brought a Grid notebook to her job interview at Apple Computer. On its pages, she had sketched, in pink marker, a series of icons to represent the commands that Hertzfeld’s software would execute. Each square represented a pixel. A pointing finger meant “Paste.” A paintbrush symbolized “MacPaint.” Scissors said “Cut.” Kare told me about this origin moment: “As soon as I started work, Andy Hertzfeld wrote an icon editor and font editor so I could design images and letterforms using the Mac, not paper,” she said. “But I loved the puzzle-like nature of working in sixteen-by-sixteen and thirty-two-by-thirty-two pixel icon grids, and the marriage of craft and metaphor.”

That same icon editor, or one of its successors, was packaged with the Mac that I used, and I vividly remember clicking on that grid myself, shaping the building blocks of the interface in a way that seems hard to imagine now.

And Kare seems to have valued these aspects of her work even at the time. There’s a famous series of photos of her in a cubicle at Apple in 1984, leaning back in her chair with one New Balance sneaker propped against her desk, looking impossibly cool. In one of the pictures, if you zoom in on the shelf of books behind her, it’s possible to make out a few titles, including the first edition of Symbol Sourcebook by Henry Dreyfuss, with an introduction by none other than R. Buckminster Fuller. Kare has spoken highly of this book elsewhere, most notably in an interview with Alex Pang of Stanford, to whom she explained:

One of my favorite parts of the book is its list of hobo signals, that hobos used to contact each other when they were on the road. They look like they’re in chalk on stones…When you’re desperate for an idea—some icons, like the piece of paper, are no problem; but others defy the visual, like “undo”—you look at things like hobo signs. Like this: “Man with a gun lives here.” Now, I can’t say that anything in this book is exactly transported into the Macintosh interface, but I think I got a lot of help from this, just thinking. This kind of symbol appeals to me because it had to be really simple, and clear to a group of people who were not going to be studying these for years in academia. I don’t understand a lot of them—“These people are rich” is a top hat and a triangle—but I always had that at Apple. I still use it, and I’m grateful for it.

And it seems likely that this was the “symbol dictionary” in which Kare discovered the Bowen Knot, a symbol once used to indicate “interesting features” at Swedish campgrounds, which lives on as the Command icon on the Mac.

According to Kare, the Bowen Knot originally represented a castle with four turrets, and if you’re imaginative enough, you can imagine it springing into being from the keys to either side of the space bar, like the village from Proust’s teacup. Like the hobo signs, Kare’s icons are a system of signals left to those who might pass by in the future, and the fact that they’ve managed to survive at Apple in even a limited way is something of a miracle in itself. (As the tech journalist Mike Murphy recently wrote: “For whatever reason, Apple looks and acts far more like a luxury brand than a consumer-technology brand in 2018.” And there isn’t much room in that business for castles or hobo signs.) When you click through the emulated versions of the earliest models of the Macintosh on the Internet Archive, it can feel like a temporary return to those values, or like a visit to a Zen garden. Yet if we only try to recapture it, we miss the point. Toward the end of In Search of Lost Time, Proust experiences a second moment of revelation, when he stumbles in a courtyard and catches himself “on a flagstone lower than the one next it,” which reminds him of a similar sensation that he had once felt at the Baptistry of St. Mark in Venice. And what he says of this flash of insight reminds me of how I feel when I look at the Happy Mac, and all the possibilities that it once seemed to express:

As at the moment when I tasted the madeleine, all my apprehensions about the future, all my intellectual doubts, were dissipated. Those doubts which had assailed me just before, regarding the reality of my literary gifts and even regarding the reality of literature itself were dispersed as though by magic…Merely repeating the movement was useless; but if…I succeeded in recapturing the sensation which accompanied the movement, again the intoxicating and elusive vision softly pervaded me, as though it said, “Grasp me as I float by you, if you can, and try to solve the enigma of happiness I offer you.”

Written by nevalalee

June 15, 2018 at 8:50 am

In the cards

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on October 21, 2016.

It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.

Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)

David Mamet

Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?

The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”

Written by nevalalee

February 27, 2018 at 9:00 am

Posted in Writing

Tagged with , ,

The minor key

with one comment

“What keeps science fiction a minor genre, for all the brilliance of its authors and apparent pertinence of its concerns?” The critic who asked this question was none other than John Updike, in his New Yorker review of David G. Hartwell’s anthology The World Treasury of Science Fiction, which was published at the end of the eighties. Updike immediately responded to his own question with his usual assurance:

The short answer is that each science-fiction story is so busy inventing its environment that little energy is left to be invested in the human subtleties. Ordinarily, “mainstream” fiction snatches what it needs from the contemporary environment and concentrates upon surprising us with details of behavior; science fiction tends to reverse the priorities…It rarely penetrates and involves us the way the quest realistic fiction can…”The writer,” Edmund Wilson wrote, “must always find expressions for something which has never yet been exposed, must master a new set of phenomena which has never yet been mastered.” Those rhapsodies, for instance, which Proust delivered upon the then-fresh inventions of the telephone, the automobile, and the airplane point up the larger relativities and magical connections of his great novel, as well as show the new century breaking upon a fin-de-siècle sensibility. The modest increments of fictional “news,” of phenomena whose presentation is unprecedented, have the cumulative weight of true science—a nudging, inching fidelity to human change ultimately far more impressive and momentous than the great glittering leaps of science fiction.

I’ll concede that Updike’s underlying point here is basically correct, and that a lot of science fiction has to spend so much time establishing the premise and the background that it has to shortchange or underplay other important qualities along the way. (At its highest level, this is less a reflection of the author’s limitations than a courtesy to the reader. It’s hard to innovate along every parameter at once, so complex works of speculative fiction as different as Gravity’s Rainbow and Inception need to strategically simplify wherever they can.) But there’s also a hidden fallacy in Updike’s description of science fiction as “a minor genre.” What, exactly, would a “major” genre look like? It’s hard to come up with a definitive list, but if we’re going to limit ourselves to a conception of genre that encompasses science fiction and not, say, modernist realism, we’d probably include fantasy, horror, western, romance, erotica, adventure, mystery, suspense, and historical fiction. When we ask ourselves whether Updike would be likely to consider any of these genres “major,” it’s pretty clear that the answer is no. Every genre, by definition, is minor, at least to many literary critics, which not only renders the distinction meaningless, but raises a host of other questions. If we honestly ask what keeps all genres—although not individual authors—in the minor category, there seem to be three possibilities. Either genre fiction fails to attract or keep major talent; it suffers from various systemic problems of the kind that Updike identified for science fiction; or there’s some other quirk in the way we think about fiction that relegates these genres to a secondary status, regardless of the quality of specific works or writers.

And while all three of these factors may play a role, it’s the third one that seems most plausible. (After all, when you average out the quality of all “literary fiction,” from Updike, Bellow, and Roth down to the work put out by the small presses and magazines, it seems fairly clear that Sturgeon’s Law applies here as much as anywhere else, and ninety percent of everything is crud. And modernist realism, like every category coherent enough to earn its own label, has plenty of clichés of its own.) In particular, if a genre writer is deemed good enough, his or her reward is to be elevated out of it entirely. You clearly see this with such authors as Jorge Luis Borges, perhaps the greatest writer of speculative fiction of the twentieth century, who was plucked out of that category to complete more effectively with Proust, Joyce, and Kafka—the last of whom was arguably also a genre writer who was forcibly promoted to the next level. It means that the genre as a whole can never win. Its best writers are promptly confiscated, freeing up critics to speculate about why it remains “minor.” As Daniel Handler noted in an interview several years ago:

I believe that children’s literature is a genre. I resisted the idea that children’s literature is just anything that children are reading. And I certainly resisted the idea that certain books should get promoted out of children’s literature just because adults are reading them. That idea is enraging too. That’s what happens to any genre, right? First you say, “Margaret Atwood isn’t really a science fiction writer.” Then you say, “There really aren’t any good science fiction writers.” That’s because you promoted them all!

And this pattern isn’t a new one. It’s revealing that Updike quoted Edmund Wilson, who in his essays “Why Do People Read Detective Stories” and “Who Cares Who Killed Roger Ackroyd?” dismissed the entire mystery genre as minor or worse. Yet when it came to defending his fondness for one author in particular, he fell back on a familiar trick:

I will now confess, in my turn, that, since my first looking into this subject last fall, I have myself become addicted, in spells, to reading myself to sleep with Sherlock Holmes, which I had gone back to, not having looked at it since childhood, in order to see how it compared with Conan Doyle’s latest imitators. I propose, however, to justify my pleasure in rereading Sherlock Holmes on grounds entirely different from those on which the consumers of the current product ordinarily defend their taste. My contention is that Sherlock Holmes is literature on a humble but not ignoble level, whereas the mystery writers most in vogue now are not. The old stories are literature, not because of the conjuring tricks and the puzzles, not because of the lively melodrama, which they have in common with many other detective stories, but by virtue of imagination and style. These are fairy-tales, as Conan Doyle intimated in his preface to his last collection, and they are among the most amusing of fairy-tales and not among the least distinguished.

Strip away the specifics, and the outlines of the argument are clear. Sherlock Holmes is good, and mysteries are bad, so Sherlock Holmes must be something other than mystery fiction. It’s maddening, but from the point of view of a working critic, it makes perfect sense. You get to hold onto the works that you like, while keeping the rest of the genre safely minor—and then you can read yourself happily to sleep.

Childhood’s end

with 2 comments

I’ve been thinking a lot recently about my childhood. One of the inciting factors was the movie adaptation of Stephen King’s It, which I enjoyed a great deal when I finally saw it. It’s a blue-chip horror film, with a likable cast and fantastic visuals, and its creators clearly care as much about the original novel as I do. In theory, the shift of its setting to the late eighties should make it even more resonant, since this is a period that I know and remember firsthand. Yet it isn’t quite as effective as it should be, since it only tells the half of the story that focuses on the main characters as children, and most of the book’s power comes from its treatment of memory, childhood, and forgetfulness—which director Andy Muschietti and his collaborators must know perfectly well. Under the circumstances, they’ve done just about the best job imaginable, but they inevitably miss a crucial side of a book that has been a part of my life for decades, even if I was too young to appreciate it on my first reading. I was about twelve years old at the time, which means that I wasn’t in a position to understand its warning that I was doomed to forget much of who I was and what I did. (King’s uncanny ability to evoke his own childhood so vividly speaks as much as anything else to his talents.) As time passes, this is the aspect of the book that impresses me the most, and it’s one that the movie in its current form isn’t able to address. A demonic clown is pretty scary, but not as much as the realization, which isn’t a fantasy at all, that we have to cut ourselves off from much of who we were as children in order to function as adults. And I’m saying this as someone who has remained almost bizarrely faithful to the values that I held when I was ten years old.

In fact, it wouldn’t be farfetched to read Pennywise the Dancing Clown as the terrifying embodiment of the act of forgetting itself. In his memoir Self-ConsciousnessJohn Updike—who is mentioned briefly in It and lends his last name to a supporting character in The Talisman—described this autobiographical amnesia in terms that could serve as an epigraph to King’s novel:

Not only are selves conditional but they die. Each day, we wake slightly altered, and the person we were yesterday is dead. So why, one could say, be afraid of death, when death comes all the time? It is even possible to dislike our old selves, these disposable ancestors of ours. For instance, my high-school self—skinny, scabby, giggly, gabby, frantic to be noticed, tormented enough to be a tormenter, relentlessly pushing his cartoons ad posters and noisy jokes and pseudo-sophisticated poems upon the helpless high school—strikes me now as considerably obnoxious, though I owe him a lot.

Updike sounds a lot here like King’s class clown Richie Tozier, and his contempt toward his teenage self is one to which most of us can relate. Yet Updike’s memories of that period seem slightly less vivid than the ones that he explored elsewhere in his fiction. He only rarely mined them for material, even as he squeezed most of his other experiences to the last drop, which implies that even Updike, our greatest noticer, preferred to draw a curtain of charity across himself as an adolescent. And you can hardly blame him.

I was reminded of this by the X-Files episode “The Lost Art of Forehead Sweat,” which is about nothing less than the ways in which we misremember our childhoods, even if this theme is cunningly hidden behind its myriad other layers. At one point, Scully says to Reggie: “None of us remember our high school years with much accuracy.” In context, it seems like an irrelevant remark, but it was evidently important to Darin Morgan, who said to Entertainment Weekly:

When we think back on our memories from our youth, we have a tendency—or at least I do—to imagine my current mindset. Whenever I think about my youth, I’m like, “Why didn’t I do this? Why didn’t I do that?” And then you drive by high school students and you go, “Oh, that’s why I didn’t do it. Because I was a kid.” You tend to think of your adult consciousness, and you take that with you when you’re thinking back on your memories and things you’ve done in the past. Our memories are sometimes not quite accurate.

In “Forehead Sweat,” Morgan expresses this through a weird flashback in which we see Mulder’s adult head superimposed on his preadolescent body, which is a broad visual gag that also gets at something real. We really do seem to recall the past through the lens of our current selves, so we’re naturally mortified by what we find there—which neatly overlooks the point that everything that embarrasses us about our younger years is what allowed us to become what we are now. I often think about this when I look at my daughter, who is so much like me at the age of five that it scares me. And although I want to give her the sort of advice that I wish I’d heard at the time, I know that it’s probably pointless.

Childhood and adolescence are obstacle courses—and occasional horror shows—that we all need to navigate for ourselves, and even if we sometimes feel humiliated when we look back, that’s part of the point. Marcel Proust, who thought more intensely about memory and forgetting than anybody else, put it best in Within a Budding Grove:

There is no man…however wise, who has not at some period of his youth said things, or lived in a way the consciousness of which is so unpleasant to him in later life that he would gladly, if he could, expunge it from his memory. And yet he ought not entirely to regret it, because he cannot be certain that he has indeed become a wise man—so far as it is possible for any of us to be wise—unless he has passed through all the fatuous or unwholesome incarnations by which that ultimate stage must be preceded…We are not provided with wisdom, we must discover it for ourselves, after a journey through the wilderness which no one else can take for us, an effort which no one can spare us, for our wisdom is the point of view from which we come at last to regard the world. The lives that you admire, the attitudes that seem noble to you are not the result of training at home, by a father, or by masters at school, they have sprung from beginnings of a very different order, by reaction from the influence of everything evil or commonplace that prevailed round about them. They represent a struggle and a victory.

I believe this, even if I don’t have much of a choice. My childhood is a blur, but it’s also part of me, and on some level, it never ended. King might be speaking of adolescence itself when he writes in the first sentence of It: “The terror…would not end for another twenty-eight years—if it ever did end.” And I can only echo what Updike wistfully says elsewhere: “I’ve remained all too true to my youthful self.”

The number nine

leave a comment »

Note: This post reveals plot details from last night’s episode of Twin Peaks.

One of the central insights of my life as a reader is that certain kinds of narrative are infinitely expansible or contractible. I first started thinking about this in college, when I was struggling to read Homer in Greek. Oral poetry, I discovered, wasn’t memorized, but composed on the fly, aided by the poet’s repertoire of stock lines, formulas, and images that happened to fit the meter. This meant that the overall length of the composition was highly variable. A scene that takes up just a few lines in the Iliad that survives could be expanded into an entire night’s recital, based on what the audience wanted to hear. (For instance, the characters of Crethon and Orsilochus, who appear for only twenty lines in the existing version before being killed by Aeneas, might have been the stars of the evening if the poet happened to be working in Pherae.) That kind of flexibility originated as a practical consequence of the oral form, but it came to affect the aesthetics of the poem itself, which could grow or shrink to accommodate anything that the poet wanted to talk about. Homer uses his metaphors to introduce miniature narratives of human life that don’t otherwise fit into a poem of war, and some amount to self-contained short stories in themselves. Proust operates in much the same way. One observation leads naturally to another, and an emotion or analogy evoked in passing can unfold like a paper flower into three dense pages of reflections. In theory, any novel could be expanded like this, like a hypertext that opens into increasingly deeper levels. In Search of Lost Time happens to be the one book in existence in which all of these flowerings have been preserved, with a plot could fit into a novella of two hundred unhurried pages.

Something similar appears to have happened with the current season of Twin Peaks, and when you start to think of it in those terms, its structure, which otherwise seems almost perversely shapeless, begins to make more sense. In the initial announcement by Showtime, the revival was said to consist of nine episodes, and Mark Frost even said to Buzzfeed:

If you think back about the first season, if you put the pilot together with the seven that we did, you get nine hours. It just felt like the right number. I’ve always felt the story should take as long as the story takes to tell. That’s what felt right to us.

It was doubled to eighteen after a curious interlude in which David Lynch dropped out of the project, citing budget constraints: “I left because not enough money was offered to do the script the way I felt it needed to be done.” He came back, of course, and shortly thereafter, it was revealed that the length of the season had increased. Yet there was never any indication that either Lynch or Frost had done any additional writing. My personal hunch is that they always had nine episodes of material, and this never changed. What happened is that the second act of the show expanded in the fashion that I’ve described above, creating a long central section that was free to explore countless byways without much concern for the plot. The beginning, and presumably the end, remained more or less as conceived—it was the middle that grew. And a quick look at the structure of the season so far seems to confirm this. The first three episodes, which take Cooper from inside the Black Lodge to slightly before his meeting with his new family in Las Vegas, seemed weird at the time, but now they look positively conventional in terms of how much story they covered. They were followed by three episodes, the Dougie Jones arc, that were expanded beyond recognition. And now that we’ve reached the final three, which account for the third act of the original outline, it makes sense for Cooper to return at last.

If the season had consisted of just those nine episodes, I suspect that more viewers would have been able to get behind it. Even if the second act had doubled in length—giving us a total of twelve installments, of which three would have been devoted to detours and loose ends—I doubt that most fans would have minded. It’s expanding that middle section to four times its size, without any explanation, that lost a lot of people. But it’s clearly the only way that Lynch would have returned. For most of the last decade, Lynch has been contentedly pottering around with odd personal projects, concentrating on painting, music, digital video, and other media that don’t require him to be answerable to anyone but himself. The Twin Peaks revival, after the revised terms had been negotiated with Showtime, allowed him to do this with a larger budget and for a vastly greater audience. Much of this season has felt like Lynch’s private sketchbook or paintbox, allowing him to indulge himself within each episode as long as the invisible scaffolding of the original nine scripts remained. The fact that so much of the strangeness of this season has been visual and nonverbal points to Lynch, rather than Frost, as the driving force on this end. And at its best, it represents something like a reinvention of television, which is the most expandable or compressible medium we have, but which has rarely utilized this quality to its full extent. (There’s an opening here, obviously, for a fan edit that condenses the season down to nine episodes, leaving the first and last three intact while shrinking the middle twelve. It would be an interesting experiment, although I’m not sure I’d want to watch it.)

Of course, this kind of aggressive attack on the structure of the narrative doesn’t come without a cost. In the case of Twin Peaks, the primary casualty has been the Dougie Jones storyline, which has been criticized for three related reasons. The first, and most understandable, is that we’re naturally impatient to get the old Cooper back. Another is that this material was never meant to go on for this long, and it starts to feel a little thin when spread over twelve episodes. And the third is that it prevents Kyle MacLachlan, the ostensible star of the show, from doing what he does best. This last criticism feels like the most valid. MacLachlan has played an enormous role in my life as a moviegoer and television viewer, but he operates within a very narrow range, with what I might inadequately describe as a combination of rectitude, earnestness, and barely concealed eccentricity. (In other words, it’s all but indistinguishable from the public persona of David Lynch himself.) It’s what made his work as Jeffrey in Blue Velvet so moving, and a huge part of the appeal of Twin Peaks lay in placing this character at the center of what looked like a procedural. MacLachlan can also convey innocence and darkness, but by bringing these two traits to the forefront, and separating them completely in Dougie and Dark Cooper, it robs us of the amalgam that makes MacLachlan interesting in the first place. Like many stars, he’s chafed under the constraints of his image, and perhaps he even welcomed the challenges that this season presented—although he may not have known how his performance would look when extended past its original dimensions and cut together with the rest. When Cooper returned last night, it reminded me of how much I’ve missed him. And the fact that we’ll get him for two more episodes, along with everything else that this season has offered us, feels more than ever like a gift.

Written by nevalalee

August 28, 2017 at 9:17 am

My ten great books #2: In Search of Lost Time

with one comment

In Search of Lost Time

The best advice I’ve found for approaching this enormous, daunting book is Roger Shattuck’s observation, in his useful study Proust’s Way, that Marcel Proust’s most immediate precursor is Scheherazade, the legendary author of The Thousand and One Nights. In Search of Lost Time has less in common with the novels that we usually read than with the volumes of myths and fairy tales that we devour in childhood, and it might seem more accessible to the readers who currently find it bewildering if, as Shattuck suggests, it had been titled The Parisian Nights. Proust is a teller of tales, and like Homer, his work is infinitely expansible. An exchange that lasts for a few lines in an oral epic like The Iliad could have been expanded—as it probably was for certain audiences—into an entire evening’s performance, and Homer deploys his metaphors to introduce miniature narratives of human life that don’t otherwise fit into a poem of war. Proust operates in much the same way. One observation leads naturally to another, and an emotion or analogy evoked in passing can unfold like a paper flower into three dense pages of reflections. In theory, any good novel could be expanded like this, like a hypertext that opens into increasingly intimate levels: In Search of Lost Time happens to be the only book in existence in which all of these flowerings have been preserved. Its plot could fit into a novella of two hundred unhurried pages, but we don’t read Proust for the plot, even if he knows more about suspense and surprise than you might expect. His digressions are the journey, and the result is the richest continuous slice of a great writer’s mind that a work of fiction can afford.

And the first thing that you notice about Proust, once you’ve lived in his head for long enough, is that he has essential advice and information to share about everything under the sun. Proust is usually associated with the gargantuan twin themes of memory and time, and although these are crucial threads, they’re only part of a tapestry that gradually expands to cover all human life. At first, it seems a little unfair that our greatest writer on the subject of sexual jealousy should also be a genius at describing, say, a seascape, as well as a mine of insight into such diverse areas as art, class, childhood, travel, death, homosexuality, architecture, poetry, the theater, and how milk looks when it’s about to boil over, while also peopling his work with vivid characters and offering up a huge amount of incidental gossip and social reportage. When you look at it from another angle, though, it seems inevitable. Proust is the king of noticing, and he’s the author who first awakened me to the fact that a major novelist should be able to treat any conceivable topic with the same level of artistic and intellectual acuity. His only rival here is Shakespeare, but with a difference. Plays like Hamlet speak as much in their omissions and silences, leaving us to fill in the gaps. Proust, by contrast, says everything—it’s all there on the page for anyone who wants to unpack it—and you can’t emerge without being subtly changed by the experience. Like Montaigne, Proust gives us words to express thoughts and feelings that we’ve always had, and if you read him deeply enough, you inevitably reach a point where you realize that this novel, which seemed to be about everything else in the world, has been talking about you all along.

Written by nevalalee

May 9, 2017 at 9:00 am

Quote of the Day

leave a comment »

Marcel Proust

You remember the story of the man who believed that he had the Princess of China shut up in a bottle. It was a form of insanity. He was cured of it. But as soon as he ceased to be mad he became merely stupid. There are maladies which we must not seek to cure because they alone protect us from others that are more serious.

Marcel Proust, The Guermantes Way

Written by nevalalee

December 21, 2016 at 7:30 am

In the cards

with 2 comments

Vladimir Nabokov

It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.

Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)

David Mamet

Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?

The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”

Written by nevalalee

October 21, 2016 at 8:21 am

The act of noticing

leave a comment »

Jonathan Franzen

Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 24, 2014.

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, nothing I’m writing at the moment includes a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself dealing with a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way to evoke it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, rather unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Updike himself, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elise, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

“Pretty clever!”

leave a comment »

The Simpsons episode "Hurricane Neddy"

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop-culture concepts have you found to be ripe for everyday use?”

A few years ago, my wife and I went on a trip to Peru and Bolivia. It was meant as one last stab at adventure travel before kids—which we knew would soon figure in our future—made that kind of vacation impossible, and we’d planned an ambitious itinerary: Machu Picchu, Lake Titicaca, the salt flats of Uyuni. As soon as we landed in Cuzco, though, I was felled by a bout of altitude sickness that made me wonder if we’d have to cancel the whole thing. A few pills and a day of acclimation made me feel stable enough to proceed, but I never got entirely used to it, and before long, I found myself hiking up a hillside in the Lares Valley, my heart jackhammering in my chest like an animal that was trying to escape. To save face, I’d periodically pause on the trail to look around, as if to take in the view, when I was just trying to get my pulse under control. But what got me through it, weirdly, was the thought of Frodo and Samwise slogging their way toward Mount Doom: if a couple of hobbits could do it, then I could climb this hill, too. It was an image to which I clung for the rest of the way, and the fantasy that I was heading toward Mordor sustained me to the top. And later, when I confessed this to my wife, she only smiled and said: “Yeah. I was thinking of the Von Trapp family.”

We relate to pop culture in all kinds of complicated ways, but one of the most curious aspects of that dynamic is how it can motivate us to become slightly better versions of ourselves, even if such exemplars aren’t entirely realistic. (The real Von Trapps didn’t hike over the mountains into Switzerland: they took a train.) Yesterday, Patrick Stewart showed up on Reddit to answer some questions, and if there was one common thread to the comments, it was that Jean-Luc Picard had been a role model, and almost a surrogate father, for many viewers as they were growing up. Just as fairy tales, with their fantasies of power and destiny, allow children to come to terms with their physical vulnerability as they risk a greater engagement with the world, the heroes we admire in books and movies give us an ideal toward which to aspire. As long as we’re aware that complete success is probably unattainable—”We aim above the mark to hit the mark,” as Emerson said—I don’t see anything wrong with this. And such comparisons often cross our minds at the least dignified moments. Whenever I’m struggling to open a package, an image flits through my mind of Daniel Craig as James Bond, and I think to myself: “Bond wouldn’t have trouble opening a bag of pretzels, would he?” It doesn’t make what I’m doing any less annoying, but it usually inspires me to do something marginally more decisive, like getting a pair of scissors.

The Sound of Music

Pop culture can also provide ways of seeing our own lives from a new perspective, often by putting words to concepts and emotions that we couldn’t articulate before. At its highest level, it can take the form of the recognition that many readers feel when encountering an author like Proust or Montaigne: for long stretches, it feels eerily like we’re reading about ourselves. And a show like Seinfeld has added countless terms to our shared vocabulary of ideas, even if it was by accident, with the writers as surprised as anyone else by what struck a nerve. As writer Peter Mehlman says:

Every line was written just to be funny and to further the plot. But, actually, there was one time that I did think that a certain phrase would become popular. And I was completely wrong. In the “Yada Yada” episode, I really thought it was going to be the “antidentite” line that was going to be the big phrase, and it was not. That line went: “If this wasn’t my son’s wedding day, I’d knock your teeth out, you antidentite bastard.” The man who said it was a dentist. And no one remembers that phrase; it’s the “yada yada yada” line that everyone remembers.

Sometimes a free-floating line will just snag onto an existing feeling and crystalize it, and along with Seinfeld, The Simpsons has been responsible for more such epiphanies than any other series. Elsewhere, I’ve compared the repository of Simpsons quotes that we all seem to carry in our heads to the metaphorical language that Picard encountered in “Darmok,” and there’s no question that it influences the way many of us think about ourselves.

Take “Hurricane Neddy,” which first aired during the show’s eighth season. It probably wouldn’t even make it onto a list of my fifty favorite episodes, but there’s one particular line from it that has been rattling around in my brain ever since. After a hurricane destroys Flanders’s house, the neighborhood joins forces to rebuild it, only to do a spectacularly crappy job. It all leads to the following exchange:

Ned: “The floor feels a little gritty here.”
Moe: “Yeah, we ran out of floorboards there, so we painted the dirt. Pretty clever!”

Those last two words, which Moe delivers with a nudge to Ned’s ribs and an air of self-satisfaction, are ones that I’ve never forgotten. At least once a week, I’ll say to myself, in Moe’s voice: “Pretty clever!” The reason, as far as I can pinpoint it, is that I’m working in a field that calls for me to be “clever” on a regular basis, whether it’s solving a narrative problem, coming up with a twist for a short story, or just figuring out a capper for a blog post. Not all of these ideas are equally clever, and many of them fall into the category of what Frederik Pohl calls “monkey tricks.” But Moe’s delivery, which is so delighted with itself, reminds me both of how ridiculous so much of it is and of the necessity of believing otherwise. Sometimes I’m just painting the dirt, but I couldn’t go on if I wasn’t sort of pleased by it. And if I had to sum up my feelings for my life’s work, for better or worse, it would sound a lot like “Pretty clever!”

Written by nevalalee

August 21, 2015 at 8:34 am

Introduction to finality

leave a comment »

Community

Yesterday, while I was out of the house, my wife asked our daughter: “Do you want your veggie snacks in a bowl?” My daughter, who is two years old, replied: “Sure!” I wasn’t there to see it, but when I got back, I was assured by all involved that it was hilarious. Then, this morning, in response to another question, my daughter said: “Sure!” Without thinking twice, I said: “That’s a good callback, honey. Is that your new catchphrase?” Which made me realize how often I talk about myself and those around me as if we were on a television show. We’ve always used terms from art and literature to describe the structure of our own lives: when we talk about “starting a new chapter” or “turning the page,” we’re implicitly comparing ourselves to the characters in novels. Even a phrase like “midlife crisis” is indebted, almost without knowing it, to the language of literary criticism. It was perhaps inevitable, then, that we’d also appropriate the grammar of television, which is the art form that has the most in common with the way our lives tend to unfold. When I moved to New York after college, I thought of myself as the star of a spinoff featuring a breakout character from the original series, a supporting player who ended up being my roommate. And a friend once told me that he felt that the show jumped the shark after I got a job in finance. (He wasn’t wrong.)

Which goes a long way toward explaining why Community has exercised such a hold over the imaginations of its viewers. For the past six years, it hasn’t always been the funniest sitcom around, or the most consistent, or even the most creative. But it’s the show that thought most urgently about the ways in which we use television to understand ourselves. For most of the show’s run, these themes centered on the character of Abed, but as last night’s season finale—which feels an awful lot like it ought to be the last episode of the entire series—clearly demonstrated, their real impact was on Jeff. Community could sometimes be understood as a dialogue between Abed and Jeff, with one insisting on seeing events in terms of narrative conventions while the other brought him down to earth, but in the end, Jeff comes to see these tropes as a way of making sense of his own feelings of loss. We’re aware, of course, that these people are characters on a television series, which is why Abed’s commentary on the action was often dismissed as a winking nod to the audience. But it wouldn’t be so powerful, so compelling, and ultimately so moving if we didn’t also sense that seeing ourselves through that lens, at least occasionally, is as sane a way as any of giving a shape to the shapelessness of our lives.

Danny Pudi on Community

Community may not endure as a lasting work of our culture—it’s more than enough that it was a great sitcom about half the time—but it’s part of a long tradition of stories that offer us metaphors drawn from their own artistic devices. Most beautifully, we have Shakespeare’s seven ages of man, which are framed as acts in a play. (Shakespeare returned to such images with a regularity that implies that he saw such comparisons as more than figures of speech: “Life’s but a walking shadow, a poor player / That struts and frets his hour upon the stage…” “These our actors, / As I foretold you, were all sprits and / Are melted into air, into thin air…”) Joyce used The Odyssey to provide a framework for his characters’ lives, as Proust did, more subtly, with The Thousand and One Nights. These are all strategies for structuring a literary work, but we wouldn’t respond to them so profoundly if they didn’t also reflect how we felt about ourselves. You could even say that fiction takes the form of a shapely sequence of causal events, at least in the western tradition, because we see our lives in much the same way. When you stand back, everyone’s life looks more or less the same, even as they differ in the details, and as we grow older, we see how much we’re only repeating patterns that others before us have laid down.

This might seem like a lot of pressure to place on a show that included a self-conscious fart joke—with repeated callbacks—in its final episode. But it’s the only way I can explain why Community ended up meaning more to me than any other sitcom since the golden age of The Simpsons. The latter show also ended up defining our lives as completely as any work of art can, mostly because its sheer density and longevity allowed it to provide a reference point to every conceivable situation. Community took a clever, almost Borgesian shortcut by explicitly making itself its own subject, and on some weird level, it benefited from the cast changes, creator firings, cancellations, and unexpected revivals that put its viewers through the wringer almost from the start. It was a show that was unable to take anything for granted, no more than any of us can, and even if it sometimes strained to keep itself going through its many incarnations, it felt like a message to those of us who struggle to impose a similar order on our own lives. Life, like a television show on the brink, has to deal with complications that weren’t part of the plan. If those ups and downs pushed Community into darker and stranger places, it’s a reminder that life gains much of its meaning, not from our conscious intentions, but as an emergent property of the compromises we’re forced to make. And like any television show, it’s defined largely by the fact that it ends.

Written by nevalalee

June 3, 2015 at 9:33 am

“Her face was that of a woman with secrets…”

leave a comment »

"She had never considered herself particularly Indian..."

Note: This post is the thirteenth installment in my author’s commentary for Eternal Empire, covering Chapter 14. You can read the previous installments here.

Of all the misconceptions that frustrate aspiring writers, one of the most insidious involves the distinction between flat and round characters. As formulated by E.M. Forster in Aspects of the Novel, a flat character is one that expresses a single, unchanging idea or quality, while a round character has the ability to change or surprise us. One certainly sounds better than the other, and as a result, you’ll often find writers fretting over the fact that one character or another in their stories is flat, or wondering how to construct a suitably round character from scratch, as if it were a matter of submitting the proper design specifications. What all this misses is the fact that Forster’s original categories were descriptive, not prescriptive, and a round character isn’t inherently more desirable than a flat one: as with everything else in writing, it depends on execution and the role a particular character plays in the narrative as a whole. It’s true that Forster concludes by saying: “We must admit that flat people are not in themselves as big achievements as round ones.” But he also prefaces this with three full pages of reasons why flat characters can be useful—or essential—in even the greatest of novels.

So why should we ever prefer a flat character over a round? Forster notes that flat characters often linger in the memory more vividly after the novel is over; they can be brought onstage in full force, rather than being slowly developed; and they’re easily recognizable, which can serve as an organizing principle in a complicated story. (He even says that Russian novels could use more of them.) In the work of writers like Dickens, who gives us pretty much nothing but flat characters, or Proust, who uses almost as many, their interest arises from their interactions with one another and the events of the plot: “He is the idea, and such life as he possesses radiates from its edges and from the scintillations it strikes when other elements in the novel impinge.” If Forster had lived a little later, he might have also mentioned Thomas Pynchon, whose works are populated by caricatures and cartoons whose flatness becomes a kind of strategy for managing the novel’s complexity. Flat characters have their limitations; they’re more appealing when comic than tragic, and they work best when they set off a round character at the center. But most good novels, as Forster observes, contain a mixture of the two: “A novel that is at all complex often requires flat people as well as round, and the outcome of their collisions parallels life more accurately.”

"Her face was that of a woman with secrets..."

And a memorable flat character requires as much work and imagination as one seen in the round. A bad, unconvincing character is sometimes described as “flat,” but the problem isn’t flatness in itself—it’s the lack of energy or ingenuity devoted to rendering that one vivid quality, or the author’s failure to recognize when one or another category of character is required. A bad flat character can be unbearable, but a bad round character tends to dissolve into a big pile of nothing, an empty collection of notions without anything to hold it together, as we see in so much literary fiction. The great ideal is a round, compelling character, but in order to surprise the reader, he or she has to surprise the writer first. And in practice, what this usually means is that a character who was introduced to fill a particular role gradually begins to take on other qualities, not through some kind of magic, but simply as the part is extended through multiple incidents and situations. Sherlock Holmes is fairly flat as first introduced in A Study in Scarlet: he’s extraordinarily memorable, but also the expression of a single idea. It’s only when the element of time is introduced, in the form of a series of stories, that he acquires an inner life. Not every flat character evolves into roundness, but when one does, the result is often more interesting than if it were conceived that way from the ground up.

My own novels contain plenty of flat characters, mostly to fill a necessary function or story point, but the one who turned into something more is Maya Asthana. She began, as most flat characters do, purely as a matter of convenience. Wolfe needed to talk to somebody, so I gave her a friend, and most of her qualities were chosen to make her marginally more vivid in what I thought would be her limited time onstage: I made her South Asian, which was an idea left over from an early conception of Wolfe herself, and I decided that she’d be planning her wedding, since this would provide her with a few easy bits of business that could be introduced without much trouble. But as I’ve mentioned elsewhere, Asthana got caught up in a radical shift in the logic of the novel itself: I needed a mole and a traitor within the agency, and after my original plan turned out to be unworkable, I cast around for someone else to fill that role. Asthana happened to be handy. And by turning her into a villain without changing a word of her initial presentation in City of Exiles, I got something far more intriguing than if I’d had this in mind from the beginning. Chapter 14 of Eternal Empire represents our first extended look at Asthana from the inside, and I like how the characteristics she acquired before I knew her true nature—her vanity, her intelligence, her perfect life with her fiancé—vibrate against what she became. Not every character turns out this way; these novels are filled with minor players content to occupy their roles. But Asthana, lucky for me and unlucky for everyone else, wanted to be more…

The act of noticing

leave a comment »

Jonathan Franzen

Yesterday, while playing with my daughter at the park, I found myself oddly fascinated by the sight of a landscaping crew that was taking down a tree across the street. It’s the kind of scene you encounter on a regular basis in suburbia, but I wound up watching with unusual attention, mostly because I didn’t have much else to do. (I wasn’t alone, either. Any kind of construction work amounts to the greatest show on earth for toddlers, and there ended up being a line of tiny spectators peering through the fence.) Maybe because I’ve been in a novelistic state of mind recently, I focused on details that I’d never noticed before. There’s the way a severed tree limb dangles from the end of the crane almost exactly like a hanged man, as Eco describes it in Foucault’s Pendulum, with its heavy base tracing a second, smaller circle in the air. I noted how a chainsaw in action sprays a fan of fine particles behind it, like a peacock’s tail. And when the woodchipper shoots chips into the back of the truck, a cloud of light golden dust forms above the container, like the soul of the tree ascending.

As I watched, I had the inevitable thought: I should put this into a story. Unfortunately, my current novel project doesn’t include a landscaping scene, and the easiest way to incorporate it would be through some kind of elaborate metaphor, as we often see, at its finest, in Proust. (“As he listened to her words, he found himself reminded of a landscaping crew he had once seen…”) But it made me reflect both on the act of noticing and on the role it plays, or doesn’t, in my own fiction. Most of the time, when I’m writing a story, I’m following the dictates of a carefully constructed plot, and I’ll find myself confronted by a building or a city scene that has imposed itself by necessity on the action: my characters end up at a hospital or a police station, and I strain to find a way of evoking it in a few economical lines that haven’t been written a million times before. Occasionally, this strikes me as a backward way of working. It would be better, it seems, to build the story around locations and situations that I already know I can describe—or which caught my attention in the way that landscaping crew did—rather than scrambling to push out something original under pressure.

Joseph O'Neill

In fact, that’s the way a lot of novelists work, particularly on the literary end. One of the striking trends in contemporary fiction is how so much of it doubles as reportage, with miniature New Yorker pieces buried like bonbons within the larger story. This isn’t exactly new: writers from Nabokov to Updike have filled their novels with set pieces that serve, in James Wood’s memorable phrase, as “propaganda on behalf of good noticing.” What sets more recent novels apart is how undigested some of it seems. At times, you can feel the narrative pausing for a page or two as the writer—invariably a talented one, or else these sections wouldn’t survive the editorial process—serves up a chunk of journalistic observation. As Norman Mailer writes, unkindly, of Jonathan Franzen:

Everything of novelistic use to him that came up on the Internet seems to have bypassed the higher reaches of his imagination—it is as if he offers us more human experience than he has literally mastered, and this is obvious when we come upon his set pieces on gourmet restaurants or giant cruise ships or modern Lithuania in disarray. Such sections read like first-rate magazine pieces, but no better—they stick to the surface.

This isn’t entirely fair to Franzen, a superb noticer who creates vivid characters even as he auditions for our admiration. But I thought of this again after finishing Joseph O’Neill’s Netherland this week. It’s a novel I’d wanted to read for years, and I enjoyed it a hell of a lot, while remaining conscious of its constant shifts into what amounts to nonfiction: beautifully written and reported essays on New York, London, the Hague, India, cricket, and just about everything else. It’s a gorgeous book, but it ends up feeling more like a collection of lovingly burnished parts than a cohesive whole, and its acts of noticing occasionally interfere with its ability to invent real interactions for its characters. It was Edmund Wilson, I think, who warned writers against mining their journals for material, and you can see why: it encourages a sort of novelistic bricolage rather than an organic discovery of the action, and the best approach lies somewhere in the middle. And there’s more than one way of telling a story. As I was studying the landscaping crew at the park, my daughter was engaged in a narrative of her own: she ran into her friend Elyse, played on the seesaw, and then had to leave abruptly for a diaper change. Or, as Beatrix put it, when I asked about her day: “Park. Elyse. Say hi. Seesaw. Poop. Go home.” And I don’t think I can do better than that.

In with the old

leave a comment »

The Dick Van Dyke Show

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What are your pop cultural dealbreakers?”

Every year, it sometimes seems, a new television series is anointed the greatest show of all time. Oddly enough, it’s usually a show that happens to be airing right now, and which is still some distance away from finishing its run, which you’d think would be a necessary part of evaluating its place in the canon. It isn’t that the acclaim is always undeserved: even before “Ozymandias” and “Felina,” for instance, it was clear that Breaking Bad merited a place near the top of many people’s lists of classic shows. (And I’m not entirely innocent here. I think that Mad Men might be the best dramatic series I’ve ever seen, but I’m going to hold off until the series actually ends before staking out that position.) Part of this is simply an aspect of a larger trend toward hyperbole, which is inevitable given our fragmented media landscape: for a show to stand out among the dozens of hours of excellent television at our disposal these days, it needs to be hyped as the best thing ever. Fans have been doing this for a long time, but serious critics have started to do the same, to the point where if a similar tendency was at work among movies, we’d see a Sight and Sound list where Gravity and Inception had long since toppled Citizen Kane.

Still, there’s another, more disturbing phenomenon operating here, which speaks to a lack of curiosity in—or a sense of being overwhelmed by—the pop culture that emerged before we were born. Television in particular suffers from a loss of institutional memory. There’s just so much of it, with more being produced every day, that we don’t know where to begin, even if we make a point of seeking out older shows. Even as it stands, much of our cultural knowledge arises from accidents of biography, timing, and syndication. I was lucky enough to grow up during the golden era of Nick at Nite, which exposed me to some fantastic television (The Dick Van Dyke Show, Get Smart, Mary Tyler Moore) while leaving me unaware of many others (Andy Griffith, The Twilight Zone, All in the Family). And it’s likely that I suffer from some of the same biases that I see in others. I have a real problem with people who say that they can’t watch black-and-white movies, or three-camera sitcoms, but if I haven’t revisited many of the great series of the 70s, it’s because the videotape they used looks hideous to contemporary eyes, when older shows shot on film still look fantastic.

Aaron Paul on Breaking Bad

But a large portion of our responsibility as thinking adults who care about pop culture in any form lies in overcoming those preconceptions. It’s easy to stick with art that presents itself to us in a way we find immediately recognizable and accessible, but many—perhaps most—worthwhile stories teach us how to encounter them, whether because they’re rooted in the past, look toward the future, or both. Proust takes more effort than Stephen King, but both offer considerable rewards to those willing to seek them out, and this especially applies to works of art that have fallen off the radar. I’m always a little depressed, in a slightly guilty fashion, when I see that a friend’s bookshelves consist only of books published in the last five years or so, most of which might as well have been plucked directly from the front table at Barnes & Noble. Reading is a worthwhile pursuit in its own right, no matter the provenance of the works involved, but I still can’t help feel that there’s something limiting in sticking to the books that everyone else you know is already buying. (If anything, in recent years, I’ve become unfairly biased toward the old, neglected, and out of print, to the point where I find myself erring in the other direction.)

Of course, we all have our cultural blinders: I could stand to be more aware, say, of the current occupants of the Billboard Hot 100, and I’m more than a little influenced in my choice of what television shows to watch by what people I respect think is cool. But what counts more than the particular books, movies, television, or music you love is that willingness to move beyond the familiar. The gradual expansion of one’s cultural comfort zone is an essential part of becoming a grownup, and while I can understand that reluctance in the young—I vividly remember how it felt to be in high school, when the kind of music you liked seemed like the only thing defining who you were to your peers—adults have no excuse. And it doesn’t exclude the possibility of strong likes and dislikes; in fact, it lays the groundwork, because those tastes are nourished by curiosity and wide experience, rather than easy assumptions. If you’re a curious reader and viewer, I’m going to love talking to you, even if our tastes diverge; if you refuse to move beyond one comfortable slice marked out by the tastemakers around you, we might have trouble having a conversation, even if we agree that Breaking Bad is pretty darned great.

Written by nevalalee

May 9, 2014 at 9:45 am

The mismeasure of man

with 4 comments

Yardstick

I like to think of myself as a fairly rational guy, so it might come as a surprise to discover that I deeply dislike the metric system. Yes, it makes units easier to convert, and I understand why it’s more useful and less prone to error in science and engineering. In everyday life, though, it’s sorely lacking, and it forces people to move from a system of measurement that emerged over millennia of common experience to one whose units are essentially arbitrary. The imperial measures are reflections of the human body: an inch is the first joint of the thumb, a yard is either the length of an average pace or the distance from the nose to the tip of the forefinger, and a foot is, well, a foot. The result is intuitive, suited to ordinary needs, and lends itself to approximation, best guesses, and the rough and ready nature of daily life, which is governed literally by rules of thumb. And when the imperial units are discarded, it makes it all that much easier to ignore human scale in our buildings and surroundings, which leads in turn to the impersonal, alienating nature of so much modern architecture and design.

Anyway, that’s the end of that rant, and even if you don’t agree, I hope you’ll grant that there are reasons beyond simple laziness or inertia for wanting to preserve the inch, foot, and mile. (I’m aware that this is an unpopular stance: Reddit has a post this morning making fun of America’s “arbitrary retarded” system of measurement, and although the commenters have risen in defense of both the Fahrenheit scale and writing our dates with the month first, I don’t see a single voice in favor of imperial measures.) If the body has a head, as Gustav Eckstein famously said in his book of the same name, it’s equally true that the head has a body, and it’s inseparable from the way we interact with the world. When we forget this, we run the risk of making larger mistakes that have nothing to do with remembering the number of pints to a gallon. We accommodate ourselves to measurements we’ve imposed, rather than starting with the body and working our way out, and although it may seem chauvinistic for us as a species, it’s just a way of acknowledging that we all experience the world with human eyes and hands.

The author's copy of Proust

That’s true of how we write as well. Poetic meter is intimately connected to the rhythms of circulation and respiration, and even in prose, a sentence is generally the longest stretch of words we can say before taking a breath. If we divide pages into paragraphs, it’s partially to provide a logical structure, but it’s also because our eyes can’t deal with an uninterrupted wall of text. Chapters and other divisions within a story were originally the size of a scroll that could be comfortably held, and more recently, they’ve come to represent a unit that can be easily read within a particular stretch of time, whether it’s a single sitting or an evening of reading aloud. Constraints that arise out of necessity or convenience inevitably acquire syntactic meaning. If we think in sentences or paragraphs, it’s because the necessity of breathing between statements encouraged us to break up what we said into manageable chunks, which also creates a natural pause for consolidation or comprehension. (This may be part of the reason why I’m irrationally opposed to semicolons. We talk in commas, dashes, and periods, and the semicolon is the mark of a text that is only meant to be read.)

You could extend these analogies to most other forms of creative expression, from painting to music and finally to dance, which is the ultimate culmination of an approach to art that bases its assumptions on the form and capabilities of the human body. Within every medium, of course, there are artists who deliberately undermine and challenge our senses of scale, and when we’re confronted with an unnaturally long sentence—whether in an action scene in Thomas Harris or an introspective paragraph by Proust—it changes the way we engage with the material, either because we forget to breathe or because we need to breathe all the more. But this requires an understanding and appreciation of why those proportions are there in the first place, and even for those of us who are largely content to operate within these constraints, it helps to occasionally reacquaint ourselves with where they come from. That’s one reason why writers are often encouraged to read their work aloud: not only does it oblige us to slow down, to pay more attention to rhythm, and to experience our words using a range of senses, but it reconnects us to the fundamental way in which ideas and images are expressed, moment by moment, and one breath at a time.

Written by nevalalee

March 19, 2014 at 9:44 am

Making it long

with 3 comments

In Search of Lost Time

Along with giving up movies and music, another consequence of becoming a new father is that I’ve found it increasingly hard to read long novels. Earlier this year, I started Infinite Jest for the first time, but I trailed off after a few hundred pages, not because I wasn’t enjoying it—I liked it a lot—but because it was becoming all but impossible for me to carve adequate reading time out of the limited hours in the day. Since then, I’ve read a lot of nonfiction, mostly for research, and a few shorter novels on the order of John D. MacDonald, but when I look at some of the larger volumes on my bookshelf, I feel a little daunted. I’m not sure when I’m going to have time for Life: A User’s Manual or The Tunnel or The Recognitions or any of the other big novels I bought years ago in full intention of reading them eventually. And although it’s possible that this year will turn out to be a fluke, it’s more likely that my reading life, like so many other things, has undergone a decisive shift. (Even my old trick of reading a big book on vacation may no longer work: it’s hard to balance Underworld in your hands when there’s also a baby strapped to your chest.)

Which is a shame, because I love big novels. This may sound strange coming from a writer who constantly preaches the values of cutting, but I can only report the facts: of the ten favorite novels I discussed here recently, fully half of them—In Search of Lost Time, The Magic Mountain, Gravity’s Rainbow, It, Foucault’s Pendulum—are enormous by any standard. I enjoy long novels for many of the same reasons it’s hard for me to read them these days: their sheer size forces you to give up a significant chunk of your life, and the psychic space they occupy can change the way you think, at least temporarily. When I first read Proust, there were moments when I felt that the events of the novel were objectively more real than anything I was doing at the time, which is something I suspect most readers of big books have experienced. Reading an enormous novel can start to feel like a second job, or an uncredited college class, or a stranger living in your house, especially once you’re been at it for a while. I spent something like a decade picking at The Gold-Bug Variations before finally finishing it, and even though I have mixed feelings about the novel itself, the emotions it evokes are still vivid, if only because it was a part of my life for so long.

Lawrence of Arabia

And length can affect the content of the novel itself in unexpected ways. Edward Mendelson, in his famous essay on encyclopedic narratives, notes that many of these big, insane books—Gargantua and Pantagruel, Moby-Dick—deal with literal or figurative giants, as if the novel is conducting a narrative battle with its own bulk, like Don Quixote fighting the windmill. This also runs in the opposite direction: a subject like a white whale deserves a whale of a novel. Even in books that tackle more intimate themes, length can be a statement or strategy in itself. I’ve noted before that In Search of Lost Time is both a modern version of The Thousand and One Nights and a novelette that expands itself infinitely in all directions, like a Japanese paper flower dropped in water, and it needs to unfold over multiple volumes: we might be able to abridge Dumas or Hugo, but an abridged version of Proust would be a contradiction in terms. Its length isn’t just a consequence of a longer series of events or a more complicated story, but a philosophy of life, or of reading, that can only find its full expression in the span of pages that a long novel provides.

We find much the same thing in other works of art, particularly movies. William Goldman says that if you can’t tell a story in an hour and fifty minutes, you’d better be David Lean, and even then, you don’t know if you’re going to get Lawrence of Arabia or Ryan’s Daughter. Really long movies tend toward the grandiose, as if its ambitions were expanding simultaneously in space and time, but certain stories, regardless of scale, need that room to breathe: I wouldn’t want to lose a minute of Seven Samurai or Barry Lyndon or Yi Yi. And there’s something about a long movie that encourages a different kind of contemplation. As Roger Ebert notes in his review of the six-hour Little Dorrit:

Very long films can create a life of their own. We lose our moorings. We don’t know exactly where we stand within the narrative, and so we can’t guess what will happen next. People appear and reappear, grow older and die, and we accept the rhythm of the story rather than requiring it to be speeded up.

Hence a movie like Shoah, whose nine-hour runtime becomes a part of its message: its quiet, systematic accumulation of detail begins to feel like the only valid response to the monstrousness of the story it tells. Length, at its best, can represent a vision of the world, and it can feel as big as the world itself—as long as we give it the attention it deserves.

Tomorrow: Keeping it short.

My ten great books #2: In Search of Lost Time

with 2 comments

In Search of Lost Time

(Note: For the rest of the month, I’m counting down the ten works of fiction that have had the greatest influence on my life as an author and reader, in order of their first publication. For earlier entries in the series, please see here.) 

The best clue I’ve found for understanding the work of Marcel Proust is Roger Shattuck’s observation, in his useful study Proust’s Way, that the author’s most immediate precursor is Scheherazade, the legendary author of The Thousand and One Nights. In Search of Lost Time has less in common with the novels we’re used to reading than the volumes of myths and fairy tales we devour in childhood, and I suspect that it would seem more accessible to the many readers who currently find it bewildering if, as Shattuck suggests, it had been called The Parisian Nights. Proust is a teller of tales, and like Homer, his work is infinitely expansible: an exchange that lasts for two lines in an oral epic like The Iliad could have been expanded—as it probably was for certain audiences—into an entire evening’s performance, and Homer uses his metaphors to introduce miniature narratives of human life that don’t otherwise fit into a poem of war. Proust operates in much the same way. One observation leads naturally to another, and an emotion or metaphor evoked in passing can unfold into three pages of reflections. In theory, any good novel could be expanded like this, like a hypertext that opens into increasingly intimate levels: In Search of Lost Time happens to be the only book in existence in which all of these flowerings have been preserved. Its plot could fit into a novella of two hundred unhurried pages, but we don’t read Proust for the plot, even if he knows more about suspense and surprise than you might intially expect. His digressions are the journey, and the result is the richest uninterrupted slice of a great writer’s mind that a work of fiction has afforded.

And the first thing you notice about Proust, once you’ve lived in his head for long enough, is that he has essential advice and information to share with us about everything under the sun. Proust is usually associated with the great twin themes of memory and time, and although these are crucial threads, they’re only part of a tapestry that gradually expands to cover all aspects of human life. At first, it seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape, not to mention a mine of insight into such diverse areas as art, class, childhood, travel, death, homosexuality, architecture, poetry, the theater, and how milk looks when it’s about to boil over, while also peopling his work with memorable characters and offering up a huge amount of incidental gossip and social reportage. When you look at it from another angle, though, it seems inevitable. Proust is the king of noticing, and he’s the author who first awakened me to the fact that a great novelist should be able to treat any conceivable topic with the same artistic and intellectual acuity. His only rival here is Shakespeare, but with a difference. Plays like Hamlet speak as much in their omissions and silences, leaving us to fill in the gaps. Proust, by contrast, says everything—it’s all there on the page for anyone who wants to unpack it—and you can’t emerge without being subtly changed by the experience. Like Montaigne, Proust gives us words to express thoughts and feelings we’ve always had, and if you read him deeply, you inevitably reach a point where you realize that this novel, which seemed to be about everything in the world, has been talking about you all along.

Written by nevalalee

September 24, 2013 at 9:00 am

The pleasures of underlining

with 6 comments

The author's copy of Proust

There are some readers who would never dream of marking up a book’s pristine pages, but I’m an inveterate underliner. In some ways, I don’t think I’ve really read a book until I’ve had a chance to go through it with a pen. Back in high school and college, I tended to underline books in their entirety, and when I look back at my old copies of Dante or The Anatomy of Melancholy, it can be hard to find an unmarked sentence. This might seem to defeat the practical purpose of highlighting selected passages, but I wasn’t thinking in terms of later reference: it was my way of blazing a trail, of reminding myself how far I’d gone into Dante’s dark forest. Underlining a phrase leaves a distinct, permanent signpost for my future self long after the details of the book have faded. These days, my memory for what I’ve read is spotty at best, but when I open a book and see a passage I’ve marked, I know for sure that I’ve been there.

But I’m a little more selective about what I underline now than I was a decade ago. With nonfiction, I tend to focus on striking facts or insights, especially if I think they might be helpful later, either because I might put them in a story or because they offer useful perspectives or advice. (Many of the Quotes of the Day on this blog were originally found this way.) When I’m doing research for a novel, underlining serves a clear purpose: I’ll usually read through the book once, marking whatever catches my eye, then go back over it again to transfer the major points onto notecards. I’ve found that it saves time to indicate important passages with a thin pen or pencil line in the margin, much as readers of an earlier era scored the page with their thumbnails, which allows me to quickly flip through the book to find what I’ve marked. And a passage that seemed only mildly interesting at the time can later turn out to have enormous resonance. When I’m trying to figure out the plot of a novel, I always go through my old notecards to see if there’s anything I can salvage, and something I wrote down in passing will often have an important role to play years later.

The author's copy of Walden

With fiction, the process is a little harder to pin down. The real test is whether I think an underlined passage will give me pleasure when I come back to it in the future, and I’ll often hesitate for a second before committing myself. It might seem like I’m overthinking it, but I’ve found that looking back through a book I’ve selectively underlined is one of my great joys as a reader. When I revisit my marked copies of Proust or Thoreau, with my eye skipping from one passage to the next, I hit all the high points at once, and whenever I’m reading this way, I never want to do anything else. Just opening The Magic Mountain at random, for instance, I find this:

On the whole, however, it seemed to him that although honor had its advantages, so, too, did disgrace, and that indeed the advantages of the latter were almost boundless.

Even more interesting is when I come across a passage that I don’t remember, and which at first glance doesn’t seem to hold much of interest. If I look more closely, however, I’ll often find that it struck me for reasons that have since lost their urgency, leaving a fossil or snapshot of my emotional life at the time. The result is the closest thing I have to an intellectual autobiography. When I underline a book, it becomes a part of me.

As a result, most of the books I’ve bought in the last ten years are full of highlighted passages, as well as notes on the endpapers, where I’ll often jot ideas or observations if I don’t have a notebook handy. (Don’t tell anyone, but I’ve even been known to lightly underline library books, although only in pencil, and I always go back to erase my work once I’m done.) And it isn’t nearly the same in a Kindle, although it can be interesting to see what other readers have marked. Underlining a physical book brings the hand and the mind into a sort of temporary harmony, and I often feel, rightly or not, that I’m reading more deeply or attentively when I’m holding a pen. Just as I think it’s important to use pen and paper whenever possible while writing, I take pains to keep reading a tactile experience: marking it by hand turns a book from one of thousands of identical objects into something that belongs to me alone, and in the end, it comes to feel like a living being, or a friend.

Written by nevalalee

May 7, 2013 at 9:50 am

In search of quick fixes

with 6 comments

Marcel Proust

If there’s one piece of advice that young writers receive more than any other, it’s that it can be dangerous to look for easy answers. Everyone dreams of an overnight success, and there are times when the need to get published and establish a name for yourself feels like a matter of life and death. Not surprisingly, many writers, especially younger ones, want to get the whole mess over with as soon as they possibly can. When I read posts by aspiring writers online, I’m often struck by the sense of urgency: they want to get published right now, and they’re hoping to discover a magic solution that will allow them to crack the problem of selling a book. The wise response, usually given by other writers who have been tackling the same challenge for years, is that it’s dangerous to seek a quick fix for something so amorphous as the writing life. If it takes ten thousand hours of practice or a million words to attain any degree of mastery, it isn’t a process you want to rush, and you need to be willing to settle in for an extended apprenticeship and long periods of doubt and frustration.

This is absolutely the right answer to give, and I’ve given it a few times myself. When I look back at my own life, however, I find that I’ve spent much of it in search of easy answers or overnight fame. I wrote my first novel at age thirteen, and when I was cranking it out on WordStar, I wasn’t thinking of it as an early stage in a long apprenticeship: I really wanted to write the best science-fiction novel of all time. Later, in high school, I got four hundred pages into an even more ambitious project, both because I wanted to get published as soon as possible and because I thought it might give me an edge in my college applications, which in retrospect seems like a rather misguided choice of extracurricular activities. And of all the projects I’ve attempted since then, finished or unfinished, published or unpublished, most of them were undertaken amid dreams of sudden glory, with what seemed like an urgent artistic deadline, usually in the form of an upcoming birthday. I knew intellectually that the writing life would be an extended process with as many defeats as triumphs. But each time I started a novel, I told myself that this one was going to be different.

Paul Graham

And I don’t necessarily think that this is the wrong approach to take. I’ve mentioned before that there’s a place for irrational optimism in the writing life: it’s such an uncertain, risky proposition that few writers would stick with it for long if they weren’t all convinced that they were the exception to the rule. As the venture capitalist Paul Graham has said: “One reason the young sometimes succeed where the old fail is that they don’t realize how incompetent they are.” And it’s important for young writers to overrate their own talents—or the odds of success for any particular project—because otherwise few debut novels would get written at all. Writing a novel is such a long, sometimes thankless process that you need to be convinced from the start that this is the project that will change your life. It rarely is, of course, and when that change finally happens, it never comes in quite the form you’ve been expecting. But as much as you may know this in your mind, you feel something else entirely in your gut. And that’s fine.

In the meantime, it’s that search for a quick fix that keeps you going, and when you look back, you often discover that you’ve learned a huge amount about craft almost by accident. Artistic maturity comes into being in the same way that Proust notes we get wisdom in other ways—as a result of countless small mistakes, and by surviving all the “fatuous and unwholesome incarnations” that we pass through along the line:

We are not provided with wisdom, we must discover it for ourselves, after a journey through the wilderness which no one else can take for us, an effort which no one can spare us, for our wisdom is the point of view from which we come at last to regard the world.

In short, as much as I tell young writers to avoid the search for quick fixes, I know they aren’t going to listen—and they shouldn’t. Because artistic maturity is really just the result of a lifetime looking in vain for ways to avoid it.

Written by nevalalee

May 6, 2013 at 8:03 am

Posted in Publishing, Writing

Tagged with ,

Better late than never: On the Road

leave a comment »

I’m not sure how I managed to avoid On the Road for more than thirty years. Part of it, I suppose, was the sense that I was already too old for it. The music critic Dorian Lynskey includes it along with Tropic of Cancer and The Magus on a list of books you should read before you’re eighteen or not at all, and he’s probably right. As a result, my knowledge of Kerouac never went beyond 10,000 Maniacs and “That’s not writing, that’s typing.” Yet I knew I had to confront this book one day. Its central question, as its admirers love to remind us, is how to live, and when you’ve decided to write for a living, this isn’t just an abstract philosophical question, but a matter of urgent survival. On a practical level, I’m interested in any serious attempt to lay out the rules of the game. And when I picked up On the Road at last, I was genuinely curious to see what Kerouac had to teach me.

And what I discovered, unfortunately, is that I’m no longer convinced by the vision of life that On the Road represents. It begins promisingly, with Sal’s epic journey from New York to San Francisco, but founders on the figure of Dean Moriarty, presented to us initially as a reckless romantic, but who is really a monster of selfishness and, ultimately, a bore. The central figures are feckless car thieves, pickpockets, and shoplifters who leave a string of broken relationships—and abandoned children—in their headlong rush across the country. There’s a lot of talk about freedom and the embrace of the unknown, but never a moment in which anyone takes the ultimate risk of real human connection that demands any kind of personal sacrifice. The strongest emotion is Sal’s momentary infatuation with a beautiful prostitute at a Mexican brothel, but before long, we’re on the road again, leaving her to live a life that we suspect is far more interesting that those of the men we’ve been following.

And yet On the Road contains moments that shine with beauty, insight, and truth. There’s a scene in which Sal and Dean end up in an all-night movie theater in Detroit and end up repeatedly watching Background to Danger with George Raft, Sydney Greenstreet, and Peter Lorre, until the movie takes up permanent residence in Sal’s brain:

We saw them waking, we heard them sleeping, we sensed them dreaming, we were permeated completely with the strange Great Myth of the West and the weird dark Myth of the East when morning came. All my actions since then have been dictated automatically to my subconscious by this horrible osmotic experience.

Kerouac is getting at something crucial here about how Hollywood and mass culture can shape our inner lives, and I wish he’d followed up on the hint, just as I wish we knew more about the insipid “mystery programs” that Marylou plays on the radio as they drive through the darkness of Texas.

What On the Road finally presents is a very limited version of life and its possibilities, and although Sal seems to acknowledge this by the end, I doubt that this is the message that the novel’s fans have taken away from it. It isn’t a model for the life of art, but a cautionary tale. Which isn’t to say that it isn’t worth reading, or even worth living for a time. Any book on how to live is necessarily constrained: Thoreau only lived at Walden Pond for two years, as a sort of contained experiment before moving on to a more conventional life, even as the traces of the sojourn still lingered. And what Kerouac gives us is a chronicle of the journey that every thinking person has to pass through on the way to something else, like the countless mistakes that Proust reminds us lie on the path to wisdom. In the end, Dean is still on the road, while Sal, like all writers, decides to settle for something more ordinary that will allow him to tell Dean’s story. And that’s where the true adventure begins.

Written by nevalalee

November 13, 2012 at 10:32 am

%d bloggers like this: