Posts Tagged ‘The New Yorker’
I’ve never owned a dictionary. Well, that isn’t precisely true. Looking around my bookshelves now, I can see all kinds of specialized dictionaries without leaving my chair, from Hobson-Jobson: The Anglo-Indian Dictionary to Partridge’s Dictionary of Slang and Unconventional English to Brewer’s Dictionary of Phrase and Fable. About a year ago, moreover, I was lucky enough to acquire not just a dictionary, but the dictionary. As much as I love my Compact Oxford English Dictionary, however, it isn’t exactly made for everyday use: the volumes are bulky, the print is too small to read without a magnifying glass, and it’s easy to get lost in it for hours when you’re just trying to look up one word. And as far as a conventional desk dictionary is concerned, I haven’t used one in a long time. My vocabulary is more than adequate for the kind of fiction I’m writing, and whenever I have to check a definition just to be on the safe side, there are plenty of online resources that I can consult with ease. So although I have plenty of other reference books, I just never saw the need for Webster’s.
But I was wrong. Or at least I’m strongly reconsidering my position after reading the latest in John McPhee’s wonderful series of essays on the writing life in The New Yorker. The most recent installment covers a lot of ground—it contains invaluable advice on how to write a rough draft, which McPhee says you should approach as if it were a letter to your mother, and includes a fascinating digression on the history of the magazine’s copy editors—but the real meat of the piece lies here:
With dictionaries, I spend a great deal more time looking up words I know than words I have never heard of—at least ninety-nine to one. The dictionary definitions of words you are trying to replace are far more likely to help you out than a scattershot wad from a thesaurus.
The emphasis is mine, but McPhee’s case speaks for itself. He explains, for instance, that he wrote the sentence “The reflection of the sun races through the trees and shoots forth light from the water” after seeing “to shoot forth light” in the dictionary definition of “sparkle.” And after struggling to find a way to describe canoeing, he looked up the definition of the word “sport” and found: “A diversion in the field.” Hence:
A canoe trip has become simply a rite of oneness with certain terrain, a diversion in the field, an act performed not because it is necessary but because there is value in the act itself.
As far as thesauruses go, McPhee calls them “useful things” in their proper place: “The value of a thesaurus is in the assistance it can give you in finding the best possible word for the mission that the word is supposed to fulfill.” In my own case, I tend to use a thesaurus most often in the rewrite, when I’m drilling down more deeply into the meaning of each sentence, and when issues of variety and rhythm start to take greater precedence. I rely mostly on the thesaurus function in Word and on an occasional trip to the excellent free thesauruses available online, where the hyperlinks allow me to skip more easily from one possible synonym to another. And although I recently found myself tempted by a copy of Roget’s at my local thrift store, I expect that I’ll stick to my current routine. (Incidentally, I’ve found that I tend to read thesauruses most obsessively when I’m trying to figure out the title for a novel, which is an exhausting process that needs all the help it can get—I vividly remember going to Thesaurus.com repeatedly on my phone while trying to find a title for what eventually became City of Exiles.)
But McPhee has sold me on the dictionary. After briefly weighing the possibility of picking up McPhee’s own Webster’s Collegiate, I ended up buying a used copy of the American Heritage Dictionary, since I remember it fondly from my own childhood and because it’s the dictionary most warmly recommended by the Whole Earth Catalog, which has never steered me wrong. It’s coming on Tuesday, and after it arrives, I wouldn’t be surprised if it took up a permanent place on my desk, next to my reference copies of my own novels and A Choice of Shakespeare’s Verse by Ted Hughes. Whether or not it will change my style remains to be seen, but it’s still something I wish I’d done years earlier. Dictionaries, as all writers know, are books of magic, and we should consult them as diligently as we would any religious text, an act, like canoeing, performed not because it is necessary but because there is value in the act itself. As Jean Cocteau says: “The greatest masterpiece in literature is only a dictionary out of order.”
Like most people, I’ll occasionally come up with what I’m convinced is a great invention or business idea. Now that I have a newborn daughter in the house, my brainstorms tend to center around products for babies: for instance, a reliable baby glove that won’t slip off within seconds of being pulled on, leaving my daughter’s sharp nails free to claw at her little face. I’m not particularly tempted to follow up on these ideas, of course, partially because products for babies can be hard to test and market—as an acquaintance of mine recently pointed out, the best idea in the world isn’t worth much after it sends one kid to the emergency room—and because I lack the skills and inclination to develop a business idea into something more. I’ve spent all my time learning how to write, and I suspect that a real entrepreneur would respond to my ideas for a killer baby app with the same impatience with which I regard people who insist that they’re full of great ideas for novels, if only they had the time to write them down.
Writers and entrepreneurs have at least one thing in common: success in either field rarely comes down to one magic idea. Rather, in the latter case, it’s the habit of entrepreneurship itself, and the ability to develop ideas and bring them to completion, that typifies the best startups. Paul Graham, the programmer, essayist, and venture capitalist I’ve quoted here before, likes to say that he’s investing in the personalities of the founders, not in the product they’re currently selling. A recent Vanity Fair piece by Randall Stross on Graham’s Y Combinator, a sort of startup boot camp where teams of young entrepreneurs pitch ideas for funding, points out that a team’s initial concept will often change between the time their application is accepted and the day of their actual interview. “We liked you guys more than the idea,” Graham tells one group at the start of a meeting, and he cautions that a company’s goals will often change as the founders figure out what problem they’re trying to solve. As the article notes:
Graham is much more interested in the founders than in the proposed business idea. When he sees a strong team of founders with the qualities that he believes favor success, he will overlook a weak idea.
And if a writer often resembles a kind of serial entrepreneur, it’s because he’s distinctive less because his ideas are better than anyone else’s—good ideas, as we all know, are cheap—than because he’s relentlessly resourceful, and knows what to do with an idea when he sees it. In short, he’s like Ron Popeil, the pitchman behind the Ronco Food Dehydrator, Mr. Microphone, and the Pocket Fisherman. As Malcom Gladwell observes in a famous New Yorker piece, Popeil isn’t just a salesman, but an inveterate tinkerer, the kind of man who might “lay awake at night thinking of a way to chop an onion so that the only tears you shed were tears of joy.” And because he has the skills not just to come with with an idea, but to package and market it, he’s done so again and again. Similarly, as a writer, I have plenty of room for improvement, but if there’s one thing I’ve learned, it’s that given a decent idea and sufficient time—a few weeks for a short story, nine months to a year for a novel—I can turn it into a finished manuscript. Whether anyone else will want to buy it is another matter. But as Stephen Sondheim says in another context, it will be a proper song.
Of course, the fact that I had to quit my job to figure out writing to my own satisfaction hints at another point of similarity. As Graham notes elsewhere:
Statistically, if you want to avoid failure, it would seem like the most important thing is to quit your day job. Most founders of failed startups don’t quit their day jobs, and most founders of successful ones do. If startup failure were a disease, the CDC would be issuing bulletins warning people to avoid day jobs.
Yet I wouldn’t say that quitting one’s day job is what makes a successful entrepreneur. More likely, it’s the other way around: true entrepreneurs, like writers, tend to be people who just aren’t happy doing anything else, so it’s only a matter of time before they decide to devote all of their energies to it. I probably could have learned how to write a decent novel while holding down another job on the side—and plenty of other authors have done so—but quitting my job, while not the cause, was certainly an effect of where my life ended up taking me. And although a writer’s life, like an entrepreneur’s, is hardly an easy one, it allows me to say, in Popeil’s enticing words: “But wait, there’s more…”
Learning about a writer’s outlining methods may not be as interesting as reading about his or her sex life, but it exercises a peculiar fascination of its own—at least for other writers. Everyone else probably feels a little like I did while reading Shawn McGrath’s recent appreciation of the beautiful source code behind Doom 3: I understood what he was getting at, but the article itself read like a dispatch from a parallel universe of lexical analyzers and rigid parameters. Still, the rules of good structure are surprisingly constant across disciplines. You don’t want more parts than you need; the parts you do have should be arranged in a logical form; and endless tinkering is usually required before the result has the necessary balance and beauty. And for the most part, the underlying work ought to remain invisible. The structure of a good piece of fiction is something like the structure of a comfortable chair. You don’t necessarily want to think about it while you’re in it, but if the structure has been properly conceived, your brain, or your rear end, will thank you.
In recent weeks, I’ve been lucky enough to read two enjoyable pieces of structure porn. The first is John McPhee’s New Yorker essay on the structure of narrative nonfiction; the second is Aaron Hamburger’s piece in the New York Times on outlining in reverse. McPhee’s article goes into his methods in great, sometimes laborious detail, and there’s something delightful in hearing him sing the praises of his outlining and text editing software. His tools may be computerized, but they only allow him to streamline what he’d always done with a typewriter and scissors:
After reading and rereading the typed notes and then developing the structure and then coding the notes accordingly in the margins and then photocopying the whole of it, I would go at the copied set with the scissors, cutting each sheet into slivers of varying size…One after another, in the course of writing, I would spill out the sets of slivers, arrange them ladderlike on a card table, and refer to them as I manipulated the Underwood.
Regular readers will know that this is the kind of thing I love. Accounts of how a book is written tend to dwell on personal gossip or poetic inspiration, and while such stories can be inspiring or encouraging, as a working writer, I’d much rather hear more about those slivers of paper.
And the reason I love them so much is that they get close to the heart of writing as a profession, which has surprising affinities with more technical or mechanical trades. Writing a novel, in particular, hinges partially on a few eureka moments, but it also presents daunting organizational and logistical challenges. A huge amount of material needs to be kept under control, and a writer’s brain just isn’t large or flexible enough to handle it all at once. Every author develops his or her own strategies for corralling ideas, and for most of us, it boils down to taking good notes, which I’ve compared elsewhere to messages that I’ve left, a la Memento, for my future self to rediscover. By putting our thoughts on paper—or, like McPhee does, in a computerized database—we make them easier to sort and retrieve. It looks like little more than bookkeeping, but it liberates us. McPhee says it better than I ever could: “If this sounds mechanical, the effect was absolutely the reverse…The procedure eliminated all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.”
This kind of organization can also take place closer to the end of the project, as Hamburger notes in his Times piece. Hamburger says that he dislikes using outlines to plan a writing project, and prefers to work more organically, but also observes that it can be useful to view the resulting material with a more objective, even mathematical eye. What he describes is similar to what I’ve called writing by numbers: you break the story down to individual scenes, count the pages or paragraphs, and see how each piece fits in with the shape of the story as a whole. Such an analysis often reveals hidden weaknesses or asymmetries, and the solution can often be as simple as the ten percent rule:
In [some] stories, I found that most of the scenes were roughly equal in length, and so cutting became as easy as an across-the-board budget cut. I dared myself to try to cut ten percent from each scene, and then assessed what was left. Happily, I didn’t always achieve my goal—because let’s face it, writing is not math and never should be. Yet what I learned about my story along the way proved invaluable.
I agree with this wholeheartedly, with one caveat: I believe that writing often is math, although not exclusively, and only as a necessary prop for emotion and intuition. Getting good ideas, as every writer knows, is the easy part. It’s the structure that makes them dance.
Last week, I read Daniel Mendelsohn’s touching account in The New Yorker of his youthful correspondence with Mary Renault, the author of The King Must Die and other novels set in ancient Greece. Mendelsohn’s tribute to her generosity is very moving, and it’s a story that I think every writer should read, if only to be reminded of how important even small acts of kindness to a fan can be, and the impact they can have on a young person’s life. Most readers will probably take the greatest interest in Mendelsohn’s discussion of how Renault’s novels, with their frank treatment of homosexuality, helped him come to terms with being gay, but I was even more struck by the fact that her books also inspired him to become a classicist. “The writers we absorb when we’re young bind us to them, sometimes lightly, sometimes with iron,” Mendelsohn writes. “In time, the bonds fall away, but if you look very closely you can sometimes make out the pale white groove of a faded scar, or the telltale chalky red of old rust.”
In a sense, the choice of an undergraduate major is one of the few reasonably pure decisions most of us ever make. The process of choosing a career, especially your first job, is usually constrained by many factors out of your control, but in theory, college presents a limitless—and dizzyingly accessible—range of possibilities. I still remember the heady thrill I felt while browsing through the course catalog as a freshman, and the realization that I really could become, say, an astrophysicist, if only I was willing to put in the necessary work. Obviously, going to a good college is a privilege that not everyone can afford, and our choices are probably more limited than they seem: I probably wouldn’t have made much of an astrophysicist, or psychologist, or any of the other professions that briefly seemed so enticing. Yet it’s one of the few times in our lives when we’re at least given the illusion of being able to influence our own fates, even if we’re often too young at the time to really know what we’re doing.
Like Mendelsohn, my decision to become a classicist was informed by the books I read growing up. First among them is the D’Aulaires’ Greek Myths, a volume I all but memorized in grade school, and which I still think is one of the ten best children’s books ever written. Like all great books for kids, it draws you in at first with its surface pleasures, especially its gorgeous illustrations, only to reveal surprising depths. It’s a wise, intelligent retelling of Greek mythology without a trace of condescension, and it taught me things that came in handy years later in my college classes: I’ll never forget my pride as the only student in my section who recognized an obscure reference to the story of Tithonus, who was transformed into a cicada when his lover, the goddess Eos, asked that he be granted eternal life, but forgot to ask for eternal youth as well. When one of my classmates asked how I knew this, I replied simply: “From D’Aulaires.” And I wasn’t alone: I know for a fact that many of my fellow concentrators could trace their love of classical literature to the same book.
And it was only the first in a long chain of books that led me further down the same path. I have a hunch that my urge to learn Latin and Greek was subconsciously influenced by the Indiana Jones trilogy, in which a knowledge of dead languages was clearly a prerequisite, as well as by my love of such authors as Robert Graves and Umberto Eco, for whom such proficiency was a given. Later, I was haunted by John Gardner’s admonition, in The Art of Fiction, that “the really serious-minded way” for a writer to build his vocabulary was to study classical and modern languages. As a result, when I got to college, I was primed to at least take a few courses in Classics, and would probably have ended up majoring in it anyway even if I hadn’t been given an extra quixotic push by the book Who Killed Homer? But the seeds had been planted long before. My life, like Mendelsohn’s, would have been completely different if you’d taken away only five or six books that I read almost by accident. And I wear their faded scars with pride.
Earlier this week, I read Keith Gessen’s fascinating account in The New Yorker of the voyage of the Nordic Odyssey, a bulk carrier with a load of iron ore that sailed from Russia to China through the melting Arctic ice. Gessen notes that one of the greatest challenges on a month-long voyage like this is boredom: deprived of email and alcohol, crew members tend to spend their time playing solitaire, watching downloaded television shows on their laptops, and engaging in epic ping-pong matches. Reading this, I began to daydream of the books I would take on such a voyage. I often like to ask myself what books I would keep if I were compelled, for reasons of space, weight, or minimalism, to restrict myself to a few compact volumes, and recently I’ve been thinking about this a lot, perhaps because, with a baby in the house, I’m not sure when I’ll take such a trip again. The books I’d bring would need to be dense, open to rereading, and small enough to fit in a small suitcase or backpack—which means that I’d need to leave Proust and The Annotated Sherlock Holmes behind. At the moment, in my private reveries, this is my traveler’s library:
1. Zen in English Literature and Oriental Classics. I’ve long been a devotee of R.H. Blyth’s eccentric, prickly masterpiece, but I’ve gradually come to see it as one of my indispensable books, and perhaps the only one I’d bring with me if, for whatever reason, I had to spend a year or two reading nothing but what I could carry. At heart, it’s an opinionated, sometimes cranky philosophy of life that owes as much to Jesus, Wordsworth, and Shakespeare as to the Zen of the title, and it comes closer than any other book I’ve found to laying out the virtues I strive, with mixed success, to apply to my own life: simplicity, detachment, and objectivity. It’s a distilled anthology of some of the world’s best poetry and prose, both east and west; a spiritual handbook; a guide to literary and artistic expression; and a mine of practical wisdom. I’ve turned to it constantly in recent years, in both good and difficult times, and it’s been an unfailing source of inspiration and pleasure. For decades, it’s been out of print and difficult to find, but I see with some satisfaction that an inexpensive reprint edition should be available in February. Pick it up if you can—I don’t think you’ll regret it.
2. The Five Gospels. Even if you’re an agnostic like me, it’s hard to deny that the gospels contain some of the most compelling distilled wisdom in all of literature, even if their message tends to be lost in interpretation and transmission. The genius of the Jesus Seminar has been its commitment to teasing out the core of the original teachings, using a sort of best consensus—based on a majority vote—of textual and historical criticism, and their findings are elegantly presented in this book, which prints the texts of Mark, Matthew, Luke, John, and Thomas with colored annotations to indicate various degrees of perceived authenticity. You can quarrel with their methodology and assumptions, and many have, but it’s still riveting to be presented with what certainly feels like the center of what remains the most challenging of all ethical paths, if we’re willing to read it as closely as it demands: “Turn the other cheek.” “Blessed are the poor.” “Walk the second mile.” “Love your enemy.” And underlying it all is the seminar’s pithy admonition, which we’d all be advised to take: “Beware of finding a Jesus entirely congenial to you.”
3. A Pattern Language. Any great work of philosophy should also be full of useful advice, and the beauty of Christopher Alexander’s classic book—which is the best work of nonfiction of the past fifty years—is that it begins with a vision of the world on the level of nations and cities and brings it down to open shelving and window seats, while managing to remain a seamless whole. Reading it, it’s hard not to fall under the spell of its language and rhythms, which are simultaneously logical, soothing, and impassioned, and quickly come to seem like the voice of a trusted guide and friend. It’s primarily about architecture, but it’s impossible not to apply its lessons to all other aspects of one’s life, from political engagement to writing to web design. Each entry leads to countless others, while also inviting sustained thought and meditation. If I could give only one book to President Obama to read, this would be the one, and it’s also the book, above all others, that seems to offer the best tools to construct a meaningful life of one’s own, whether at home, on the road, or in a cabin on a ship in the Arctic Sea.
I tried to imagine a fella smarter than myself. Then I tried to think, “What would he do?”
—David Mamet, Heist
Writers, by definition, are always trying to punch above their weight. When you sit down to write a novel for the first time, you’re almost comically inexperienced: however many books you’ve read or short stories you’ve written, you still don’t know the first thing about structuring—or even finishing—a long complicated narrative. Yet we all do it anyway. This is partly thanks to the irrational optimism that I’ve said elsewhere is a crucial element of any writer’s psychological makeup, in which we’re inclined to believe that we’re smarter and more prepared than we actually are. There’s nothing wrong with this; it’s the only way any of us will ever grow as writers, as we slowly evolve into the level of competence we’ve imagined for ourselves. Still, in any project, there always comes a time when a writer, however experienced, realizes that he’s taken on more than he can handle. The story is there, unwritten, and it’s beautiful in his head, but lost in the translation to the printed page. One day, he hopes, he’ll be good enough to realize it, but that doesn’t help him now. What he really needs is a way to temporarily become a better writer than he already is.
This may sound like witchcraft, but in reality, it’s something that writers do all the time. When we start out, we have no choice but to imitate the artists we admire, because when we set out to write that first page, we lack the experience of life and craft that only years of work can bring. Eventually, we move past imitation to find a voice and style of our own, but there are still times when we find ourselves compelled to channel the spirit of our betters. We do this when we start each day by reading a few pages from the work of a writer we like, or when we approach a tough moment in the plot by asking ourselves what Updike or Thomas Harris in his prime would do. Some of us go even further. In this week’s issue of The New Yorker, James Wood talks about a friend who became so obsessed by the work of the Norwegian writer Per Petterson that he copied out one of his novels word for word. This isn’t about stylistic plagiarism or slavish imitation, but a kind of sympathetic magic, a hope that we can conjure up the spirit of a more experienced writer just long enough to solve the problems in front of us.
And the act of imitation itself can lead to surprising places. There’s a great deleted scene from the notorious documentary The Aristocrats in which Kevin Pollak delivers the titular joke in the style of Albert Brooks. After milking it for two delicious minutes, he takes a sip of coffee and says:
That’s the trippy thing about doing Brooks, though—I’m faster and funnier than I am as myself. It’s very, very sad. It’s a possession. I hate to do it because, literally, I’m listening to myself and thinking, “Why am I never this funny?”
I’m not a huge Kevin Pollak fan, but I love this clip, because it gets at something important and mysterious about the way artistic imitation works. Pollak is a skilled mimic who does a good, if not great, impression of Albert Brooks on all the superficial levels—his vocal tics, his tone, the way he holds his face and body. Somewhere along the line, though, these surface impressions work a deeper transformation, and he finds himself temporarily thinking like Brooks. This is why typing out the work of a writer we admire can be so helpful: there’s no better way of opening a window, even just for a crucial moment or two, into someone else’s brain.
The best kind of imitation, as Pollak says, is a possession, in which we will ourselves, almost unconsciously, into becoming better artists than we really are. Imitation can become dangerous, however, when we focus on the superficial without also channeling more fundamental habits of mind. This morning, while watching the new teaser trailer for Star Trek: Into Darkness, which clearly takes many of its cues from the recent films of Christopher Nolan, I was amused by the thought that while Nolan has done more than any contemporary director to push the envelope of visual and narrative complexity in mainstream movies, the big takeaway for other filmmakers—or at least those who assemble the trailers—has apparently been a “BWONG” sound effect. But big influences can arise from small beginnings. The qualities that most deserve imitation in the artists we admire have little to do with the obvious trademarks of their style, and if we imitate those aspects alone, we’re just being derivative. But sometimes it’s those little things that allow us to temporarily acquire the mindset of smarter artists than ourselves, until, finally, we’ve made it our own.
A few months ago, I wrote a piece for Salon on whether there was such a thing as a New Yorker feature curse. I was largely inspired by the example of John Carter, in which the magazine’s highly positive profile of director Andrew Stanton was followed shortly thereafter by a debacle that deserves its own book, like Final Cut or The Devil’s Candy, to unpack in all its negative glory. Judging from the response, a lot of readers misunderstood the piece, with one commenter sniffing that I should read Carl Sagan’s The Demon-Haunted World before spreading so much superstition. My point, to the extent I had one, was that the New Yorker curse, like its counterpart at Sports Illustrated, was likely a case of regression to the mean: magazines like this have only a limited amount of feature space to devote to the movies, which means they tend to pick artists who have just had an outstanding outlier of a success—which often means that a correction is on the way. And although my theory has been badly tested by Seth MacFarlane’s Ted, which is now the highest-grossing R-rated comedy of all time, at first glance, the recent failure of Cloud Atlas, which follows a fascinating profile of the Wachowskis by Aleksandar Hemon, seems to indicate that the curse is alive and well.
Yet at the risk of sounding exactly as arbitrary as my critics have accused me of being, I can’t quite bring myself to lump it into the same category. This isn’t a movie like John Carter, which was undermined by a fundamentally flawed conception and a lot of tactical mistakes along the way. Cloud Atlas has its problems, but as directed by the Wachowskis and Tom Tykwer after the novel by David Mitchell, it’s a real movie, an ambitious, entertaining, often technically spellbinding film that probably never had a shot at finding a large popular audience. I’m not a huge fan of the Wachkowskis, who over the past decade have often seemed more intelligent in their interviews than in their movies, but I give them and Tykwer full credit for pursuing this dazzling folly to its very end. Cloud Atlas is like The Tree of Life made in a jazzy, sentimental, fanboyish state of mind, and although it doesn’t succeed entirely, under the circumstances, it comes closer than I ever expected. It’s the kind of weird, personal, expensive project that gives fiascos a good name, and it’s one of the few movies released this year that I expect to watch again.
And with one exception, which I’ll mention in a moment, the movie’s flaws are inseparable from its fidelity to the underlying material. I liked Mitchell’s novel a lot, and as with the movie it inspired, it’s hard not to be impressed by the author’s talents and ambition. That said, not all of its nested novelettes are equally interesting, and its structure insists on a deeper network of resonance that isn’t always there. Some of its connections—the idea that Somni-451 would become a messianic figure for the world after the fall, for instance, or that she’d want to spend her last few moments in life catching up with the story of Timothy Cavendish—don’t quite hold water, and in general, its attempts to link the stories together symbolically, as with the comet-shaped birthmark that its primary characters share, are too facile to be worthy of Mitchell’s huge authorial intelligence. (You only need to compare Cloud Atlas to a book like Dictionary of the Khazars, which does keep the promises its structure implies, to see how the former novel falls short of the mark.) And the movie suffers from the same tendency to inform us that everything here is connected, when really, they’re simply juxtaposed in the editing room.
All the same, the movie, like the book, is one that demands to be experienced. There are a few serious lapses, most unforgivably at the end, in which we’re given a new piece of information about the frame story—not present in the original novel—in the clumsiest way imaginable. For the most part, however, it’s fun to watch, and occasionally a blast. Somewhat to my surprise, my favorite sequences were the ones directed by Tykwer, an unreliable director who also offered up one of the best action scenes in recent years with the Guggenheim shootout in The International: he gives the Louisa Rey narrative a nice ’70s conspiracy feel, and the story of Timothy Cavendish, which I thought was unnecessary in the novel, turns out to be the most entertaining of all. (A lot of this is due to the presence of Jim Broadbent, who gives the best performance in the movie, and one of the few not hampered by elaborate but frequently distracting makeup.) The Wachowskis can’t do much with the journal of Adam Ewing, but the futuristic ordeal of Somni-451 is right in their wheelhouse. It’s a movie that takes great risks and succeeds an impressive amount of the time. And as far as I’m concerned, the curse is broken. At least for now.
In her fascinating New Yorker profile of the author Hilary Mantel, Larissa MacFarquhar writes: “What kind of person writes fiction about the past? It is helpful to be acquainted with violence, because the past is violent.” Of course, the present can be violent, too, along with many of our visions of the future, and one of the hardest things about becoming a writer, at least for me, is coming to terms with the depiction, meaning, and significance of fictional violence. I’m about as mild-mannered a personality as they come, but I’ve found myself working in a genre utterly predicated on the anticipation of violence and the occasional violent payoff. Both The Icon Thief and City of Exiles have high body counts, and Eternal Empire doesn’t seem likely to break the pattern. There’s a moment fairly early in my third novel in which an innocent person meets an unfortunate fate, and several readers, including my wife, have pointed to this scene as particularly shocking. But I can’t see any way around it. For the narrative stakes here to have any meaning, the reader needs to know that no one is safe. As a result, although I’m a fairly cerebral novelist at heart, I’ve ended up writing about violence more than I ever expected.
I’ve been thinking about this a lot ever since finally finishing Cormac McCarthy’s Blood Meridian, which had been on my short list of novels to read for a long time. Until now, the only McCarthy I’d read was The Road, another devastatingly bleak and violent novel, but Blood Meridian goes much further: it’s essentially an epic meditation on violence, with more atrocities per page than any other story I can name. It’s a challenging book, hugely readable from paragraph to paragraph but often wearying as a whole, which is an inextricable part of McCarthy’s conception. The violence here is deliberately depicted without the usual payoffs: the narrative doesn’t build in any conventional sense, but instead carries the reader along through sheer rhetorical and symbolic power. This isn’t an unstructured novel by any means, but the structure is paratactic rather than periodic—the plot doesn’t advance so much as proceed inexorably from one bloody set piece to the next. McCarthy’s strategy is to give us as little context as possible: the book is based on real but largely forgotten historical events, and its characters spend much of the story moving through vast, featureless deserts where even the constellations cease to take their familiar shapes.
I’m not especially interested in delving into the allegorical depths of McCarthy’s story, and I’m not even sure he has much to say about the role of violence in American history, any more than Kubrick does in The Shining. As a writer, I’m more inclined to consider the book on its most fundamental level, as the work of a novelist of formidable gifts confronting the narrative problem of violence. I’ve mentioned McCarthy’s language, which, like all great styles, is often vulnerable to parody. Its central strength, however, and the one that unites its many registers of tone—from brutal realism to Biblical sonority—is its specificity. McCarthy delights in arcane but evocative proper names for animals, weaponry, landscape, and it’s hard for the reader not to be caught up in that spell:
…all about in that circle attended companies of lesser auxiliaries routed forth into the inordinate day, small owls that crouched silently and stood from foot to foot and tarantulas and solpugas and vinegarroons and the vicious mygale spiders and beaded lizards with mouths black as a chowdog’s, deadly to man, and the little desert basilisks that jet blood from their eyes and the small sandvipers like seemly gods, silent and the same, in Jedda, in Babylon.
And this specificity is central to the novel’s treatment of violence. McCarthy shows us horrible things, but he’s no more or less specific in the details of mass scalpings as he is in describing the way a man’s shadow looks against a rockface, and it’s all part of the same rich narrative fabric. (This is why the idea of a film adaptation is so daunting: it would take a director of genius to find a visual equivalent for McCarthy’s prose, without resorting to the sorry compromise of voiceover.)
In the end, however, McCarthy, like Shakespeare, is most evocative when he falls silent. After more than three hundred pages of detailed bloodshed, the book ends, famously, on a note of ambiguity: the final encounter with Judge Holden is left to our imagination, and rightly so—it’s one of the most terrifying moments I’ve ever read in a book, the embodiment of the fear that our childhood nightmares will still be waiting for us, when we least expect it, years or decades after we thought we left them behind. And the moment when the author finally turns his eyes away wouldn’t be nearly as effective if the entire novel hadn’t methodically demonstrated that McCarthy can describe violence as well as any writer who has ever lived. The most frightening image in the world, as Stephen King once observed, is a closed door. McCarthy knows this, too, and it’s a measure of his shrewdness that as soon as that door opens, it closes immediately—and shoots the wooden barlatch home, locking us out of what follows. And although I have no way of knowing this, I suspect that McCarthy wrote all of Blood Meridian with that closed door in mind, systematically showing us everything to prepare us for the moment when he shows us nothing, leaving us with that one simple sentence: “Then he opened the door and looked in.”
If you didn’t have a sense of how people spoke, you didn’t know them well enough, and so you couldn’t—you shouldn’t—tell their story. The way people spoke, in short, clipped phrases or long, flowing rambles, revealed so much about them: their place of origin, their social class, their temperament, whether calm or angry, warm-hearted or cold-blooded, foulmouthed or polite; and, beneath their temperament, their true nature, intellectual or earthy, plainspoken or devious, and, yes, good or bad.
Back in June, when it was first revealed that Jonah Lehrer had reused some of his own work without attribution on the New Yorker blog, an editor for whom I’d written articles in the past sent me an email with the subject line: “Mike Daisey…Jonah Lehrer?” When he asked if I’d be interested in writing a piece about it, I said I’d give it a shot, although I also noted: “I don’t think I’d lump Lehrer in with Daisey just yet.” And in fact, I’ve found myself writing about Lehrer surprisingly often, in pieces for The Daily Beast, The Rumpus, and this blog. If I’ve returned to Lehrer more than once, it’s because I enjoyed a lot of his early work, was mystified by his recent problems, and took a personal interest in his case because we’re about the same age and preoccupied with similar issues of creativity and imagination. But with the revelation that he fabricated quotes in his book and lied about it, as uncovered by Michael C. Moynihan of Tablet, it seems that we may end up lumping Lehrer in with Mike Daisey after all. And this makes me very sad.
What strikes me now is the fact that most of Lehrer’s problems seem to have been the product of haste. He evidently repurposed material on his blog from previously published works because he wasn’t able to produce new content at the necessary rate. The same factor seems to have motivated his uncredited reuse of material in Imagine. And the Bob Dylan quotes he’s accused of fabricating in the same book are so uninteresting (“It’s a hard thing to describe. It’s just this sense that you got something to say”) that it’s difficult to attribute them to calculated fraud. Rather, I suspect that it was just carelessness: the original quotes were garbled in editing, compression, or revision, with Lehrer forgetting where Dylan’s quote left off and his own paraphrase begin. A mistake entered one draft and persisted into the next until it wound up in the finished book. And if there’s one set of errors like this, there are likely to be others—Lehrer’s mistakes just happened to be caught by an obsessive Dylan fan and a very good journalist.
Such errors are embarrassing, but they aren’t hard to understand. I’ve learned from experience that if I quote something in an article, I’d better check it against the source at least twice, because all kinds of gremlins can get their claws into it in the meantime. What sets Lehrer’s example apart is that the error survived until the book was in print, which implies an exceptional amount of sloppiness, and when the mistake was revealed, Lehrer only made it worse by lying. As Daisey recently found out, it isn’t the initial mistake that kills you, but the coverup. If Lehrer had simply granted that he couldn’t source the quote and blamed it on an editing error, it would have been humiliating, but not catastrophic. Instead, he spun a comically elaborate series of lies about having access to unreleased documentary footage and being in contact with Bob Dylan’s management, fabrications that fell apart at once. And while I’ve done my best to interpret his previous lapses as generously as possible, I don’t know if I can do that anymore.
In my piece on The Rumpus, I said that Lehrer’s earlier mistakes were venial sins, not mortal ones. Now that he’s slid into the area of mortal sin—not so much for the initial mistake, but for the lies that followed—it’s unclear what comes next. At the time, I wrote:
Lehrer, who has written so often about human irrationality, can only benefit from this reminder of his own fallibility, and if he’s as smart as he seems, he’ll use it in his work, which until now has reflected wide reading and curiosity, but not experience.
Unfortunately, this is no longer true. I don’t think this is the end of Lehrer’s story: he’s undeniably talented, and if James Frey, of all people, can reinvent himself, Lehrer should be able to do so as well. And yet I’m afraid that there are certain elements of his previous career that will be closed off forever. I don’t think we can take his thoughts on the creative process seriously any longer, now that we’ve seen how his own process was so fatally flawed. There is a world elsewhere, of course. And Lehrer is still so young. But where he goes from here is hard to imagine.
Last week, The Rumpus published an essay I’d written about Jonah Lehrer, the prolific young writer on science and creativity who had been caught reusing portions of previously published articles on his blog at The New Yorker. I defended Lehrer from some of the more extreme charges—for one thing, I dislike the label “self-plagiarism,” which misrepresents what he actually did—and tried my best to understand the reasons behind this very public lapse of judgment. And while only Lehrer really knows what he was thinking, I think it’s fair to conclude, as I do in my essay, that his case is inseparable from the predicament of many contemporary writers, who are essentially required to become nonstop marketers of themselves. The acceleration of all media has produced a ravenous appetite for content, especially online, forcing authors to run a Red Queen’s race to keep up with demand. And when a writer is expected to blog, publish articles, give talks, and produce new books on a regular basis, it’s no surprise if the work starts to suffer.
The irony, of course, is that I’m just as guilty of this as anyone else. I think of myself primarily as a novelist, but over the past couple of years, I’ve found myself wearing a lot of different hats. I blog every day. I work as hard as possible to get interviews, panel discussions, and radio appearances to talk about my work. I’ve been known to use Twitter and Facebook. And I publish a lot of nonfiction, up to and including my essay at The Rumpus itself. I do it mostly because I like it—and I like getting paid for it when I can—but I also do it to get my name out there, along with, hopefully, the title of my book. I suspect that a lot of other writers would say the same thing, and that few guest reviews, essays, or opinion pieces are ever published without some ulterior motive on the part of the author, especially if that author happens to have a novel in stores. And while I think that most readers are aware of this, and adjust their perceptions accordingly, it’s also worth asking what this does to the writer’s own work.
The process of marketing puts any decent writer in a bind. To become a good novelist, you need to develop a skill set centered on solitude and introversion: you have to be physically and emotionally capable of sitting at a desk, alone, without distraction, for weeks or months at a time. The instant your novel comes out, however, you’re suddenly expected to develop the opposite set of skills, becoming extroverted, gregarious, and willing to invest huge amounts of energy into selling yourself in public. Very few writers, aside from the occasional outlier like Gore Vidal or Norman Mailer, have ever seemed comfortable in both roles, which create a real tension in a writer’s life. As I note in my article on Lehrer, the kind of routine required of most mainstream authors these days is antithetical to the kind of solitary, unrewarding activity needed for real creative work. Creativity requires uninterrupted time, silence, and the ability to concentrate on one problem to the exclusion of everything else. Marketing yourself at the same time is more like juggling, or, even better, like spinning plates, with different parts of your life receiving more or less attention until they need a nudge to keep them going.
When an author lets one of the plates fall, as Lehrer has done so publicly, it’s reasonable to ask whether the costs of this kind of career outweigh the rewards. I’ve often wondered about this myself. And the only answer I can give is that none of this is worth doing unless the different parts give you satisfaction for their own sake. There’s no guarantee that any of the work you do will pay off in a tangible way, so if you spend your time on something only for its perceived marketing benefits, the result will be cynical or worse. And my own attitudes about this have changed over time. This blog began, frankly, as an attempt to build an online audience in advance of The Icon Thief, but after blogging every day for almost two years, it’s become something much more—a huge part of my identity as a writer. The same is true, I hope, of my essays and short fiction. No one piece counts for much, but when I stand back and take them all together, I start to dimly glimpse the shape of my career. I wouldn’t have done half of this without the imperatives of the market. And for that, weirdly, I’m grateful.
The writing impulse seeks its own level and isn’t always given a chance to find it. You can’t make up your mind in Comp Lit class that you’re going to be a Russian novelist. Or even an American novelist. Or a poet. Young writers find out what kinds of writers they are by experiment. If they choose from the outset to practice exclusively a form of writing because it is praised in the classroom or otherwise carries appealing prestige, they are vastly increasing the risk inherent in taking up writing in the first place. It is so easy to misjudge yourself and get stuck in the wrong genre. You avoid that, early on, by writing in every genre. If you are telling yourself you’re a poet, write poems. Write a lot of poems. If fewer than one work out, throw them all away; you’re not a poet. Maybe you’re a novelist. You won’t know until you have written several novels…
I have always thought that Ben Jonson must have had young writers in mind when he said, “Though a man be more prone and able for one kind of writing than another, yet he must exercise all.” Gender aside, I take that to be a message to young writers.
Over the past few days, I’ve been devouring the book Thinking, Fast and Slow by Nobel laureate Daniel Kahneman, which I’d mentioned here before but only recently got around to reading. It is, as promised, rife with fascinating insights and stories—my wife says that I seem to have underlined every sentence—and I’m still only halfway through. In particular, Chapter 17, “Regression to the Mean,” is one that everyone should read, even if it’s just standing up at Barnes & Noble. The chapter is only ten pages long, but it’s packed with more useful insights than a shelf of ordinary books, and I can all but guarantee that it will subtly change the way you think about a lot of things. The key passage, at least to my eyes, is one that begins with Kahneman sharing what he calls his favorite equation:
Success = talent + luck
Great success = a little more talent + a lot of luck
This is something that most of us know intuitively, but Kahneman takes it one step further. Basically, if we accept the premise that a single instance of exceptionally good performance is due largely to luck—or, more precisely, to positive factors outside the performer’s control—then our best guess about the next performance is that it won’t be quite as good, as the performer’s luck regresses to the mean. We can’t predict anything about luck except for the fact that, in general, it will be more or less average. As a result, someone who has excellent luck on one occasion, like an athlete who makes a great ski jump, will probably only have average luck the next time out—and the better the original performance, the more extreme the regression will be. And while we might be tempted to ascribe all kinds of causal factors to the change, it’s really nothing but simple mathematics.
This is obviously true of sports, given the important role that luck plays in most sporting events, but it’s also fascinating to think about its implications for the arts. In particular, regression to the mean is the most likely explanation for what I call “the New Yorker feature curse” in my recent article in Salon. When we interview movie stars or directors based on a recent great success, it’s likely that we’ve caught them just before they regress to the mean, which is why their next project—the one we’ve spent the entire article extolling—often seems like a relative disappointment. And this has nothing to do with the talent of the subjects involved. The movies are such a volatile business that even successful filmmakers can only be expected to succeed perhaps half the time, so it shouldn’t be surprising when a big success is followed by a movie that seems like a failure in comparison, and vice versa. For a particularly stark example, one need look no further than the recent career of Woody Allen, who, in Match Point, had a character say:
The man who said “I’d rather be lucky than good” saw deeply into life. People are afraid to face how great a part of life is dependent on luck. It’s scary to think so much is out of one’s control. There are moments in a match when the ball hits the top of the net, and for a split second, it can either go forward or fall back. With a little luck, it goes forward, and you win. Or maybe it doesn’t, and you lose.
And this applies to literature as well. If athletes have the Sports Illustrated cover jinx and directors have the New Yorker curse, novelists have second-novel syndrome: the big debut novel followed by a sophomore slump. We like to ascribe all kinds of causal explanations to this—pressure, time constraints, authorial self-indulgence—but most often, it’s just another case of regression to the mean. Luck, as I’ve learned firsthand, plays an enormous role in a book’s publication and reception, and it’s mathematically unsound to expect lightning to strike twice. This is true, most obviously, of a book’s commercial prospects, but also, oddly, of its artistic merits. Luck plays a larger role in a novel’s quality than many of us would like to admit: like ski jumpers and golf players, we benefit from moments of serendipity and inspiration that may never return. Until, of course, we try again.
One way you can describe the collapse of the idea of the future is the collapse of science fiction. Now it’s either about technology that doesn’t work or about technology that’s used in bad ways. The anthology of the top twenty-five sci-fi stories in 1970 was, like, “Me and my friend the robot went for a walk on the moon,” and in 2008 it was, like, “The galaxy is run by a fundamentalist Islamic confederacy and there are people who are hunting planets and killing them for fun.”
A few years ago, I woke up with the startling realization that of all my friends from college, I was by far the least educated. I don’t mean that in any kind of absolute sense, but simply as a matter of numbers: most of my college friends went on to get master’s or professional degrees, and many of them have gone much further. By contrast, I, who loved college and would happily have spent the rest of my life in Widener Library, took my bachelor’s degree and went looking for a job, with the idea that I’d go back to school at some point after seeing something of the larger world. The reality, of course, was very different. And while I don’t regret any of the choices I’ve made, I do sometimes wonder if I might have benefited from, or at least enjoyed, some sort of postgraduate education.
Of course, it’s also possible that even my bachelor’s degree was a bad investment, a sentiment that seems increasingly common these days. College seniors, we’re frequently reminded, are graduating into a lousy job market. As Louis Menand points out in this week’s New Yorker, it’s unclear whether the American college system is doing the job it’s intended to do, whether you think of it primarily as a winnowing system or as a means of student enrichment. And then we have the controversial Thiel Fellowship, which is designed to encourage gifted entrepreneurs to drop out of college altogether. One of the fellowship’s first recipients recently argued that “higher education is broken,” a position that might be easier to credit if he wasn’t nineteen years old and hadn’t just received a $100,000 check to drop out of school. Which doesn’t necessarily make him wrong.
More interesting, perhaps, is the position of David Mamet, whose new book The Secret Knowledge includes a remarkable jeremiad against the whole idea of a liberal education. “Though much has been made of the necessity of a college education,” Mamet writes, “the extended study of the Liberal Arts actually trains one for nothing.” Mamet has said this before, most notably two years ago in a speech at Stanford University, where he compared the process of higher education to that of a laboratory rat pulling a lever to get a pellet. Of course, he’s been saying the same thing for a long time with respect to the uselessness of education for playwrights (not to mention ping-pong players). And as far as playwrights are concerned, I suspect he may be right, although he gets into trouble when he tries to expand the argument to everyone else.
So is college useful? In particular, is it useful for aspiring members of the creative class? Anecdotal information cuts both ways: for every Tom Stoppard, who didn’t go to college at all, there’s an Umberto Eco, who became a famous novelist after—and because of—a lifetime of academic achievement. Considered objectively, though, the answer seems to lie somewhere in the middle. In Origins of Genius, Dean Simonton writes:
Indeed, empirical research has often found that achieved eminence as a creator is a curvilinear, inverted-U function of the level of formal education. That is, formal education first increases the probability of attaining creative success, but after an optimum point, additional formal education may actually lower the odds. The location of this peak varies according to the specific type of creativity. In particular, for creators in the arts and humanities, the optimum is reached in the last two years of undergraduate instruction, whereas for scientific creators the optimum may be delayed until the first couple of years of graduate school. [Italics mine.]
Which implies that a few years of higher education is useful for artists, since it exposes them to interesting people and gives them a basic level of necessary knowledge, but that too much is unhelpful, or even damaging, if it encourages greater conformity. The bottom line, not surprisingly, is that if you want to be a writer, yes, you should probably go to college. But that doesn’t mean you need to stay there.
[George R.R. Martin, author of A Song of Ice and Fire] enjoys being surprised by his own work. He thinks of himself as a “gardener”—he has a rough idea of where he’s going but improvises along the way. He sometimes fleshes out only as much of his imaginary world as he needs to make a workable setting for the story. Tolkien was what Martin calls an “architect.” Tolkien created entire languages, mythologies, and histories for Middle-earth long before he wrote the novels set there. Martin told me that many of his fans assume that he is as meticulous a world-builder as Tolkien was. “They write to say, ‘I’m fascinated by the languages. I would like to do a study of High Valyrian’”—an ancient tongue. “‘Could you send me a glossary and a dictionary and the syntax?’ I have to write back and say, ‘I’ve invented seven words of High Valyrian.’”
—Laura Miller, in The New Yorker
Yesterday I finally got around to reading Rebecca Mead’s New Yorker piece on Middlemarch—by all odds the most intelligent novel ever written—and its influence on her own life. If you’re a subscriber, Mead’s article is well worth reading in full (especially for her discussion of an inspirational quotation inexplicably misattributed to George Eliot, a subject on which I have some strong opinions), but I was struck in particular by her thoughts on how her attitudes toward the book have changed over time. Mead writes:
I have gone back to Middlemarch every five years or so, my emotional response to it evolving at each revisiting. In my judgmental twenties, I thought that Ladislaw, with his brown curls and his callow artistic dabbling, was not entirely deserving of Dorothea; by forty, I could better measure the appeal of his youthful energies and Byronic hairdressing, at least to his middle-aged creator, who was fifty-three when the book was published.
This, of course, is the measure of a great work of art: its ability to reveal new perspectives as we approach it at different times in our lives. Most of us, I imagine, have a book or movie or album that serves as a similar sort of milestone, with our evolving feelings toward it charting how much we ourselves have changed. For Roger Ebert, it’s Fellini’s La Dolce Vita. In an article first published two years ago, he writes:
In 1962, Marcello Mastroianni represented everything I dreamed of attaining…Ten years later, he represented what I had become, at least to the degree that Chicago offered the opportunities of Rome. Ten years after that, in 1982, he was what I had escaped from, after I stopped drinking too much and burning the candle at both ends.
And now Ebert has left the movie behind entirely. Recently, he wrote movingly of the fact that he will no longer be able to discuss the film shot by shot at the Conference on World Affairs at Boulder, as he’s done on four separate occasions, and concludes:
Well, now I’ve outlasted Marcello. I’ve come out the other side. He is still standing on the beach, unable to understand the gestures of the sweet blond girl who was his waitress at the restaurant, that day he was going to start his novel. He shakes his head resignedly and turns to walk back into the trees and she looks after him wistfully. I am in the trees with Marcello.
As for the equivalent work in my own life, I’m tempted to say that it’s the Pet Shop Boys album Actually, which has slipped imperceptibly from the imagined soundtrack of my adulthood to a reminder of a period I’ve already left behind. Or perhaps it’s The Phantom Tollbooth, which has evolved, as I’ve grown older, from escapist fantasy to handbook for adult life to the book that I’m most looking forward to giving to my own children. I suspect, though, that it might actually be Citizen Kane, which I once saw as a challenge and call to art, and which currently seems—now that I’m five years older than Welles was—more like a warning, or a rebuke. Or perhaps all of the above. What about you?
Daniel Zalewski’s recent New Yorker piece on Guillermo del Toro, director of Pan’s Labyrinth and the Hellboy movies, is the most engaging profile I’ve read of any filmmaker in a long time. Much of this is due to the fact that del Toro himself is such an engaging character: enthusiastic and overweight, he’s part auteur and part fanboy, living in a house packed with ghouls and monsters, including many of the maquettes from his own movies. And the article itself is equally packed with insights into the creative process. On creature design:
Del Toro thinks that monsters should appear transformed when viewed from a fresh angle, lest the audience lose a sense of awe. Defining silhouettes is the first step in good monster design, he said. “Then you start playing with movement. The next element of design in color. And then finally—finally—comes detail. A lot of people go the other way, and just pile up a lot of detail.”
On Ray Harryhausen:
“He used to say, ‘Whenever you think of a creature, think of a lion—how a lion can be absolutely malignant or benign, majestic, depending on what it’s doing. If your creature cannot be in repose, then it’s a bad design.’”
And in an aside that might double as del Toro’s personal philosophy:
“In emotional genres, you cannot advocate good taste as an argument.”
Reading this article makes me freshly mourn the fact that del Toro won’t be directing The Hobbit. I like Peter Jackson well enough, but part of me feels that if del Toro had been allowed to apply his practical, physical approach to such a famous property—much as Christopher Nolan did with the effects in Inception—the history of popular filmmaking might have been different. As it stands, I can only hope that Universal gives the green light to del Toro’s adaptation of At the Mountains of Madness, a prospect that fills me with equal parts joy and eldritch terror. Judging from what I’ve heard so far, it sounds like del Toro is planning to make the monster movie to end all monster movies. Let’s all hope that he gets the chance.