Posts Tagged ‘James Wood’
Over the last few weeks, I’ve been rereading The Dogs of War by Frederick Forsyth, my favorite suspense novelist. I’ve mentioned before that Forsyth is basically as good as it gets, and that he’s the writer I turn to the most these days in terms of pure enjoyment: he operates within a very narrow range of material and tone, but on those terms, he always delivers. Reading The Dogs of War again was a fascinating experience, because although it takes place in the world of mercenaries and other guns for hire, it contains surprisingly little action—maybe thirty pages’ worth over the course of four hundred dense pages. The rest of the novel is taken up by an obsessively detailed account of how, precisely, a privately funded war might be financed and equipped, from obtaining weapons to hiring a ship to acquiring the necessary amount of shirts and underwear. And although the amount of information is sometimes overwhelming, it’s always a superlatively readable book, if only because Forsyth is a master of organization and clarity.
Of course, it also works because it’s fun to learn about these things. The Dogs of War is perhaps the ultimate example of the kind of fiction that Anthony Lane, speaking of Allan Folsom’s The Day After Tomorrow, has dismissed as “not so much a novel as a six-hundred-page fact sheet with occasional breaks for violence.” Yet the pleasure we take in absorbing a few facts while reading a diverting thriller is perfectly understandable. Recently, I saw a posting on a social news site from a commenter who said that he didn’t read much, but was looking for novels that would teach him some things while telling an interesting story. I pointed him toward Michael Crichton, who is one of those novelists, like Forsyth, whose work has inspired countless imitators, but who remains the best of his breed. This kind of fiction is easy to dismiss, but conveying factual information to a reader is like any other aspect of writing: when done right, it can be a source of considerable satisfaction. In my own novels, I’ve indulged in such tidbits as how to build a handheld laser, how to open a Soviet weapons cache, and what exactly happened at the Dyatlov Pass.
That said, like all good things, the desire to satisfy a reader’s craving for information can also be taken too far. I’ve spoken elsewhere about the fiction of Irving Wallace, who crams his books with travelogues, dubious factoids, and masses of undigested research—along with a few clinical sex scenes—until whatever narrative interest the story once held is lost. And my feelings about Dan Brown are a matter of record. Here, as in most things, the key is balance: information can be a delight, but only in the context of a story that the reader finds engaging for the usual reasons. Its effectiveness can also vary within the work of a single author. Forsyth is great, but the weight of information in some of his later novels can be a little deadening; conversely, I’m not a fan of Tom Clancy, and gave up on The Cardinal of the Kremlin after struggling through a few hundred pages, but I found Without Remorse to be a really fine revenge story, hardware and all. The misuse of factual information by popular novelists has given it a bad reputation, but really, like any writing tool, it just needs to be properly deployed.
And it’s especially fascinating to see how this obsession with information—in a somewhat ambivalent form—has migrated into literary fiction. It’s hard to read Thomas Pynchon, for instance, without getting a kick from his mastery of everything from Tarot cards to aeronautical engineering, and James Wood points out that we see much the same urge in Jonathan Franzen:
The contemporary novel has such a desire to be clever about so many elements of life that it sometimes resembles a man who takes too many classes that he has no time to read: auditing abolishes composure. Of course, there are readers who will enjoy the fact that Franzen fills us in on campus politics, Lithuanian gangsters, biotech patents, the chemistry of depression, and so on…
Yet Franzen, like Pynchon, uses voluminous research to underline his point about how unknowable the world really is: if an author with the capacity to write limericks about the vane servomotor feels despair at the violent, impersonal systems of which we’re all a part, the rest of us don’t stand a chance. Popular novelists, by contrast, use information for the opposite reason, to flatter us that perhaps we, too, would make good mercenaries, if only we knew how to forge an end user certificate for a shipment of gun parts in Spain. In both cases, the underlying research gives the narrative a credibility it wouldn’t otherwise have. And the ability to use it correctly, according to one’s intentions, is one that every writer could stand to develop.
I tried to imagine a fella smarter than myself. Then I tried to think, “What would he do?”
—David Mamet, Heist
Writers, by definition, are always trying to punch above their weight. When you sit down to write a novel for the first time, you’re almost comically inexperienced: however many books you’ve read or short stories you’ve written, you still don’t know the first thing about structuring—or even finishing—a long complicated narrative. Yet we all do it anyway. This is partly thanks to the irrational optimism that I’ve said elsewhere is a crucial element of any writer’s psychological makeup, in which we’re inclined to believe that we’re smarter and more prepared than we actually are. There’s nothing wrong with this; it’s the only way any of us will ever grow as writers, as we slowly evolve into the level of competence we’ve imagined for ourselves. Still, in any project, there always comes a time when a writer, however experienced, realizes that he’s taken on more than he can handle. The story is there, unwritten, and it’s beautiful in his head, but lost in the translation to the printed page. One day, he hopes, he’ll be good enough to realize it, but that doesn’t help him now. What he really needs is a way to temporarily become a better writer than he already is.
This may sound like witchcraft, but in reality, it’s something that writers do all the time. When we start out, we have no choice but to imitate the artists we admire, because when we set out to write that first page, we lack the experience of life and craft that only years of work can bring. Eventually, we move past imitation to find a voice and style of our own, but there are still times when we find ourselves compelled to channel the spirit of our betters. We do this when we start each day by reading a few pages from the work of a writer we like, or when we approach a tough moment in the plot by asking ourselves what Updike or Thomas Harris in his prime would do. Some of us go even further. In this week’s issue of The New Yorker, James Wood talks about a friend who became so obsessed by the work of the Norwegian writer Per Petterson that he copied out one of his novels word for word. This isn’t about stylistic plagiarism or slavish imitation, but a kind of sympathetic magic, a hope that we can conjure up the spirit of a more experienced writer just long enough to solve the problems in front of us.
And the act of imitation itself can lead to surprising places. There’s a great deleted scene from the notorious documentary The Aristocrats in which Kevin Pollak delivers the titular joke in the style of Albert Brooks. After milking it for two delicious minutes, he takes a sip of coffee and says:
That’s the trippy thing about doing Brooks, though—I’m faster and funnier than I am as myself. It’s very, very sad. It’s a possession. I hate to do it because, literally, I’m listening to myself and thinking, “Why am I never this funny?”
I’m not a huge Kevin Pollak fan, but I love this clip, because it gets at something important and mysterious about the way artistic imitation works. Pollak is a skilled mimic who does a good, if not great, impression of Albert Brooks on all the superficial levels—his vocal tics, his tone, the way he holds his face and body. Somewhere along the line, though, these surface impressions work a deeper transformation, and he finds himself temporarily thinking like Brooks. This is why typing out the work of a writer we admire can be so helpful: there’s no better way of opening a window, even just for a crucial moment or two, into someone else’s brain.
The best kind of imitation, as Pollak says, is a possession, in which we will ourselves, almost unconsciously, into becoming better artists than we really are. Imitation can become dangerous, however, when we focus on the superficial without also channeling more fundamental habits of mind. This morning, while watching the new teaser trailer for Star Trek: Into Darkness, which clearly takes many of its cues from the recent films of Christopher Nolan, I was amused by the thought that while Nolan has done more than any contemporary director to push the envelope of visual and narrative complexity in mainstream movies, the big takeaway for other filmmakers—or at least those who assemble the trailers—has apparently been a “BWONG” sound effect. But big influences can arise from small beginnings. The qualities that most deserve imitation in the artists we admire have little to do with the obvious trademarks of their style, and if we imitate those aspects alone, we’re just being derivative. But sometimes it’s those little things that allow us to temporarily acquire the mindset of smarter artists than ourselves, until, finally, we’ve made it our own.
It’s taken me a long time to get around to le Carré. As I noted in my review of the recent movie adaptation of Tinker, Tailor, Soldier, Spy, my interest in his great subject—the psychology and culture of spycraft—has always been limited at best, so his books can seem forbiddingly hermetic to a reader like me. A writer like Frederick Forsyth, whom I admire enormously, does a nice job of balancing esoteric detail with narrative thrills, while le Carré, although he’s an ingenious plotter, deliberately holds back from the release of action for its own sake. The difference, perhaps, is that Forsyth was a journalist, while le Carré worked in intelligence himself, which accounts for much of the contrast in their work—one is a great explainer and popularizer, so that his books read like a men’s adventure novel and intelligence briefing rolled into one, while the other is all implication. As a result, while I’ve devoured most of Forsyth’s novels, I’ve tried and failed to get into le Carré more than once, and it’s only recently that I decided to remedy this situation once and for all.
Because there’s an important point to be made about le Carré’s reticence, which is that it ultimately feels more convincing, and lives more intriguingly in the imagination, than the paragraph-level thrills of other books. In interviews, le Carré has noted that many of the terms of spycraft that fill his novels were invented by himself, and weren’t actually used within MI6. This hardly matters, because a reader encountering this language for the first time—the lamplighters, the scalphunters, the janitors—has no doubt that this world is authentic. Forsyth, by contrast, stuffs his books with detail, nearly all of it compelling, but always with the sense that much of this information comes secondhand: we applaud the research, but don’t quite believe in the world. With le Carré, we feel as though we’re being ushered into a real place, sometimes tedious, often opaque, with major players glimpsed only in passing. And even if he’s inventing most of it, it’s still utterly persuasive.
This is the great strength of Tinker, Tailor, Soldier, Spy, which I finished reading this week. Le Carré is the strongest stylist in suspense fiction, and this book is a master class in the slow accumulation of detail and atmosphere. Sometimes we aren’t quite sure what is taking place, either because of the language of spycraft or the density of Britishisms—”a lonely queer in a trilby exercising his Sealyham”—but there’s never any break in the fictional dream. It’s a book that demands sustained engagement, that resolutely refuses to spell out its conclusions, and that always leaves us scrambling to catch up with the unassuming but formidable Smiley. In this respect, Tomas Alfredson’s movie is an inspired adaptation: it visualizes a few moments that the novel leaves offstage, but for the most part, it leave us to swim for ourselves in le Carré’s ocean of names, dates, and faces. (I haven’t seen the classic Alec Guinness version, which I’m saving for when the details of the plot have faded.)
And yet the overall impact is somewhat unsatisfying. Tinker, Tailor is a brilliantly written and constructed novel, but it’s an intellectual experience, not a visceral one. By the end of the book, we’ve come to know Smiley and a handful of others, but the rest are left enigmatic by design, so that the book’s key moment—the revelation of the mole’s identity—feels almost like an afterthought, with no real sense of pain or of betrayal. (The film has many of the same issues, and as I’ve noted before, it gives the game away with some injudicious casting.) This isn’t a flaw, precisely: it’s totally consistent with the book’s tone, which distrusts outbursts of emotion and buries feeling as deep as possible. That air of reserve can be fascinating, but it also leads to what James Wood, for somewhat different reasons, calls le Carré’s “clever coffin”—a narrowness of tone that limits the range of feeling that the work can express, which is often true of even the best suspense fiction. Le Carré’s talent is so great that it inadvertently exposes the limitations of the entire genre, and it’s a problem that we’re all still trying to solve.
Like it or not, authors have to live with other authors. Some may prefer otherwise, and do their best to keep their distance, but most of us end up spending a fair amount of time—in person, in print, and online—interacting with our fellow writers. You can check it up to camaraderie, careerism, or the simple sense that there’s no one else with whom we can talk about the things that matter most to us, as well as the knowledge that, for better or worse, we’re going to be collaborating and competing with these people for a long time. As a result, most of us generally avoid criticizing one another’s work, at least in public. Which isn’t to say that writers aren’t neurotic, needy, petty people—most of us certainly are. But while we may secretly begrudge a friend’s success or agree that this year’s big book is a big bore, we generally keep these opinions to ourselves or share them only in private. As a result, only a handful of major novelists—Updike, Vidal, maybe a few others—have also been major critics. It isn’t for lack of intelligence; it’s more out of prudence or caution.
That’s why I don’t agree with Dwight Garner’s recent assertion that Twitter has somehow made writers less willing to criticize one another in public. Most writers have long since concluded, and rightly so, that it isn’t worth the headache. At best, we tend to reserve our critical arrows for those unlikely to be hurt by what we say, or even to read it at all, which is the real reason why the dead, the famous, and the canonized are such tempting targets. But when it comes to writers on our own level, there’s little to gain and much to lose by criticizing them in print. This isn’t omerta, or a gentlemen’s agreement, but a modus vivendi that avoids problems down the line. Even Norman Mailer, no stranger to conflict, came to the same conclusion. Fifty years ago, in his essay “Some Children of the Goddess,” he took potshots at contemporaries like Styron, Salinger, and Roth, and some never forgave him for it. Ever since, he avoided criticizing his peers, or lobbed his missiles at more resilient targets like Tom Wolfe. And if Mailer, of all people, decided that being a critic was more trouble that it was worth, I can’t blame other writers for concluding the same thing.
And yet it’s also a genuine loss. Dave Eggers isn’t wrong when he advises us not to criticize a novel until we’ve written one, or a movie until we’ve made one. There’s no question that we’d avoid a lot of the nonsense written about movies and books—like the idea, for instance, that a director is the sole author of a film, despite all evidence to the contrary—if more criticism were written by people with experience in the creative field in question. As someone who has done a bit of freelancing myself, I can say that while critics can be driven by ambitions and impulses of their own, these are qualitatively different from the process that underlies the creation of any extended, original work of art. Ideally, then, a literary critic would know something about how a novel is put together, with all the compromises, accidents, and beartraps involved—and there’s no one more qualified to do this than working novelists themselves. But for all the reasons I’ve listed above, there are good reasons why most writers prefer to keep out of it, especially when it comes to the contemporaries about whom they know the most.
In short, the people best equipped to write intelligently about contemporary literature—the writers themselves—have more than enough reason to stand down, and it isn’t necessarily realistic or fair to expect otherwise. Consequently, our best literary critics have often been those with some experience of creative work who have since thrown in their lot on the critical side, which is how we end up with valuable voices like Edmund Wilson or James Wood, who have written novels of their own but found their true calling elsewhere. This isn’t a perfect solution, but it’s a pretty good one, and I’d much rather be reviewed by a critic who at least knew what writing a publishable novel was like. In the end, though, this will always be an issue for literary criticism, which differs from all other fields in that critics and their subjects use the same tools and draw on the same pool of talent. It makes objectivity, bravery, and expertise in a critic all the more precious. And if you want to know what a writer really thinks of his peers—well, just corner him at a party, and believe me, you’ll get an earful.
Last week, Dwight Garner of the New York Times—arguably one of the two or three most famous literary critics now at work, along with his colleague Michiko Kakutani and The New Yorker‘s James Wood—wrote a long opinion piece titled “A Critic’s Case for Critics Who Are Actually Critical.” In it, he decries what he sees as the decline of serious criticism, as well as the hostility toward the role of critics themselves, who are seen, at least by authors, as negative, dismissive, and cruel. To illustrate this view, he quotes a decade-old interview with Dave Eggers, who says:
Do not dismiss a book until you have written one, and do not dismiss a movie until you have made one, and do not dismiss a person until you have met them. It is a fuckload of work to be open-minded and generous and understanding and forgiving and accepting, but Christ, that is what matters. What matters is saying yes.
(Incidentally, Eggers conducted this interview with my old college literary magazine, whose fiction board I joined a few months later. Garner doesn’t quote the interview’s last few lines, which, if I recall correctly, became something of a running joke around the Advocate building for years afterward: “And if anyone wants to hurt me for that, or dismiss me for saying that, for saying yes, I say Oh do it, do it you motherfuckers, finally, finally, finally.”)
Well, Garner finally, finally, finally goes after Eggers, a writer he says he admires, saying that he “deplores” the stance expressed above: “The sad truth about the book world,” Garner writes, “is that it doesn’t need more yes-saying novelists and certainly no more yes-saying critics. We are drowning in them.” What the world really needs, he argues, are uncompromising critics who are willing to honestly engage with works of art, both good and bad, and to be harsh when the situation requires it. He says that the best work of critics like Pauline Kael “is more valuable—and more stimulating—than all but the most first-rate novels.” He points out that any writer who consents for his or her novel to be published tacitly agrees to allow critics to review it however they like. And he bemoans the fact that social media has made it hard for critics to be as honest and hard as they should be. Twitter, he says, has degenerated into a mutual lovefest between authors, and doesn’t allow for anything like real criticism: “On it, negative words have the same effect as a bat flying into a bridal shower.”
The trouble with Garner’s argument, aside from its quixotic attempt to persuade authors to feel kindly toward critics, is that I don’t think it’s factually correct. Garner quotes Jonah Peretti’s observation that “Twitter is a simple service used by smart people,” which isn’t true at all—Twitter, for better or worse, is used by all kinds of people, and when we venture out of our own carefully cultivated circles, we’re treated to the sight of humanity in its purest form, including people who didn’t realize the Titanic was real. The same goes for the comments section of any news or opinion site, which is generally a swamp of negativity. The trouble with social media isn’t that it encourages people to be uncritically positive or negative: it’s that it encourages unconsidered discourse of all kinds. Twitter, by design, isn’t a place for reasoned commentary; at its best, it’s more like a vehicle for small talk. And we shouldn’t judge it by the same standards that use for other forms of criticism, any more than we should judge guests at a cocktail party for not saying what they really feel about the people around them. That’s also why attempts at criticism on Twitter tend to look uglier than the author may have intended—it’s the nature of the form.
And when we’re dealing with the choice, admittedly not a great one, between uncritical positivity and negativity, I’d have to say that the former is the lesser of two evils. That’s what Eggers is saying in the interview quoted above: he isn’t proposing, as Garner would have it, “mass intellectual suicide,” but an extreme solution to what he rightly sees as an extreme problem, which is the ease in which we can fall back into dismissive snark, long before “snark” had even attained its current meaning. It’s best, of course, to make nuanced, perceptive, complex arguments, but if we don’t have the time for it—and being a good critic takes time—then it’s marginally better, at least for our own souls, to be enthusiastic bores. I’ve argued before, and I still believe, that every worthwhile critic builds his or her work on a foundation of genuine enthusiasm for the art in question. Hard intellectual engagement comes later, as a sort of refinement of joy, and when it doesn’t, that’s the worst kind of intellectual suicide, which disguises itself as its opposite. Dwight Garner is a really good critic. But to get where Garner is now, you need to pass through Eggers first.
Last month, the critic Arthur Krystal published a piece in The New Yorker titled “Easy Writers: Guilty Pleasures Without Guilt.” I’ve held off on talking about this essay until now because even after two readings, I’m not quite sure what Krystal’s point is—he seems to be saying that we think of certain novels as guilty pleasures, but we really shouldn’t, unless perhaps we should—and because Lev Grossman has already done such a fine job of responding in Time. Yet the fact that Krystal felt capable of weighing in on such an ancient debate makes me inclined to share a few of my own disorganized thoughts. (Krystal, incidentally, commits a basic gaffe when he writes: “Preferring Ken Follett’s On Wings of Eagles to Henry James’s Wings of the Dove is not a negligible bias.” This neglects the fact that the Follett book is actually a work of nonfiction that has no place in his discussion of the novel, guilty pleasure or otherwise.)
There are three points I’d like to make. First is the obvious fact, which nonetheless bears repeating, that while our very best novels are properly defined as literary fiction, simply stating that one book, or even a group of books, is “literary” and another is “genre” gives no indication of their relative quality. A literary novel like The Magic Mountain—which, incidentally, cares a great deal about story and suspense—clearly stands head and shoulders above most other novels of any kind, even as paperback smut stands more or less clearly at the bottom. But in the middle is a vast gray area of novels of varying quality, including very great genre fiction and rather trashy literary fiction, and a lot of books that fall somewhere between the two extremes. “Literary” and “genre” aren’t statements of quality, but of intent. And if, by literary fiction, we tend to mean contemporary realism, then we’re talking about a genre with its own formulas and rules, as James Wood has accurately, if smugly, pointed out.
My second point is that these classifications are unfairly skewed, because whenever a genre novelist shows signs of exceptional quality, we immediately promote him into the literary sphere, creating a kind of reverse survivorship bias. My favorite example is Ian McEwan, a great suspense novelist who has been embraced by the literary camp because of the quality of his prose and ideas. Atonement aside, most of McEwan’s books are essentially thrillers—they often end with a home invasion or a man wielding a knife—that happen to be written with impeccable style and intelligence. The same is true of Borges, who writes fantasy and mystery fiction on a higher level than any author in history. To say that they aren’t really part of the genre because they’re so good is to impoverish the genre label, creating a self-fulfilling prophecy. If we automatically exclude all great writers from the category in which they belong, it’s no surprise that the category will start to look a little thin—but that’s only because we’ve defined it that way.
And my last point is that if literary fiction tends to receive certain kinds of recognition that genre fiction does not, this is less out of its inherent quality than a case of simple economics. If we agree that it’s a good thing, in general, to have a steady supply of both genre and literary novels, we need to find nonmonetary ways of encouraging the latter. Genre or mainstream fiction sells better, on the whole, than literary fiction, so a separate, noncommercial system of incentives needs to be set up for the literary side. These include prizes, fellowships, and reviews in prestigious publications. If these were portioned out equally to both sides, the attraction of the literary novel would disappear—which is why giving a National Book Foundation medal to Stephen King was perceived as such a threat. Literary novelists need to feel special, and to be treated as such, because otherwise, there wouldn’t be any at all. And if classifying all other books as guilty pleasures is what literary novels need to survive, well, that’s a price we should be willing to pay.
He is tall and wiry; he has a thin goatee and an earring; he wears a black leather jacket and black leather trousers. He looks older than most students; he looks like trouble.
This description of a character’s appearance appears early in J.M. Coetzee’s Disgrace, one of my favorite novels of recent years. For our purposes, it doesn’t necessarily matter who the character is. (For the record, he’s the thuggish older boyfriend of the student who is having an affair with the novel’s protagonist.) The description isn’t particularly detailed or specific—it sees the character only on the surface, and is really just a record of a first impression—but it more than serves its purpose. We see this character clearly enough to retain a consistent mental image of how he looks, and, more importantly, how he appears to our protagonist. Like just about every sentence in Coetzee’s novel, this is good, concise writing, economical and concrete. Given the character’s significant but ultimately secondary role in the story, that’s probably enough. Or is it?
James Wood would say no. In a pointedly skeptical review of Coetzee’s book—of which he says “It sometimes reads as if it were the winner of an exam whose challenge was to create the perfect specimen of a very good contemporary novel”—Wood uses this particular description as an example of the limits of Coetzee’s tight, compressed style. No real person is ever really adequately described in just a few sentences, Wood argues, and Coetzee’s refusal to look at this character more closely is a sign of authorial coldness, or even resistance to reality. (He says elsewhere that elements of Coetzee’s style “would not be out of place in a mass-market thriller,” which he clearly regards as a devastating insult.) Wood, famously, is a devotee of Saul Bellow, one of the great writers of character descriptions, and when he criticizes Coetzee for not going deep enough, one suspects that he’d rather see a description like this one in Humboldt’s Gift:
Rinaldo was extremely good-looking with a dark furry mustache as fine as mink, and he was elegantly dressed…His nose was particularly white and his large nostrils, correspondingly dark, reminded me of the oboe when they dilated. People so distinctly seen have power over me. But I don’t know which comes first, the attraction or the close observation.
But is there a right or wrong way to describe our characters? The difference between the styles of Coetzee and Bellow—between the concise signifier of appearance and the luxuriant jungle of personal description—strikes me as pretty fundamental, and every writer will tend to come down on one side or another. In my own case, as a writer, yes, of mass-market thrillers, I prefer to describe characters in the compressed Coetzee fashion, allowing the reader to fill in the blanks. This is partly because I think it’s closer to the way we actually tend to see the people around us, in a sort of nonverbal shorthand. When I read the riot of noticing in authors like Bellow or Updike, I’m impressed and delighted, but not quite convinced that this is really how their characters would see the world. And even if I grant the author the freedom to notice things more deeply, a detailed physical description often makes a character seem less real and distinct to me—I have trouble seeing them through the flurry of adjectives.
My own ideal, which isn’t for everyone, is a kind of fictional transparency, with as little as possible interposed between the reader and the story—and if that means I need to stint on specificity for the sake of momentum, that’s a sacrifice I’m willing to make. In The Icon Thief, I devote maybe a sentence to the looks of each main character: I provide a few tags—like Powell’s “thick glasses and alarmingly high forehead”—and trust the reader to supply the rest. And different characters require different approaches, even within the same novel. In The Silence of the Lambs, for instance, Thomas Harris describes Hannibal Lecter at length—his red eyes, his head sleek like a mink’s—but I don’t think there’s a single line of description for Clarice Starling. (“She knew she could look all right without primping” is the most we get.) It’s easy to see why: Lecter is seen from the outside, while we spend most of the novel inside Clarice’s head. And even if we aren’t told how to picture her, she’s still utterly real. Not bad for a mass-market thriller.
Although my life has since taken me in a rather different direction, for a long time, I was convinced that I wanted to be a film critic. My first paying job as a writer was cranking out movie reviews, at fifty dollars a pop, for a now-defunct college website, a gig that happily coincided with the best year for movies in my lifetime. Later, I spent the summer of 2001 writing capsule reviews for the San Francisco Bay Guardian, during a somewhat less distinguished era for film—my most memorable experience was interviewing Kevin Smith about Jay and Silent Bob Strike Back. After college, I tried to get work as a film critic in New York, only to quickly realize that reviewing movies for a print publication is one of the cushier jobs around, meaning that most critics don’t leave the position until they retire or die, and when they do, there’s usually someone in the office—often the television reporter—already waiting in the wings.
In the years since, the proliferation of pop cultural sites on the Internet has led to a mixed renaissance for critics of all kinds: there are more professional reviewers than ever before, but their influence has been correspondingly diluted. Critics have always been distrusted by artists, of course, but these days, they get it from both sides: for every working critic, there are a thousand commenters convinced that they can do a better job, and the rest of us are often swayed less by the opinions of individual writers than the consensus on Rotten Tomatoes, which is a shame. At its best, a critic’s body of work is a substantial accomplishment in its own right, and personalities as dissimilar as those of Pauline Kael, Roger Ebert, and David Thomson—speaking only of film, which is the area I know best—have created lasting legacies in print and online. And while the critical profession is still in a period of transition, the elements of great criticism haven’t changed since the days of James Agee, or even Samuel Johnson.
So what makes a good critic? Knowledge of the field, yes; enthusiasm for art, most definitely. (A critic without underlying affection for his chosen medium, or who sees it only as an excuse for snark, isn’t good for much of anything.) Above all else, it requires a curious mixture of the objective and the subjective. A critic needs to be objective enough to evaluate a work of art on its own terms—to review the work that the creator wanted to make, not the one that the critic wishes had been made instead—while also acknowledging that all good reviews are essentially autobiographical. Ebert has noted that his own criticism is written in the first person, and the most enduring critics are those who write, not as an authority delivering opinions from up on high, but as someone speaking to an intelligent friend. As a result, the collected works of critics like Ebert and Kael are the closest things we have these days to books that seem like living men or women, like Montaigne’s essays or The Anatomy of Melancholy. “Cut these words,” as Emerson said of Montaigne, “and they would bleed.”
Surveying the current crop of writers on the arts, my sense is that while we have many gifted critics, most of them fall short in one way or another. A critic like Anthony Lane, for all his intelligence, tends to treat the subject under consideration as an excuse for an arch bon mot (as with Star Trek: First Contact: “If you thought the Borg were bad, just wait till you meet the McEnroe.”) And while his wit can be devastating when aimed at the right target—The Da Vinci Code, for instance, or the occupants of the New York Times bestseller list—it often betrays both too much self-regard and a lack of respect for the work itself. On the literary side, James Wood has a similar problem: he’s a skilled parodist and mimic, but surely not every review obliges him to show off with one of his self-consciously clever pastiches. (If I were Chang Rae-Lee, I’d still be mad about this.) The writers of the A.V. Club are more my style: in their pop cultural coverage, especially of television, they’ve struck a nice balance between enthusiasm, autobiography, and reader engagement. But I’m always looking for more. Which critics do you like?
By now, many of you have probably heard of the truly bizarre case of Q.R. Markham, the nom de plume of a Brooklyn novelist whose debut thriller, Assassin of Secrets, was recently exposed as an insane patchwork of plagiarized passages from other books. In his author photos, Markham himself looks something like a character out of a Nabokov novel, so it’s perhaps fitting that this scandal differs from other instances of plagiarism both in scope and in kind: dozens of thefts have been identified so far, from such famous novelists as Charles McCarry, Robert Ludlum, and James Bond author John Gardner, all but guaranteeing that the fraud would quickly be discovered. (One of the lifted passages was allegedly six pages long.) The sheer massiveness of the deception, which also extends to much of the author’s other published work, suggests that unlike most plagiarists—who tend to be motivated by laziness, carelessness, or cynicism—Markham was driven, instead, by a neurotic need to be caught.
Of course, as with James Frey and the Harvard student I still like to think of as Opal Mehta, after the exposure comes the inevitable justification, and Markham doesn’t disappoint. In a fascinating email exchange with author Jeremy Duns, who provided a glowing blurb for the novel in happier times, Markham claims that his actions were motivated by “a need to conceal my own voice with the armor of someone else’s words,” as well as, more prosaically, the pressure of rapidly turning around revisions for his publisher. The latter rationale can be dismissed at once, and novelist Jamie Freveletti has already skewered it quite nicely: every working novelist has to generate rewrites on short notice—I’m doing this for my own novel as we speak—so invoking time constraints as an excuse makes about as much sense as blaming the physical act of typing itself. More interesting, at least to me, is the implication that assembling this novel of shreds and patches ultimately became a kind of game. Markham writes:
I had certain things I wanted to see happen in the initial plot: a double cross, a drive through the South of France, a raid on a snowy satellite base. Eventually I found passages that adhered to these kinds of scenes that only meant changing the plot a little bit here and there. It felt very much like putting an elaborate puzzle together.
Now, on some level, this kind of puzzle construction is what every genre novelist does. The number of tropes at a writer’s disposal is large, but finite, and barring a really exceptional act of invention, which has happened only a handful times in the history of the genre, much of what a suspense novelist does consists of finding fresh, unexpected combinations of existing elements and executing them in a surprising way. If anything, Markham’s example highlights one of the weaknesses of the suspense genre, which is that the underlying components—like the ones he lists above—have become rather tired and predictable. Doesn’t every spy novel contain a double cross, or a raid on some kind of secret base? In his neurotic fear of originality, Markham simply took it to the next logical step, so it’s tempting to read his case as a kind of demented experiment, a sweeping indictment of the artificiality of the spy thriller itself.
But this gives him too much credit. Assassin of Secrets is a kind of distorting mirror, a looking glass in which various players in the publishing world can see uncomfortable reflections of themselves. Markham’s editors and reviewers have clearly been wondering, as well they should, why they didn’t detect this deception much sooner, and what this says about their knowledge of the genre in which they make their living. And for other novelists, Markham stands as an emblem of what I might call a culture of empty virtuosity, in which a book that mechanically recombines exhausted tropes can be acclaimed as the work of an exciting new voice, when it merely contains, as James Wood once unfairly said of John Le Carré, “a clever coffin of dead conventions.” I love suspense, and much of its pleasure lies, as Markham says, in the construction of elaborate puzzles. But it can also be more. And if nothing else, this Frankenstein monster of a novel should remind us of the fact that we owe it to ourselves to do better.
The tricky thing about defending plot is that you occasionally get it from both sides. On the literary end, you have critics like John Lucas of the Guardian, who is clearly suspicious of most plotted fiction, or James Wood of the New Yorker, who is famously fed up with the conventions of literary realism. Meanwhile, on the other end, you have those who want to get rid of story altogether, but for radically different reasons. And I suspect that the likes of Lucas and Wood might ease up on their invective if they realized that plot was, in fact, literature’s last stand against an even more insidious opponent, embodied, at least this week, by Andy Hendrickson, Chief Technology Officer of Disney Studios, who was quoted in Variety as saying: “People say ‘It’s all about the story.’ When you’re making tentpole films, bullshit.”
To state the obvious, I’d rather be defending story against the likes of Lucas and Wood, who at least claim to be aspiring to something more, than Hendrickson, who is pushing toward something much less. On a superficial level, though, he seems to have a point—at least when it comes to the movies that consistently generate large audiences. Citing a chart of the top 12 movies of all time, including Disney’s own Alice in Wonderland, Hendrickson notes that visual effects are what tend to drive box office—”and Johnny Depp didn’t hurt,” he concludes. Which is true enough. Most of these movies are triumphs of visuals over narrative, based on existing brands or properties, to the point where story seems almost incidental. Even a movie like The Dark Knight, which cares deeply about plot and narrative complexity, feels like little more than an aberration.
But this only tells half the story. For one thing, the list that Hendrickson provides isn’t adjusted for inflation, and the list of the real highest-grossing movies of all time yields a much different picture. There are some clunkers here, too (nobody, I trust, went to see The Ten Commandments because of the script), but for the most part, these are movies driven by story and spectacle: Gone With the Wind. E.T. Star Wars. The Sound of Music. Even Avatar, which had a few problems in the screenplay department, was an ambitious attempt to create a fully realized original story that would fuel the dreamlife of millions. And these are the most lucrative movies ever made. To be content with a disposable tentpole picture that barely makes back its production and marketing costs strikes me as a lack of ambition. And it should strike Disney shareholders the same way.
Moreover, even the movies that Hendrickson cites are more driven by story than he acknowledges. Alice in Wonderland was a book before it became a terrible movie, after all, and it’s safe to say that box office was driven as much by goodwill toward Lewis Carroll’s creations as toward Johnny Depp. The same is true for Spider-Man, Harry Potter, The Lord of the Rings, and most other major franchises, all of which were built on the work of solitary geniuses. In short, someone still needs to do the work of story. Aside from exceptions like Pixar or Inception, the primary creative work may not be done in Hollywood itself, but in novels, comics, and other media where true artists continue to gravitate, and where the movies will eventually turn. Hendrickson may hate to admit it, but he still depends on storytellers, even if they’ve fled his own department. Life, as a certain famous franchise reminds us, always finds a way. And story does as well.
By [narrative] grammar, I mean the rather lazy stock-in-trade of mainstream realist fiction: the cinematic sweep, followed by the selection of small, telling details (“It was a large room, filled almost entirely by rows of antique computers; there was an odd smell of aftershave and bacon”); the careful mixing of dynamic and habitual detail (“At one of the computers, a man was unhurriedly eating a spring roll; traffic noise pierced the thick, sealed windows; an ambulance yelped by”); the preference for the concrete over the abstract (“She was twenty-nine, but still went home every evening to her mom’s ground-floor apartment in Queens, which doubled by day as a yoga studio”); vivid brevity of character-sketching (“Bob wore a bright-yellow T-shirt that read ‘Got Beer?,’ and had a small mole on his upper lip”); plenty of homely “filler” (“She ordered a beer and a sandwich, sat down at the table, and opened her computer”); more or less orderly access to consciousness and memory (“He lay on the bed and thought with shame of everything that had happened that day”); lucid but allowably lyrical sentences (“From the window, he watched the streetlights flicker on, in amber hesitations”). And this does not even touch on the small change of fictional narrative: how strange it is, when you think about it, that thousands of novels are published every year, in which characters all have different names (whereas, in real life, doesn’t one always have at least three friends named John, and another three named Elizabeth?), or in which characters quizzically “raise an eyebrow,” and angrily “knit their brows,” or just express themselves in quotation marks and single adverbs (“‘You know that’s not fair,’ he said, whiningly”). At this level of convention, there is a shorter distance than one would imagine between, say, Harriet the Spy and Disgrace.
The joy is in the surprise. It can be as small as a felicitous coupling of noun and adjective. Or a whole new scene, or the sudden emergence of an unplanned character who simply grows out of a phrase. Literary criticism, which is bound to pursue meaning, can never really encompass the fact that some things are on the page because they gave the writer pleasure.
Recently, I’ve been thinking a lot about plot—what it is, how to construct it, and why it matters. I’ve spoken to other aspiring writers about this, and have been dealing with it constantly while assembling an outline for the sequel to Kamera (which, now that the proposal has been officially accepted, I can finally say will be called Midrash). Over the next few days, I’ll be looking at plot from various angles, both in fiction and in film. Today, however, I want to talk about something more fundamental: the joy of plot from the perspective of the writer, who gets to play the greatest game in the world.
First, though, I want to address a major misconception. There’s a common assumption, reinforced by many critics and writing instructors, that plot is somehow inferior to other aspects of fiction, notably character and theme. (I’m not going to talk about language here, if only because language should, ideally, arise organically from those other three aspects.) And it’s true that a novel driven solely by plot can feel thin or unsatisfying. But here’s the important point: in nine cases out of ten, a novel driven solely by character and theme will, in the end, prove unsatisfying as well, if it’s published at all. A good novel needs all three legs of the tripod. And a strong plot, more than anything else, is what draws the reader along to the final page.
So why do so many critics—James Wood, for instance—tend to dismiss plot? It’s rather mysterious, but my sense is that those who undervalue plot are often those with the least experience of writing a novel themselves. Personally, I don’t think that any major novelist can dismiss plot. Or would want to. Because the construction of plot is one of the great joys and compensations of the writer’s life. Part instinct, part luck, part planning and preparation, it’s the most challenging thing that an artist can do: a process of intellectual engagement, drawing on all sides of the brain and personality, that can span months or years. It’s a game, but also deadly serious. And when it works, it’s something that no writer would willingly relinquish. As McEwan says:
A writer whose morning is going well, whose sentences are forming well, is experiencing a calm and private joy. This joy itself then liberates a richness of thought that can prompt new surprises. Writers crave these moments, these sessions….Nothing else—cheerful launch party, packed readings, positive reviews—will come near it for satisfaction.
And why is plot so satisfying for the writer? My guess is that it’s the aspect of writing that comes closest to capturing the deepest pleasures of craft. The writer begins with a handful of isolated pieces—a character, a location, an incident—and gradually moves outward. He thinks, dreams, and does research, casting his net as wide as possible, hoping that a chance conversation or a stray sentence in another book will set him off in another promising direction. Once he has amassed enough material, he looks for patterns, connections, affinities. He orders the pieces one way, thinks it over, and reorders them again. This process continues, in various forms, long after the actual writing has begun. And any writer who has really experienced it, even once, would never give it up, much less disallow it to others.
Here’s the big secret: writers value plot because it’s one of the few things that make their lives bearable. Writing is hard work. The simple act of putting words on the page can be torture. And, indeed, if a plot isn’t working—if it refuses to harmonize with the characters or become logically coherent—it can be torture as well. But when the pieces do finally fit, it can feel like magic. At best, there’s something mysterious about the result, as if the universe and the writer were conspiring in secret. Such moments may occur only two or three times in the course of a given novel, and not until after the hard work of research and preparation has been done, but once they fall into place, the writer would rather die than leave them unrealized. Plot, in short, serves the same purpose for writers as for readers: it reassures them that something good is around the corner. And it’s what carries them along to the end.
But none of this would matter if the writer’s joy weren’t also contagious. Reading a novel with a perfect plot—the first half of McEwan’s Atonement, for instance, before the story deliberately blows itself up—gives me, as a reader, an intense kind of pleasure, one that exists on two levels. The first is a shared pleasure at the skill of the author, who has created a vivid, interesting, elegant structure, a narrative house that can stand on its own. The second is rather simpler: it’s the primal, almost childlike satisfaction at seeing the promises of a story kept. Such satisfaction, as I see it, deserves to be ranked at the very height of the reasons we read, or write, fiction in the first place. Without it, and without plot, I don’t think we’d have novels at all.