Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘John Gardner

My ten creative books #7: On Directing Film

with 2 comments

On Directing Film

Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.

When it comes to giving advice on something as inherently unteachable as writing, books on the subject tend to fall into one of three categories. The first treats the writing manual as an extension of the self-help genre, offering what amounts to an extended pep talk that is long on encouragement but short on specifics. A second, more useful approach is to consolidate material on a variety of potential strategies, either through the voices of multiple writers—as George Plimpton did so wonderfully in The Writer’s Chapbook, which assembles the best of the legendary interviews given to The Paris Review—or through the perspective of a writer and teacher, like John Gardner, generous enough to consider the full range of what the art of fiction can be. And the third, exemplified by David Mamet’s On Directing Film, is to lay out a single, highly prescriptive recipe for constructing stories. This last approach might seem unduly severe. Yet after a lifetime of reading what other writers have to say on the subject, Mamet’s little book is still the best I’ve ever found, not just for film, but for fiction and narrative nonfiction as well. On one level, it can serve as a starting point for your own thoughts about how the writing process should look: Mamet provides a strict, almost mathematical set of tools for building a plot from first principles, and even if you disagree with his methods, they clarify your thinking in a way that a more generalized treatment might not. But even if you just take it at face value, it’s still the closest thing I know to a foolproof formula for generating rock-solid first drafts. (If Mamet himself has a flaw as a director, it’s that he often stops there.) In fact, it’s so useful, so lucid, and so reliable that I sometimes feel reluctant to recommend it, as if I were giving away an industrial secret to my competitors.

Mamet’s principles are easy to grasp, but endlessly challenging to follow. You start by figuring out what every scene is about, mostly by asking one question: “What does the protagonist want?” You then divide each scene up into a sequence of beats, consisting of an immediate objective and a logical action that the protagonist takes to achieve it, ideally in a form that can be told in visual terms, without the need for expository dialogue. And you repeat the process until the protagonist succeeds or fails at his or her ultimate objective, at which point the story is over. This may sound straightforward, but as soon as you start forcing yourself to think this way consistently, you discover how tough it can be. Mamet’s book consists of a few simple examples, teased out in a series of discussions at a class he taught at Columbia, and it’s studded with insights that once heard are never forgotten: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” “Here is a tool—choose your shots, beats, scenes, objectives, and always refer to them by the names you chose.” “Keep it simple, stupid, and don’t violate those rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.” “The audience doesn’t want to read a sign; they want to watch a motion picture.” “A good writer gets better only by learning to cut, to remove the ornamental, the descriptive, the narrative, and especially the deeply felt and meaningful.” “Now, why did all those Olympic skaters fall down? The only answer I know is that they hadn’t practiced enough.” And my own personal favorite: “The nail doesn’t have to look like a house; it is not a house. It is a nail. If the house is going to stand, the nail must do the work of a nail. To do the work of the nail, it has to look like a nail.”

Written by nevalalee

August 7, 2018 at 9:00 am

My ten creative books #6: The Art of Fiction

with one comment

Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.

I bought The Art of Fiction by John Gardner nearly a quarter of a century ago, at a used bookstore in Half Moon Bay, California, shortly before starting my freshman year of high school. (On that same afternoon, I picked up a copy of Critical Path by R. Buckminster Fuller, and my family also somehow acquired our first cat, which suggests that my life would be significantly different if that one day were magically erased.) Since then, I’ve read it in pieces a dozen or more times—it’s one of the few books that I’ve brought wherever I’ve moved—and I still know much of it by heart. Writing guides tend to be either loftily aspirational or fixated on the nuts and bolts of craft, and Gardner’s brilliance is that he tackles both sides in a way that enriches the whole. He has plenty to say on sentence structure, vocabulary, rhythm, and point of view, and his illustrations of process are still the most vivid that I’ve ever seen:

The good writer treats each unit individually, developing them one by one. When he’s working on the description of Uncle Fyodor’s store, he does not think about the hold-up men who in a moment will enter it, though he keeps them in the back of his mind. He describes the store, patiently, making it come alive, infusing every smell with Uncle Fyodor’s emotion and personality (his fear of hold-up men, perhaps); he works on the store as if this were simply an exercise, writing as if he had all eternity to finish it, and when the description is perfect—and not too long or short in relation to its function in the story as a whole—he moves on to his story’s next unit.

Yet Gardner is equally concerned with warning young writers away from “faults of soul,” including frigidity, sentimentality, and mannerism, and in reminding them that their work must have interest and truth. Every element of writing, he notes, should by judged by its ability to sustain the fictional dream: the illusion, to the reader, that the events and characters described are really taking place. And everything I’ve written since then has been undertaken with his high standards in mind.

By now, I’ve internalized all of his advice, even if I don’t always follow it, and as a result, when I read his book again now, it’s less as a guide than as a novel in itself, with an archetypal writer—who shouldn’t be confused with Gardner—who emerges as a character in his own right. For instance:

He begins to brood over what he’s written, reading it over and over, patiently, endlessly, letting his mind wander, sometimes to Picasso or the Great Pyramid, sometimes to the possible philosophical implications of Menelaos’ limp (a detail he introduced by impulse, because it seemed right). Reading in this strange way lines he has known by heart for weeks, he discovers odd tics his unconscious has sent up to him, perhaps curious accidental repetitions of imagery…Just as dreams have meaning, whether or not we can penetrate the meaning, the writer assumes that the accidents in his writing may have significance.

And his offhand observations about other writers have stuck in my head as well. Writing of a possible plot hole in Hamlet, for instance, Gardner offers a view of Shakespeare that I’ve never forgotten:

The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.

Of the countless books that I’ve read on writing, this is still the best, as well as the finest manual of the life of which Gardner writes elsewhere: “Novel-writing is not so much a profession as a yoga, or ‘way,’ an alternative to ordinary life-in-the-world…For those who are authentically called to the profession, spiritual profits are enough.”

Written by nevalalee

August 6, 2018 at 9:00 am

The rock that flies

leave a comment »

William H. Gass: There is a fundamental divergence about what literature is. I don’t want to subordinate beauty to truth and goodness. John and others have values which they think more important. Beauty, after all, is not very vital for most people. I think it is very important, in the cleanliness of the mind, to know why a particular thing is good. A lot of people judge, to use a crude example, the dinner good because of the amount of calories it has. Well, that is important if you don’t want to gain weight, but what has that got to do with the quality of the food? Moral judgments on art constantly confuse the quality of the food. I would also claim that my view is more catholic. It will allow in as good writers more than this other view will; John lets hardly anybody in the door.

John Gardner: I love Bill’s writing, and I honestly think that Bill is the only writer in America that I would let in the door. For twenty-four years I have been screaming at him, sometimes literally screaming at him, saying, “Bill, you are wasting the greatest genius ever given to America by fiddling around when you could be doing big, important things.” What he can do with language is magnificent, but then he turns it against itself. Our definitions of beauty are different. I think language exists to make a beautiful and powerful apparition. He thinks you can make pretty colored walls with it. That’s unfair. But what I think is beautiful, he would think is not yet sufficiently ornate. The difference is that my 707 will fly and his is too encrusted with gold to get off the ground.

Gass: There is always that danger. But what I really want is to have it sit there solid as a rock and have everybody think it is flying.

Conversations with John Gardner

Written by nevalalee

December 9, 2017 at 7:30 am

The fanfic disposition

with 6 comments

Yesterday, I mentioned Roxane Gay’s insightful opinion piece on the proposed HBO series Confederate, which was headlined “I Don’t Want to Watch Slavery Fan Fiction.” I’m still sorting out my own feelings toward this show, an alternate history set in the present day in which the South won the Civil War, but I found myself agreeing with just about everything that Gay writes, particularly when she confesses to her own ambivalence:

As a writer, I never wish to put constraints upon creativity nor do I think anything is off limits to someone simply because of who they are. [Creators] Mr. Benioff and Mr. Weiss are indeed white and they have as much a right to create this reimagining of slavery as anyone. That’s what I’m supposed to say, but it is not at all how I feel.

And I was especially struck by Gay’s comparison of the show’s premise to fanfic. Her essay, which appeared in the New York Times, only uses the phrase “fan fiction” once, linking to a tweet from the critic Pilot Bacon, and while its use in reference to Confederate isn’t literally true—at least not if we define fanfic as a derivative work based on characters or ideas by another author—its connotations are clear. Fairly or not, it encapsulates the notion that David Benioff and D.B. Weiss are appropriating existing images and themes to further their own artistic interests.

Even if we table, for now, the question of whether the criticism is justified, it’s worth looking at the history of the word “fanfic” as a pejorative term. I’ve used it that way here myself, particularly in reference to works of art that amount to authorial wish fulfillment toward the characters, like the epilogue to Ann Patchett’s Bel Canto. (Looking back at my old posts, I see that I even once used it to describe a scene in one of my own novels.) Watching The Hobbit: The Battle of the Five Armies recently with my wife, I commented that certain scenes, like the big fight at Dol Guldur, felt like fanfic, except that Peter Jackson was somehow able to get Cate Blanchett, Ian McKellen, Hugo Weaving, and Christopher Lee to reprise all their old roles. And you often see such comparisons made by critics. Gavia Baker-Whitelaw devoted an entire article on The Daily Dot to the ways in which J.K. Rowling’s Harry Potter and the Cursed Child resembled a wok of “badfic,” while Ian Crouch of The New Yorker tried to parse the difference between fanfic and such works as Jean Rhys’s Wide Sargasso Sea:

Fan fiction is surely not a new phenomenon, nor is it an uninteresting one, but it is different in kind and quality from a work like Rhys’s, or, to take a recent example, Cynthia Ozick’s remarkable new novel, Foreign Bodies, which reimagines the particulars of The Ambassadors, by Henry James. Not only do these books interpret texts in the public domain…but they do so with an admirable combination of respect and originality.

As a teenager, I wrote a lot of X-Files fanfic, mostly because I knew that it would give me a readily available audience for the kind of science fiction that I liked, and although I look back on that period in my life with enormous affection—I think about it almost every day—I’m also aware of the limitations that it imposed on my development as a writer. The trouble with fanfic is that it allows you to produce massive amounts of material while systematically avoiding the single hardest element of fiction: the creation of imaginary human beings capable of sustaining our interest and sympathy. It begins in an enviable position, with a cast of characters to which the reader is already emotionally attached. As a result, the writer can easily be left in a state of arrested development, with superb technical skills when it comes to writing about the inner life of existing characters, but little sense of how to do it from scratch. This even holds true when the writer is going back to characters that he or she originally created or realized onscreen. When J.K. Rowling revisits her most famous series or Peter Jackson gives us a fight scene with Elrond and the Ringwraiths, there’s an inescapable sense that all of the heavy lifting took place at an earlier stage. These artists are trading on the affection that we hold toward narrative decisions made years ago, instead of drawing us into the story in the moment. And even when the name on the title page or the director’s credit is the same, readers and viewers can sense when creators are indulging themselves, rather than following the logic of the underlying material.

This all means that fanfic, at its worst, is a code word for a kind of sentimentality, as John Gardner describes it in The Art of Fiction:

If the storyteller appears to stock response (our love of God or country, our pity for the downtrodden, the presumed warm feelings all decent people have for children and small animals)…then the effect is sentimentality, and no reader who’s experienced the power of real fiction will be pleased by it.

Replace “children and small animals” with Harry Potter and Gandalf, and you have a concise description of how fanfic works, encouraging readers to plow through tens of thousands of words because of the hard work of imaginative empathy that someone else did long ago. When Gay and Bacon compare Confederate to fan fiction, I think that this is what they mean. It isn’t drawing on existing characters, but on a collection of ideas, images, and historical events that carry an overwhelming emotional charge before Benioff and Weiss have written a line. You could argue that countless works of art have done the same thing—the canonical work of Civil War fanfic has got to be Gone With the Wind—but if slavery seems somehow different now, it’s largely because of the timing, as Gay notes: “We do not make art in a vacuum isolated from sociopolitical context. We live in a starkly divided country with a president who is shamefully ill equipped to bridge that divide.” Benioff and Weiss spent years developing their premise, and when they began, they couldn’t have anticipated the environment in which their announcement would be received. I don’t want the project to be canceled, which would have a freezing effect throughout the industry, but they should act as if they’re going to be held to a higher standard. Because they will be.

Cruise and control

leave a comment »

Over the last week, I’ve been listening to a long interview that the writer and director Christopher McQuarrie gave to The Empire Film Podcast after the release of Mission: Impossible—Rogue Nation. It’s over two and a half hours long and loaded with insight, but it also has a somewhat different tone when you come to it after the recent debacle of The Mummy. McQuarrie, predictably, has nothing but good words for Tom Cruise, whom he describes as the ultimate producer, with a hand in every aspect of the creative process. Now compare this to the postmortem in Variety:

In the case of The Mummy, one person—Cruise—had an excessive amount of control, according to several people interviewed. The reboot of The Mummy was supposed to be the start of a mega-franchise for Universal Pictures. But instead, it’s become a textbook case of a movie star run amok…Several sources close to the production say that Cruise exerted nearly complete creative oversight on The Mummy, essentially wearing all the hats and dictating even the smallest decisions on the set…Universal, according to sources familiar with the matter, contractually guaranteed Cruise control of most aspects of the project, from script approval to post-production decisions.

To put it another way, between Rogue Nation and The Mummy, absolutely nothing changed. On the one hand, Cruise’s perfectionist tendencies resulted in an excellent piece of work; on the other, they led to a movie that most critics agree is nearly unwatchable. This might seem like a paradox, but I’d prefer to see it as proof that this level of obsessiveness is required to make any movie whatsoever, regardless of the outcome. It may come from a producer or director rather than from the star, but in its absence, complicated projects just don’t get made at all. And the quality of the finished product is the result of factors that are out of even Tom Cruise’s control.

If you work in any creative field, you probably know this already, but the extent to which you’re willing to accept it is often determined by where your role falls in production. At one extreme, you have someone like the editor Walter Murch, who hangs a shiny brass “B” in his office. As Charles Koppelman writes in Behind the Seen:

Ask Walter about it, and he’ll tell you about aiming for a “B.” Work hard to get the best grade you can—in this world, a B is all that is humanly attainable. One can be happy with that. Getting an A? That depends on good timing and the whims of the gods—it’s beyond your control. If you start to think that the gods are smiling, they will take your revenge. Keep your blade sharp. Make as good a film as you know how. It’s an Eastern-oriented philosophy, as expressed by the American writer and philosopher, Ralph Waldo Emerson: “We aim above the mark to hit the mark.”

At the other extreme, you have the star, who has been groomed to attribute everything good in a movie to his or her irreplaceable presence. And it’s no accident that you find these two attitudes at opposite ends of the production process. The light that strikes the star’s face is captured on film that works its way down the chain to the editors, who have little choice but to be pragmatic: they can only work with the footage that they’ve been given, and while they have lots of good tricks for manipulating it, they’re ultimately the ones who deal with what remains after all the fond hopes that went into a film have collided with reality. They know exactly what they do and don’t have. And they’re aware that superhuman technical control doesn’t represent the high end of craft, but the bare minimum required to do useful work.    

The screenwriter lies somewhere in the middle. In theory, he’s the one who gets paid to dream, and he isn’t constrained by any outside factors when he’s putting ideas down on the page. This isn’t quite how it works in practice, since there are plenty of externalities to consider at every point, and a screenwriter is often asked to solve problems at every stage in production. And we should be a little skeptical of what they have to say. Our understanding of cinematic craft is skewed by the fact that writers have traditionally been its most eloquent and entertaining expositors, which provides just one perspective on the making of the movie. One reason is the fact that screenwriters need to be good with words, not just for the script, but for the pitch meeting, which is another sort of performance—and it encourages them to deliver a hard sell for the act of writing itself. Another is that screenwriters have often been critically denigrated in favor of directors, which obliges them to be exceptionally funny, insightful, and forceful when they’re defending the importance of what they do for a living. Finally, there’s a kind of cynicism about the idea of control, which makes it easier to talk about it afterward. No screenplay is ever shot or released as written, which means that screenwriters exist to have their visions betrayed. If you believe that movies are made up largely of the contingent factors that emerge during production, that’s how it should be. But it also leaves screenwriters in a strange place when it comes to questions of control. Terry Rossio says of formatting the script so that the page breaks come at the right spot: “If you find yourself with this sort of obsessive behavior—like coming up with inventive ways to cheat the page count!—then, I think, you’ve got the right kind of attitude to make it in Hollywood.” He’s clearly right. But it’s also the kind of meticulousness that will be seen by only a handful of insiders, before your ideas pass through the hands of a dozen other professionals on the way to taking an unrecognizable form onscreen.

This may be the real reason why the screenwriters who serve as public advocates for craft—William Goldman, Robert Towne, Tony Gilroy, McQuarrie, and a few others—are also the ones with reputations as fixers, coming in at the very end to work on “troubled” shoots, which, as I’ve argued before, describes nearly every studio movie ever. These writers may well be legitimately better than most of their peers at solving problems, or at least they’re perceived that way, which is why they get those assignments. (As McQuarrie recently said to John August, when asked about the use of writers’ rooms on franchises like Transformers: “I believe you can create all of the Transformers stuff you want. You can build out the whole universe…When the rubber hits the road, that’s all going to change. They’re going to call you. They’re going to call me.” And he’s probably correct.) They’re survivors, and they’ve inevitably got good war stories to share. But we’re also more likely to listen to writers whose contributions come at the end of the process, where their obsessiveness can have a visible impact. It allows them to take credit for what worked while implicitly washing their hands of what didn’t, and there’s an element of chance involved here, too: every screenwriter wants to be the last one hired on a movie, but where you end up on that queue has a lot to do with luck and timing. I still find McQuarrie impossible to resist, and I learn more about storytelling from listening to him for ten minutes than by doing anything else. I’ve been talking about his interview so much that my wife joked that it’s my new religion. Well, maybe it is. But given how little anyone can control, it’s closer to John Gardner says about writing novels: it’s a yoga, a way of life in the world, rather than an end in itself. As McQuarrie himself says to Empire: “Never do anything to effect a result. Do something because you want to do it, or because you have to do it.” And he would know.

A most pitiful ambition

with 3 comments

In Magic and Showmanship, which is one of my favorite books on storytelling of any kind, the magician and polymath Henning Nelms sets forth a principle that ought to be remembered by all artists:

An illusion is, by definition, untrue. In every field, we detect untruth by inconsistency. We recognize statements as false when they contradict themselves. An actor who does something which is not in keeping with his role falls out of character, and the spell of the play is broken. If a conjurer’s words and actions fail to match the powers he claims, he pricks the bubble of illusion; he may still entertain his audience with a trick, but he loses the magic of drama. Consistency is the key to conviction.

Nelms adds that consistency is also the key to entertainment, and that it achieves its greatest impact when all of its resources are directed toward the same goal. He continues:

Consistency implies a standard. We cannot merely be consistent; we must be consistent with something. In creating an illusion, our standard is the theme. Once you realize this, you will find that the theme provides a guide to every detail of your presentation. This is a tremendous asset. It answers many questions almost before you can ask them.

And Nelms concludes with a powerful rule: “Plan a routine as if every element of the theme—personalities, phenomena, purpose, and proof—were literally true.”

To some extent, this is simply a restatement of what John Gardner calls “the vivid and continuous fictional dream.” Any lapse or inconsistency will draw viewers or readers out of the performance, and it can be hard to get them back again. As Nelms puts it:

Although the “as if” rule is an inspiring guide, it is also a strict taskmaster. Consistency is essential to any suspension of disbelief. No conviction is so deep that it cannot be destroyed by a discrepancy in the presentation. On the contrary, the more profoundly the spectators are enthralled by a performance, the more likely they are to be jerked back to reality by anything which is not in harmony with the illusion.

Even more usefully, Nelms frames this rule as a courtesy to the magician himself, since it provides a source of information at times when we might otherwise be lost: “It not only helps us to make decisions, but suggests ideas.” He also helpfully observes that it can be more productive, on a creative level, to focus on eliminating discrepancies, rather than on heightening the elements that are already effective:

My whole procedure as a showman is based on a technique of hunting for faults and ruthlessly eliminating them…The good parts of a play or routine take care of themselves. If I see a way to improve them, I do so. But I never worry about them. Instead, I concentrate on spotting and correcting the flaws. These are the places that offer the greatest opportunities for improvement. Hence, they are also the places where time and effort devoted to improvement will produce the greatest results.

On a practical level, Nelms suggests that you write down an outline of the illusion as if it were literally true, and then see where you have to depart from this ideal for technical reasons—which is where you should concentrate your attention to minimize any obvious discrepancies. This all seems like common sense, and if writers and performers sometimes forget this, it’s because they get attached to inconsistencies that provide some other benefit in the short term. Nelms writes:

Many dramas have been ruined by actors who tried to enliven serious scenes by being funny. The spectators laughed at the comedy, but they were bored by the play. The same law holds true for conjuring: No matter how effective an inconsistent part may be, the damage that it does to the routine as a whole more than offsets whatever advantages it may have in itself.

He continues: “Directors and performers alike are so flattered by hearing an audience laugh or exclaim over some line or action that they blind themselves to the harm it does to the play or the illusion.” This tendency is as old as drama itself, as we see in Hamlet’s advice to the players, and it can have a troubling effect on the audience:

A discrepancy may escape conscious notice and still weaken conviction. The suspension of disbelief is a subconscious process. No one says to himself, “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.” Spectators can be led to adopt this attitude, but they must do so without thinking—and without realizing that they have done anything of the kind.

Which brings us, unfortunately, to Donald Trump. If you’re a progressive who is convinced that the president is trying to put one over on the public, you also have to confront the fact that he isn’t especially good at it. Not only are the discrepancies glaring, but they occur with a clockwork regularity that would be funny if it weren’t so horrifying. After the Washington Post reported that Trump had disclosed classified information—remember that?—to the Russian foreign minister and ambassador, his national security adviser said: “I was in the room. It did not happen.” The next day, Trump tweeted that he “wanted to share” the facts with Russia, as he had “the absolute right to do.” After James Comey was fired, the White House issued a statement saying that Trump had acted on the advice of the Justice Department, which based its recommendation on Comey’s handling of the investigation into Hilary Clinton’s emails. Two days later, Trump contradicted both points in an interview with Lester Holt: “I was going to fire Comey. My decision…In fact, when I decided to just do it, I said to myself, I said: ‘You know, this Russia thing with Trump and Russia is a made-up story.’” And when his staff repeatedly asserted that the refugee order wasn’t a travel ban, only to have Trump insist that it was, it felt like a cutaway gag on Arrested Development. You’ll sometimes see arguments that Trump is a chess master, creating distractions like a magician utilizing the technique of misdirection, which strikes me as a weird form of liberal consolation. (It reminds me of what Cooder the carny says of being grifted by Homer Simpson: “Well, there’s no shame in bein’ beaten by the best.” When his son tries to point out that Homer didn’t seem very smart, Cooder interrupts angrily: “We were beaten by the best.”) But the real answer is close at hand. Let’s look at Hamlet’s speech again:

And let those that play your clowns speak no more than is set down for them, for there be of them that will themselves laugh, to set on some quantity of barren spectators to laugh too, though in the meantime some necessary question of the play be then to be considered. That’s villainous and shows a most pitiful ambition in the fool that uses it.

This may be the best thing ever written about the Trump administration. Trump has been trained for years to go for the easy laugh or the quick reaction from the crowd, and he’ll continue to do so, even as “necessary questions” need to be considered. He’s done pretty well with it so far. And he has a receptive audience that seems willing to tell itself exactly what Nelms thought was impossible: “If I am to enjoy this performance to the full, I must accept it as true and close my mind to the fact that I know it to be false.”

Written by nevalalee

June 9, 2017 at 9:02 am

My ten great books #5: Couples

leave a comment »

In his discussion of the aesthetic flaw of frigidity in The Art of Fiction, John Gardner says: “When a skillful writer writes a shallow, cynical, merely amusing book about extramarital affairs, he has wandered—with far more harmful effect—into the same unsavory bog.” There’s little doubt in my mind that he’s thinking of John Updike, of whom a very different author, Lawrence Block, states in Writing the Novel: “It’s probably safe to assume that John Updike wrote Couples out of comparable cupidity, but it’s hardly vintage Updike, and the author’s own detachment from it is evident throughout.” Given the fact that this novel was based so closely on the writer’s personal life that it scandalized his circle of friends in Ipswich, it might seem hard to describe it as shallow, cynical, and detached—which doesn’t mean that it can’t be all of these things as well. Couples made Updike rich and famous, and it was clearly conceived as a mainstream novel, but this was less a question of trying to write a bestseller than of shaping it for the cultural position that he hoped it would attain. Updike had already been promised the cover of Time magazine before it came out, and, as he later recalled: “Then they read the book and discovered, I think, that, the higher up it went in the Time hierarchy, the less they liked it.” As Jonathan Franzen did with The Corrections, Updike seems to have known that his next effort was positioned to break through in a huge way, and he engineered it accordingly, casting his obsessions with sex, death, and mortality into a form that would resonate with a wider audience. The back cover of my paperback copy calls it “an intellectual Peyton Place,” and I think that the quote must have pleased him.

I’ve always been fascinated by the moment in the late sixties and early seventies that made it possible for the conventions of modernist realism—particularly its attitudes toward sex—to be appropriated by bestselling writers. The early novels of Stephen King are a key text here, but so, in its way, is Couples, which shows the line of influence running in the other direction. In his determination to write a big book, Updike drew on the structural symmetries of popular fiction, and the result was his most richly organized novel of any kind. Like Mad Men, which takes place in the same era, it draws you in with its superficial pleasures and then invites you to go deeper, although many readers or viewers seem happy to stop at the surface. Gardner fretted about this possibility at length in On Moral Fiction:

[Updike is] a master of symbolic complexity, but one can’t tell his women apart in a book like Couples; his characters’ sexual preoccupations, mostly perverse, are too generously indulged; and the disparity between the surface and sub-surface of his novels is treacherous: to the naive reader (and most readers of popular bestsellers are likely to be naive), a novel like A Month of Sundays seems like a merry, bourgeois-pornographic book…while to the subtler reader, the novel may be wearily if not ambivalently satirical, a sophisticated attack on false religion…Since the irony—the presumably satiric purpose—is nowhere available on the surface…one cannot help feeling misgivings about Updike’s intent.

It’s certainly possible to read Couples, as I often do, purely for entertainment, or as a kind of gossipy cultural reportage. (No other novel tells us more about what it must have really been like to be a member of the upper middle class at the time of the Kennedy assassination.) Yet we’re also implicated by that choice. I own a copy of the first hardcover edition, which I bought, in a symbolic act that might have struck even Updike as a little too on the nose, on the morning of my wedding day. As it turns out, my life resembles it in a lot of the small ways but none of the big ones. But maybe that’s because Updike got there first.

%d bloggers like this: