Posts Tagged ‘The A.V. Club’
Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on February 16, 2016.
Every few years or so, I go back and revisit Renata Adler’s famous attack in the New York Review of Books on the reputation of the film critic Pauline Kael. As a lifelong Kael fan, I don’t agree with Adler—who describes Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless”—but I respect the essay’s fire and eloquence, and it’s still a great read. What is sometimes forgotten is that Adler opens with an assault, not on Kael alone, but on the entire enterprise of professional criticism itself. Here’s what she says:
The job of the regular daily, weekly, or even monthly critic resembles the work of the serious intermittent critic, who writes only when he is asked to or genuinely moved to, in limited ways and for only a limited period of time…Normally, no art can support for long the play of a major intelligence, working flat out, on a quotidian basis. No serious critic can devote himself, frequently, exclusively, and indefinitely, to reviewing works most of which inevitably cannot bear, would even be misrepresented by, review in depth…
The simple truth—this is okay, this is not okay, this is vile, this resembles that, this is good indeed, this is unspeakable—is not a day’s work for a thinking adult. Some critics go shrill. Others go stale. A lot go simultaneously shrill and stale.
Adler concludes: “By far the most common tendency, however, is to stay put and simply to inflate, to pretend that each day’s text is after all a crisis—the most, first, best, worst, finest, meanest, deepest, etc.—to take on, since we are dealing in superlatives, one of the first, most unmistakable marks of the hack.” And I think that she has a point, even if I have to challenge a few of her assumptions. (The statement that most works of art “inevitably cannot bear, would even be misrepresented by, review in depth,” is particularly strange, with its implicit division of all artistic productions into the sheep and the goats. It also implies that it’s the obligation of the artist to provide a worthy subject for the major critic, when in fact it’s the other way around: as a critic, you prove yourself in large part through your ability to mine insight from the unlikeliest of sources.) Writing reviews on a daily or weekly basis, especially when you have a limited amount of time to absorb the work itself, lends itself inevitably to shortcuts, and you often find yourself falling back on the same stock phrases and judgments. And Adler’s warning about “dealing in superlatives” seems altogether prescient. As Keith Phipps and Tasha Robinson of The A.V. Club pointed out a few years back, the need to stand out in an ocean of competing coverage means that every topic under consideration becomes either an epic fail or an epic win: a sensible middle ground doesn’t generate page views.
But the situation, at least from Adler’s point of view, is even more dire than when she wrote this essay in the early eighties. When Adler’s takedown of Kael first appeared, the most threatening form of critical dilution lay in weekly movie reviews: today, we’re living in a media environment in which every episode of every television show gets thousands of words of critical analysis from multiple pop culture sites. (Adler writes: “Television, in this respect, is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which is only a measure of how much the way we think and talk about the medium has changed over the intervening three decades.) The conditions that Adler identifies as necessary for the creation of a major critic like Edmund Wilson or Harold Rosenberg—time, the ability to choose one’s subjects, and the freedom to quit when necessary—have all but disappeared for most writers hoping to make a mark, or even just a living. To borrow a trendy phrase, we’ve reached a point of peak content, with a torrent of verbiage being churned out at an unsustainable pace without the advertising dollars to support it, in a situation that can be maintained only by the seemingly endless supply of aspiring writers willing to be chewed up by the machine. And if Adler thought that even a monthly reviewing schedule was deadly for serious criticism, I’d be curious to hear how she feels about the online apprenticeship that all young writers seem expected to undergo these days.
Still, I’d like to think that Adler got it wrong, just as I believe that she was ultimately mistaken about Kael, whose legacy, for all its flaws, still endures. (It’s revealing to note that Adler had a long, distinguished career as a writer and critic herself, and yet she almost certainly remains best known among casual readers for her Kael review.) Not every lengthy writeup of the latest episode of Riverdale is going to stand the test of time, but as a crucible for forming a critic’s judgment, this daily grind feels like a necessary component, even if it isn’t the only one. A critic needs time and leisure to think about major works of art, which is a situation that the current media landscape doesn’t seem prepared to offer. But the ability to form quick judgments about works of widely varying quality and to express them fluently on deadline is an indispensable part of any critic’s toolbox. When taken as an end itself, it can be deadening, as Adler notes, but it can also be the foundation for something more, even if it has to be undertaken outside of—or despite—the critic’s day job. The critic’s responsibility, now more than ever, isn’t to detach entirely from the relentless pace of pop culture, but to find ways of channeling it into something deeper than the instantaneous think piece or hot take. As a daily blogger who also undertakes projects that can last for months or years, I’m constantly mindful of the relationship between my work on demand and my larger ambitions. And I sure hope that the two halves can work together. Because, like it or not, every critic is walking that path already.
Last week, The A.V. Club ran an entire article devoted to television shows in which the lead is also the best character, which only points to how boring many protagonists tend to be. I’ve learned to chalk this up to two factors, one internal, the other external. The internal problem stems from the reasonable principle that the narrative and the hero’s objectives should be inseparable: the conflict should emerge from something that the protagonist urgently needs to accomplish, and when the goal has been met—or spectacularly thwarted—the story is over. It’s great advice, but in practice, it often results in leads who are boringly singleminded: when every action needs to advance the plot, there isn’t much room for the digressions and quirks that bring characters to life. The supporting cast has room to go off on tangents, but the characters at the center have to constantly triangulate between action, motivation, and relatability, which can drain them of all surprise. A protagonist is under so much narrative pressure that when the story relaxes, he bursts, like a sea creature brought up from its crevasse to the surface. Elsewhere, I’ve compared a main character to a diagram of a pattern of forces, like one of the fish in D’Arcy Wentworth Thompson’s On Growth and Form, in which the animal’s physical shape is determined by the outside stresses to which it has been subjected. And on top of this, there’s an external factor, which is the universal desire of editors, producers, and studio executives to make the protagonist “likable,” which, whether or not you agree with it, tends to smooth out the rough edges that make a character vivid and memorable.
In the classic textbook Disney Animation: The Illusion of Life, we find a useful perspective on this problem. The legendary animators Frank Thomas and Ollie Johnston provide a list of guidelines for evaluating story material before the animation begins, including the following:
Tell your story through the broad cartoon characters rather than the “straight” ones. There is no way to animate strong-enough attitudes, feelings, or expressions on realistic characters to get the communication you should have. The more real, the less latitude for clear communication. This is more easily done with the cartoon characters who can carry the story with more interest and spirit anyway. Snow White was told through the animals, the dwarfs, and the witch—not through the prince or the queen or the huntsman. They had vital roles, but their scenes were essentially situation. The girl herself was a real problem, but she was helped by always working to a sympathetic animal or a broad character. This is the old vaudeville trick of playing the pretty girl against the buffoon; it helps both characters.
Even more than Snow White, the great example here is Sleeping Beauty, which has always fascinated me as an attempt by Disney to recapture past glories by a mechanical application of its old principles raised to dazzling technical heights. Not only do Aurora and Prince Philip fail to drive the story, but they’re all but abandoned by it—Aurora speaks fewer lines than any other Disney main character, and neither of them talk for the last thirty minutes. Not only does the film acknowledge the dullness of its protagonists, but it practically turns it into an artistic statement in itself.
And it arises from a tension between the nature of animation, which is naturally drawn to caricature, and the notion that sympathetic protagonists need to be basically realistic. With regard to the first point, Thomas and Johnston advise:
Ask yourself, “Can the story point be done in caricature?” Be sure the scenes call for action, or acting that can be caricatured if you are to make a clear statement. Just to imitate nature, illustrate reality, or duplicate live action not only wastes the medium but puts an enormous burden on the animator. It should be believable, but not realistic.
The italics are mine. This is a good rule, but it collides headlong with the principle that the “real” characters should be rendered with greater naturalism:
Of course, there is always a big problem in making the “real” or “straight” characters in our pictures have enough personality to carry their part of the story…The point of this is misinterpreted by many to mean that characters who have to be represented as real should be left out of feature films, that the stories should be told with broad characters who can be handled more easily. This would be a mistake, for spectators need to have someone or something they can believe in, or the picture falls apart.
And while you could make a strong case that viewers relate just as much to the sidekicks, it’s probably also true that a realistic central character serves an important functional role, which allows the audience to take the story seriously. This doesn’t just apply to animation, either, but to all forms of storytelling—including most fiction, film, and television—that work best with broad strokes. In many cases, you can sense the reluctance of animators to tackle characters who don’t lend themselves to such bold gestures:
Early in the story development, these questions will be asked: “Does this character have to be straight?” “What is the role we need here?” If it is a prince or a hero or a sympathetic person who needs acceptance from the audience to make the story work, then the character must be drawn realistically.
Figuring out the protagonists is a thankless job: they have to serve a function within the overall story, but they’re also liable to be taken out and judged on their own merits, in the absence of the narrative pressures that created them in the first place. The best stories, it seems, are the ones in which that pattern of forces results in something fascinating in its own right, or which transform a stock character into something more. (It’s revealing that Thomas and Johnston refer to the queen and the witch in Snow White as separate figures, when they’re really a single person who evolves over the course of the story into her true form.) And their concluding advice is worth bearing in mind by everyone: “Generally speaking, if there is a human character in a story, it is wise to draw the person with as much caricature as the role will permit.”
I first saw Brian De Palma’s Raising Cain when I was fourteen years old. In a weird way, it amounted to a peak moment of my early adolescence: I was on a school trip to our nation’s capital, sharing a hotel room with my friends from middle school, and we were just tickled to get away with watching an R-rated movie on cable. The fact that we ended up with Raising Cain doesn’t quite compare with the kids on The Simpsons cheering at the chance to see Barton Fink, but it isn’t too far off. I think that we liked it, and while I won’t claim that we understood it, that doesn’t mean much of anything—it’s hard for me to imagine anybody, of any age, entirely understanding this movie, which includes both me and De Palma himself. A few years later, I caught it again on television, and while I can’t say I’ve thought about it much since, I never forgot it. Gradually, I began to catch up on my De Palma, going mostly by whatever movies made Pauline Kael the most ecstatic at the time, which in itself was an education in the gap between a great critic’s pet enthusiasms and what exists on the screen. (In her review of The Fury, Kael wrote: “No Hitchcock thriller was ever so intense, went so far, or had so many ‘classic’ sequences.” I love Kael, but there are at least three things wrong with that sentence.) And ultimately De Palma came to mean a lot to me, as he does to just about anyone who responds to the movies in a certain way.
When I heard about the recut version of Raising Cain—in an interview with John Lithgow on The A.V. Club, no less, in which he was promoting his somewhat different role on The Crown—I was intrigued. And its backstory is particularly interesting. Shortly before the movie was first released, De Palma moved a crucial sequence from the beginning to the middle, eliminating an extended flashback and allowing the film to play more or less chronologically. He came to regret the change, but it was too late to do anything about it. Years later, a freelance director and editor named Peet Gelderblom read about the original cut and decided to restore it, performing a judicious edit on a digital copy. He put it online, where, unbelievably, it was seen by De Palma himself, who not only loved it but asked that it be included as a special feature on the new Blu-ray release. If nothing else, it’s a reminder of the true possibilities of fan edits, which have served mostly for competing visions of the ideal version of Star Wars. With modern software, a fan can do for a movie what Walter Murch did for Touch of Evil, restoring it to the director’s original version based on a script or a verbal description. In the case of Raising Cain, this mostly just involved rearranging the pieces in the theatrical cut, but other fans have tackled such challenges as restoring all the deleted scenes in Twin Peaks: Fire Walk With Me, and there are countless other candidates.
Yet Raising Cain might be the most instructive case study of all, because simply restoring the original opening to its intended place results in a radical transformation. It isn’t for everyone, and it’s necessary to grant De Palma his usual passes for clunky dialogue and characterization, but if you’re ready to meet it halfway, you’re rewarded with a thriller that twists back on itself like a Möbius strip. De Palma plunders his earlier movies so blatantly that it isn’t clear if he’s somehow paying loving homage to himself—bypassing Hitchcock entirely—or recycling good ideas that he feels like using again. The recut opens with a long mislead that recalls Dressed to Kill, which means that Lithgow barely even appears for the first twenty minutes. You can almost see why De Palma chickened out for the theatrical version: Lithgow’s performance as the meek Carter and his psychotic imaginary brother Cain feels too juicy to withhold. But the logic of the script was destroyed. For a film that tests an audience’s suspension of disbelief in so many other ways, it’s unclear why De Palma thought that a flashback would be too much for the viewer to handle. The theatrical release preserves all the great shock effects that are the movie’s primary reason for existing, but they don’t build to anything, and you’re left with a film that plays like a series of sketches. With the original order restored, it becomes what it was meant to be all along: a great shaggy dog story with a killer punchline.
Raising Cain is gleefully about nothing but itself, and I wouldn’t force anybody to watch it who wasn’t already interested. But the recut also serves as an excellent introduction to its director, just as the older version did for me: when I first encountered it, I doubt I’d seen anything by De Palma, except maybe The Untouchables, and Mission: Impossible was still a year away. It’s safe to say that if you like Raising Cain, you’ll like De Palma in general, and if you can’t get past its archness, campiness, and indifference to basic plausibility—well, I can hardly blame you. Watching it again, I was reminded of Blue Velvet, a far greater movie that presents the viewer with a similar test. It has the same mixture of naïveté and incredible technical virtuosity, with scenes that barely seem to have been written alternating with ones that push against the boundaries of the medium itself. You’re never quite sure if the director is in on the gag, and maybe it doesn’t matter. There isn’t much beauty in Raising Cain, and De Palma is a hackier and more mechanical director than Lynch, but both are so strongly visual that the nonsensory aspects of their films, like the obligatory scenes with the cops, seem to wither before our eyes. (It’s an approach that requires a kind of raw, intuitive trust from the cast, and as much as I enjoy what Lithgow does here, he may be too clever and resourceful an actor to really disappear into the role.) Both are rooted, crucially, in Hitchcock, who was equally obsessive, but was careful to never work from his own script. Hitchcock kept his secret self hidden, while De Palma puts it in plain sight. And if it turns out to be nothing at all, that’s probably part of the joke.
Note: Major spoilers follow for the most recent episode of Westworld.
Shortly before the final scene of “Trompe L’Oeil,” it occurred to me that Westworld, after a strong start, was beginning to coast a little. Like any ensemble drama on a premium cable channel, it’s a machine with a lot of moving parts, so it can be hard to pin down any specific source of trouble. But it appears to be a combination of factors. The plot thread centering on Dolores, which I’ve previously identified as the engine that drives the entire series, has entered something of a holding pattern—presumably because the show is saving its best material for closer to the finale. (I was skeptical of the multiple timelines theory at first, but I’m reluctantly coming around to it.) The introduction of Delos, the corporation that owns the park, as an active participant in the story is a decision that probably looked good on paper, but it doesn’t quite work. So far, the series has given us what amounts to a closed ecosystem, with a cast of characters that consists entirely of the hosts, the employees, and a handful of guests. At this stage, bringing in a broadly villainous executive from corporate headquarters comes precariously close to a gimmick: it would have been more interesting to have the conflict arise from someone we’d already gotten to know in a more nuanced way. Finally, it’s possible that the events of the last week have made me more sensitive to the tendency of the series to fall back on images of violence against women to drive the story forward. I don’t know how those scenes would have played earlier, but they sure don’t play for me now.
And then we get the twist that a lot of viewers, including me, had suspected might be coming: Bernard is a robot. Taken on its own, the revelation is smartly handled, and there are a lot of clever touches. In a scene at the beginning between Bernard and Hector, the episode establishes that the robots simply can’t process details that conflict with their programming, and this pays off nicely at the end, when Bernard doesn’t see the door that leads into Dr. Ford’s secret lab. A minute later, when Theresa hands him the schematics that show his own face, Bernard says: “It doesn’t look like anything to me.” (This raises an enticing possibility for future reveals, in which scenes from previous episodes that were staged from Bernard’s point of view are shown to have elements that we didn’t see at the time, because Bernard couldn’t. I don’t know if the show will take that approach, but it should—it’s nothing less than an improvement on the structural mislead in The Sixth Sense, and it would be a shame not to use it.) Yet the climactic moment, in which Dr. Ford calmly orders Bernard to murder Theresa, doesn’t land as well as it could have. It should have felt like a shocking betrayal, but the groundwork wasn’t quite there: Bernard and Theresa’s affair was treated very casually, and by the time we get to their defining encounter, whatever affection they had for each other is long gone. From the point of view of the overall plot, this arguably makes sense. But it also drains some of the horror from a payoff that the show must have known was coming. If we imagine Elsie as the victim instead, we can glimpse what the scene might have been.
Yet I’m not entirely sure this wasn’t intentional. Westworld is a cerebral, even clinical show, and it doesn’t seem to take pleasure in action or visceral climaxes for their own sake. Part of this probably reflects the temperament of its creators, but it also feels like an attempt by the show to position itself in a challenging time for this kind of storytelling. It’s a serialized drama that delivers new installments each week, but these days, such shows are just as likely to drop all ten episodes at once. This was obviously never an option for a show on HBO, but the weekly format creates real problems for a show that seems determined to set up twists that are more considered and logical than the usual shock deaths. To its credit, the show has played fair with viewers, and the clues to Bernard’s true nature were laid in with care. (If I noticed them, it was only because I was looking: I asked myself, working from first principles, what kind of surprise a show like this would be likely to spring, and the revelation that one of the staff members was actually a host seemed like a strong contender.) When a full week of online discussion and speculation falls between each episode, it becomes harder to deliver such surprises. Even if the multiple timeline theory doesn’t turn out to be correct, its very existence indicates the amount of energy, ingenuity, and obsessive analysis that the audience is willing to devote to it. As a result, the show’s emotional detachment comes off as a preemptive defense mechanism. It downplays the big twists, as if to tell us that it isn’t the surprises that count, but their implications.
In the case of Bernard, I’m willing to take that leap, if only because the character is in the hands of Jeffrey Wright, who is more qualified than any other actor alive to work through the repercussions. It’s a casting choice that speaks a lot, in itself, to the show’s intelligence. (In an interview with The A.V. Club, Wright has revealed that he didn’t know that Bernard was a robot when he shot the pilot, and that his own theory was that Dr. Ford was a creation of Bernard’s, which would have been even more interesting.) The revelation effectively reveals Bernard to have been the show’s secret protagonist all along, which is where he belongs, and it occurs at just about the right point in the season for it to resonate: we’ve still got three episodes to go, which gives the show room, refreshingly, to deal with the consequences, rather than rushing past them to the finale. Whether it can do the same with whatever else it has up its sleeve, including the possibility of multiple timelines, remains to be seen. But even though I’ve been slightly underwhelmed by the last two episodes, I’m still excited to see how it plays its hand. Even as Westworld unfolds from one week to the next, it clearly sees the season as a single continuous story, and the qualities that I’ve found unsatisfying in the moment—the lulls, the lack of connection between the various plot threads, the sense that it’s holding back for the climax—are those that I hope will pay off the most in the end. Like its robots, the series is built with a bicameral mind, with the logic of the whole whispering its instructions to the present. And more than any show since Mad Men, it seems to have its eye on the long game.
My short story “Ernesto,” which originally appeared in the March 2012 issue of Analog Science Fiction and Fact, has just been reprinted by Lightspeed. To celebrate its reappearance, I’ll be publishing revised versions of a few posts in which I described the origins of this story, which you can read for free here, along with a nice interview.
In an excellent interview from a few years ago with The A.V. Club, the director Steven Soderbergh spoke about the disproportionately large impact that small changes can have on a film: “Two frames can be the difference between something that works and something that doesn’t. It’s fascinating.” The playwright and screenwriter Jez Butterworth once made a similar point, noting that the gap between “nearly” and “really” in a photograph—or a script—can come down to a single frame. The same principle holds just as true, if not more so, for fiction. A cut, a new sentence, or a tiny clarification can turn a decent but unpublishable story into one that sells. These changes are often so invisible that the author himself would have trouble finding them after the fact, but their overall effect can’t be denied. And I’ve learned this lesson more than once in my life, perhaps most vividly with “Ernesto,” a story that I thought was finished, but which turned out to have a few more surprises in store.
When I was done with “Ernesto,” I sent it to Stanley Schmidt at Analog, who had just purchased my novelette “The Last Resort.” Stan’s response, which I still have somewhere in my files, was that the story didn’t quite grab him enough to find room for it in a rather crowded schedule, but that he’d hold onto it, just in case, while I sent it around to other publications. It wasn’t a rejection, exactly, but it was hardly an acceptance. (Having just gone through three decades of John W. Campbell’s correspondence, I now know that this kind of response is fairly common when a magazine is overstocked.) I dutifully sent it around to most of the usual suspects at the time: Asimov’s, Fantasy & Science Fiction, and the online magazines Clarkesworld and Intergalatic Medicine Show. Some had a few kind words for the story, but they all ultimately passed. At that point, I concluded that “Ernesto” just wasn’t publishable. This was hardly the end of the world—it had only taken two weeks to write—but it was an unfortunate outcome for a story that I thought was still pretty clever.
A few months later, I saw a call for submissions for a independent paperback anthology, the kind that pays its contributors in author’s copies, and its theme—science fiction stories about monks—seemed to fit “Ernesto” fairly well. The one catch was that the maximum length for submissions was 6,000 words, while “Ernesto” weighed in at over 7,500. Cutting twenty percent of a story that was already highly compressed, at least to my eyes, was no joke, but I figured that I’d give it a try. Over the course of a couple of days, then, I cut it to the bone, removing scenes and extra material wherever I could. Since almost a year had passed since I’d first written it, it was easy to see what was and wasn’t necessary. More significantly, I added an epigraph, from Ernest Hemingway’s interview with The Paris Review, that made it clear from the start that the main character was Hemingway, which wasn’t the case with the earlier draft. And the result read a lot more smoothly than the version I’d sent out before.
It might have ended there, with “Ernesto” appearing without fanfare in an unpaid anthology, but as luck would have it, Analog had just accepted a revised version of my novelette “The Boneless One,” which had also been rejected by a bunch of magazines in its earlier form. Encouraged by this, I thought I’d try the same thing with “Ernesto.” So I sent it to Analog again, and it was accepted, almost twelve months after my first submission. Now it’s being reprinted more than four years later by Lightspeed, a magazine that didn’t even exist when I first wrote it. The moral, I guess, is that if a story has been turned down by five of the top magazines in your field, it probably isn’t good enough to be published—but that doesn’t mean it can’t get better. In this case, my rule of spending two weeks on a short story ended up being not quite correct: I wrote the story in two weeks, shopped it around for a year, and then spent two more days on it. And those last two days, like Soderbergh’s two frames, were what made all the difference.
Earlier this week, The A.V. Club, which is still the pop culture website at which I spend the vast majority of my online life, announced a new food section called “Supper Club.” It’s helmed by the James Beard Award-winning food critic and journalist Kevin Pang, a talented writer and documentarian whose work I’ve admired for years. On Wednesday, alongside the site’s usual television and movie coverage, seemingly half the homepage was devoted to features like “America’s ten tastiest fast foods,” followed a day later by “All of Dairy Queen’s Blizzards, ranked.” And the reaction from the community was—not good. Pang’s introductory post quickly drew over a thousand comments, with the most upvoted response reading:
I’ll save you about six months of pissed-away cash. Please reallocate the money that will be wasted on this venture to add more shows to the TV Club review section.
Most of the other food features received the same treatment, with commenters ignoring the content of the articles themselves and complaining about the new section on principle. Internet commenters, it must be said, are notoriously resistant to change, and most vocal segment of the community represents a tiny fraction of the overall readership of The A.V. Club. But I think it’s fair to say that the site’s editors can’t be entirely happy with how the launch has gone.
Yet the readers aren’t altogether wrong, either, and in retrospect, you could make a good case that the rollout should have been handled differently. The A.V. Club has gone through a rough couple of years, with many of its most recognizable writers leaving to start the movie site The Dissolve—which recently folded—even as its signature television coverage has been scaled back. Those detailed reviews of individual episodes might be popular with commenters, but they evidently don’t generate enough page views to justify the same degree of investment, and the site is looking at ways to stabilize its revenue at a challenging time for the entire industry. The community is obviously worried abut this, and Supper Club happened to appear at a moment when the commenters were likely to be skeptical about any new move, as if it were all a zero-sum game, which it isn’t. But the launch itself didn’t help matters. It makes sense to start an enterprise like this with a lot of articles on its first day, but taking over half the site with minimal advance warning lost it a lot of goodwill. Pang could also have been introduced more gradually: he’s a celebrity in foodie circles, but to most A.V. Club readers, he’s just a name. (It was also probably a miscalculation to have Pang write the introductory post himself, which placed him in the awkward position of having to drum up interest in his own work for an audience that didn’t know who he was.) And while I’ve enjoyed some of the content so far, and I understand the desire to keep the features lightweight and accessible, I don’t think the site has done itself any favors by leading with articles like “Do we eat soup or do we drink soup?”
This might seem like a lot of analysis for a kerfuffle that will be forgotten within a few weeks, no matter how Supper Club does in the meantime. But The A.V. Club has been a landmark site for pop culture coverage for the last decade, and its efforts to reinvent itself should concern anyone who cares about whether such venues can survive. I found myself thinking about this shortly after reading the excellent New Yorker profile of Pete Wells, the restaurant critic of the New York Times. Its author, Ian Parker, notes that modern food writing has become a subset of cultural criticism:
“A lot of reviews now tend to be food features,” [former Times restaurant critic Mimi Sheraton] said. She recalled a reference to Martin Amis in a Wells review of a Spanish restaurant in Brooklyn; she said she would have mentioned Amis only “if he came in and sat down and ordered chopped liver.”
Craig Claiborne, in a review from 1966, observed, “The lobster tart was palatable but bland and the skewered lamb on the dry side. The mussels marinière were creditable.” Thanks, in part, to the informal and diverting columns of Gael Greene, at New York, and Ruth Reichl, the Times’ critic during the nineties, restaurant reviewing in American papers has since become as much a vehicle for cultural criticism and literary entertainment—or, as Sheraton put it, “gossip”—as a guide to eating out.
If this is true, and I think it is, it means that food criticism, for better or worse, falls squarely within the mandate of The A.V. Club, whether its commenters like it or not.
But that doesn’t mean that we shouldn’t hold The A.V. Club to unreasonably high standards. In fact, we should be harder on it than we would on most sites, for reasons that Parker neatly outlines in his profile of Wells:
As Wells has come to see it, a disastrous restaurant is newsworthy only if it has a pedigree or commercial might. The mom-and-pop catastrophe can be overlooked. “I shouldn’t be having to explain to people what the place is,” he said. This reasoning seems civil, though, as Wells acknowledged, it means that his pans focus disproportionately on restaurants that have corporate siblings. Indeed, hype is often his direct or indirect subject. Of the fifteen no-star evaluations in his first four years, only two went to restaurants that weren’t part of a group of restaurants.
Parker continues: “There are restaurants that exist to have four Times stars. With fewer, they become a kind of paradox.” And when it comes to pop culture, The A.V. Club is the equivalent of a four-star restaurant. It was writing deeply felt, outrageously long essays on film and television before the longread was even a thing—in part, I suspect, because of its historical connection to The Onion: because it was often mistaken for a parody site, it always felt the need to prove its fundamental seriousness, which it did, over and over again. If Supper Club had launched with one of the ambitious, richly reported pieces that Pang has written elsewhere, the response might have been very different. Listicles might make more economic sense, and they can be fun if done right, but The A.V. Club has defined itself as a place where obsessively detailed and personal pop culture writing has a home. That’s what Supper Club should be. And until it is, we shouldn’t be surprised if readers have trouble swallowing it.
“It’s the rare writer who cannot have sentences lifted from his work,” Norman Mailer once wrote. What he meant is that if a reviewer is eager to find something to mock, dismiss, or pick apart, any interesting book will provide plenty of ammunition. On a simple level of craft, it’s hard for most authors to sustain a high pitch of technical proficiency in every line, and if you want to make a novelist seem boring or ordinary, you can just focus on the sentences that fall between the high points. In his famously savage takedown of Thomas Harris’s Hannibal, Martin Amis quotes another reviewer who raved: “There is not a single ugly or dead sentence.” Amis then acidly observes:
Hannibal is a genre novel, and all genre novels contain dead sentences—unless you feel the throb of life in such periods as “Tommaso put the lid back on the cooler” or “Eric Pickford answered” or “Pazzi worked like a man possessed” or “Margot laughed in spite of herself” or “Bob Sneed broke the silence.”
Amis knows that this is a cheap shot, and he glories in it. But it isn’t so different from what critics do when they list the awful sentences from a current bestseller or nominate lines for the Bad Sex in Fiction Award. I laugh at this along with anyone else, but I also wince a little, because there are few authors alive who aren’t vulnerable to that sort of treatment. As G.K. Chesterton pointed out: “You could compile the worst book in the world entirely out of selected passages from the best writers in the world.”
This is even more true of authors who take considerable stylistic or thematic risks, which usually result in individual sentences that seem crazy or, worse, silly. The fear of seeming ridiculous is what prevents a lot of writers from taking chances, and it isn’t always unjustified. An ambitious novel opens itself up to savaging from all sides, precisely because it provides so much material that can be turned against the author when taken out of context. And it doesn’t need to be malicious, either: even objective or actively sympathetic critics can be seduced by the ease with which a writer can be excerpted to make a case. I’ve become increasingly daunted by the prospect of distilling the work of Robert A. Heinlein, for example, because his career was so long, varied, and often intentionally provocative that you can find sentences to support any argument about him that you want to make. (It doesn’t help that his politics evolved drastically over time, and they probably would have undergone several more transformations if he had lived for longer.) This isn’t to say that his opinions aren’t a fair target for criticism, but any reasonable understanding of who Heinlein was and what he believed—which I’m still trying to sort out for myself—can’t be conveyed by a handful of cherry-picked quotations. Literary biography is useful primarily to the extent that it can lay out a writer’s life in an orderly fashion, providing a frame that tells us something about the work that we wouldn’t know by encountering it out of order. But even that involves a process of selection, as does everything else about a biography. The biographer’s project isn’t essentially different from that of a working critic or reviewer: it just takes place on a larger scale.
And it’s worth noting that prolific critics themselves are particularly susceptible to this kind of treatment. When Renata Adler described Pauline Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless,” any devotee of Kael’s work had to disagree—but it was also impossible to deny that there was plenty of evidence for the prosecution. If you’re determined to hate Roger Ebert, you just have to search for the reviews in which his opinions, written on deadline, weren’t sufficiently in line with the conclusions reached by posterity, as when he unforgivably gave only three stars to The Godfather Part II. And there isn’t a single page in the work of David Thomson, who is probably the most interesting movie critic who ever lived, that couldn’t be mined for outrageous, idiotic, or infuriating statements. I still remember a review on The A.V. Club of How to Watch a Movie that quoted lines like this:
Tell me a story, we beg as children, while wanting so many other things. Story will put off sleep (or extinction) and the child’s organism hardly trusts the habit of waking yet.
You came into this book under deceptive promises (mine) and false hopes (yours). You believed we might make decisive progress in the matter of how to watch a movie. So be it, but this was a ruse to make you look at life.
The reviewer quoted these sentences as examples of the book’s deficiencies, and they were duly excoriated in the comments. But anyone who has really read Thomson knows that such statements are part of the package, and removing them would also deny most of what makes him so fun, perverse, and valuable.
So what’s a responsible reviewer to do? We could start, maybe, by quoting longer or complete sections, rather than sentences in isolation, and by providing more context when we offer up just a line or two. We can also respect an author’s feelings, explicit or otherwise, about what sections are actually important. In the passage I mentioned at the beginning of this post, which is about John Updike, Mailer goes on to quote a few sentences from Rabbit, Run, and he adds:
The first quotation is taken from the first five sentences of the book, the second is on the next-to-last page, and the third is nothing less than the last three sentences of the novel. The beginning and end of a novel are usually worked over. They are the index to taste in the writer.
That’s a pretty good rule, and it ensures that the critic is discussing something reasonably close to what the writer intended to say. Best of all, we can approach the problem of excerpting with a kind of joy in the hunt: the search for the slice of a work that will stand as a synecdoche of the whole. In the book U & I, which is also about Updike, Nicholson Baker writes about the “standardized ID phrase” and “the aphoristic consensus” and “the jingle we will have to fight past at some point in the future” to see a writer clearly again, just as fans of Joyce have to do their best to forget about “the ineluctable modality of the visible” and “yes I said yes I will Yes.” For a living author, that repository of familiar quotations is constantly in flux, and reviewers might approach their work with a greater sense of responsibility if they realized that they were playing a part in creating it—one tiny excerpt at a time.