Posts Tagged ‘Renata Adler’
Quote of the Day
[Criticism] used to be one way a young writer made it in New York. He would attack, in a small obscure publication, someone very strong, highly regarded, whom a few people may already have hated. Then the young writer might gain a small following. When he looked for a job, an assignment, and an editor asked, “What have you published?” he could reply, “Well, this piece.” The editor might say, “Oh, yeah, that was met with a lot of consternation.” And a portfolio began. This isn’t the way it goes now. More like a race to join the herd of received ideas and agreement.
—Renata Adler, to The Believer
The driver and the signalman
In his landmark book Design With Nature, the architect Ian L. McHarg shares an anecdote from the work of an English biologist named George Scott Williamson. McHarg, who describes Williamson as “a remarkable man,” mentions him in passing in a discussion of the social aspects of health: “He believed that physical, mental, and social health were unified attributes and that there were aspects of the physical and social environment that were their corollaries.” Before diving more deeply into the subject, however, McHarg offers up an apparently unrelated story that was evidently too interesting to resist:
One of the most endearing stories of this man concerns a discovery made when he was undertaking a study of the signalmen who maintain lonely vigils while operating the switches on British railroads. The question to be studied was whether these lonely custodians were subject to boredom, which would diminish their dependability. It transpired that lonely or not, underpaid or not, these men had a strong sense of responsibility and were entirely dependable. But this was not the major perception. Williamson learned that every single signalman, from London to Glasgow, could identify infallibly the drivers of the great express trains which flashed past their vision at one hundred miles per hour. The drivers were able to express their unique personalities through the unlikely and intractable medium of some thousand tons of moving train, passing in a fraction of a second. The signalmen were perceptive to this momentary expression of the individual, and Williamson perceived the power of the personality.
I hadn’t heard of Williamson before reading this wonderful passage, and all that I know about him is that he was the founder of the Peckham Experiment, an attempt to provide inexpensive health and recreation services to a neighborhood in Southeast London. The story of the signalmen seems to make its first appearance in his book Science, Synthesis, and Sanity: An Inquiry Into the Nature of Living, which he cowrote with his wife and collaborator Innes Hope Pearse. They relate:
Or again, sitting in a railway signal box on a dark night, in the far distance from several miles away came the rumble of the express train from London. “Hallo,” said my friend the signalman. “Forsyth’s driving her—wonder what’s happened to Courtney?” Next morning, on inquiry of the stationmaster at the junction, I found it was true. Courtney had been taken ill suddenly and Forsyth had deputized for him—all unknown, of course, to the signalman who in any case had met neither Forsyth nor Courtney. He knew them only as names on paper and by their “action-pattern” impressed on a dynamic medium—a unique action-pattern transmitted through the rumble of an unseen train. Or, in a listening post with nothing visible in the sky, said the listener: “That’s ‘Lizzie,’ and Crompton’s flying her.” “Lizzie” an airplane, and her pilot imprinting his action-pattern on her course.
And while Williamson and Pearse are mostly interested in the idea of an individual’s “action-pattern” being visible in an unlikely medium, it’s hard not to come away more struck, like McHarg, by the image of the lone signalman, the passing machine, and the transient moment of connection between them.
As I read over this, it occurred to me that it perfectly encapsulated our relationship with a certain kind of pop culture. We’re the signalmen, and the movie or television show is the train. As we sit in our living rooms, lonely and relatively isolated, something passes across our field of vision—an episode of Game of Thrones, say, which often feels like a locomotive to the face. This is the first time that we’ve seen it, but it represents the end result of a process that has unfolded for months or years, as the episode was written, shot, edited, scored, and mixed, with the contributions of hundreds of men and women we wouldn’t be able to name. As we experience it, however, we see the glimmer of another human being’s personality, as expressed through the narrative machine. It isn’t just a matter of the visible choices made on the screen, but of something less definable, a “style” or “voice” or “attitude,” behind which, we think, we can make out the amorphous factors of influence and intent. We identify an artist’s obsessions, hangups, and favorite tricks, and we believe that we can recognize the mark of a distinctive style even when it goes uncredited. Sometimes we have a hunch about what happened on the set that day, or the confluence of studio politics that led to a particular decision, even if we have no way of knowing it firsthand. (This was one of the tics of Pauline Kael’s movie reviews that irritated Renata Adler: “There was also, in relation to filmmaking itself, an increasingly strident knowingness: whatever else you may think about her work, each column seemed more hectoringly to claim, she certainly does know about movies. And often, when the point appeared most knowing, it was factually false.”) We may never know the truth, but it’s enough if a theory seems plausible. And the primary difference between us and the railway signalman is that we can share our observations with everyone in sight.
I’m not saying that these inferences are necessarily incorrect, any more than the signalmen were wrong when they recognized the personal styles of particular drivers. If Williamson’s account is accurate, they were often right. But it’s worth emphasizing that the idea that you can recognize a driver from the passage of a train is no less strange than the notion that we can know something about, say, Christopher Nolan’s personality from Dunkirk. Both are “unlikely and intractable” mediums that serve as force multipliers for individual ability, and in the case of a television show or movie, there are countless unseen variables that complicate our efforts to attribute anything to anyone, much less pick apart the motivations behind specific details. The auteur theory in film represents an attempt to read movies like novels, but as Thomas Schatz pointed out decades ago in his book The Genius of the System, trying to read Casablanca as the handiwork of Michael Curtiz, rather than that of all of its collaborators taken together, is inherently problematic. And this is easy to forget. (I was reminded of this by the recent controversy over David Benioff and D.B. Weiss’s pitch for their Civil War alternate history series Confederate. I agree with the case against it that the critic Roxane Gay presents in her opinion piece for the New York Times, but the fact that we’re closely scrutinizing a few paragraphs for clues about the merits of a show that doesn’t even exist only hints at how fraught the conversation will be after it actually premieres.) There’s a place for informed critical discussion about any work of art, but we’re often drawing conclusions based on the momentary passage of a huge machine before our eyes, and we don’t know much about how it got there or what might be happening inside. Most of us aren’t even signalmen, who are a part of the system itself. We’re trainspotters.
The critical path
Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on February 16, 2016.
Every few years or so, I go back and revisit Renata Adler’s famous attack in the New York Review of Books on the reputation of the film critic Pauline Kael. As a lifelong Kael fan, I don’t agree with Adler—who describes Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless”—but I respect the essay’s fire and eloquence, and it’s still a great read. What is sometimes forgotten is that Adler opens with an assault, not on Kael alone, but on the entire enterprise of professional criticism itself. Here’s what she says:
The job of the regular daily, weekly, or even monthly critic resembles the work of the serious intermittent critic, who writes only when he is asked to or genuinely moved to, in limited ways and for only a limited period of time…Normally, no art can support for long the play of a major intelligence, working flat out, on a quotidian basis. No serious critic can devote himself, frequently, exclusively, and indefinitely, to reviewing works most of which inevitably cannot bear, would even be misrepresented by, review in depth…
The simple truth—this is okay, this is not okay, this is vile, this resembles that, this is good indeed, this is unspeakable—is not a day’s work for a thinking adult. Some critics go shrill. Others go stale. A lot go simultaneously shrill and stale.
Adler concludes: “By far the most common tendency, however, is to stay put and simply to inflate, to pretend that each day’s text is after all a crisis—the most, first, best, worst, finest, meanest, deepest, etc.—to take on, since we are dealing in superlatives, one of the first, most unmistakable marks of the hack.” And I think that she has a point, even if I have to challenge a few of her assumptions. (The statement that most works of art “inevitably cannot bear, would even be misrepresented by, review in depth,” is particularly strange, with its implicit division of all artistic productions into the sheep and the goats. It also implies that it’s the obligation of the artist to provide a worthy subject for the major critic, when in fact it’s the other way around: as a critic, you prove yourself in large part through your ability to mine insight from the unlikeliest of sources.) Writing reviews on a daily or weekly basis, especially when you have a limited amount of time to absorb the work itself, lends itself inevitably to shortcuts, and you often find yourself falling back on the same stock phrases and judgments. And Adler’s warning about “dealing in superlatives” seems altogether prescient. As Keith Phipps and Tasha Robinson of The A.V. Club pointed out a few years back, the need to stand out in an ocean of competing coverage means that every topic under consideration becomes either an epic fail or an epic win: a sensible middle ground doesn’t generate page views.
But the situation, at least from Adler’s point of view, is even more dire than when she wrote this essay in the early eighties. When Adler’s takedown of Kael first appeared, the most threatening form of critical dilution lay in weekly movie reviews: today, we’re living in a media environment in which every episode of every television show gets thousands of words of critical analysis from multiple pop culture sites. (Adler writes: “Television, in this respect, is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which is only a measure of how much the way we think and talk about the medium has changed over the intervening three decades.) The conditions that Adler identifies as necessary for the creation of a major critic like Edmund Wilson or Harold Rosenberg—time, the ability to choose one’s subjects, and the freedom to quit when necessary—have all but disappeared for most writers hoping to make a mark, or even just a living. To borrow a trendy phrase, we’ve reached a point of peak content, with a torrent of verbiage being churned out at an unsustainable pace without the advertising dollars to support it, in a situation that can be maintained only by the seemingly endless supply of aspiring writers willing to be chewed up by the machine. And if Adler thought that even a monthly reviewing schedule was deadly for serious criticism, I’d be curious to hear how she feels about the online apprenticeship that all young writers seem expected to undergo these days.
Still, I’d like to think that Adler got it wrong, just as I believe that she was ultimately mistaken about Kael, whose legacy, for all its flaws, still endures. (It’s revealing to note that Adler had a long, distinguished career as a writer and critic herself, and yet she almost certainly remains best known among casual readers for her Kael review.) Not every lengthy writeup of the latest episode of Riverdale is going to stand the test of time, but as a crucible for forming a critic’s judgment, this daily grind feels like a necessary component, even if it isn’t the only one. A critic needs time and leisure to think about major works of art, which is a situation that the current media landscape doesn’t seem prepared to offer. But the ability to form quick judgments about works of widely varying quality and to express them fluently on deadline is an indispensable part of any critic’s toolbox. When taken as an end itself, it can be deadening, as Adler notes, but it can also be the foundation for something more, even if it has to be undertaken outside of—or despite—the critic’s day job. The critic’s responsibility, now more than ever, isn’t to detach entirely from the relentless pace of pop culture, but to find ways of channeling it into something deeper than the instantaneous think piece or hot take. As a daily blogger who also undertakes projects that can last for months or years, I’m constantly mindful of the relationship between my work on demand and my larger ambitions. And I sure hope that the two halves can work together. Because, like it or not, every critic is walking that path already.
The excerpt opinion
“It’s the rare writer who cannot have sentences lifted from his work,” Norman Mailer once wrote. What he meant is that if a reviewer is eager to find something to mock, dismiss, or pick apart, any interesting book will provide plenty of ammunition. On a simple level of craft, it’s hard for most authors to sustain a high pitch of technical proficiency in every line, and if you want to make a novelist seem boring or ordinary, you can just focus on the sentences that fall between the high points. In his famously savage takedown of Thomas Harris’s Hannibal, Martin Amis quotes another reviewer who raved: “There is not a single ugly or dead sentence.” Amis then acidly observes:
Hannibal is a genre novel, and all genre novels contain dead sentences—unless you feel the throb of life in such periods as “Tommaso put the lid back on the cooler” or “Eric Pickford answered” or “Pazzi worked like a man possessed” or “Margot laughed in spite of herself” or “Bob Sneed broke the silence.”
Amis knows that this is a cheap shot, and he glories in it. But it isn’t so different from what critics do when they list the awful sentences from a current bestseller or nominate lines for the Bad Sex in Fiction Award. I laugh at this along with anyone else, but I also wince a little, because there are few authors alive who aren’t vulnerable to that sort of treatment. As G.K. Chesterton pointed out: “You could compile the worst book in the world entirely out of selected passages from the best writers in the world.”
This is even more true of authors who take considerable stylistic or thematic risks, which usually result in individual sentences that seem crazy or, worse, silly. The fear of seeming ridiculous is what prevents a lot of writers from taking chances, and it isn’t always unjustified. An ambitious novel opens itself up to savaging from all sides, precisely because it provides so much material that can be turned against the author when taken out of context. And it doesn’t need to be malicious, either: even objective or actively sympathetic critics can be seduced by the ease with which a writer can be excerpted to make a case. I’ve become increasingly daunted by the prospect of distilling the work of Robert A. Heinlein, for example, because his career was so long, varied, and often intentionally provocative that you can find sentences to support any argument about him that you want to make. (It doesn’t help that his politics evolved drastically over time, and they probably would have undergone several more transformations if he had lived for longer.) This isn’t to say that his opinions aren’t a fair target for criticism, but any reasonable understanding of who Heinlein was and what he believed—which I’m still trying to sort out for myself—can’t be conveyed by a handful of cherry-picked quotations. Literary biography is useful primarily to the extent that it can lay out a writer’s life in an orderly fashion, providing a frame that tells us something about the work that we wouldn’t know by encountering it out of order. But even that involves a process of selection, as does everything else about a biography. The biographer’s project isn’t essentially different from that of a working critic or reviewer: it just takes place on a larger scale.
And it’s worth noting that prolific critics themselves are particularly susceptible to this kind of treatment. When Renata Adler described Pauline Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless,” any devotee of Kael’s work had to disagree—but it was also impossible to deny that there was plenty of evidence for the prosecution. If you’re determined to hate Roger Ebert, you just have to search for the reviews in which his opinions, written on deadline, weren’t sufficiently in line with the conclusions reached by posterity, as when he unforgivably gave only three stars to The Godfather Part II. And there isn’t a single page in the work of David Thomson, who is probably the most interesting movie critic who ever lived, that couldn’t be mined for outrageous, idiotic, or infuriating statements. I still remember a review on The A.V. Club of How to Watch a Movie that quoted lines like this:
Tell me a story, we beg as children, while wanting so many other things. Story will put off sleep (or extinction) and the child’s organism hardly trusts the habit of waking yet.
And this:
You came into this book under deceptive promises (mine) and false hopes (yours). You believed we might make decisive progress in the matter of how to watch a movie. So be it, but this was a ruse to make you look at life.
The reviewer quoted these sentences as examples of the book’s deficiencies, and they were duly excoriated in the comments. But anyone who has really read Thomson knows that such statements are part of the package, and removing them would also deny most of what makes him so fun, perverse, and valuable.
So what’s a responsible reviewer to do? We could start, maybe, by quoting longer or complete sections, rather than sentences in isolation, and by providing more context when we offer up just a line or two. We can also respect an author’s feelings, explicit or otherwise, about what sections are actually important. In the passage I mentioned at the beginning of this post, which is about John Updike, Mailer goes on to quote a few sentences from Rabbit, Run, and he adds:
The first quotation is taken from the first five sentences of the book, the second is on the next-to-last page, and the third is nothing less than the last three sentences of the novel. The beginning and end of a novel are usually worked over. They are the index to taste in the writer.
That’s a pretty good rule, and it ensures that the critic is discussing something reasonably close to what the writer intended to say. Best of all, we can approach the problem of excerpting with a kind of joy in the hunt: the search for the slice of a work that will stand as a synecdoche of the whole. In the book U & I, which is also about Updike, Nicholson Baker writes about the “standardized ID phrase” and “the aphoristic consensus” and “the jingle we will have to fight past at some point in the future” to see a writer clearly again, just as fans of Joyce have to do their best to forget about “the ineluctable modality of the visible” and “yes I said yes I will Yes.” For a living author, that repository of familiar quotations is constantly in flux, and reviewers might approach their work with a greater sense of responsibility if they realized that they were playing a part in creating it—one tiny excerpt at a time.
Sofia’s world
When you’ve gotten into the habit of seeing television as a source of hot takes and think pieces, it can be hard to turn that mindset off. Consider the case of Sofia the First. Of all the shows that my daughter watches these days, it’s by far her favorite, and its easy availability through Netflix and Disney Junior means that we absorb three or four episodes on an average morning. (Like most parents, I do what I can to keep screen time under control, but it isn’t easy: we’re at the point where I can only talk her into brushing her teeth and putting on her pajamas with the promise of a Taylor Swift video.) Most of her shows tend to blur into background noise, largely because I’ve already been up since before sunrise, but I’ve ended up watching Sofia more closely. And I like it. It’s a show that benefits from having the full resources of the massive Disney studio mustered on its behalf: as the gateway into the princess franchise for an entire generation of toddlers, it’s a crucial piece of that machine, and you can tell that a lot of time, money, and effort have gone into making it as appealing a product as possible. The animation is great, the songs are cute, and the writing is reasonably sharp, even as it remains pitched squarely toward the kindergarten crowd. When I sit down to watch it, I have a good time.
But the strange thing is that I also find myself thinking about it at odd moments throughout the day. The premise, if you aren’t familiar with it, is spelled out with admirable efficiency in the show’s theme song: Sofia was a village girl whose mother married the king of Enchancia, making her a princess overnight and giving her a new royal brother and sister. She has a magic amulet that lets her talk to animals, and which occasionally summons a princess from another movie to give her advice, although their input isn’t always particularly useful. (When Aurora from Sleeping Beauty turns up, you have to wonder what she has to teach anyone about anything, and her only tip is for Sofia to listen to her animal friends.) Her world is populated by the usual sorcerers and magical creatures, including, delightfully, Tim Gunn, more or less playing himself. And if this all sounds routine, it’s executed at a consistently high level, with a light touch and just enough wit to make it all very charming. The writers are clearly having fun with the material. They aren’t afraid to let Sofia herself come off as prissy or smug, and Amber, her stepsister, has become a fan favorite for obvious reasons: she’s vain, spoiled, and self-centered, but she’s also the closest thing we have to an audience surrogate, and she’s often the only one who sees the underlying ridiculousness of the situations in which she finds herself.
Yet the fact that I’ve devoted this much thought to Sofia at all indicates how my feelings about television have changed. I don’t think it’s possible for me to watch a show casually anymore: everything has to fit into a larger picture, as if I’m pitching some imaginary article to Salon. My wife and I have debated class issues, or their absence, in the kingdom of Enchancia; unpacked the character arc of Cedric the Sorcerer; made fun of the general incompetence of King Roland; compared the series to the plot of The Royal We; and joked about writing a crossover with Game of Thrones. (Honestly, James shades into Joffrey so imperceptibly that it isn’t even funny.) But we’re also being sucked into the show on its own terms, even if we can’t simply enjoy it in the way my daughter does—we have to justify it to ourselves. We’re used to seeking out shows to talk about, rather than having them sneak up on us: sometimes it seems as if we watch most shows these days so that we won’t be left out of the conversation online, rather than the other way around. And if we talk about Sofia at length, it’s because we’ve been trained to talk about every show this way. Thanks to my daughter, we basically binge watch it every morning. And even after she’s gone to bed, there are times when I’m folding laundry or doing other chores around the living room and I have to almost physically restrain myself from putting on an episode.
Of course, there’s a reason I’m writing about Sofia the First here and not Strawberry Shortcake: I’ve learned to value quality wherever I find it, and the show is an excellent example of how a branding strategy can yield something like real storytelling, however slickly packaged and presented. But it also reminds me of something that I’ve lost. A few weeks ago, I wrote a blog post in which I referred to television as a reviewable appliance, generating a steady stream of content to fill the voracious demands of online critics and readers. After reading—and occasionally writing—so much of it, I find it harder to relate to shows purely as entertainment. (It may also have something to do with the fact that I’ve watched nothing but appointment television for the last decade or so: it’s been a long time since I’ve tuned into something simply because it was on.) Sofia might seem like the quintessential example of Renata Adler called a work of art that “inevitably cannot bear, would even be misrepresented by, review in depth,” and although I doubt that this is what she meant, I do think that it deserves to be watched through a child’s eyes. And so do a lot of other shows. I might not gain much by seeing Sofia as my daughter would, but it might be healthier if I watched, say, Mad Men that way. As Sofia herself says in her theme song, there’s so much to learn and see. And I’ve got to figure out how to do it right.
The reviewable appliance
Last week, I quoted the critic Renata Adler, who wrote back in the early eighties: “Television…is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which only indicates how much has changed over the last thirty years, which have seen television not only validated as a reviewable medium, but transformed into maybe the single most widely reviewed art form in existence. Part of this is due to an increase in the quality of the shows themselves: by now, it’s a cliché to say that we’re living in a golden age of television, but that doesn’t make it any less true, until there are almost too many great shows for any one viewer to absorb. As John Landgraf of FX said last year, in a quote that was widely shared in media circles, mostly because it expresses how many of us feel: “There is simply too much television.” There are something like four hundred original scripted series airing these days—which is remarkable in itself, given how often critics have tolled the death knell for scripted content in the face of reality programming—and many are good to excellent. If we’ve learned to respect television as a medium that rewards close scrutiny, it’s largely because there are more worthwhile shows than ever before, and many deserve to be unpacked at length.
There’s also a sense in which shows have consciously risen to that challenge, taking advantage of the fact that there are so many venues for reviews and discussion. I never felt that I’d truly watched an episode of Mad Men until I’d watched Matthew Weiner’s weekly commentary and read the writeup on The A.V. Club, and I suspect that Weiner felt enabled to go for that level of density because the tools for talking about it were there. (To take another example: Mad Style, the fantastic blog maintained by Tom and Lorenzo, came into being because of the incredible work of costume designer Jane Bryant, but Bryant herself seemed to be make certain choices because she knew that they would be noticed and dissected.) The Simpsons is often called the first VCR show—it allowed itself to go for rapid freeze-frame jokes and sign gags because viewers could pause to catch every detail—but these days, we’re more likely to rely on recaps and screen grabs to process shows that are too rich to be fully grasped on a single viewing. I’m occasionally embarrassed when I click on a review and read about a piece of obvious symbolism that I missed the first time around, but you could also argue that I’ve outsourced that part of my brain to the hive mind, knowing that I can take advantage of countless other pairs of eyes.
But the fact that television inspires millions of words of coverage every day can’t be entirely separated from Adler’s description of it an appliance. For reasons that don’t have anything to do with television itself, the cycle of pop culture coverage—like that of every form of news—has continued to accelerate, with readers expecting nonstop content on demand: I’ll refresh a site a dozen times a day to see what has been posted in the meantime. Under those circumstances, reviewers and their editors naturally need a regular stream of material to be discussed, and television fits the bill beautifully. There’s a lot of it, it generates fresh grist for the mill on a daily basis, and it has an existing audience that can be enticed into reading about their favorite shows online. (This just takes a model that had long been used for sports and applies it to entertainment: the idea that every episode of Pretty Little Liars deserves a full writeup isn’t that much more ridiculous than devoting a few hundred words to every baseball game.) One utility piggybacks on the other, and it results in a symbiotic relationship: the shows start to focus on generating social media chatter, which, if not exactly a replacement for ratings, at least becomes an argument for keeping marginal shows like Community alive. And before long, the show itself is on Hulu or Yahoo.
None of this is inherently good or bad, although I’m often irked by the pressure to provide instant hot takes about the latest twist on a hit series, with think pieces covering other think pieces until the snake has eaten its own tail. (The most recent example was the “death” of Glenn on The Walking Dead, a show I don’t even watch, but which I found impossible to escape for three weeks last November.) There’s also an uncomfortable sense in which a television show can become an adjunct to its own media coverage: I found reading about Game of Thrones far more entertaining over the last season than watching the show itself. It’s all too easy to use the glut of detailed reviews as a substitute for the act of viewing: I haven’t watched Halt and Catch Fire, for instance, but I feel as if I have an opinion about it, based solely on the information I’ve picked up by osmosis from the review sites I visit. I sometimes worry that critics and fans have become so adept at live-tweeting episodes that they barely look at the screen, and the concept of hate-watching, of which I’ve been guilty myself, wouldn’t exist if we didn’t have plenty of ways to publicly express our contempt. It’s a slippery slope from there to losing the ability to enjoy good storytelling for its own sake. And we need to be aware of this. Because we’re lucky to be living in an era of so much great television—and we ought to treat it as something more than a source of hot and cold running reviews.
The critical path
A few weeks ago, I had occasion to mention Renata Adler’s famous attack in the New York Review of Books on the reputation of the film critic Pauline Kael. As a lifelong Kael fan, I don’t agree with Adler—who describes Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless”—but I respect the essay’s fire and eloquence, and it’s still a great read. What I’d forgotten is that Adler opens with an assault, not on Kael alone, but on the entire enterprise of professional criticism itself. Here’s what she says:
The job of the regular daily, weekly, or even monthly critic resembles the work of the serious intermittent critic, who writes only when he is asked to or genuinely moved to, in limited ways and for only a limited period of time…Normally, no art can support for long the play of a major intelligence, working flat out, on a quotidian basis. No serious critic can devote himself, frequently, exclusively, and indefinitely, to reviewing works most of which inevitably cannot bear, would even be misrepresented by, review in depth…
The simple truth—this is okay, this is not okay, this is vile, this resembles that, this is good indeed, this is unspeakable—is not a day’s work for a thinking adult. Some critics go shrill. Others go stale. A lot go simultaneously shrill and stale.
Adler concludes: “By far the most common tendency, however, is to stay put and simply to inflate, to pretend that each day’s text is after all a crisis—the most, first, best, worst, finest, meanest, deepest, etc.—to take on, since we are dealing in superlatives, one of the first, most unmistakable marks of the hack.” And I think that she has a point, even if I have to challenge a few of her assumptions. (The statement that most works of art “inevitably cannot bear, would even be misrepresented by, review in depth,” is particularly strange, with its implicit division of all artistic productions into the sheep and the goats. It also implies that it’s the obligation of the artist to provide a worthy subject for the major critic, when in fact it’s the other way around: as a critic, you prove yourself in large part through your ability to mine insight from the unlikeliest of sources.) Writing reviews on a daily or weekly basis, especially when you have a limited amount of time to absorb the work itself, lends itself inevitably to shortcuts, and you often find yourself falling back on the same stock phrases and judgments. And Adler’s warning about “dealing in superlatives” seems altogether prescient. As Keith Phipps and Tasha Robinson of The A.V. Club pointed out a few years back, the need to stand out in an ocean of competing coverage means that every topic under consideration becomes either an epic fail or an epic win: a sensible middle ground doesn’t generate page views.
But the situation, at least from Adler’s point of view, is even more dire than when she wrote this essay in the early eighties. When Adler’s takedown of Kael first appeared, the most threatening form of critical dilution lay in weekly movie reviews: today, we’re living in a media environment in which every episode of every television show gets thousands of words of critical analysis from multiple pop culture sites. (Adler writes: “Television, in this respect, is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which is only a measure of how much the way we think and talk about the medium has changed over the intervening three decades.) The conditions that Adler identifies as necessary for the creation of a major critic like Edmund Wilson or Harold Rosenberg—time, the ability to choose one’s subjects, and the freedom to quit when necessary—have all but disappeared for most writers hoping to make a mark, or even just a living. To borrow a trendy phrase, we’ve reached a point of peak content, with a torrent of verbiage being churned out at an unsustainable pace without the advertising dollars to support it, in a situation that can be maintained only by the seemingly endless supply of aspiring writers willing to be chewed up by the machine. And if Adler thought that even a monthly reviewing schedule was deadly for serious criticism, I’d be curious to hear how she feels about the online apprenticeship that all young writers seem expected to undergo these days.
Still, I’d like to think that Adler got it wrong, just as I believe that she was ultimately mistaken about Kael, whose legacy, for all its flaws, still endures. (It’s revealing to note that Adler had a long, distinguished career as a writer and critic herself, and yet she almost certainly remains best known among casual readers for her Kael review.) Not every lengthy writeup of the latest episode of The Vampire Diaries is going to stand the test of time, but as a crucible for forming a critic’s judgment, this daily grind feels like a necessary component, even if it isn’t the only one. A critic needs time and leisure to think about major works of art, which is a situation that the current media landscape doesn’t seem prepared to offer. But the ability to form quick judgments about works of widely varying quality and to express them fluently on deadline is an indispensable part of any critic’s toolbox. When taken as an end itself, it can be deadening, as Adler notes, but it can also be the foundation for something more, even if it has to be undertaken outside of—or despite—the critic’s day job. The critic’s responsibility, now more than ever, isn’t to detach entirely from the relentless pace of pop culture, but to find ways of channeling it into something deeper than the instant think piece or hot take. As a blogger who frequently undertakes projects that can last for months or years, I’m constantly mindful of the relationship between my work on demand and my larger ambitions. And I sure hope that the two halves can work together. Because like it or not, every critic is walking that path already.
You are here
Remember when you were watching Star Wars: The Force Awakens and Adam Driver took off his mask, and you thought you were looking at some kind of advanced alien? You don’t? That’s strange, because it says you did, right here in Anthony Lane’s review in The New Yorker:
So well is Driver cast against type here that evil may turn out to be his type, and so extraordinary are his features, long and quiveringly gaunt, that even when he removes his headpiece you still believe that you’re gazing at some form of advanced alien.
I’m picking on Lane a little here, because the use of the second person is so common in movie reviews and other types of criticism—including this blog—that we hardly notice it, any more than we notice the “we” in this very sentence. Film criticism, like any form of writing, evolves its own language, and using that insinuating “you,” as if your impressions had melded seamlessly with the critic’s, is one of its favorite conventions. (For instance, in Manohla Dargis’s New York Times review of the same film, she says: “It also has appealingly imperfect men and women whose blunders and victories, decency and goofiness remind you that a pop mythology like Star Wars needs more than old gods to sustain it.”) But who is this “you,” exactly? And why has it started to irk me so much?
The second person has been used by critics for a long time, but in its current form, it almost certainly goes back to Pauline Kael, who employed it in the service of images or insights that could have occurred to no other brain on the planet, as when she wrote of Madeline Kahn in Young Frankenstein: “When you look at her, you see a water bed at just the right temperature.” This tic of Kael’s has been noted and derided for almost four decades, going back to Renata Adler’s memorable takedown in the early eighties, in which she called it “the intrusive ‘you'” and noted shrewdly: “But ‘you’ is most often Ms. Kael’s ‘I,’ or a member or prospective member of her ‘we.'” Adam Gopnik later said: “It wasn’t her making all those judgments. It was the Pop Audience there beside her.” And “the second-person address” clearly bugged Louis Menand, too, although his dislike of it was somewhat undermined by the fact that he internalized it so completely:
James Agee, in his brief service as movie critic of The Nation, reviewed many nondescript and now long-forgotten pictures; but as soon as you finish reading one of his pieces, you want to read it again, just to see how he did it…You know what you think about Bonnie and Clyde by now, though, and so [Kael’s] insights have lost their freshness. On the other hand, she is a large part of the reason you think as you do.
Kael’s style was so influential—I hear echoes of it in almost everything I write—that it’s no surprise that her intrusive “you” has been unconsciously absorbed by the generations of film critics that followed. If it bothers you as it does me, you can quietly replace it throughout with “I” without losing much in the way of meaning. But that’s part of the problem. The “you” of film criticism conceals a neurotic distrust of the first person that prevents critics from honoring their opinions as their own. Kael said that she used “you” because she didn’t like “one,” which is fair enough, but there’s also nothing wrong with “I,” which she wasn’t shy about using elsewhere. To a large extent, Kael was forging her own language, and I’m willing to forgive that “you,” along with so much else, because of the oceanic force of the sensibilities to which it was attached. But separating the second person from Kael’s unique voice and turning it into a crutch to be indiscriminately employed by critics everywhere yields a more troubling result. It becomes a tactic that distances the writer slightly from his or her own judgments, creating an impression of objectivity and paradoxical intimacy that has no business in a serious review. Frame these observations in “I,” and the critic would feel more of an obligation to own them and make sense of them; stick them in a convenient “you,” and they’re just one more insight to be tossed off, as if the critic happened to observe it unfolding in your brain and can record it here without comment.
Obviously, there’s nothing wrong with wanting to avoid the first person in certain kinds of writing. It rarely has a place in serious reportage, for instance, despite the efforts of countless aspiring gonzo journalists who try to do what Norman Mailer, Hunter S. Thompson, and only a handful of others have ever done well. (It can even plague otherwise gifted writers: I was looking forward to Ben Lerner’s recent New Yorker piece about art conservation, but I couldn’t get past his insistent use of the first person.) But that “I” absolutely belongs in criticism, which is fundamentally a record of a specific viewer, listener, or reader’s impressions of his or her encounter with a piece of art. All great critics, whether they use that “you” or not, are aware of this, and it can be painful to read a review by an inexperienced writer that labors hard to seem “objective.” But if our best critics so often fall into the “you” trap, it’s a sign that even they aren’t entirely comfortable with giving us all of themselves, and I’ve started to see it as a tiny betrayal—meaningful or not—of what ought to be the critic’s intensely personal engagement with the work. And if it’s only a tic or a trick, then we sacrifice nothing by losing it. Replace that “you” with “I” throughout, making whatever other adjustments seem necessary, and the result is heightened and clarified, with a much better sense of who was really sitting there in the dark, feeling emotions that no other human being would ever feel in quite the same way.