Posts Tagged ‘The A.V. Club’
As I’ve said here perhaps more often than necessary, television is a very strange medium, and the fact that it occupies such a familiar place in our lives can blind us to how weird it really is. It creates characters and stories that can feel as vivid as our own friends or memories, and it’s like real life in another way: sooner or later, it ends, and nobody—including the creators—ever really knows how. Even the best narrative plans have a way of going sideways, and much of the fascination of a great television show comes from how it deals with the unexpected, whether in the form of a cast change, a creative departure, or an unexpected extension or cancellation. Television can be as unpredictable and uncontrollable as life itself, except that we know, or think we know, who really pulls the strings. While it’s true that many viewers probably don’t care much about where television comes from, in recent years, there’s been a greater degree of engagement than ever before between the audience and the men and women behind the curtain. And it inevitably changes the way we experience it.
I’ve been thinking about this a lot ever since watching “The Crash,” the latest episode of Mad Men, and reading Todd VanDerWerff’s thoughtful—if somewhat bewildered—review on The A.V. Club. (Its opening sentence: “What the ever-loving merciful fuck?”) VanDerWerff is one of my favorite writers, and I’ve been reading his articles and criticism with pleasure for years, but I was particularly struck by one observation:
A lot of the core conflicts on this show are the sorts of core conflicts one might find in a TV writers’ room, and to a degree, for the people who follow this show obsessively, its true protagonist is Matt Weiner. The question for many of us obsessive fans isn’t what Don Draper will get up to next but what Matt Weiner will get up to next.
I think VanDerWerff goes a little too far when he says that the episode seems like Weiner’s “dare to the weekly review culture,” but otherwise, his analysis is right on the mark. Weiner is the secret hero of his own show, which more than any other series in history is about the process of writing itself: Don Draper writes ads, but he’s also the author of his own life, and it’s fascinating to see how the show continues to exercise the same chilly emotional control even as Don’s story spins apart.
Every week, after watching the latest episode of Mad Men, my wife and I will play the short featurette that accompanies it on iTunes, in which Weiner and members of the cast share their thoughts on the latest installment. These videos presumably began as an easy promotional extra, but they’ve evolved, at least to me, into a weirdly exegetical part of the show itself: as soon as the closing credits roll, I just want to know what the hell Weiner was thinking. Weiner seems aware of this, too, and there’s a teasing quality to many of his comments, which are lucid and reasonable, but which also seem to explain a lot more than they actually do. They’re a little like T.S. Eliot’s notes to The Waste Land, which are less a way of clarifying the poem than an integral part of the text. Sophisticated readers and viewers know that you should never take a writer’s statements about his own work at face value, and although Weiner comes across as a smart, ordinary, entirely earnest guy when he explains himself to the camera, there’s something Nabokovian in the way he elucidates a few select points while leaving the rest of it shrouded in mystery.
And it’s made me reflect about the ways in which television is an ongoing dialogue, imaginary or not, between a creator and his audience. This isn’t true of every show, of course, and it’s never more clear than when it’s no longer there. It’s fair to say that Community‘s new showrunners are highly conscious about how the series is perceived, and they’ve been good—almost to fault—about honoring the show’s history and giving fans what they think they want. Yet that old sense of interchange or possibility is missing: you never catch the show in a moment, as you often did in the old days, in which you could almost hear Dan Harmon thinking out his next move. The result feels a lot like the second season of Twin Peaks, after the departure of David Lynch and Mark Frost: it was still weird, but in a calculated way, as if strangeness were simply a part of the premise, rather than something that the show’s creators found themselves doing while trying to tell a story in the only way they could. Mad Men is both the best and the strangest show on television, and it’s dazzling in the way Weiner lays out the pieces and dares us to put them together. He even gives us a few helpful hints. But I’m not sure if I entirely trust him.
Somewhere in my parents’ home, there’s a book with both its front and back covers missing. When it first fell into my hands, it was brand new, and I would have been about eight years old, which I remember because I can still see exactly where it stood on the bookcase in our old house. The strange thing is that it wasn’t on a shelf I could reach: either someone took it down for me or I made a point of retrieving it myself, and it’s been so long that I’m not sure which was the case. All I know is that for the next ten years, it was rarely out of my sight, and throughout the most formative decade of my life, it was probably the book I read the most. Even now, I know much of it by heart, and I’ll occasionally find its phrases and rhythms appearing in my work, like fragments of my own memories. It was Roger Ebert’s Movie Home Companion, and when I look back now, I realize that it wasn’t just the book that first introduced me to the movies—which would be legacy enough—but one that made me think for the first time about journalism, criticism, and countless other aspects of the world and culture around me.
I’ve written at greater length about Ebert’s role in my life here and here, and I won’t repeat myself. I never had a chance to tell him in person how much he meant to me, although I’d like to think that he saw what I wrote here, and he certainly heard much the same thing from countless other writers and movie lovers. Still, the fact that I never met Ebert, despite having lived the last few years of my life in Chicago, will always remain a profound regret, although I’m very grateful that I got to see him in person at the celebration of his favorite film music at the Chicago Symphony Orchestra. For a while, Ebert and Gene Siskel were my two favorite guys on television, and I can still hum the opening theme for At the Movies, which was always a high point of my week. I’ll never forget where I was when I learned that Gene Siskel had died, and I’m sure I’ll remember where I was when I heard that Ebert was gone. (To give you a sense of how big a part of my life Ebert was, my wife called me with the news from work, and a college friend emailed later that day to say she was thinking of me.)
If there’s a silver lining to Ebert’s death, it’s that it gives us a sense of how deeply he influenced a whole generation of writers and critics. Will Leitch’s bittersweet remembrance in Deadspin, which recounts how he benefited from Ebert’s example and generosity, then foolishly threw it all away, is essential reading. But the words that linger with me the most are those of Scott Tobias of The A.V. Club, which reflect my own feelings to an almost frightening extent:
Cinema is a river with many tributaries, and I’m sure I’m not alone among movie-crazy teenagers in the ‘80s in using Roger Ebert’s Movie Home Companion as the boat downstream. You go through all the four-star reviews. You see Taxi Driver, and then of course you have to see Raging Bull, and then every other Martin Scorsese picture that sits on the video shelf. (And then you get into the movies that influenced Scorsese, which is a lifetime in itself.) You argue with him, you glean insights in the things you watch, you learn an entire new way of thinking, talking, and writing about the movies. And you never stop watching. You never stop debating. You have a companion for life, even now that his is over.
“The old man was around for a long time,” Ebert says of John Wayne in The Shootist, and although Ebert was only in his thirties when he wrote those words, the same could be said about his own career. Ebert was the one who first taught me that, at his best, a critic is sort of an island of stability, staying at the same desk for forty years to regard a changing world through a very particular lens, until his body of work says as much about the decades through which he lived as about the movies themselves. Ebert once seemed more stable—and certainly more substantial—than most, and at his prime, it was hard to believe that he would ever be gone. Toward the end, of course, this changed. Yet it’s in the last act of his life that his influence will be the most profound: he proved that criticism, a trade that has often been denigrated and dismissed, can give us the tools to face the fact of our own mortality with honor. At the end of his life, Ebert seemed reduced to little more than his words and, remarkably, his thumb, as if his most famous trademark had really been a mysterious preparation for a time when it would be all that remained. And in the end, his words were enough.
I have a friend who hates Reservoir Dogs. He’s willing to grant that some of Quentin Tarantino’s other movies have merit, but refuses to rewatch this particular film, mostly on account of its violence—which, he says, he found increasingly hard to take after he had children. I can understand what he means. In the case of my own daughter, I’m still working out what kinds of media she’ll be watching at what age, and while I definitely plan to introduce Beatrix to the joys of Pulp Fiction and the two movies about her namesake at the right time, I might give Reservoir Dogs a pass. I liked it plenty when I first saw it, but I haven’t been tempted to revisit it in a long time, and these days, I think of it mostly as an inventive and resourceful debut that paved the way for the astonishing career to come. (The recent Vanity Fair oral history of the making of Pulp Fiction just serves as a reminder of how deeply influential Tarantino has been, even as his influences and innovations are absorbed into invisibility by the culture as a whole.)
And although I understand my friend’s point about the violence in Reservoir Dogs, what lingers with me, weirdly, is Tarantino’s restraint. Take the movie’s most notorious sequence. When I think of it today, what I remember is not so much the violence as two amazingly assured shots. The first is the moment when the camera turns aside as Mr. Blond prepares to hack off the cop’s ear, tracking away to focus on a nondescript corner of the room as we listen to the screams coming from just offscreen. It’s a startlingly subjective camera move, as striking in its way as the moment in Taxi Driver when Scorsese pans away from Bickle’s telephone rejection from Betsy, and reflects Tarantino’s understanding that such things are more effective when left to the imagination. Even better is the shot immediately afterward, when Mr. Blond leaves the warehouse, crosses a peaceful street in silence, retrieves a gas can from his car, and returns, all in a single unbroken take that ends back in the room where “Stuck in the Middle With You” is still playing. Mike D’Angelo of The A.V. Club has sung this shot’s praises, and it’s one that still knocks me out, more than fifteen years after I first saw it.
Given this kind of filmic grace, which Tarantino had in spades before he even turned thirty, it’s instructive to turn to Django Unchained, which I finally caught over the weekend. (I liked it a lot, by the way, although it strikes me as one of his less essential movies, somewhere above Death Proof and below Jackie Brown.) Django has also aroused controversy over its violence, and while I wouldn’t want to argue that it isn’t a violent movie, here, too, I’m more struck by its restraint than anything else. This is partly because it’s the first movie in which Tarantino hasn’t done deliberate violence to the medium of storytelling itself: the plot proceeds in a linear fashion, without any of the structural games we find in his previous work, and the boundary between good and evil is much more clearly delineated than usual. Even if we hadn’t been clued in by the fact that audiences, for the most part, seem to be embracing the movie, there isn’t a lot of doubt about how this particular revenge story will conclude. And although Tarantino doesn’t shy away from the blood squibs in his climactic shootouts, he’s even more careful here in his use of violence than usual.
Django Unchained takes place in a violent time, with plenty of human misery inherent to the story, but it doesn’t linger over scenes of cruelty and torture. Tarantino gives us these moments in flashes, just long enough to lock them in the mind’s eye, and doesn’t deal with sexual violence at all, except by implication. Which doesn’t mean he shies away from the implications of the material. The film’s most memorable scene is the long monologue by Samuel L. Jackson—who gives what I think is the supporting performance of the year—in which he coolly explains how a living death in the mines, to which slaves are routinely condemned, is far more cruel than any torture Django’s captors could invent. Tarantino knows the difference between the violence of history and that of escapism, and it’s fascinating to see a film in which they exist so casually side by side. Sometimes his canniness goes a little too far: when Django engages in one killing that might make him seem unsympathetic, he instructs the bystanders to tell the victim goodbye, and when he fires, the body is jerked offscreen by what can only be a stagehand with a length of piano wire, leaving it conveniently out of sight for the rest of the scene. It’s a cheap gag, but done with the artistry that separates Tarantino, not just from his imitators, but from his precursors. And like it or not, that’s the mark of a master.
“I don’t want to be the man who learns—I want to be the man who knows.” This is author William Goldman in Adventures in the Screen Trade, quoting an unnamed movie star whom I’ve always pictured as Steve McQueen, although it probably wasn’t. Goldman is making a slightly cynical point about how a screenwriter needs to give every good moment in the script to the star, and especially can’t show the hero asking questions or carrying the burden of exposition. On a deeper level, however, this quote gets close to the heart of what we, in the audience, want from our heroes. Everyone has a different sense of the qualities of the ideal movie hero, but at the top of my own list is competence. When I’m looking for escapism, I like movies and books about men and women who are good at their jobs, who are smart and resourceful, and who embody the kind of confidence, or at least conviction, that I’d like to see in myself. As Emerson said of Napoleon, heroes are like the rest of us, except quicker, more decisive, and always sure about what to do next. Which only means that a hero is someone who sees at a glance what it took the screenwriter weeks to figure out.
I’ve been thinking about this recently while reflecting, once again, on the appeal of The Silence of the Lambs, which was inexplicably left out of the A.V. Club’s recent rundown of the fifty best movies of the ’90s. (Honestly, I’m not the kind of person who usually complains when a list like this omits one of his favorite films, but really, this is beyond comprehension.) Hannibal Lecter is one of our great villains—he’s at the top of the AFI list—but he’s also, weirdly, one of the most compelling heroes of the past several decades, and a lot of this is due to the reasons that I mention above. He isn’t just brilliant, but hugely resourceful. His escape from the security facility in Tennessee consists of one audacious move after another, and even if we can’t buy every detail, it’s hard not to be swept up by the result. And his ingenuity is really just a distillation and acceleration of the craft of Thomas Harris. That’s the beauty of fiction: a plan that took Harris months, if not years, to work out on paper occurs to Lecter in real time, over the course of twenty dense pages. And that kind of unnatural clarity of action is what fictional heroism is all about.
Of course, Lecter has since degenerated as a character, and although I’ve talked about this far too many times before, it hints at an important truth. In his book Characters and Viewpoint, Orson Scott Card draws a useful distinction between cleverness and intelligence:
[I]n our society with its egalitarian ideals, any obvious display of intelligence or erudition suggests elitism, snobbery, arrogance…Yet we love a character who is clever enough to think of solutions to knotty problems. Does this seem contradictory? It is contradictory…The audience loves a character who solves problems and knows exactly the right facts when he needs them—but they don’t like a character who flaunts his superior knowledge or acts as if he knows how clever he is.
As an example, Card cites the case of Indiana Jones, who is intellectually brilliant by definition, but slightly bumbling whenever we see him in the classroom—and endlessly inventive and resourceful when pressed into action. And Lecter is a cautionary counterexample. We don’t like Lecter because he can quote Renaissance poetry and appreciate fine wine, but because he outsmarts his enemies and deals ingeniously with problems presented by the story. The trouble with Hannibal and its sequel is that in the end, we’re left with nothing but Lecter the cultured epicure, to the point where his taste for the finer things in life becomes actively annoying, while his acts of violence grow increasingly baroque and grotesque. This, more than anything else, is where Harris faltered.
Which just means that a hero is only as good as the plot in which he finds himself. If you’ve constructed a surprising story in which the protagonist reacts in engaging ways, you’ve already solved most of the problems of writing a convincing hero, including the issue of making him seem too competent. You can always build flaws into your protagonist—Smiley’s miserable domestic life, Lawrence’s inner torment, Indy’s tendency to get in over his head—but really, if your plot is a match for the hero you’ve constructed, those qualities will take care of themselves. This is why James Bond, even in the best of the early films, is both a seductive icon and a narrative void: the plots are just too arbitrary and absurd to present him with any real challenge. It also explains why Casino Royale is, by a large measure, the best of all the Bond films, not because it goes out of its way to present us with a flawed Bond, but because the story around him, for once, is worthy of the character’s inner resources. Bond is still the man who knows, but in this case, the filmmakers knew just a little bit more. And that’s exactly how it should be.
It’s generally agreed that the two greatest dramas on television today are Mad Men and Breaking Bad, two consistently fascinating shows that air on the same network and appeal to similar demographics, but which in other respects couldn’t be more different. Mad Men, as I’ve said before, is almost fractal in its simultaneous commitment to fine detail and shapely storytelling, and it comes off as a seamless piece of narrative that could go on serenely forever. Breaking Bad, by contrast, is a lumpier, shaggier, messier show that often seems on the verge of coming apart entirely. It has narrative problems that I don’t think it ever truly solved—notably involving the character of Skyler White—and it didn’t really come into its own until halfway through the third season. It can feel contrived, and its seams often show. But at its best, it reaches greater heights than any other recent show, Mad Men included. And much of its appeal comes from the fact that creator Vince Gilligan and his writing staff clearly don’t always know what will happen next, but are willing to follow the characters into strange, dark places.
I’ve been a big fan of Gilligan ever since I first saw “Pusher,” my favorite episode of The X-Files, and one of the great pleasures of Breaking Bad is the chance it affords to watch Gilligan and his writers think in real time. Breaking Bad is all but unique among important television shows in that its underlying conception changed radically after its first season, as the writers began to honestly examine the story’s implications. The series began as a finely crafted but somewhat facile black comedy about an essentially decent family man forced into a life of crime to pay his medical bills. As the show went on, however, it became increasingly clear that this premise, which made for a great elevator pitch, was unsustainable over the course of many seasons—at least not without a radical shift in tone. The result is a show that has become increasingly bleak in ways I don’t think even Gilligan anticipated, but to his credit, he has remained fully committed to the show’s new direction, based on a simple concept of dazzling audacity. As Gilligan said to the New York Times Magazine: “Wouldn’t it be interesting to have a show that takes the protagonist and transforms him into the antagonist?”
Which is exactly what Breaking Bad has done. The fact that it has succeeded so completely is a testament to the strength of its cast, especially Bryan Cranston and Aaron Paul, but also to the power of committing fully to the logic of the narrative, even if you don’t know precisely where it will lead. This applies to individual story arcs and episodes as well as to the shape of the series as a whole. In a wonderful series of interviews with Todd VanDerWerff of The A.V. Club, Gilligan admits that his writing staff will generally begin each season with only a vague idea of where it ends, and often plot only three or four episodes ahead. This is very close to how I write my own novels, with detailed outlines taking me a third of the way through the story at a time, and it’s a thrilling way to write fiction, since it allows you to control the narrative to a certain extent while still being unsure of where the characters will ultimately go. The difference, of course, is that Gilligan and his team are doing it in public, with each season airing before they move on to the next, and it’s especially fun to see the show revisit elements from earlier seasons—like the sinister figure of Tio Salamanca—in ways that nobody could have anticipated.
And it’s also careful to keep its options open. Gilligan notes that even the writing staff doesn’t know much about the mysterious background of Gus Fring, the icy antagonist played so brilliantly by Giancarlo Esposito. This is partly because Gilligan feels, and rightly so, that certain characters “are sometimes more interesting the less you know about them,” but also because they don’t want to commit themselves without reason. Similarly, they’ve never said anything about Walt’s mother, or even shown us her picture, in order to keep certain possibilities alive. Whether or not these elements will ever pay off is an open question, but Gilligan and his writers have proven themselves experts at playing the long game, even if they aren’t entirely sure what the next move may be. It’s that constant play between constraint and possibility—between honoring the rules that the show has established while also leaving a few things in reserve—that makes the series so riveting from episode to episode. And it’s a measure of the show’s mastery that even as Walter White’s options continue to contract, the show’s own options seem limitless.
As regular readers of this blog know, I love commentary tracks, both for their insights into the filmmaking process and for what they reveal, often unintentionally, about the people doing the talking. Nowhere is this more the case than in Dan Harmon’s commentary for the Community episode “Intermediate Documentary Filmmaking,” a terrific episode from one of the best seasons of television ever produced. In a departure from the show’s usual style, the episode is shot in the mockumentary format that has been been all but obligatory for smart, ambitious sitcoms ever since the premiere of The Office, and while the episode itself offers plenty to talk about, Harmon spends most of the commentary delivering a humorous, somewhat self-deprecating, but ultimately totally earnest attack on the use of these documentary conventions by shows like Parks & Recreation and Modern Family, which, he implies, is the comedic equivalent of playing tennis without a net.
Harmon argues, and not without reason, that the mockumentary format basically allows comedy writers to take the easy way out: it’s much easier to make a fast-paced sitcom when you can just cut away to a character explaining what he’s thinking, underline jokes using montage, voiceover, and flashbacks, or zoom in on a facial expression or an important detail. Community, he continues, is trying to do something much harder, which is to deliver similarly dense, scripted comedy without the mockumentary crutch. He also complains, quite sincerely, that viewers and critics don’t sufficiently appreciate this. Quite the opposite, in fact: the documentary format is faster, cheaper, and wins more awards, so it’s no wonder that sitcom creators prefer it over more conventional single-camera comedy. And a subsequent interview with The A.V. Club makes it clear that Harmon means what he says:
I just wanted to [work in the documentary format] to see what it was like. You know, to take those weights off our ankles. I feel like 30 Rock and Community never get an award for doing a format that’s twice as hard. Because it really is twice as hard…Now we have to go back to playing the violin while [other shows] play stickball.
I’ve been thinking about this commentary track a lot, and not just for the obvious reasons, ever since the news broke that Harmon had been fired as the showrunner of Community. Harmon is clearly a genius, but nearly everything I’ve ever heard him say indicates that he’s an incredibly neurotic guy, competitive, perfectionistic, and insecure even in the midst of success. According to industry gossip, he wasn’t easy to work with, and his management issues may have been partially responsible for the departures of creative talent that the show had suffered in recent seasons. From the point of view of Sony, which made the decision to replace him, it probably seemed like a no-brainer: a sitcom is supposed to be an efficient machine for producing content, which doesn’t exactly describe Community, for reasons that any interview—or commentary track—with Dan Harmon will make abundantly clear.
And yet here’s the thing: Harmon isn’t wrong. What he did with Community was incredibly hard, in ways that aren’t always obvious. It’s easy to point to big conceptual stories like “Remedial Chaos Theory” or the recent “Digital Estate Planning” as evidence of Harmon’s talent, but as he points out, even the show’s more modest episodes are ambitious, complex mini-movies that consistently take big risks. They don’t always pay off, and the third season was noticeably uneven, but to even try to make this kind of television in the face of so much opposition requires the kind of combative, uncompromising personality that ultimately got Harmon fired. As he wrote on his blog: “I’m not saying you can’t make a good version of Community without me, but I am definitely saying that you can’t make my version of it unless I have the option of saying ‘has to be like this or I quit’ roughly eight times a day.” And while it’s too soon to tell whether Community without Harmon will be better or worse, it definitely won’t be the same.
I can’t quite remember when I gave up on Glee. For the first two seasons, I watched the show regularly, both because I enjoyed it and because it was the kind of creative, ambitious mess that can be more interesting to think about than a conventionally tidy series. Glee often fell flat on its face, but it did so in unexpected ways that made me reflect on the nature of storytelling, the challenges of episodic television, and the power of ensembles. After a while, though, it just became too exhausting. The show was still good for a handful of transcendent moments, but I found it increasingly hard to sit through the rest, especially as it became clear that the writers had no idea what to do with their most important characters. Finally, I just stopped. Until this week, I hadn’t watched an episode all year, not since “Asian F,” which aired all the way back in October.
And yet I occasionally found myself missing it. Sometimes I’d watch a clip online, or think back to the promise of Glee‘s first season, or just remember the characters, some of whom I still cared about, at least in their earlier incarnations. (I also had a surprisingly good time watching the concert movie on a plane.) Still, I wasn’t really tempted to check in again. As I recently put it to a friend of mine, there’s so much good television available these days, both on the air and on DVD, that I have no excuse for watching a show that doesn’t stand at the very top of its game. Mad Men, for instance, is basically awesome all the time, and Community isn’t far behind. And when I still haven’t seen most of the Sorkin years of The West Wing or all but a few episodes of The Sopranos, it’s hard to justify investing time in a show that pays off only intermittently.
Of course, if I’d followed this rule my entire life, I never would have watched The X-Files, my favorite show of all time, which seemed perversely intent on punishing viewers who expected anything like consistency. And sometimes it can be thrilling to see a show you love suddenly return to form. Todd VanDerWerff of the A.V. Club has always been one of Glee’s most interesting critics—he’s the one responsible for the theory of the three Glees—and he has an interesting take on this. To his mind, Glee could have been an observant, sad, but ultimately triumphant series about growing up in a small town while dealing with the failure of your own dreams, which is what it felt like in the pilot. Instead, it was taken over by ridiculous high concepts, big production numbers, and theme episodes, but would occasionally still send dispatches from an alternate universe where that other show still existed.
All of which is to say that I watched the show again this week, if only to see the kids win Nationals at last, and I enjoyed it. Still, it’s startling to realize how little I regret missing the past fifteen episodes: there were plot points or characters I didn’t recognize, but for the most part, this is the same show I remembered—and perhaps more fondly than if I’d been around for some of the low points in between. And as much as I liked this episode, I can also safely say that after this season, I’m done with Glee. Every television show ultimately boils down to a handful of moments in the viewer’s memory, an idealized version constructed out of its best pieces, and the Glee of my imagination—the one that was wistful, funny, and occasionally spectacular—is now complete. It was good to tune in one last time, but now that I’ve shared in that moment, it’s finally time to graduate.
The more I think about it, the more I suspect that making great television over the course of multiple seasons might be the most challenging of all sustained creative acts. On a practical level, it’s arguably harder than directing a movie or writing a novel, not just because of the scale and speed required, but because of the uncertainty inherent in network scheduling, in which a show’s creator doesn’t know whether he’ll have one episode, half a season, or six seasons and a movie. Few series have suffered from more uncertainty than Dan Harmon’s Community, which, despite a vocal fan following, has always seemed on the verge of cancellation. Its return is therefore all the more cause for celebration, not simply because the show survived, but because it thrived under awful circumstances: no other contemporary series, not even Mad Men, has faced the vagaries of modern television as well as Community, which has pushed the boundaries of the sitcom in every episode while somehow adding up to a satisfying whole. The result is a master class in both comedy and storytelling.
When I think of Community, the first word that comes to mind is balance. This may seem surprising, given some of the truly unhinged episodes that the show has produced over the past few years, but what really stands out with this series is its ability to coordinate a wide range of impulses and ambitions—any one of which, left unchecked, would lead to disaster—within one remarkably cohesive vision. It’s a fantastically structured and plotted show that also leaves room for its characters to evolve through improvisation. It’s breathtakingly smart and honestly emotional. It’s a whirlwind history of recent pop culture (the second season is the first thing I’d throw into a time capsule to give future generations a sense of what this decade was like) and also fundamentally grounded in the lives of its seven major characters. And like Glee, it began with a cast meant to evoke sitcom stereotypes and then gradually reveal greater depths, but unlike Glee, it succeeded.
The comparison with Glee, which I’m not the first to make—Todd VanDerWerff of the A.V. Club has set it out admirably—is perhaps the most instructive. From its first episodes on, Glee was manifestly a show of vast ambition but limited ability to realize its goals. Community, by contrast, has aimed even higher and nailed every challenge it set for itself. And its ambitions have only grown over time. This was a smart, funny show right out of the gate, but it wasn’t until late in the first season that it locked on to its true potential. Part of this was its discovery of the range of things it could do, from tightly written bottle episodes to fake clip shows to epic parodies of action and science fiction movies, but it also involved refining the characters to take advantage of the strengths of its cast, particularly the astonishing triumvirate of Donald Glover, Danny Pudi, and Gillian Jacobs. (Jacobs, in particular, has been a revelation in the second half of the show’s run, as Britta evolved from a bland voice of reason to a glorious train wreck of a human being.)
Above all else, Community reminds us how to be clever. I’ve written at length about the perils of cleverness, and there are certainly critics who see the show as nothing more than a cleverness machine, churning out movie references and pastiches for its tiny audience. Yet the show’s real cleverness doesn’t lie in its inside jokes and nerd-culture homages—otherwise, it would be little more than a more cuddly version of Family Guy—but in its ability to integrate them into a world that feels emotional and real. Greendale is one of those fictional places in which we want to believe, populated by characters who feel like our friends, and whose lives and problems remain consistent even as they’re fighting zombies or split into alternate timelines. That’s more than clever; it’s astounding. My favorite episode consists of nothing but the characters talking around a table for twenty minutes, but it works because they’re doing exactly what the show does every week: telling stories. And it does it as well as any show I’ve ever seen.
Although my life has since taken me in a rather different direction, for a long time, I was convinced that I wanted to be a film critic. My first paying job as a writer was cranking out movie reviews, at fifty dollars a pop, for a now-defunct college website, a gig that happily coincided with the best year for movies in my lifetime. Later, I spent the summer of 2001 writing capsule reviews for the San Francisco Bay Guardian, during a somewhat less distinguished era for film—my most memorable experience was interviewing Kevin Smith about Jay and Silent Bob Strike Back. After college, I tried to get work as a film critic in New York, only to quickly realize that reviewing movies for a print publication is one of the cushier jobs around, meaning that most critics don’t leave the position until they retire or die, and when they do, there’s usually someone in the office—often the television reporter—already waiting in the wings.
In the years since, the proliferation of pop cultural sites on the Internet has led to a mixed renaissance for critics of all kinds: there are more professional reviewers than ever before, but their influence has been correspondingly diluted. Critics have always been distrusted by artists, of course, but these days, they get it from both sides: for every working critic, there are a thousand commenters convinced that they can do a better job, and the rest of us are often swayed less by the opinions of individual writers than the consensus on Rotten Tomatoes, which is a shame. At its best, a critic’s body of work is a substantial accomplishment in its own right, and personalities as dissimilar as those of Pauline Kael, Roger Ebert, and David Thomson—speaking only of film, which is the area I know best—have created lasting legacies in print and online. And while the critical profession is still in a period of transition, the elements of great criticism haven’t changed since the days of James Agee, or even Samuel Johnson.
So what makes a good critic? Knowledge of the field, yes; enthusiasm for art, most definitely. (A critic without underlying affection for his chosen medium, or who sees it only as an excuse for snark, isn’t good for much of anything.) Above all else, it requires a curious mixture of the objective and the subjective. A critic needs to be objective enough to evaluate a work of art on its own terms—to review the work that the creator wanted to make, not the one that the critic wishes had been made instead—while also acknowledging that all good reviews are essentially autobiographical. Ebert has noted that his own criticism is written in the first person, and the most enduring critics are those who write, not as an authority delivering opinions from up on high, but as someone speaking to an intelligent friend. As a result, the collected works of critics like Ebert and Kael are the closest things we have these days to books that seem like living men or women, like Montaigne’s essays or The Anatomy of Melancholy. “Cut these words,” as Emerson said of Montaigne, “and they would bleed.”
Surveying the current crop of writers on the arts, my sense is that while we have many gifted critics, most of them fall short in one way or another. A critic like Anthony Lane, for all his intelligence, tends to treat the subject under consideration as an excuse for an arch bon mot (as with Star Trek: First Contact: “If you thought the Borg were bad, just wait till you meet the McEnroe.”) And while his wit can be devastating when aimed at the right target—The Da Vinci Code, for instance, or the occupants of the New York Times bestseller list—it often betrays both too much self-regard and a lack of respect for the work itself. On the literary side, James Wood has a similar problem: he’s a skilled parodist and mimic, but surely not every review obliges him to show off with one of his self-consciously clever pastiches. (If I were Chang Rae-Lee, I’d still be mad about this.) The writers of the A.V. Club are more my style: in their pop cultural coverage, especially of television, they’ve struck a nice balance between enthusiasm, autobiography, and reader engagement. But I’m always looking for more. Which critics do you like?
The recent release of Brian Kellow’s biography A Life in the Dark and the Library of America anthology The Age of Movies has led to a resurgence of interest in the career of Pauline Kael. Yet Kael never really went away, at least not for those of us who spend most of our waking hours—and you know who you are—reading about pop culture online. Maud Newton of the New York Times once credited, or blamed, David Foster Wallace for creating the ironic, slangy tone of modern blogs, but three decades earlier, Kael had forever shaped the way we talk about the movies, and, by extension, everything else we care about. Peel back the prose of any top critic on Rotten Tomatoes and you’ll find Kael peeking out from underneath, as writers mimic her snap judgments and rapid turns of phrase while often missing the depths that these surface flourishes concealed.
And while Kael is deservedly remembered for championing the cinema of the sixties and seventies, her lasting legacy is likely to be that of a stylist. I don’t think she’s the best or most insightful film critic of all time; for that honor, I’d nominate David Thomson, although I know he’s the man many movie lovers love to hate. As far as my own personal love of the movies is concerned, I owe the most to Roger Ebert. But Kael’s voice was the most distinctive of all the great film critics, and it’s been jangling in my head for decades. Phrases from her reviews nestle themselves into the corners of your brain, forever changing the way you think of the films under discussion, like her take on Altman’s visual flourishes in The Long Goodbye: “They’re like ribbons tying up the whole history of movies.” Even today, I can recite her enraptured description of the ending of The Fury, which Bret Easton Ellis cheerfully ripped off for his blurb for House of Leaves, almost by heart:
This finale—a parody of Antonioni’s apocalyptic vision at the close of Zabriskie Point—is the greatest finish for any villain ever. One can imagine Welles, Peckinpah, Scorsese, and Spielberg still stunned, bowing to the ground, choking with laughter.
But the trouble with Kael as a role model is that her breathless style, in the absence of a larger philosophy of film, can sometimes cover up the lack of deeper understanding, and, at its worst, turn into something alarmingly like trolling. Kael’s reviews as a whole can be nuanced, but her individual sentences (“The greatest finish for any villain ever”) rarely occupy any middle ground. Imitating Kael on the sentence level only feeds our current tendency, as a culture of online commenters, to believe that everything deserves either five stars or none. This all or nothing approach has been discussed before, notably in an excellent Crosstalk at The A.V. Club, but it’s worth noting that Kael is its unlikely godmother. And if that’s the case, then her influence is vaster than even her greatest admirers acknowledge: her style touches everything we write about the arts, both online and in traditional media, down to this very blog post.
Which makes it all the more important to remember that Kael’s style was the expression of a genuine love of movies. Kael could be cruel to movies she disliked, as in her famously savage (and not entirely inaccurate) dismissal of Raging Bull: “What am I doing here watching these two dumb fucks?” But underpinning it all was a fanatical belief in what movies could do, and a determination that they live up to the standards set by other works of art, which is a quality that many of her imitators lack. Kael was a lot of things, but she wasn’t ironic, and her style was less about showing off than a way of getting her readers to feel the same intense emotions that she did—and, of course, to watch the movies themselves. I’ve sought out countless films just so I could read Kael’s reviews of them, and I know I’m not alone in this. And while I’m not sure if Kael would approve, I suspect that she’d at least be glad I was watching the movies she loved so much.
With Halloween right around the corner, my thoughts have been turning to horror, and not just at the prospect of providing candy for the 250 trick-or-treaters I’ve been reliably told to expect. The success of the third installment of the Paranormal Activity franchise, which scored both the highest October debut and the all-time best opening weekend for a horror movie, provides ample proof that the horror genre is alive and well. And while I have no intention of seeing Paranormal Activity 3, or anything else from the makers of the loathsome Catfish, I can’t help but admire the ingenuity behind a franchise that has grossed $450 million worldwide on a combined $8 million budget. Audiences love horror, it seems, which remains the only genre truly independent of budget or starpower, so I thought it might be fun to spend the next few days reflecting on this most potent, and misunderstood, segment of popular culture.
The first point, which can’t be stressed enough, is that horror in film and horror in literature are two very different things, although they’re often misleadingly conflated. Cinematic horror is a communal experience: nothing compares to seeing a great horror movie, whether it’s Psycho or The Descent, in a packed auditorium with an enthusiastic crowd. At its best, this carnival atmosphere adds enormously to the fun, as the A.V. Club’s Mike D’Angelo notes in his recent consideration of Scream 2, and is only diminished when a movie is experienced on video. (For what it’s worth, I suspect that the increase in the critical reputation of The Shining, which was widely dismissed on its initial release, is because it’s one of the few great horror movies that can be profitably watched at home, although its power is incalculably increased on the big screen.)
Horror fiction, by contrast, is experienced in solitude. This is true of all fiction, of course, but here the solitude is as much a part of the reading experience as communality is at the movies. For the full effect, horror novels or stories are best experienced alone, at night, in an empty house, and the best horror fiction amplifies the reader’s loneliness, so that every creaking floorboard or unexplained sound participates in the overall mood. (It’s no accident that many of the best horror stories are built around a spooky house.) And while every good novel is grounded on the reader’s identification with the characters, horror takes the identification to another level, until it becomes not just mental, but physiological. The sweating palms, the accelerating heart, the white knuckles—these are all signs that the identification is complete. And it can only achieve its optimal intensity when the reader is completely alone.
Clearly, an art form centered on a communal experience will evolve in utterly different ways than one that depends on solitude. And indeed, successful works in either medium have developed distinctive strategies to achieve the common goal of complete identification with the characters, at least for the duration of a scene. It’s unfortunate, then, how often aspiring writers in horror fiction take their cues from the movies, without realizing that the two forms have little in common, and how badly the movies have distorted the works of serious horror novelists like Stephen King. Writing good horror fiction, in particular, is a skill that only a handful of authors have managed to achieve, which is partially due to the misleading influence of cinematic horror. Tomorrow, I’ll talking more about this distinction, and about the differences between horror, terror, and the most powerful sensation of all, dread.
Moneyball is one of the best movies I’ve seen this year, and the second great film in four months starring Brad Pitt. (A few more like this, and I’ll even forgive him for Benjamin Button.) It’s the first film in a while in which Pitt’s star power has been on full, dazzling display, and it’s especially welcome in a sports movie that is designed to frustrate, or at least challenge, our expectations. This is an absorbing, often exhilarating film, but not for the usual reasons: despite Billy Beane’s shrewdness and vision, and the lasting impact he’s had on baseball, he’s never won a championship, and probably never will, now that his insights have spread far and wide. Moneyball, contrary to the subtitle of its source material, isn’t about winning an unfair game, but about surviving it—which makes it much more poignant than Michael Lewis’s book, which was unable to witness the aftermath of its own revolution.
And one of the film’s great virtues is that it treats survival on one’s own terms as something noble. Watching it, I was reminded of Roger Angell’s praise of Bull Durham, which the A.V. Club quoted a few months ago:
It assumes you’re going to stay with the game, even in its dreariest, dusty middle innings, when the handful of folks in the stands are slumped down on their spines waiting for something to happen, even a base on balls.
At its best, Moneyball—which loves a base on balls—is an unsentimental look at those dusty middle innings, and what it really takes to say in the game. The A’s may never win another title against a big-market team, but they played competitively long after being dismissed. And one of the film’s unspoken messages is that Beane was happier scheming and cobbling together a team in Oakland than he would have been as part of the Red Sox machine, even if it cost him a World Series. As Bennett Miller, director of Moneyball, recently said to the New York Times: “He would have died in Boston. It wouldn’t have been his show. He likes to be the guerrilla in the mountains in combat fatigues.”
One of the reasons why the book and movie of Moneyball have such wide appeal—even to those, like me, who have close to no interest in sports—is that it’s impossible not to apply its lessons to one’s own life. In my own case, it reminds me, inevitably, of being a writer. Deciding to become a novelist is something like entering professional sports: you start with dreams of a multimillion-dollar contract, but in the end, you feel lucky just to get picked in the draft. And while you may get occasional bursts of attention and praise, for the most part, it’s about playing in every game, practicing in solitude, and making small, crucial choices that nobody will notice. If writing a great novel can be compared to a baseball feat, it isn’t DiMaggio’s hitting streak, but Ted Williams’s .406 year, in which every swing counted, day after unglamorous day.
And the first, necessary duty is simply to survive. A writer doesn’t have the benefit of sabermetrics, but he or she inevitably develops a comparable suite of tricks, both practical and artistic, to keep playing. These tricks often boil down to boring formulas or rules of thumb: structure stories in three acts, get into scenes late and out of them early, cut every draft by at least 10%. And the process of internalizing these tricks—and I’m stretching the metaphor here, but whatever—is something like increasing one’s on-base percentage: it’s nothing fancy, but over time, it adds up to runs, which allow players and teams to endure. In the end, no matter what the other rewards might be, a writer, like a baseball player, is incredibly lucky to be in the show. But if you want to keep playing a grown man’s game, as Moneyball understands, luck by itself isn’t enough.
“Ten thousand hours,” writes Malcom Gladwell in Outliers, “is the magic number of greatness.” That is, ten thousand hours of hard practice, at minimum, is a necessary prerequisite for success in any field, whether it’s chess, the violin, or even, dare I say it, writing. There’s also the variously attributed but widely accepted rule that a writer needs to crank out a million words, over roughly ten years, before achieving a basic level of technical competence. Both of these numbers are, obviously, sort of bogus—many people will require more time, a few much less. But they’re also useful. Ultimately, the underlying message in both cases is the same: mastery in any field takes years of commitment. And if you need some kind of number to guide you on your way, like Dumbo’s magic feather, that’s fine.
Because the only real path to mastery is staying in the game. Terry Rossio, on his very useful Wordplay site, makes a similar point, noting that when he was just starting out as a writer, he realized that anyone who spent ten years at a job—”grocery clerk, college professor, machinist, airline pilot”—had no choice but to become an expert at it. He concludes:
This insight freed me from the fear of picking a so-called “impossible” job. I could pick any field I wanted, free of intimidation, because it was guaranteed I would become an expert…if I was willing to stick to it for ten years. So I picked the job I really wanted deep in my heart: writing for movies.
The concept of a necessary amount of time to achieve expertise is what inspired the old master/apprentice relationship, in which, for instance, a focus puller would spend ten years observing what a cinematographer did, and at the end, be ready to shoot a movie himself. Writing doesn’t offer such neat arrangements, but it still requires the same investment of time, along with an occasional push in the right direction.
In fact, the best argument for writing full-time is that it allows you to accelerate this process. In the nearly four years I spent at my first job in New York, I wrote perhaps 30,000 words of fiction, only a fraction of which was published. After quitting my job, in the five years since, I’ve written about 600,000 words, not to mention another 100,000 words for this blog—a number that gives even me pause. While not all these words were great, they’re getting better, and close to half are going to end up in print. The number of hours is harder to quantify, but it’s probably something like 7,500, which, combined with the untold hours I spent writing bad fiction earlier in my life, has brought me close to Gladwell’s number. And if I hadn’t spent the past five years doing little else, I wouldn’t even be a third of the way there.
Of course, time by itself isn’t enough. The road to mastery is paved with well-intentioned grinders who work diligently on the same story or comic for years without showing any sign of improving. (The cartoonist Missy Pena memorably described this type to Todd VanDerWerff of the A.V. Club at this year’s Comic-Con. VanDerWerff writes: “Plenty of people who get—and deserve—bad reviews come back year after year after year, never quite getting what it is they could do better, treating the whole thing as a kind of weird theater.”) But even if time isn’t a sufficient condition, it’s at least a necessary one. Every great writer has served an apprenticeship, even if he or she doesn’t like to admit it, and if you haven’t rushed into print, you can always deny it when the time comes. As Hemingway said, when a suitcase filled with his old unpublished stories was lost: “It’s none of their business that you have to learn how to write. Let them think you were born that way.”
I didn’t want to see Captain America. The trailer wasn’t great, Joe Johnston wasn’t exactly my idea of a dream director, and most of all, I was getting a little tired of superheroes. The fact that we’ve seen four major comic book adaptations this summer alone wasn’t the only reason. Ten years ago, a movie like Spider-Man felt like a cultural event, a movie that I’d been waiting decades to see. Today, they’ve become the norm, to the point where a movie that isn’t driven by digital effects and an existing comic book property seems strangely exotic. At worst, such movies come off as the cynical cash grabs that, frankly, most of them are, a trend epitomized by Green Lantern, a would-be marketing bonanza so calculated that an A.V. Club headline summed it up as “Superhero movies are popular right now. Here’s another one.”
Which is why it gives me no small pleasure to report that Captain America is a pretty good movie, and in ways that seem utterly reproducible. This isn’t a film like The Dark Knight, which seems like an increasingly isolated case of a genius director being given all the resources he needed to make a singular masterpiece. Captain America is more the work of talented journeymen, guys who like what they do and are reasonably skilled at it, and who care enough to give the audience a good time—presumably with the kind of movie that they’d enjoy seeing themselves. Joe Johnston is no Chris Nolan, but in his own way, he does an even more credible Spielberg imitation than the J.J. Abrams of Super 8, and to more of a purpose. If this is clearly a cash grab—as its closing minutes make excruciatingly clear—it’s also full-blooded and lovingly rendered.
As a result, it’s probably the comic book movie I enjoyed most this year. While it doesn’t have the icy elegance of X-Men: First Class, it has a better script (credited to Christopher Markus and Stephen McFeely), and it’s far superior to the muddled, halfhearted, and overpraised Thor. Part of this is due to the fact that it’s the only recent superhero movie to manage a credible supervillain: in retrospect, Hugo Weaving’s Red Skull doesn’t do much more than strut around, but he’s still mostly glorious. And it’s also one of the rare modern comic book movies that remembers that the audience might still like to see some occasional action. As Thor failed to understand, special effects alone aren’t enough: I’ve had my mind blown too many times before. Yet it’s still fun to see an expertly staged action scene that arises organically from the story, and Captain America has a good handful of those, at a time when I’ve almost forgotten what it was like to see one.
What Captain America does, then, isn’t rocket science: it’s what you’d expect from any big studio movie, done with a modicum of care, aiming to appeal to the largest possible audience. So why aren’t there more movies like this? Perhaps because it’s harder to do than it looks: for one thing, it requires a decent script, which, more than anything else, is the limiting factor in a movie’s quality, and can’t be fixed by throwing money at it. The more movies I see, the more I respect mainstream entertainment that tries to be more than disposable, an effort that can seem quixotic in an industry where Pirates of the Caribbean: On Stranger Tides earns a billion dollars worldwide. Like it or not, movies are going to look increasingly like this, which is why it’s a good idea to welcome quality wherever we find it. Because it isn’t enough for a superhero to be super anymore; he also needs to be special.
The other night, my wife asked, with genuine curiosity: “Why do you like Glee?” Which, honestly, is a really good question. I don’t watch a lot of television; I’m not, as far as I can tell, anything close to Glee‘s target demographic; I know that Glee is fundamentally flawed, and often disappointing; and yet I find it fun to watch and, more surprisingly, interesting to think about. But why?
My only answer, aside from the fact that I like musicals, that that I enjoy Glee because of its flaws, because it can be frustrating and horrifically uneven, because it regularly neglects its own characters, and because an average episode can get nearly every moment wrong—and yet still remain a compelling show. For a writer who cares about pop culture, it’s the most interesting case study around. (As opposed to, say, Mad Men, which is the best TV drama I’ve ever seen, but much less instructive in its sheer perfection.)
Here, then, are some of the lessons, positive and negative, that I’ve tried to draw from Glee:
1. Do follow through on big moments. Howard Hawks defined a good movie as having three good scenes and no bad scenes. The average episode of Glee has maybe three good scenes and eight bad scenes, but the good stuff is usually executed with enough conviction and skill to carry the audience past the rest. The lesson? Every story has a few big moments. No matter what else you do as a writer, make sure those moments work.
2. Do invest the audience in your characters as early as possible. Glee‘s pilot, which now seems so long ago, did an impressive job of generating interest in a massive cast of characters. Since then, nearly everything the pilot established has been thrown out the window, but the viewer’s initial engagement with Will, Rachel, and the rest still gives the show a lot of goodwill, which it hasn’t entirely squandered. (Please note, though, that a cast of appealing actors goes a long way toward maintaining the audience’s sympathy. In a novel, once your characters have lost the reader’s interest, it’s very hard to win it back.)
3. Do push against yourself and your story. A.V. Club critic Todd VanDerWerff has done a heroic job of arguing the “three authors” theory of Glee: that the show’s creators—Ryan Murphy, Brad Falchuk, and Ian Brennan—each have distinct, and conflicting, visions of what the show should be, and that this inherent tension is what makes the show so fascinating. Similarly, much of the interest of an ambitious novel comes from the writer’s struggle against the restrictions and contradictions of his or her own story. (Of course, if you don’t give yourself at least some constraints, such as those of genre, you aren’t likely to benefit from this.)
1. Don’t neglect structure. Remember the importance of constraints? The trouble with Glee is that it doesn’t seem to have any. Early on, the show established a tone and style in which almost anything could happen, which is fine—but even the most anarchic comedy benefits from following a consistent set of rules. In Glee‘s case, a little more narrative coherence, and a lot more character consistency, would go a long way towards making it a great show, rather than a fascinating train wreck.
2. Don’t take your eye off the long game. Glee rather notoriously went through four years’ worth of plotlines in its first season, and as a result, the second season has seemed increasingly aimless. Obviously, it’s hard for most TV shows, which hover precariously between cancellation and renewal, to plan much further ahead than the next order of episodes, but a novelist has no such excuse. A writer has to maintain the reader’s interest over hundreds of pages, so as tempting as it is to put all your best ideas up front, it’s important to keep a few things in reserve, especially for the ending.
In terms of not giving people what they want, I think it’s a mandate: Don’t give people what they want, give them what they need. What they want is for Sam and Diane to get together. [Whispers.] Don’t give it to them. Trust me. [Normal voice.] You know?
Glee, because it was so successful so early on, and with such a devoted fan base, has repeatedly succumbed to the temptation to give viewers exactly what they want, whether it’s more jukebox episodes, bigger musical numbers, or a romance between two of its leads. (And fans don’t like it if the show takes one of these things away.) This approach might work in the short term, but in the long run, it leaves the show—as is becoming increasingly clear—with nowhere else to go. Remember: once your characters, or your readers, get what they want, the story is essentially over.
Of course, none of these issues have hurt Glee‘s success, and judging from the last few episodes, the show is making an effort to dial back the worst of its excesses. And I do hope it continues to improve. As much as I enjoy it now, a show can’t work as a case study forever. Because a show like Glee is always interesting…until, alas, it isn’t.