Posts Tagged ‘The Simpsons’
The chosen ones
In his recent New Yorker profile of Mark Zuckerberg, Evan Osnos quotes one of the Facebook founder’s close friends: “I think Mark has always seen himself as a man of history, someone who is destined to be great, and I mean that in the broadest sense of the term.” Zuckerberg feels “a teleological frame of feeling almost chosen,” and in his case, it happened to be correct. Yet this tells us almost nothing abut Zuckerberg himself, because I can safely say that most other undergraduates at Harvard feel the same way. A writer for The Simpsons once claimed that the show had so many presidential jokes—like the one about Grover Cleveland spanking Grandpa “on two non-consecutive occasions”—because most of the writers secretly once thought that they would be president themselves, and he had a point. It’s very hard to do anything interesting in life without the certainty that you’re somehow one of the chosen ones, even if your estimation of yourself turns out to be wildly off the mark. (When I was in my twenties, my favorite point of comparison was Napoleon, while Zuckerberg seems to be more fond of Augustus: “You have all these good and bad and complex figures. I think Augustus is one of the most fascinating. Basically, through a really harsh approach, he established two hundred years of world peace.”) This kind of conviction is necessary for success, although hardly sufficient. The first human beings to walk on Mars may have already been born. Deep down, they know it, and this knowledge will determine their decisions for the rest of their lives. Of course, thousands of others “know” it, too. And just a few of them will turn out to be right.
One of my persistent themes on this blog is how we tend to confuse talent with luck, or, more generally, to underestimate the role that chance plays in success or failure. I never tire of quoting the economist Daniel Kahneman, who in Thinking Fast and Slow shares what he calls his favorite equation:
Success = Talent + Luck
Great Success = A little more talent + A lot of luck
The truth of this statement seems incontestable. Yet we’re all reluctant to acknowledge its power in our own lives, and this tendency only increases as the roles played by luck and privilege assume a greater importance. This week has been bracketed by news stories about two men who embody this attitude at its most extreme. On the one hand, you have Brett Kavanaugh, a Yale legacy student who seems unable to recognize that his drinking and his professional success weren’t mutually exclusive, but closer to the opposite. He occupied a cultural and social stratum that gave him the chance to screw up repeatedly without lasting consequences, and we’re about to learn how far that privilege truly extends. On the other hand, you have yesterday’s New York Times exposé of Donald Trump, who took hundreds of millions of dollars from his father’s real estate empire—often in the form of bailouts for his own failed investments—while constantly describing himself as a self-made billionaire. This is hardly surprising, but it’s still striking to see the extent to which Fred Trump played along with his son’s story. He understood the value of that myth.
This gets at an important point about privilege, no matter which form it takes. We have a way of visualizing these matters in spatial terms—”upper class,” “lower class,” “class pyramid,” “rising,” “falling,” or “stratum” in the sense that I used it above. But true privilege isn’t spatial, but temporal. It unfolds over time, by giving its beneficiaries more opportunities to fail and recover, when those living at the edge might not be able to come back from the slightest misstep. We like to say that a privileged person is someone who was born on third base and thinks he hit a triple, but it’s more like being granted unlimited turns at bat. Kavanaugh provides a vivid reminder, in case we needed one, that a man who fits a certain profile has the freedom to make all kinds of mistakes, the smallest of which would be fatal for someone who didn’t look like he did. And this doesn’t just apply to drunken misbehavior, criminal or otherwise, but even to the legitimate failures that are necessary for the vast majority of us to achieve real success. When you come from the right background, it’s easier to survive for long enough to benefit from the effects of luck, which influences the way that we talk about failure itself. Silicon Valley speaks of “failing faster,” which only makes sense when the price of failure is humiliation or the loss of investment capital, not falling permanently out of the middle class. And as I’ve noted before, Pixar’s creative philosophy, which Andrew Stanton described as a process in which “the films still suck for three out of the four years it takes to make them,” is only practicable for filmmakers who look and sound like their counterparts at the top, which grants them the necessary creative freedom to fail repeatedly—a luxury that women are rarely granted.
This may all come across as unbelievably depressing, but there’s a silver lining, and it took me years to figure it out. The odds of succeeding in any creative field—which includes nearly everything in which the standard career path isn’t clearly marked—are minuscule. Few who try will ever make it, even if they have “a teleological frame of feeling almost chosen.” This isn’t due to a lack of drive or talent, but of time and second chances. When you combine the absence of any straightforward instructions with the crucial role played by luck, you get a process in which repeated failure over a long period is almost inevitable. Those who drop out don’t suffer from weak nerves, but from the fact that they’ve used up all of their extra lives. Privilege allows you to stay in the game for long enough for the odds to turn in your favor, and if you’ve got it, you may as well use it. (An Ivy League education doesn’t guarantee success, but it drastically increases your ability to stick around in the middle class in the meantime.) In its absence, you can find strategies of minimizing risk in small ways while increasing it on the highest levels, which just another word for becoming a bohemian. And the big takeaway here is that since the probability of success is already so low, you may as well do exactly what you want. It can be tempting to tailor your work to the market, reasoning that it will increase your chances ever so slightly, but in reality, the difference is infinitesimal. An objective observer would conclude that you’re not going to make it either way, and even if you do, it will take about the same amount of time to succeed by selling out as it would by staying true to yourself. You should still do everything that you can to make the odds more favorable, but if you’re probably going to fail anyway, you might as well do it on your own terms. And that’s the only choice that matters.
The Prime of Miss Elizabeth Hoover
Yesterday, as I was working on my post for this blog, I found myself thinking about the first time that I ever heard of Lyme disease, which, naturally, was on The Simpsons. In the episode “Lisa’s Substitute,” which first aired on April 25, 1991, Lisa’s teacher, Miss Hoover, tells the class: “Children, I won’t be staying long. I just came from the doctor, and I have Lyme disease.” As Principal Skinner cheerfully explains: “Lyme disease is spread by small parasites called ‘ticks.’ When a diseased tick attaches itself to you, it begins sucking your blood. Malignant spirochetes infect your bloodstream, eventually spreading to your spinal fluid and on into the brain.” At the end of the second act, however, Miss Hoover unexpectedly returns, and I’ve never forgotten her explanation for her sudden recovery:
Miss Hoover: You see, class, my Lyme disease turned out to be psychosomatic.
Ralph: Does that mean you’re crazy?
Janie: It means she was faking it.
Miss Hoover: No, actually, it was a little of both. Sometimes, when a disease is in all the magazines and on all the news shows, it’s only natural that you think you have it.
And while it might seem excessive to criticize a television episode that first aired over a quarter of a century ago, it’s hard to read these lines after Porochista Khakpour’s memoir Sick without wishing that this particular joke didn’t exist.
In its chronic form, Lyme disease remains controversial, but like chronic fatigue syndrome and fibromyalgia, it’s an important element in the long, complicated history of women having trouble finding doctors who will take their pain seriously. As Lidija Haas writes in The New Yorker:
There’s a class of illnesses—multi-symptomatic, chronic, hard to diagnose—that remain associated with suffering women and disbelieving experts. Lyme disease, symptoms of which can afflict patients years after the initial tick bite, appears to be one…[The musician Kathleen Hanna] describes an experience common to many sufferers from chronic illness—that of being dismissed as an unreliable witness to what is happening inside her. Since no single medical condition, a doctor once told her, could plausibly affect so many different systems—neurological, respiratory, gastrointestinal—she must be having a panic attack…As in so many other areas of American life, women of color often endure the most extreme versions of this problem.
It goes without saying that when “Lisa’s Substitute” was written, there weren’t any women on the writing staff of The Simpsons, although even if there were, it might not have made a difference. In her recent memoir Just the Funny Parts, Nell Scovell, who worked as a freelance television writer in the early nineties, memorably describes the feeling of walking into the “all-male” Simpsons writing room, which was “welcoming, but also intimidating.” It’s hard to imagine these writers, so many of them undeniably brilliant, thinking twice about making a joke like this—and it’s frankly hard to see them rejecting it now, when it might only lead to attacks from people who, in Matt Groening’s words, “love to pretend they’re offended.”
I’m not saying that there are any subjects that should be excluded from comedic consideration, or that The Simpsons can’t joke about Lyme disease. But as I look back at the classic years of my favorite television show of all time, I’m starting to see a pattern that troubles me, and it goes far beyond Apu. I’m tempted to call it “punching down,” but it’s worse. It’s a tendency to pick what seem at the time like safe targets, and to focus with uncanny precision on comic gray areas that allow for certain forms of transgression. I know that I quoted this statement just a couple of months ago, but I can’t resist repeating what producer Bill Oakley says of Greg Daniels’s pitch about an episode on racism in Springfield:
Do you remember this? Something about Homer and Dr. Hibbert? Well, you pitched it several times and I think we were just…It was some exploration of the concept of race in Springfield, and we just said, you know, we don’t think this is the forum. The Simpsons can’t be the right forum to deal with racism.
He was probably right. But when you look at the few racially charged jokes that the show actually made, the characters involved weren’t black, but quite specifically “brown,” or members of groups that occupy a liminal space in our cultural understanding of race: Apu, Akira, Bumblebee Man. (I know that Akira was technically whiter than anybody else, but you get my drift.) By contrast, the show was very cautious when it came to its black characters. Apart from Dr. Hibbert, who was derived from Bill Cosby, the show’s only recurring black faces were Carl and Officer Lou, the latter of whom is so unmemorable that I had to look him up to make sure that he wasn’t Officer Eddie. And both Carl and Lou were given effectively the same voice by Hank Azaria, the defining feature of which was that it was nondescript as humanly possible.
I’m not necessarily criticizing the show’s treatment of race, but the unconscious conservatism that carefully avoided potentially controversial areas while lavishing attention on targets that seemed unobjectionable. It’s hard to imagine a version of the show that would have dared to employ such stereotypes, even ironically, on Carl, Lou, or even Judge Snyder, who was so racially undefined that he was occasionally colored as white. (The show’s most transgressive black figures, Drederick Tatum and Lucius Sweet, were so transparently modeled on real people that they barely even qualified as characters. As Homer once said: “You know Lucius Sweet? He’s one of the biggest names in boxing! He’s exactly as rich and as famous as Don King, and he looks just like him, too!” And I’m not even going to talk about “Bleeding Gums” Murphy.) That joke about Miss Hoover is starting to feel much the same way, and if it took two decades for my own sensibilities to catch up with that fact, it’s for the same reasons that we’re finally taking a harder look at Apu. And if I speak as a fan, it isn’t to qualify these thoughts, but to get at the heart of why I feel obliged to write about them at all. We’re all shaped by popular culture, and I can honestly say of The Simpsons, as Jack Kerouac writes in On the Road: “All my actions since then have been dictated automatically to my subconscious by this horrible osmotic experience.” The show’s later seasons are reflexively dismissed as lazy, derivative, and reliant on easy jokes, but we still venerate its golden years. Yet if The Simpsons has gradually degraded under the watch of many of its original writers and producers, this implies that we’re only seeing the logical culmination—or eruption—of something that was there all along, afflicting its viewers years after the original bite. We all believed that The Simpsons, in its prime, was making us smarter. But what if it was just psychosomatic?
A season of disenchantment
A few days ago, Matt Groening announced that his new animated series, Disenchantment, will premiere in August on Netflix. Under other circumstances, I might have been pleased by the prospect of another show from the creator of The Simpsons and Futurama—not to mention producers Bill Oakley and Josh Weinstein—and I expect that I’ll probably watch it. At the moment, however, it’s hard for me to think about Groening at all without recalling his recent reaction to the long overdue conversation around the character of Apu. When Bill Keveny of USA Today asked earlier this month if he had any thoughts on the subject, Groening replied: “Not really. I’m proud of what we do on the show. And I think it’s a time in our culture where people love to pretend they’re offended.” It was a profoundly disappointing statement, particularly after Hank Azaria himself had expressed his willingness to step aside from the role, and it was all the more disillusioning coming from a man whose work has been a part of my life for as long as I can remember. As I noted in my earlier post, the show’s unfeeling response to this issue is painful because it contradicts everything that The Simpsons was once supposed to represent. It was the smartest show on television; it was simply right about everything; it offered its fans an entire metaphorical language. And as the passage of time reveals that it suffered from its own set of blinders, it doesn’t just cast doubt on the series and its creators, but on the viewers, like me, who used it for so long as an intellectual benchmark.
And it’s still an inescapable part of my personal lexicon. Last year, for instance, when Elon Musk defended his decision to serve on Trump’s economic advisory council, I thought immediately of what Homer says to Marge in “Whacking Day”: “Maybe if I’m part of that mob, I can help steer it in wise directions.” Yet it turns out that I might have been too quick to give Musk—who, revealingly, was the subject of an adulatory episode of The Simpsons—the benefit of the doubt. A few months later, in response to reports of discrimination at Tesla, he wrote an email to employees that included this remarkable paragraph:
If someone is a jerk to you, but sincerely apologizes, it is important to be thick-skinned and accept that apology. If you are part of a lesser represented group, you don’t get a free pass on being a jerk yourself. We have had a few cases at Tesla were someone in a less represented group was actually given a job or promoted over more qualified highly represented candidates and then decided to sue Tesla for millions of dollars because they felt they weren’t promoted enough. That is obviously not cool.
The last two lines, which were a clear reference to the case of A.J. Vandermeyden, tell us more about Musk’s idea of a “sincere apology” than he probably intended. And when Musk responded this week to criticism of Tesla’s safety and labor practices by accusing the nonprofit Center for Investigative Reporting of bias and proposing a site where users could provide a “credibility score” for individual journalists, he sounded a lot like the president whose circle of advisers he only reluctantly left.
Musk, who benefited from years of uncritical coverage from people who will forgive anything as long as you talk about space travel, seems genuinely wounded by any form of criticism or scrutiny, and he lashes out just as Trump does—by questioning the motives of ordinary reporters or sources, whom he accuses of being in the pocket of unions or oil companies. Yet he’s also right to be worried. We’re living in a time when public figures and institutions are going to be judged by their responses to questions that they would rather avoid, which isn’t likely to change. And the media itself is hardly exempt. For the last two weeks, I’ve been waiting for The New Yorker to respond to stories about the actions of two of its most prominent contributors, Junot Díaz and the late David Foster Wallace. I’m not even sure what I want the magazine to do, exactly, except make an honest effort to grapple with the situation, and maybe even offer a valuable perspective, which is why I read it in the first place. (In all honesty, it fills much the same role in my life these days as The Simpsons did in my teens. As Norman Mailer wrote back in the sixties: “Hundreds of thousands, perhaps millions of people in the most established parts of the middle class kill their quickest impulses before they dare to act in such a way as to look ridiculous to the private eye of their taste whose style has been keyed by the eye of The New Yorker.”) As the days passed without any comment, I assumed that it was figuring out how to tackle an admittedly uncomfortable topic, and I didn’t expect it to rush. Now that we’ve reached the end of the month without any public engagement at all, however, I can only conclude that it’s deliberately ignoring the matter in hopes that it will go away. I hope that I’m wrong. But so far, it’s a discouraging omission from a magazine whose stories on Harvey Weinstein and Eric Schneiderman implicitly put it at the head of an entire movement.
The New Yorker has evidently discovered that it’s harder to take such stands when they affect people whom we know or care about— which only means that it can get in line. Our historical moment has forced some of our smartest individuals and organizations to learn how to take criticism as well as to give it, and it’s often those whose observations about others have been the sharpest who turn out to be singularly incapable, as Clarice Starling once put it, when it comes to pointing that high-powered perception on themselves. (In this list, which is constantly being updated, I include Groening, Musk, The New Yorker, and about half the cast of Arrested Development.) But I can sympathize with their predicament, because I feel it nearly every day. My opinion of Musk has always been rather mixed, but nothing can dislodge the affection and gratitude that I feel toward the first eight seasons of The Simpsons, and I expect to approvingly link to an article in The New Yorker this time next week. But if our disenchantment forces us to question the icons whose influence is fundamental to our conception of ourselves, then perhaps it will have been worth the pain. Separating our affection for the product from those who produced it is a problem that we all have to confront, and it isn’t going to get any easier. As I was thinking about this post yesterday, the news broke that Morgan Freeman had been accused by multiple women of inappropriate behavior. In response, he issued a statement that read in part: “I apologize to anyone who felt uncomfortable or disrespected.” It reminded me a little of another man who once grudgingly said of some remarks that were caught on tape: “I apologize if anyone was offended.” But it sounds a lot better when you imagine it in Morgan Freeman’s voice.
Tales from The Far Side
Note: I’m taking a few days off for the holidays, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on September 27, 2016.
Last year, when I finally saw The Revenant—it wasn’t a movie that my wife particularly wanted to see, so I had to wait for one of the rare weekends when she was out of town—it struck me as an exquisitely crafted film that was very hard to take seriously. Alejandro G. Iñárittu, despite his obvious visual gifts, may be the most pretentious and least self-aware director at work today—which is one reason why Birdman fell so flat in my eyes—and I would have liked The Revenant a lot more if it had allowed itself to smile a little at how absurd its story was. (Even the films of someone like Werner Herzog include flashes of dark humor, and I suspect that Herzog, who doesn’t lack for pretension, also actively seeks out such moments, even if he maintains his poker face throughout.) About five minutes after the movie began, I realized that I was fundamentally out of sync with it. It happened during the scene in which the fur trappers find themselves under attack by an Arikara war party, which announces itself, in classic fashion, with an unexpected arrow through a supporting character’s throat. A few seconds later, the camera pans up to show more arrows, now on fire, arcing through the trees overhead. It’s an eerie sight, and it’s given the usual glow by Emmanuel Lubezki’s luminous cinematography. But I’ll confess that when I first saw it, I said to myself: “Hey! They’re lighting their arrows! Can they do that?”
It’s a caption from a Far Side cartoon, of course, and it started me thinking about the ways in which the work of Gary Larson has imperceptibly shaped my inner life. I’ve spoken here before about how quotations from The Simpsons provide a complete metaphorical language for fans, like the one that Captain Picard learns in “Darmok.” You could do much the same thing with Larson’s captions, and there are a lot of fluent speakers out there. Peanuts is still the comic strip that has meant the most to me, and I count myself lucky that I grew up at a time when I could read most of Calvin and Hobbes in its original run. Yet both of these strips, like Bloom County, lived most vividly for me in the form of collections, and in the case of Peanuts, its best years were long behind it. The Far Side, by contrast, obsessed me on a daily basis, more than any other comic strip of its era. When I was eight years old, I spent a few months diligently cutting out all the panels from my local paper and pasting them into a scrapbook, which is an impulse that I haven’t felt since. Two decades later, I got a copy of The Complete Far Side for Christmas, which might still be my favorite present ever. Every three years so, I get bitten by the bug again, and I spend an evening or two with one of those huge volumes on my lap, going through the strip systematically from beginning to end. Its early years are a little rough, but they’re still wonderful, and it went out at its peak. And when I’m reading it in the right mood, there’s nothing else in the world that I’d rather be doing.
A gag panel might seem like the lowest form of comic, but The Far Side also had a weirdly novelistic quality that I’ve always admired as a writer. Larson’s style seemed easy to imitate—I think that every high school newspaper had a strip that verged on outright plagiarism—but his real gift was harder to pin down. It was the ability to take what seemed like an ongoing story, pause it, and offer it up to readers at a moment of defining absurdity. (Larson himself observes in The Prehistory of The Far Side: “Cartoons are, after all, little stories themselves, frozen at an interesting point in time.”) His ideas stick in the brain because we can’t help but wonder what happened before or afterward. Part of this because he cleverly employed all the usual tropes of the gag cartoon, which are fun precisely because of the imaginative fertility of the clichés they depict: the cowboys singing around a campfire, the explorers in pith helmets hacking their way through the jungle, the castaway on the desert island. But the snapshots in time that Larson captures are simultaneously so insane and so logical that the reader has no choice but to make up a story. The panel is never the inciting incident or the climax, but a ticklish moment somewhere in the middle. It can be the gigantic mailman knocking over buildings while a dog exhorts a crowd of his fellows: “Listen! The authorities are helpless! If the city’s to be saved, I’m afraid it’s up to us! This is our hour!” Or the duck hunter with a shotgun confronted by a row of apparitions in a hall of mirrors: “Ah, yes, Mr. Frischberg, I thought you’d come…but which of us is the real duck, Mr. Frischberg, and not just an illusion?”
In fact, you could easily go through a Far Side collection and use it as a series of writing prompts, like some demented version of The Mysteries of Harris Burdick. I’ve occasionally thought about writing a story revolving around the sudden appearance of Professor DeArmond, “the epitome of evil among butterfly collectors,” or expanding on the incomparable caption: “Dwayne paused. As usual, the forest was full of happy little animals—but this time something seemed awry.” It’s hard to pick just one favorite, but the panel I’ve thought about the most is probably the one with the elephant in the trench coat, speaking in a low voice out of the darkness of the stairwell:
Remember me, Mr. Schneider? Kenya. 1947. If you’re going to shoot at an elephant, Mr. Schneider, you better be prepared to finish the job.
Years later, I spent an ungodly amount of time working on a novel, still unpublished, about an elephant hunt, and while I wouldn’t go so far as to say that it was inspired by this cartoon, I’m also not prepared to say that it wasn’t. I should also note Larson’s mastery of perfect proper names, which are harder to come up with than you might think: “Mr. Frischberg” and “Mr. Schneider” were both so nice that he said them twice. And it’s that inimitable mixture of the ridiculous and the specific that makes Larson such a model for storytellers. He made it to the far side thirty years ago, and we’re just catching up to him now.
The conveyor belt
For all the endless discussion of various aspects of Twin Peaks, one quality that sometimes feels neglected is the incongruous fact that it had one of the most attractive casts in television history. In that respect—and maybe in that one alone—it was like just about every other series that ever existed. From prestige dramas to reality shows to local newscasts, the story of television has inescapably been that of beautiful men and women on camera. A show like The Hills, which was one of my guilty pleasures, seemed to be consciously trying to see how long it could coast on surface beauty alone, and nearly every series, ambitious or otherwise, has used the attractiveness of its actors as a commercial or artistic strategy. (In one of the commentary tracks on The Simpsons, a producer describes how a network executive might ask indirectly about the looks of the cast of a sitcom: “So how are we doing aesthetically?”) If this seemed even more pronounced on Twin Peaks, it was partially because, like Mad Men, it took its conventionally glamorous actors into dark, unpredictable places, and also because David Lynch had an eye for a certain kind of beauty, both male and female, that was more distinctive than that of the usual soap opera star. He’s continued this trend in the third season, which has been populated so far by such striking presences as Chrysta Bell, Ben Rosenfield, and Madeline Zima, and last night’s episode features an extended, very funny scene between a delighted Gordon Cole and a character played by Bérénice Marlohe, who, with her red lipstick and “très chic” spike heels, might be the platonic ideal of his type.
Lynch isn’t the first director to display a preference for actors, particularly women, with a very specific look—although he’s thankfully never taken it as far as his precursor Alfred Hitchcock did. And the notion that a film or television series can consist of little more than following around two beautiful people with a camera has a long and honorable history. My two favorite movies of my lifetime, Blue Velvet and Chungking Express, both understand this implicitly. It’s fair to say that the second half of the latter film would be far less watchable if it didn’t involve Tony Leung and Faye Wong, two of the most attractive people in the world, and Wong Kar-Wai, like so many filmmakers before him, uses it as a psychological hook to take us into strange, funny, romantic places. Blue Velvet is a much darker work, but it employs a similar lure, with the actors made up to look like illustrations of themselves. In a Time cover story on Lynch from the early nineties, Richard Corliss writes of Kyle MacLachlan’s face: “It is a startling visage, as pure of line as an art deco vase, with soft, all-American features and a comic-book hero’s jutting chin—you could park a Packard on it.” It echoes what Pauline Kael says of Isabella Rossellini in Blue Velvet: “She even has the kind of nostrils that cover artists can represent accurately with two dots.” MacLachlan’s chin and Rossellini’s nose would have caught our attention in any case, but it’s also a matter of lighting and makeup, and Lynch shoots them to emphasize their roots in the pulp tradition, or, more accurately, in the subconscious store of images that we take from those sources. And the casting gets him halfway there.
This leaves us in a peculiar position when it comes to the third season of Twin Peaks, which, both by nature and by design, is about aging. Mark Frost said in an interview: “It’s an exercise in engaging with one of the most powerful themes in all of art, which is the ruthless passage of time…We’re all trapped in time and we’re all going to die. We’re all traveling along this conveyor belt that is relentlessly moving us toward this very certain outcome.” One of the first, unforgettable images from the show’s promotional materials was Kyle MacLachlan’s face, a quarter of a century older, emerging from the darkness into light, and our feelings toward these characters when they were younger inevitably shape the way we regard them now. I felt this strongly in two contrasting scenes from last night’s episode. It offers us our first extended look at Sarah Palmer, played by Grace Zabriskie, who delivers a freakout in a grocery store that reminds us of how much we’ve missed and needed her—it’s one of the most electrifying moments of the season. And we also finally see Audrey Horne again, in a brutally frustrating sequence that feels to me like the first time that the show’s alienating style comes off as a miscalculation, rather than as a considered choice. Audrey isn’t just in a bad place, which we might have expected, but a sad, unpleasant one, with a sham marriage and a monster of a son, and she doesn’t even know the worst of it yet. It would be a hard scene to watch with anyone, but it’s particularly painful when we set it against our first glimpse of Audrey in the original series, when we might have said, along with the Norwegian businessman at the Great Northern Hotel: “Excuse me, is there something wrong, young pretty girl?”
Yet the two scenes aren’t all that dissimilar. Both Sarah and Audrey are deeply damaged characters who could fairly say: “Things can happen. Something happened to me.” And I can only explain away the difference by confessing that I was a little in love in my early teens with Audrey. Using those feelings against us—much as the show resists giving us Dale Cooper again, even as it extravagantly develops everything around him—must have been what Lynch and Frost had in mind. And it isn’t the first time that this series has toyed with our emotions about beauty and death. The original dream girl of Twin Peaks, after all, was Laura Palmer herself, as captured in two of its most indelible images: Laura’s prom photo, and her body wrapped in plastic. (Sheryl Lee, like January Jones in Mad Men, was originally cast for her look, and only later did anyone try to find out whether or not she could act.) The contrast between Laura’s lovely features and her horrifying fate, in death and in the afterlife, was practically the motor on which the show ran. Her face still opens every episode of the revival, dimly visible in the title sequence, but it also ended each installment of the original run, gazing out from behind the prison bars of the closing credits to the strains of “Laura Palmer’s Theme.” In the new season, the episodes generally conclude with whatever dream pop band Lynch feels like showcasing, usually with a few cool women, and I wouldn’t want to give that up. But I also wonder whether we’re missing something when we take away Laura at the end. This season began with Cooper being asked to find her, but she often seems like the last thing on anyone’s mind. Twin Peaks never allowed us to forget her before, because it left us staring at her photograph each week, which was the only time that one of its beautiful faces seemed to be looking back at us.
Live from Twin Peaks
What does Twin Peaks look like without Agent Cooper? It was a problem that David Lynch and his writing team were forced to solve for Fire Walk With Me, when Kyle MacLachlan declined to come back for much more than a token appearance, and now, in the show’s third season, Lynch and Mark Frost seem determined to tackle the question yet again, even though they’ve been given more screen time for their leading man than anyone could ever want. MacLachlan’s name is the first thing that we see in the closing credits, in large type, to the point where it’s starting to feel like a weekly punchline—it’s the only way that we’d ever know that the episode was over. He’s undoubtedly the star of the show. Yet even as we’re treated to an abundance of Dark Cooper and Dougie Jones, we’re still waiting to see the one character that I, and a lot of other fans, have been awaiting the most impatiently. Dale Cooper, it’s fair to say, is one of the most peculiar protagonists in television history. As the archetypal outsider coming into an isolated town to investigate a murder, he seems at first like a natural surrogate for the audience, but, if anything, he’s quirkier and stranger than many of the locals he encounters. When we first meet Cooper, he comes across as an almost unplayable combination of personal fastidiousness, superhuman deductive skills, and childlike wonder. But you’re anything like me, you wanted to be like him. I ordered my coffee black for years. And if he stood for the rest of us, it was as a representative of the notion, which crumbles in the face of logic but remains emotionally inescapable, that the town of Twin Peaks would somehow be a wonderful place to live, despite all evidence to the contrary.
In the third season, this version of Cooper, whom I’ve been waiting for a quarter of a century to see again, is nowhere in sight. And the buildup to his return, which I still trust will happen sooner or later, has been so teasingly long that it can hardly be anything but a conscious artistic choice. With every moment of recognition—the taste of coffee, the statue of the gunfighter in the plaza—we hope that the old Cooper will suddenly reappear, but the light in his eyes always fades. On some level, Lynch and Frost are clearly having fun with how long they can get away with this, but by removing the keystone of the original series, they’re also leaving us with some fascinating insights into what kind of show this has been from the very beginning. Let’s tick off its qualities one by one. Over the course of any given episode, it cuts between what seems like about a dozen loosely related plotlines. Most of the scenes last between two and four minutes, with about the same number of characters, and the components are too far removed from one another to provide anything in the way of narrative momentum. They aren’t built around any obligation to advance the plot, but around striking images or odd visual or verbal gags. The payoff, as in the case of Dr. Jacoby’s golden shovels, often doesn’t come for hours, and when it does, it amounts to the end of a shaggy dog story. (The closest thing we’ve had so far to a complete sequence is the sad case of Sam, Tracey, and the glass cube, which didn’t even make it past the premiere.) If there’s a pattern, it isn’t visible, but the result is still strangely absorbing, as long as you don’t approach it as a conventional drama but as something more like Twenty-Two Short Films About Twin Peaks.
You know what this sounds like to me? It sounds like a sketch comedy show. I’ve always seen Twin Peaks as a key element in a series of dramas that stretches from The X-Files through Mad Men, but you could make an equally strong case for it as part of a tradition that runs from SCTV to Portlandia, which went so far as to cast MacLachlan as its mayor. They’re set in a particular location with a consistent cast of characters, but they’re essentially sketch comedies, and when one scene is over, they simply cut to the next. In some ways, the use of a fixed setting is a partial solution to the problem of transitions, which shows from Monty Python onward have struggled to address, but it also creates a beguiling sense of encounters taking place beyond the edges of the frame. (Matt Groening has pointed to SCTV as an inspiration for The Simpsons, with its use of a small town in which the characters were always running into one another. Groening, let’s not forget, was born in Portland, just two hours away from Springfield, which raises the intriguing question of why such shows are so drawn to the atmosphere of the Pacific Northwest.) Without Cooper, the show’s affinities to sketch comedy are far more obvious—and this isn’t the first time this has happened. After Laura’s murderer was revealed in the second season, the show seemed to lose direction, and many of the subplots, like James’s terminable storyline with Evelyn, became proverbial for their pointlessness. But in retrospect, that arid middle stretch starts to look a lot like an unsuccessful sketch comedy series. And it’s worth remembering that Lynch and Frost originally hoped to keep the identity of the killer a secret forever, knowing that it was all that was holding together the rest.
In the absence of a connective thread, it takes a genius to make this kind of thing work, and the lack of a controlling hand is a big part of what made the second season so markedly unsuccessful. Fortunately, the third season has a genius readily available. The sketch format has always been David Lynch’s comfort zone, a fact that has been obscured by contingent factors in his long career. Lynch, who was trained as a painter and conceptual artist, thinks naturally in small narrative units, like the video installations that we glimpse for a second as we wander between rooms in a museum. Eraserhead is basically a bunch of sketches linked by its titular character, and he returned to that structure in Inland Empire, which, thanks to the cheapness of digital video, was the first movie in decades that he was able to make entirely on his own terms. In between, the inclination was present but constrained, sometimes for the better. In its original cut of three hours, Blue Velvet would have played much the same way, but in paring it down to its contractually mandated runtime, Lynch and editor Duwayne Dunham ended up focusing entirely on its backbone as a thriller. (It’s an exact parallel to Annie Hall, which began as a three-hour series of sketches called Anhedonia that assumed its current form after Woody Allen and Ralph Rosenbaum threw out everything that wasn’t a romantic comedy.) Most interesting of all is Mulholland Drive, which was originally shot as a television pilot, with fragmented scenes that were clearly supposed to lead to storylines of their own. When Lynch recut it into a movie, they became aspects of Betty’s dream, which may have been closer to what he wanted in the first place. And in the third season of Twin Peaks, it is happening again.
The Berenstain Barrier
If you’ve spent any time online in the last few years, there’s a decent chance that you’ve come across some version of what I like to call the Berenstain Bears enigma. It’s based on the fact that a sizable number of readers who recall this book series from childhood remember the name of its titular family as “Berenstein,” when in reality, as a glance at any of the covers will reveal, it’s “Berenstain.” As far as mass instances of misremembering are concerned, this isn’t particularly surprising, and certainly less bewildering than the Mandela effect, or the similar confusion surrounding a nonexistent movie named Shazam. But enough people have been perplexed by it to inspire speculation that these false memories may be the result of an errant time traveler, à la Asimov’s The End of Eternity, or an event in which some of us crossed over from an alternate universe in which the “Berenstein” spelling was correct. (If the theory had emerged a few decades earlier, Robert Anton Wilson might have devoted a page or two to it in Cosmic Trigger.) Even if we explain it as an understandable, if widespread, mistake, it stands as a reminder of how an assumption absorbed in childhood remains far more powerful than a falsehood learned later on. If we discover that we’ve been mispronouncing, say, “Steve Buscemi” for all this time, we aren’t likely to take it as evidence that we’ve ended up in another dimension, but the further back you go, the more ingrained such impressions become. It’s hard to unlearn something that we’ve believed since we were children—which indicates how difficult it can be to discard the more insidious beliefs that some of us are taught from the cradle.
But if the Berenstain Bears enigma has proven to be unusually persistent, I suspect that it’s because many of us really are remembering different versions of this franchise, even if we believe that we aren’t. (You could almost take it as a version of Hilary Putnam’s Twin Earth thought experiment, which asks if the word “water” means the same thing to us and to the inhabitants of an otherwise identical planet covered with a similar but different liquid.) As I’ve recently discovered while reading the books aloud to my daughter, the characters originally created by Stan and Jan Berenstain have gone through at least six distinct incarnations, and your understanding of what this series “is” largely depends on when you initially encountered it. The earliest books, like The Bike Lesson or The Bears’ Vacation, were funny rhymed stories in the Beginner Book style in which Papa Bear injures himself in various ways while trying to teach Small Bear a lesson. They were followed by moody, impressionistic works like Bears in the Night and The Spooky Old Tree, in which the younger bears venture out alone into the dark and return safely home after a succession of evocative set pieces. Then came big educational books like The Bears’ Almanac and The Bears’ Nature Guide, my own favorites growing up, which dispensed scientific facts in an inviting, oversized format. There was a brief detour through stories like The Berenstain Bears and the Missing Dinosaur Bone, which returned to the Beginner Book format but lacked the casually violent gags of the earlier installments. Next came perhaps the most famous period, with dozens of books like Trouble With Money and Too Much TV, all written, for the first time, in prose, and ending with a tidy, if secular, moral. Finally, and jarringly, there was an abrupt swerve into Christianity, with titles like God Loves You and The Berenstain Bears Go to Sunday School.
To some extent, you can chalk this up to the noise—and sometimes the degeneration—that afflicts any series that lasts for half a century. Incremental changes can lead to radical shifts in style and tone, and they only become obvious over time. (Peanuts is the classic example, but you can even see it in the likes of Dennis the Menace and The Family Circus, both of which were startlingly funny and beautifully drawn in their early years.) Fashions in publishing can drive an author’s choices, which accounts for the ups and downs of many a long career. And the bears only found Jesus after Mike Berenstain took over the franchise after the deaths of his parents. Yet many critics don’t bother making these distinctions, and the ones who hate the Berenstain Bears books seem to associate them entirely with the Trouble With Money period. In 2005, for instance, Paul Farhi of the Washington Post wrote:
The larger questions about the popularity of the Berenstain Bears are more troubling: Is this what we really want from children’s books in the first place, a world filled with scares and neuroses and problems to be toughed out and solved? And if it is, aren’t the Berenstain Bears simply teaching to the test, providing a lesson to be spit back, rather than one lived and understood and embraced? Where is the warmth, the spirit of discovery and imagination in Bear Country? Stan Berenstain taught a million lessons to children, but subtlety and plain old joy weren’t among them.
Similarly, after Jan Berenstain died, Hanna Rosin of Slate said: “As any right-thinking mother will agree, good riddance. Among my set of mothers the series is known mostly as the one that makes us dread the bedtime routine the most.”
Which only tells me that neither Farhi or Rosin ever saw The Spooky Old Tree, which is a minor masterpiece—quirky, atmospheric, gorgeously rendered, and utterly without any lesson. It’s a book that I look forward to reading with my daughter. And while it may seem strange to dwell so much on these bears, it gets at a larger point about the pitfalls in judging any body of work by looking at a random sampling. I think that Peanuts is one of the great artistic achievements of the twentieth century, but it would be hard to convince anyone who was only familiar with its last two decades. You can see the same thing happening with The Simpsons, a series with six perfect seasons that threaten to be overwhelmed by the mediocre decades that are crowding the rest out of syndication. And the transformations of the Berenstain Bears are nothing compared to those of Robert A. Heinlein, whose career somehow encompassed Beyond This Horizon, Have Spacesuit—Will Travel, Starship Troopers, Stranger in a Strange Land, and I Will Fear No Evil. Yet there are also risks in drawing conclusions from the entirety of an artist’s output. In his biography of Anthony Burgess, Roger Lewis notes that he has read through all of Burgess’s work, and he asks parenthetically: “And how many have done that—except me?” He’s got a point. Trying to internalize everything, especially over a short period of time, can provide as false a picture as any subset of the whole, and it can result in a pattern that not even the author or the most devoted fan would recognize. Whether or not we’re from different universes, my idea of Bear Country isn’t the same as yours. That’s true of any artist’s work, and it hints at the problem at the root of all criticism: What do we talk about when we talk about the Berenstain Bears?
In the cards
It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.
Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)
Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?
The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”
Keeping us in suspense
At last night’s presidential debate, when moderator Chris Wallace asked if he would accept the outcome of the election, Donald Trump replied: “I’ll keep you in suspense, okay?” It was an extraordinary moment that immediately dominated the headlines, and not just because it was an unprecedented repudiation of a crucial cornerstone of the democratic process. Trump’s statement—it seems inaccurate to call it a “gaffe,” since it clearly reflects his actual views—was perhaps the most damaging remark anyone could have made in that setting, and it reveals a curious degree of indifference, or incompetence, in a candidate who has long taken pride in his understanding of the media. It was a short, unforgettable sound bite that could instantly be brought to members of both parties for comment. And it wasn’t an arcane matter of policy or an irrelevant personal issue, but an instantly graspable attack on assumptions shared by every democratically elected official in America, and presumably by the vast majority of voters. Even if Trump had won the rest of the debate, which he didn’t, those six words would have erased whatever gains he might have made. Not only was it politically and philosophically indefensible, but it was a ludicrous tactical mistake, an unforced error in response to a question that he and his advisors knew was going to be asked. As Julia Azari put it during the live chat on FiveThirtyEight: “The American presidency is not the latest Tana French novel—leaders can’t keep the people in suspense.”
But the phrase that he used tells us a lot about Trump. I’m speaking as someone who has devoted my fair share of thought to suspense itself: I’ve written a trilogy of thrillers and blogged here about the topic at length. When I think about the subject, I often start with what John Updike wrote in a review of Nabokov’s Glory, which is that it “never really awakens to its condition as a novel, its obligation to generate suspense.” What Updike meant is that stories are supposed to make us wonder about what’s going to happen next, and it’s that state of pleasurable anticipation that keeps us reading. It can be an end in itself, but it can also be a literary tool for sustaining the reader’s interest while the writer tackles other goals. As Kurt Vonnegut once said of plot, it isn’t necessarily an accurate representation of life, but a way to keep readers turning pages. Over time, the techniques of suspense have developed to the point where you can simulate it using purely mechanical tricks. If you watch enough reality television, you start to notice how the grammar of the editing repeats itself, whether you’re talking about Top Chef or Project Runway or Jim Henson’s Creature Shop. The delay before the judges deliver their decision, the closeups of the faces of the contestants, the way in which an editor pads out the moment by inserting cutaways between every word that Padma Lakshmi says—these are all practical tools that can give a routine stretch of footage the weight of the verdict in the O.J. Simpson trial. You can rely on them when you can’t rely on the events of the show itself.
And the best trick of all is to have a host who keeps things moving whenever the contestants or guests start to drag. That’s where someone like Trump comes in. He’s an embarrassment, but he’s far from untalented, at least within the narrow range of competence in which he used to operate. When I spent a season watching The Celebrity Apprentice—my friend’s older sister was on it—I was struck by how little Trump had to do: he was only onscreen for a few minutes in each episode. But he was good at his job, and he was also the obedient instrument of his producers. He has approached the campaign with the same mindset, but with few of the resources that are at an actual reality show’s disposal. Trump’s strategy has been built around the idea that he doesn’t need to spend money on advertising or a ground game, as long as the media provides him with free coverage. It’s an interesting experiment, but there’s a limit to how effective it can be. In practice, Trump is less like the producer or the host than a contestant, which reduces him to acting like a reality star who wants to maximize his screen time: say alarming things, pick fights, act unpredictably, and generate the footage that the show needs, while never realizing that the incentives of the contestants and producers are fundamentally misaligned. (He should have just watched the first season of UnREAL.) When he says that he’ll keep us in suspense about accepting the results of the election, he’s just following the reality show playbook, which is to milk such climactic moments for all they’re worth.
Yet this approach has backfired, and television provides us with some important clues as to why. I once believed that the best analogy to Trump’s campaign was the rake gag made famous by The Simpsons. As producer Al Jean described it: “Sam Simon had a theory that if you repeat a joke too many times, it stops being funny, but if you keep on repeating it, it might get really funny.” Trump performed a rake gag in public for months. First we were offended when he made fun of John McCain’s military service; then he said so many offensive things that we became numb to it; and then it passed a tipping point, and we got really offended. I still think that’s true. But there’s an even better analogy from television, which is the practice of keeping the audience awake by killing off major characters without warning. As I’ve said here before, it’s a narrative trick that used to seem daring, but now it’s a form of laziness: it’s easier to deliver shocking death scenes than to tell interesting stories about the characters who are still alive. In Trump’s case, the victims are ideas, or key constituents of the electorate: minorities, immigrants, women. When Trump turned on Paul Ryan, it was the equivalent of one of those moments, like the Red Wedding on Game of Thrones, when you’re supposed to gasp and realize that nobody is safe. His attack on a basic principle of democracy might seem like more of the same, but there’s a difference. The strategy might work for a few seasons, but there comes a point at which the show cuts itself too deeply, and there aren’t any characters left that we care about. This is where Trump is now. And by telling us that he’s going to keep us in suspense, he may have just made the ending a lot less suspenseful.
Loving the alienator
For what does it profit a man to gain the whole world and forfeit his soul?
—Mark 8:36
Whether or not you’re a believer, you eventually end up with your own idea of who Jesus might have been. I like to think of him as the ultimate pragmatist. If you accept his central premise—that the kingdom of heaven, whatever it is, is something that is happening right now—then his ethical system, as impossible as it might seem for most of us to follow, becomes easier to understand. It’s about eliminating distractions, focusing on what really counts, and removing sources of temptation before they have a chance to divert us from the true goal. Poverty, as Michael Grant puts it in Jesus: An Historian’s Review of the Gospels, is a practical solution to a concrete problem: “Excessive wealth might be a positive disadvantage, since its too lavish enjoyment could distract its possessors from the overriding vital matter at hand.” And as Grant observes elsewhere:
Certainly, “blessed are the meek”…but that is because “they shall inherit the earth.” Since nothing less than this is at stake, a contentious spirit is wholly out of place, for it will only distract attention and energy from the preeminent task. It is not even worth hating your enemies…In the urgent circumstances, Jesus believed, it was a sheer waste of time. Love them instead, just as much as you love everyone else; pray for those who persecute you, turn the other cheek. For why not avoid hostilities and embroilments which, beside the infinitely larger issue, are ultimately irrelevant and distracting?
“Love your enemies,” in other words, is nothing but sensible advice. Which doesn’t it make it any easier to do it for real, rather than merely paying it lip service, when it strikes us as inconvenient.
Take the case of Donald Trump. It’s fair to say that I feel less love toward Trump than I do toward any other American public figure of my lifetime. At my best, I just want to go back to the days when I could safely ignore him; at my worst, I want him to suffer some kind of humiliating, career-ending comeuppance, although I’m well aware that real life rarely affords such satisfactions. (If anything, it’s more likely to give us the opposite.) I’m also uncomfortably conscious that this is exactly the kind of reaction that he wants to evoke from me. It’s a victory. No matter what happens in this election, Trump has added perceptibly to the world’s stockpile of hate, resentment, and alienation. Hating him and what he stands for is easy; what isn’t so easy is trying to respond in ways that don’t merely feed into the cycle of hatred. The answer—and I wish it were different—is right there in front of us. We’re told to love our enemies. Jesus, the pragmatic philosopher, knew that there wasn’t time for anything else. But when I think about doing the same with Trump, I feel a bit like Meg Murry in A Wrinkle in Time, when she realizes that love is the only weapon that will work against IT, the hideous brain that rules the planet of Camazotz:
If she could give love to IT perhaps it would shrivel up and die, for she was sure that IT could not withstand love. But she, in all her weakness and foolishness and baseness and nothingness, was incapable of loving IT. Perhaps it was not too much to ask of her, but she could not do it.
The italics, as always, are mine. It isn’t too much to ask. But it’s one thing to acknowledge this, and quite another to grant that we’re obliged to do it for someone like Donald Trump.
So here’s my best shot. Trump grew up wanting nothing more than to please his own demanding father. Early in his career, he was just one real estate developer among many. He ended up concluding that the only values worth pursuing were the acquisition of money and power, abstracted from any possible benefit except as a way of keeping score. What’s worse, he received plenty of validation that his assumptions were correct. He’s never had any reason to grow or change. Instead, as we all do, he’s become more like himself as he’s aged, while categorizing the human beings around him as sources of income, enemies, or potential enablers. Behind his bluster, he’s deeply insecure, as we all are. He refuses to take responsibility for his actions, he can’t admit a mistake, and he blames everyone but himself when things go wrong. (When he says that the first debate was “rigged” because someone tampered with his mike and the moderator was against him, I’m reminded of what David Mamet says in On Directing Film: “Two reasons are equal to no reasons—it’s like saying: ‘I was late because the bus drivers are on strike and my aunt fell downstairs.’”) He seems unhappy. It’s hard to imagine him taking pleasure in reading a book, preparing a meal, or really anything aside from trolling the electorate and putting his name on buildings and planes. He appears to have no affection for anyone or anything, except perhaps his own children. And he’s the creation of forces that even he can’t control. He’s succeeded beyond his wildest expectations, but only by becoming the full-time monster that was only there in flashes before. Trump uses the system, but it also uses him. He has transformed himself into exactly what he hopes people want him to be, and he’s condemned to do it forever. And when the end comes—”As it must to all men,” the newsreel narrator reminds us in Citizen Kane—he’ll have to ask himself whether it was worth it.
I know that this comes perilously close to what the onlookers say after seeing Marge Simpson’s nude portrait of Mr. Burns: “He’s bad, but he’ll die. So I like it.” But it’s the best I can do. I can’t love Trump, but I can sort of forgive him, and pity him, for becoming what he was told to be, and for abandoning what makes us human and valuable—empathy, compassion, humility—in favor of an identity assembled from who we are at our worst. In a way, I’m even grateful to him, for much the same reason that George Saunders expressed in The New Yorker: “Although, to me, Trump seems the very opposite of a guardian angel, I thank him for this: I’ve never before imagined America as fragile, as an experiment that could, within my very lifetime, fail. But I imagine it that way now.” If Trump didn’t exist, it would have been necessary to invent him. He’s a better cautionary tale than any I could have imagined, because he won the trappings of success at a spiritual cost that isn’t tragic so much as deeply sad. He’s like Charles Foster Kane, without any of the qualities that make Kane so misleadingly attractive. When I think of the abyss of his ego, which draws like a battery on the love of his supporters and flails helplessly in every other situation, it feels like the logical extension of a career spent in the pursuit of wealth and celebrity divorced from any other consideration beyond himself. Like all mortals, Trump had exactly one chance to live a meaningful life, with greater resources than most of us ever get, and this is what he did with it. The closest I can come to loving him is the acknowledgment that I might have done the same, if I had been born with his circumstances and incentives. He’s not so different from me, as I fear I might have been in his shoes. And if I love Trump, in some weird way, it’s because I’m thankful I’m not him.
Tales from The Far Side
Last week, I finally saw The Revenant. I know that I’m pretty late to the party here, but I don’t have a chance to watch a lot of movies for grownups in the theater these days, and it wasn’t a film that my wife particularly wanted to see, so I had to wait for one of the rare weekends when she was out of town. At this point, a full review probably isn’t of much interest to anyone, so I’ll confine myself to observing that it’s an exquisitely crafted movie that I found very hard to take seriously. Alejandro G. Iñárittu, despite his obvious visual gifts, may be the most pretentious and least self-aware director at work today—which is one reason why Birdman fell so flat for me—and I would have liked The Revenant a lot more if it had allowed itself to smile a little at how absurd it all was. (Even the films of someone like Werner Herzog include flashes of dark humor, and I suspect that Herzog actively seeks out these moments, even if he maintains a straight face.) And it took me about five minutes to realize that the movie and I were fundamentally out of sync. It happened during the scene in which the fur trappers find themselves under attack by an Arikara war party, which announces itself, in classic fashion, with a sudden arrow through a character’s throat. A few seconds later, the camera pans up to show more arrows, now on fire, arcing through the trees overhead. It’s an eerie sight, and it’s given the usual glow by Emmanuel Lubezki’s luminous cinematography. But I’ll confess that when I first saw it, I said to myself: “Hey! They’re lighting their arrows! Can they do that?”
It’s a caption from a Far Side cartoon, of course, and it started me thinking about the ways in which the work of Gary Larson has imperceptibly shaped my inner life. I’ve spoken here before about how quotations from The Simpsons provide a kind of complete metaphorical language for fans, like the one that Captain Picard learns in “Darmok.” You could do much the same thing with Larson’s captions, and there are probably more fluent speakers alive than you might think. Peanuts is still the comic strip that has meant the most to me, and I count myself lucky that I grew up at a time when I could read most of Calvin and Hobbes in its original run. Yet both of these strips, like Bloom County, lived most vividly for me in the form of collections, and in the case of Peanuts, its best years were long behind it. The Far Side, by contrast, obsessed me on a daily basis, more than any other comic strip of its era. When I was eight years old, I spent a few months diligently cutting out all the panels from my local paper and pasting them into a scrapbook, which is an impulse that I hadn’t felt before and haven’t felt since. Two decades later, I got a copy of The Complete Far Side for Christmas, which might still be my favorite present ever. Every three years so, I get bitten by the bug again, and I spend an evening or two with one of those huge volumes on my lap, going through the strip systematically from beginning to end. Its early years are rough and a little uncertain, but they’re still wonderful, and it went out when it was close to its peak. And when I’m reading it in the right mood, there’s nothing else in the world that I’d rather be doing.
A gag panel might seem like the lowest form of comic, but The Far Side also had a weirdly novelistic quality that I’ve always admired as a writer. Larson’s style seemed easy to imitate—I think that every high school newspaper had a strip that was either an homage or outright plagiarism—but his real gift was harder to pin down. It was the ability to take what feels like an ongoing story, pause it, and offer it up to readers at a moment of defining absurdity. (Larson himself says in The Prehistory of The Far Side: “Cartoons are, after all, little stories themselves, frozen at an interesting point in time.”) His ideas stuck in the brain because we couldn’t help but wonder what happened before or afterward. Part of this because he cleverly employed all the usual tropes of the gag cartoon, which are fun precisely because of the imaginative fertility of the clichés they depict: the cowboys singing around a campfire, the explorers in pith helmets hacking their way through the jungle, the castaway on the desert island. But the snapshots in time that Larson captures are both so insane and so logical that the reader has no choice but to make up a story. The panel is never the inciting incident or the climax, but a ticklish moment somewhere in the middle. It can be the gigantic mailman knocking over buildings while a dog exhorts a crowd of his fellows: “Listen! The authorities are helpless! If the city’s to be saved, I’m afraid it’s up to us! This is our hour!” Or the duck hunter with a shotgun confronted by a row of apparitions in a hall of mirrors: “Ah, yes, Mr. Frischberg, I thought you’d come…but which of us is the real duck, Mr. Frischberg, and not just an illusion?”
As a result, you could easily go through a Far Side collection and use it as a series of writing prompts, like a demented version of The Mysteries of Harris Burdick. I’ve occasionally thought about writing a story revolving around the sudden appearance of Professor DeArmond, “the epitome of evil among butterfly collectors,” or expanding on the incomparable caption: “Dwayne paused. As usual, the forest was full of happy little animals—but this time something seemed awry.” It’s hard to pick just one favorite, but the panel I’ve thought about the most is probably the one with the elephant in the trench coat, speaking in a low voice out of the darkness of the stairwell:
Remember me, Mr. Schneider? Kenya. 1947. If you’re going to shoot at an elephant, Mr. Schneider, you better be prepared to finish the job.
Years later, I spent an ungodly amount of time working on a novel, still unpublished, about an elephant hunt, and while I wouldn’t go so far as to say that it was inspired by this cartoon, I’m also not prepared to say that it wasn’t. I should also note Larson’s mastery of perfect proper names, which are harder to come up with than you might think: “Mr. Frischberg” and “Mr. Schneider” were so nice that he said them twice. It’s that inimitable mixture of the ridiculous and the specific that makes Larson such a model for storytellers. He made it to the far side thirty years ago, and we’re just catching up to him now.
The two kinds of commentaries
There are two sorts of commentary tracks. The first kind is recorded shortly after a movie or television season is finished, or even while it’s still being edited or mixed, and before it comes out in theaters. Because their memories of the production are still vivid, the participants tend to be a little giddy, even punch drunk, and their feelings about the movie are raw: “The wound is still open,” as Jonathan Franzen put it to Slate. They don’t have any distance, and they remember everything, which means that they can easily get sidetracked into irrelevant detail. They don’t yet know what is and isn’t important. Most of all, they don’t know how the film did with viewers or critics, so their commentary becomes a kind of time capsule, sometimes laden with irony. The second kind of commentary is recorded long after the fact, either for a special edition, for the release of an older movie in a new format, or for a television series that is catching up with its early episodes. These tend to be less predictable in quality: while commentaries on recent work all start to sound more or less the same, the ones that reach deeper into the past are either disappointingly superficial or hugely insightful, without much room in between. Memories inevitably fade with time, but this can also allow the artist to be more honest about the result, and the knowledge of how the work was ultimately received adds another layer of interest. (For instance, one of my favorite commentaries from The Simpsons is for “The Principal and the Pauper,” with writer Ken Keeler and others ranting against the fans who declared it—preemptively, it seems safe to say—the worst episode ever.)
Perhaps most interesting of all are the audio commentaries that begin as the first kind, but end up as the second. You can hear it on the bonus features for The Lord of the Rings, in which, if memory serves, Peter Jackson and his cowriters start by talking about a movie that they finished years ago, continue by discussing a movie that they haven’t finished editing yet, and end by recording their comments for The Return of the King after it won the Oscar for Best Picture. (This leads to moments like the one for The Two Towers in which Jackson lays out his reasoning for pushing the confrontation with Saruman to the next movie—which wound up being cut for the theatrical release.) You also see it, on a more modest level, on the author’s commentaries I’ve just finished writing for my three novels. I began the commentary on The Icon Thief way back on April 30, 2012, or less than two months after the book itself came out. At the time, City of Exiles was still half a year away from being released, and I was just beginning the first draft of the novel that I still thought would be called The Scythian. I had a bit of distance from The Icon Thief, since I’d written a whole book and started another in the meantime, but I was still close enough that I remembered pretty much everything from the writing process. In my earliest posts, you can sense me trying to strike the right balance between providing specific anecdotes about the novel itself to offering more general thoughts on storytelling, while using the book mostly as a source of examples. And I eventually reached a compromise that I hoped would allow those who had actually read the book to learn something about how it was put together, while still being useful to those who hadn’t.
As a result, the commentaries began to stray further from the books themselves, usually returning to the novel under discussion only in the final paragraph. I did this partly to keep the posts accessible to nonreaders, but also because my own relationship with the material had changed. Yesterday, when I posted the last entry in my commentary on Eternal Empire, almost four years had passed since I finished the first draft of that novel. Four years is a long time, and it’s even longer in writing terms. If every new project puts a wall between you and the previous one, a series of barricades stands between these novels and me: I’ve since worked on a couple of book-length manuscripts that never got off the ground, a bunch of short stories, a lot of occasional writing, and my ongoing nonfiction project. With each new endeavor, the memory of the earlier ones grows dimmer, and when I go back to look at Eternal Empire now, not only do I barely remember writing it, but I’m often surprised by my own plot. This estrangement from a work that consumed a year of my life is a little sad, but it’s also unavoidable: you can’t keep all this information in your head and still stay sane. Amnesia is a coping strategy. We’re all programmed to forget many of our experiences—as well as our past selves—to free up capacity for the present. A novel is different, because it exists in a form outside the brain. Any book is a piece of its writer, and it can be as disorienting to revisit it as it is to read an old diary. As François Mauriac put it: “It is as painful as reading old letters…We touch it like a thing: a handful of ashes, of dust.” I’m not quite at that point with Eternal Empire, but I’ll sometimes read a whole series of chapters and think to myself, where did that come from?
Under the circumstances, I should count myself lucky that I’m still reasonably happy with how these novels turned out, since I have no choice but to be objective about it. There are things that I’d love to change, of course: sections that run too long, others that seem underdeveloped, conceits that seem too precious or farfetched or convenient. At times, I can see myself taking the easy way out, going with a shortcut or ignoring a possible implication because I lacked the time or energy to do it justice. (I don’t necessarily regret this: half of any writing project involves conserving your resources for when it really matters.) But I’m also surprised by good ideas or connections that seem to have come from outside of me, as if, to use Isaac Asimov’s phrase, I were writing over my own head. Occasionally, I’ll have trouble following my own logic, and the result is less a commentary than a forensic reconstruction of what I must have been thinking at the time. But if I find it hard to remember my reasoning today, it’s easier now than it will be next year, or after another decade. As I suspected at the time, the commentary exists more for me than for anybody else. It’s where I wrote down my feelings about a series of novels that once dominated my life, and which now seem like a distant memory. While I didn’t devote nearly as many hours to these commentaries as I did to the books themselves, they were written over a comparable stretch of time. And now that I’ve gotten to the point of writing a commentary on my commentary—well, it’s pretty clear that it’s time to stop.
Head of the class
The Democratic National Convention was filled with striking moments, but the one that lingered in my mind the most was the speech given by Meryl Streep, which was memorable less for what she said than for what she represents. Streep is undoubtedly the most acclaimed actress of our time, maybe of all time. At the peak of her career, she could come across as artificial and mannered—Pauline Kael once quoted a friend who called her “an android”—but she almost glows these days with grace and good humor. Even if this is just another performance, it’s a virtuoso one, and she maintains it with seeming effortlessness as she continues to rack up awards and nominations. Streep, in short, doesn’t need to be jealous of anybody. But as an article in the New York Times points out, there’s at least one exception:
Meryl Streep, the most accomplished, awarded and chameleonic actress of her generation, once confessed something approaching envy for Hillary Clinton: For women of her age, Ms. Streep said, Mrs. Clinton was the yardstick by which they inevitably measured their lives—sometimes flatteringly, sometimes not.
The idea that Meryl Streep, of all people, might bite her hand a little when she thinks of Clinton made me reflect on how each generation settles on one person who serves as a benchmark for the rest. And it’s often either the first to win the presidency or the first who might have a good shot at attaining it. It’s no accident that one of the earliest biographies of Bill Clinton was titled First in His Class.
I’m at a point in my life when people my age have just reached the point of eligibility for the Oval Office, and there isn’t an obvious frontrunner. (As a friend of mine recently said at an informal college reunion: “I guess nobody we know is going to be president. By now, we’d know it.”) But it’s still something I think about. One of my favorite examples of the role that a president—or a candidate—can play in the inner life of an ambitious novelist is Norman Mailer’s obsession with John F. Kennedy. Judging from how frequently he returned to the subject, it was second only to his fascination with Marilyn Monroe, which in itself was probably an outgrowth of his interest in the Kennedys, and he revisited it in works from “Superman Comes to the Supermarket” to Harlot’s Ghost. In An American Dream, he puts Kennedy right there in the opening sentence, which is like inviting a guest into the holy of holies:
I met Jack Kennedy in November, 1946. We were both war heroes, and both of us had just been elected to Congress. We went out one night on a double date and it turned out to be a fair evening for me…Of course Jack has gone on a bit since those days, and I have traveled up and I have voyaged down and I’ve gone up and down…The real difference between the President and myself may be that I ended up with too large an appreciation of the moon, for I looked down the abyss on the first night I killed: four men, four very separate Germans, dead under a full moon—whereas Jack, for all I know, never saw the abyss.
Mailer isn’t speaking as himself, but as a fictional character, but it’s hard not to interpret these lines as a conjuring of an alternate life in which he was friends with the man whom he had missed, by just a few years, at Harvard.
Kennedy and Mailer did meet briefly, and it resulted in a moment that speaks volumes about the uncanny prominence that a presidential candidate our own age can take in our thoughts. In “Superman Comes to the Supermarket,” Mailer writes:
What struck me most about the interview was a passing remark whose importance was invisible on the scale of politics, but was altogether meaningful to my particular competence. As we sat down for the first time, Kennedy smiled nicely and said that he had read my books. One muttered one’s pleasure. “Yes,” he said, “I’ve read…” and then there was a short pause which did not last long enough to be embarrassing in which it was yet obvious no title came instantly to his mind, an omission one was not ready to mind altogether since a man in such a position must be obliged to carry a hundred thousand facts and names in his head, but the hesitation lasted no longer than three seconds or four, and then he said, “I’ve read The Deer Park and…the others,” which startled me for it was the first time in a hundred similar situations, talking to someone whose knowledge of my work was casual, that the sentence did not come out, “I’ve read The Naked and the Dead…and the others.” If one is to take the worst and assume that Kennedy was briefed for this interview (which is most doubtful), it still speaks well for the striking instincts of his advisers.
I like this story best for what Mailer called its significance “to my particular competence.” A favorable remark, even in passing, from the man who had ascended to a level that no writer could ever hope to achieve was one that Mailer would savor forever. And then it was over.
Most of us never get that close, but it doesn’t matter: even from a distance, a president or a candidate makes everyone’s imagination follow a similar track, like a magnet acting on iron filings. That particular mixture of envy and admiration is especially visible among products of the Ivy League. A writer for The Simpsons once noted in an audio commentary that if the writing staff loved to write presidential jokes—like the one in which Grandpa Simpson claims to have been spanked by Grover Cleveland on two nonconsecutive occasions—it’s because the ones who went to Harvard can’t quite get over the idea that they could have been president themselves. You can feel the same sense of agonizing proximity in Mailer, who attended Harvard at a time when his Jewishness made him an outsider among heirs to power, and who later channeled that need into an absurdly unsuccessful candidacy for mayor of New York. As he later wrote:
Norman was lazy, and politics would make him work hard for sixteen hours a day for the rest of his life. He was so guilty a man that he thought he would be elected as a fit and proper punishment for his sins. Still, he also wanted to win. He would never write again if he were Mayor (the job would doubtless strain his talent to extinction) but he would have his hand on the rump of History, and Norman was not without such lust.
He concludes: “He came in fourth in a field of five, and politics was behind him.” But the memory of Kennedy lived on. Mailer wrote these lines in Of a Fire on the Moon, which chronicled the Apollo mission that Kennedy had set in motion. Kennedy would alter the future, and Mailer would write about it, just as Streep might play Clinton someday in a movie. But we’re all writing or acting these roles in our minds as we measure ourselves against the head of the class, even if we’re not sure who it is yet.
Beyond Kang and Kodos
In a recent blog post on FiveThirtyEight about the state of election polling, Nate Silver mused about what would keep him up at night if he were Hillary Clinton. He concluded: “I’d be worried that Americans come to view the race as one between two equally terrible choices, instead of Trump being uniquely unacceptable.” As the Republican National Convention lurches to a start today in Cleveland, there are signs that a lot of voters have arrived at that exact conclusion. And if you’re a certain kind of television fan, it’s hard not to think of The Simpsons “Treehouse of Horror” installment “Citizen Kang,” which aired twenty years ago this fall, shortly before the presidential election of 1996. It’s the segment in which alien invaders Kang and Kodos assume the forms of candidates Bill Clinton and Bob Dole, leading to seven of the most quotable minutes in the show’s history. Two lines, in particular, continue to resonate with self-proclaimed political cynics. One comes after Homer has exposed Kang and Kodos in their true forms, leading to this exchange:
Kodos: It’s a two-party system! You have to vote for one of us!
Man: Well, I believe I’ll vote for a third-party candidate.
Kang: Go ahead—throw your vote away!
And the other comes at the very end, after the victorious President Kang has enslaved the nation, prompting Homer to say to Marge: “Don’t blame me. I voted for Kodos.”
For many viewers, the episode encapsulates the suspicion—which we encounter across the political spectrum—that the two major parties, deep down, are basically the same. But they aren’t. Not really. And to understand why “Citizen Kang” isn’t as trenchant or insightful as it seems, we can turn to the writers and producers who worked on the episode itself. On the commentary track for the show’s eighth season, which was recorded in 2006, series creator Matt Groening and producers Josh Weinstein, David X. Cohen, and Dan Greaney have the following discussion:
Weinstein: Now, I would say, even though it’s specific candidates, the message is timeless…
Cohen: Yeah. One thing I think I’ve noticed about comedy shows that take on elections is the point is always the same—the point is it does not matter which of the awful candidates you vote for…
Greaney: Which is a complete falsity. I mean, the idiot criminal that we have in office is…a lot worse.
Cohen: I’m not saying it’s a good point. I’m just saying it always seems to be the point.
Groening: Because it feels like it’s a comment.
Cohen: Right. You’re able to feel like you’re making a commentary without actually taking sides and alienating people.
Greaney: Yeah, but—when you have somebody who is clearly an aggressor, then…evenhandedness is actually favoring the aggressor.
Cohen: That’s true.
And although I know it’s never going to happen, I wish that the insights conveyed in those last few lines were as familiar as “Citizen Kang” itself. The difference between the episode’s implicit message and the feelings expressed in the commentary track can be chalked up to the fact that the former was written during the Clinton administration, while the latter was recorded ten years later, at the height of disillusionment with George W. Bush. (In other commentaries, the writers mock their own ruthless skewering of Clinton at the time, joking, with a touch of wistfulness, that he was obviously the worst president the country would ever have.) If anything, though, it rings even more true today. And I think that Groening and Cohen—who went on to create Futurama—get at the heart of the matter. Saying that the Democratic and Republican nominees are equally compromised isn’t a political insight, but a simulation of one: it’s a comedic or narrative strategy disguised as an opinion. It’s the most insidious kind of empty statement, which allows the speaker to seem superficially insightful, even subversive, while really closing off the kind of thinking that really matters. As Cohen points out, this kind of false equivalence is perfect for writers who want to create the appearance of making a point without really saying anything. It doesn’t even qualify as real cynicism: it sidesteps actual thought as much as blind allegiance to any one party. And like most forms of laziness, it’s a luxury afforded only to those who are lucky enough not to be intensely vulnerable to the real consequences that presidential elections produce.
If it sounds like I’m being unduly hard on The Simpsons, I’m not: it wouldn’t be so powerful an example if it weren’t the best television show of all time. Its eighth season was a masterpiece, but there were limits to the messages it could send, simply because it was better off, in the long run, if it pitched its satire squarely down the middle—and also because it was television. This bears repeating, especially now. We’re in the middle of an election in which the lines between politics and entertainment have been blurred as never before, and not just because one of the candidates is a former and future reality star. Trump’s simulated version of tough talk and big ideas has been accepted as true by a sizable percentage of the electorate, because it only needs to hold together for long enough to last until the next commercial break. His strategy isn’t that of the big lie, but of a series of improvisations strung end to end, which he hopes will get him through to November. (It’s why he takes so naturally to Twitter.) But those who dismiss Trump and his supporters should begin by demanding more of themselves. The writers behind “Citizen Kang” only had to come up with a message that could sustain a third of a Halloween episode. At the time, it might have seemed plausible, but it only took one more election to expose it forever. Or it should have. But it’s always easier to recuse oneself from the difficult realization that the choice between candidates has huge practical consequences. Trump and Clinton aren’t the same, not for most of us, and certainly not for Muslims, immigrants, gays and lesbians, and other groups that have evolved what Charles Blow has called “a sort of functional pragmatism” to survive. You can still tell yourself, if you like, that this election is a choice between Kang and Kodos. But it isn’t. Even if The Simpsons did it first.
The magic xylophone
Earlier this morning, I was browsing online when I noticed that Mark Kirkland, a longtime director for The Simpsons, was answering questions on Reddit. I was immediately excited, both because Kirkland directed such legendary installments as “Last Exit to Springfield”—often considered the best episode that the series ever did—and “Homer’s Barbershop Quartet,” and because he’s a reliably funny and smart presence on the show’s commentary tracks. When I clicked on the page, however, I found that the top-ranked question was something that I probably should have expected:
In episode 2F09 when Itchy plays Scratchy’s skeleton like a xylophone, he strikes the same rib twice in succession, yet he produces two clearly different tones. I mean, what are we to believe, that this is some sort of a magic xylophone or something? Boy, I really hope somebody got fired for that blunder.
For those who are lucky enough not to get the joke, this is a reference to the episode “The Itchy & Scratchy & Poochie Show,” in which Homer takes similar questions from a roomful of nerds. Homer responds: “I’ll field that one. Let me ask you a question. Why would a grown man whose shirt says ‘Genius at Work’ spend all his time watching a children’s cartoon show?” To which the nerd replies: “I withdraw my question.”
Even within the vast universe of repurposed Simpsons quotes, which I’ve elsewhere compared to a complete metaphorical language, this is about as canonical as it gets: it’s a reference that must get rehashed online somewhere every few minutes, usually in discussions about some absurdly nitpicky aspect of a movie or television show. If Kirkland had simply responded with the expected quote from Homer, the commenters would have expressed their approval and moved on. But Kirkland didn’t seem to recognize the reference, and he replied with a straight face, leading to a minor explosion of indignation in the comments that followed. Redditors simply couldn’t believe that Kirkland didn’t know the joke, and many held it against him personally. As one wrote: “I honestly don’t think he got the reference. If I’m right, I think it explains a lot about the quality of The Simpsons these days.” Another replied: “Sadly true.” To their credit, a few other commenters responded with the obvious rejoinders. Kirkland has spent the last three decades working on new episodes of the series; it isn’t fair to expect him to immediately recognize a line that aired almost twenty years ago from a script that he didn’t even direct; and fans who have watched every episode from the show’s golden years a dozen times and quoted them repeatedly to one another are operating in a different frame of reference than the creative staff. (Anecdotal evidence certainly bears this out: the fans have consistently trounced the writers in trivia contests. As onetime show runner David Mirkin once said in their defense: “We’re too busy creating the new stuff.”)
And yet the whole exchange still rankled me, to the point where I feel obliged to write about it here. There’s one big point that ought to be italicized for emphasis: the commenters who quoted an episode word for word, and then became upset when one of the show’s most valuable contributors failed to give them the automatic reply they wanted, are unconsciously embodying the very thing that the original joke was mocking. “The Itchy & Scratchy & Poochie Show” remains one of the show’s most fascinating episodes, and its relevance has only increased as the years have gone by. Its writer, David X. Cohen, conceived it as a commentary on a show that he honestly believed was nearing the end of its run, and even if he was off by a few decades, its jokes about fans and their relationship to a favorite series are still funny and accurate. What he couldn’t have anticipated was how the compounding effect of time would make the satire almost too mild. This is the episode, after all, that includes both the lines quoted above and this equally famous—and prescient—exchange:
Comic Book Guy: Last night’s Itchy & Scratchy was, without a doubt, the worst episode ever. Rest assured that I was on the Internet within minutes registering my disgust throughout the world.
Bart: Hey, I know it wasn’t great, but what right do you have to complain?
Comic Book Guy: As a loyal viewer, I feel they owe me.
Bart: What? They’ve given you thousands of hours of entertainment for free. What could they possibly owe you? I mean, if anything, you owe them.
Comic Book Guy (after a pause): Worst episode ever.
Cohen may have intended this as a humorous exaggeration, but it was really a glimpse of the show’s future, and Reddit’s exchange with Kirkland is just a particularly stark example. This isn’t the place to go yet again into the reasons for the show’s decline in quality over the last fifteen years, except to state that the problem almost certainly isn’t that the writers and directors have failed to memorize the old episodes and constantly quote them to one another. If anything, the show has suffered from being too much of an echo chamber, leading to a reliance on throwaway lines and easy gags over coherent stories—which argues that the series should be turned less inward on itself, not more. But it reminds us of one of the show’s underlying problems: as vocal as its fans are, they don’t seem to know what they want from it. (As the leader of an audience focus group says in the very same episode: “So you want a realistic down-to-earth show that’s completely off the wall and swarming with magic robots?”) At this late date, it seems safe to say that The Simpsons is what it is, and that any given fan’s relationship with the show is something that he’ll have to work out for himself. But a decent first step to any kind of understanding would be to watch “The Itchy & Scratchy & Poochie Show” once more, and, instead of mindlessly parroting its lines, to take a good look at it and ask which character reminds us the most of ourselves. Because the magic xylophone tolls for thee.
The reviewable appliance
Last week, I quoted the critic Renata Adler, who wrote back in the early eighties: “Television…is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which only indicates how much has changed over the last thirty years, which have seen television not only validated as a reviewable medium, but transformed into maybe the single most widely reviewed art form in existence. Part of this is due to an increase in the quality of the shows themselves: by now, it’s a cliché to say that we’re living in a golden age of television, but that doesn’t make it any less true, until there are almost too many great shows for any one viewer to absorb. As John Landgraf of FX said last year, in a quote that was widely shared in media circles, mostly because it expresses how many of us feel: “There is simply too much television.” There are something like four hundred original scripted series airing these days—which is remarkable in itself, given how often critics have tolled the death knell for scripted content in the face of reality programming—and many are good to excellent. If we’ve learned to respect television as a medium that rewards close scrutiny, it’s largely because there are more worthwhile shows than ever before, and many deserve to be unpacked at length.
There’s also a sense in which shows have consciously risen to that challenge, taking advantage of the fact that there are so many venues for reviews and discussion. I never felt that I’d truly watched an episode of Mad Men until I’d watched Matthew Weiner’s weekly commentary and read the writeup on The A.V. Club, and I suspect that Weiner felt enabled to go for that level of density because the tools for talking about it were there. (To take another example: Mad Style, the fantastic blog maintained by Tom and Lorenzo, came into being because of the incredible work of costume designer Jane Bryant, but Bryant herself seemed to be make certain choices because she knew that they would be noticed and dissected.) The Simpsons is often called the first VCR show—it allowed itself to go for rapid freeze-frame jokes and sign gags because viewers could pause to catch every detail—but these days, we’re more likely to rely on recaps and screen grabs to process shows that are too rich to be fully grasped on a single viewing. I’m occasionally embarrassed when I click on a review and read about a piece of obvious symbolism that I missed the first time around, but you could also argue that I’ve outsourced that part of my brain to the hive mind, knowing that I can take advantage of countless other pairs of eyes.
But the fact that television inspires millions of words of coverage every day can’t be entirely separated from Adler’s description of it an appliance. For reasons that don’t have anything to do with television itself, the cycle of pop culture coverage—like that of every form of news—has continued to accelerate, with readers expecting nonstop content on demand: I’ll refresh a site a dozen times a day to see what has been posted in the meantime. Under those circumstances, reviewers and their editors naturally need a regular stream of material to be discussed, and television fits the bill beautifully. There’s a lot of it, it generates fresh grist for the mill on a daily basis, and it has an existing audience that can be enticed into reading about their favorite shows online. (This just takes a model that had long been used for sports and applies it to entertainment: the idea that every episode of Pretty Little Liars deserves a full writeup isn’t that much more ridiculous than devoting a few hundred words to every baseball game.) One utility piggybacks on the other, and it results in a symbiotic relationship: the shows start to focus on generating social media chatter, which, if not exactly a replacement for ratings, at least becomes an argument for keeping marginal shows like Community alive. And before long, the show itself is on Hulu or Yahoo.
None of this is inherently good or bad, although I’m often irked by the pressure to provide instant hot takes about the latest twist on a hit series, with think pieces covering other think pieces until the snake has eaten its own tail. (The most recent example was the “death” of Glenn on The Walking Dead, a show I don’t even watch, but which I found impossible to escape for three weeks last November.) There’s also an uncomfortable sense in which a television show can become an adjunct to its own media coverage: I found reading about Game of Thrones far more entertaining over the last season than watching the show itself. It’s all too easy to use the glut of detailed reviews as a substitute for the act of viewing: I haven’t watched Halt and Catch Fire, for instance, but I feel as if I have an opinion about it, based solely on the information I’ve picked up by osmosis from the review sites I visit. I sometimes worry that critics and fans have become so adept at live-tweeting episodes that they barely look at the screen, and the concept of hate-watching, of which I’ve been guilty myself, wouldn’t exist if we didn’t have plenty of ways to publicly express our contempt. It’s a slippery slope from there to losing the ability to enjoy good storytelling for its own sake. And we need to be aware of this. Because we’re lucky to be living in an era of so much great television—and we ought to treat it as something more than a source of hot and cold running reviews.
“Pretty clever!”
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop-culture concepts have you found to be ripe for everyday use?”
A few years ago, my wife and I went on a trip to Peru and Bolivia. It was meant as one last stab at adventure travel before kids—which we knew would soon figure in our future—made that kind of vacation impossible, and we’d planned an ambitious itinerary: Machu Picchu, Lake Titicaca, the salt flats of Uyuni. As soon as we landed in Cuzco, though, I was felled by a bout of altitude sickness that made me wonder if we’d have to cancel the whole thing. A few pills and a day of acclimation made me feel stable enough to proceed, but I never got entirely used to it, and before long, I found myself hiking up a hillside in the Lares Valley, my heart jackhammering in my chest like an animal that was trying to escape. To save face, I’d periodically pause on the trail to look around, as if to take in the view, when I was just trying to get my pulse under control. But what got me through it, weirdly, was the thought of Frodo and Samwise slogging their way toward Mount Doom: if a couple of hobbits could do it, then I could climb this hill, too. It was an image to which I clung for the rest of the way, and the fantasy that I was heading toward Mordor sustained me to the top. And later, when I confessed this to my wife, she only smiled and said: “Yeah. I was thinking of the Von Trapp family.”
We relate to pop culture in all kinds of complicated ways, but one of the most curious aspects of that dynamic is how it can motivate us to become slightly better versions of ourselves, even if such exemplars aren’t entirely realistic. (The real Von Trapps didn’t hike over the mountains into Switzerland: they took a train.) Yesterday, Patrick Stewart showed up on Reddit to answer some questions, and if there was one common thread to the comments, it was that Jean-Luc Picard had been a role model, and almost a surrogate father, for many viewers as they were growing up. Just as fairy tales, with their fantasies of power and destiny, allow children to come to terms with their physical vulnerability as they risk a greater engagement with the world, the heroes we admire in books and movies give us an ideal toward which to aspire. As long as we’re aware that complete success is probably unattainable—”We aim above the mark to hit the mark,” as Emerson said—I don’t see anything wrong with this. And such comparisons often cross our minds at the least dignified moments. Whenever I’m struggling to open a package, an image flits through my mind of Daniel Craig as James Bond, and I think to myself: “Bond wouldn’t have trouble opening a bag of pretzels, would he?” It doesn’t make what I’m doing any less annoying, but it usually inspires me to do something marginally more decisive, like getting a pair of scissors.
Pop culture can also provide ways of seeing our own lives from a new perspective, often by putting words to concepts and emotions that we couldn’t articulate before. At its highest level, it can take the form of the recognition that many readers feel when encountering an author like Proust or Montaigne: for long stretches, it feels eerily like we’re reading about ourselves. And a show like Seinfeld has added countless terms to our shared vocabulary of ideas, even if it was by accident, with the writers as surprised as anyone else by what struck a nerve. As writer Peter Mehlman says:
Every line was written just to be funny and to further the plot. But, actually, there was one time that I did think that a certain phrase would become popular. And I was completely wrong. In the “Yada Yada” episode, I really thought it was going to be the “antidentite” line that was going to be the big phrase, and it was not. That line went: “If this wasn’t my son’s wedding day, I’d knock your teeth out, you antidentite bastard.” The man who said it was a dentist. And no one remembers that phrase; it’s the “yada yada yada” line that everyone remembers.
Sometimes a free-floating line will just snag onto an existing feeling and crystalize it, and along with Seinfeld, The Simpsons has been responsible for more such epiphanies than any other series. Elsewhere, I’ve compared the repository of Simpsons quotes that we all seem to carry in our heads to the metaphorical language that Picard encountered in “Darmok,” and there’s no question that it influences the way many of us think about ourselves.
Take “Hurricane Neddy,” which first aired during the show’s eighth season. It probably wouldn’t even make it onto a list of my fifty favorite episodes, but there’s one particular line from it that has been rattling around in my brain ever since. After a hurricane destroys Flanders’s house, the neighborhood joins forces to rebuild it, only to do a spectacularly crappy job. It all leads to the following exchange:
Ned: “The floor feels a little gritty here.”
Moe: “Yeah, we ran out of floorboards there, so we painted the dirt. Pretty clever!”
Those last two words, which Moe delivers with a nudge to Ned’s ribs and an air of self-satisfaction, are ones that I’ve never forgotten. At least once a week, I’ll say to myself, in Moe’s voice: “Pretty clever!” The reason, as far as I can pinpoint it, is that I’m working in a field that calls for me to be “clever” on a regular basis, whether it’s solving a narrative problem, coming up with a twist for a short story, or just figuring out a capper for a blog post. Not all of these ideas are equally clever, and many of them fall into the category of what Frederik Pohl calls “monkey tricks.” But Moe’s delivery, which is so delighted with itself, reminds me both of how ridiculous so much of it is and of the necessity of believing otherwise. Sometimes I’m just painting the dirt, but I couldn’t go on if I wasn’t sort of pleased by it. And if I had to sum up my feelings for my life’s work, for better or worse, it would sound a lot like “Pretty clever!”