Posts Tagged ‘The Simpsons’
If you’ve spent any time online in the last few years, there’s a decent chance that you’ve come across some version of what I like to call the Berenstain Bears enigma. It’s based on the fact that a sizable number of readers who recall this book series from childhood remember the name of its titular family as “Berenstein,” when in reality, as a glance at any of the covers will reveal, it’s “Berenstain.” As far as mass instances of misremembering are concerned, this isn’t particularly surprising, and certainly less bewildering than the Mandela effect, or the similar confusion surrounding a nonexistent movie named Shazam. But enough people have been perplexed by it to inspire speculation that these false memories may be the result of an errant time traveler, à la Asimov’s The End of Eternity, or an event in which some of us crossed over from an alternate universe in which the “Berenstein” spelling was correct. (If the theory had emerged a few decades earlier, Robert Anton Wilson might have devoted a page or two to it in Cosmic Trigger.) Even if we explain it as an understandable, if widespread, mistake, it stands as a reminder of how an assumption absorbed in childhood remains far more powerful than a falsehood learned later on. If we discover that we’ve been mispronouncing, say, “Steve Buscemi” for all this time, we aren’t likely to take it as evidence that we’ve ended up in another dimension, but the further back you go, the more ingrained such impressions become. It’s hard to unlearn something that we’ve believed since we were children—which indicates how difficult it can be to discard the more insidious beliefs that some of us are taught from the cradle.
But if the Berenstain Bears enigma has proven to be unusually persistent, I suspect that it’s because many of us really are remembering different versions of this franchise, even if we believe that we aren’t. (You could almost take it as a version of Hilary Putnam’s Twin Earth thought experiment, which asks if the word “water” means the same thing to us and to the inhabitants of an otherwise identical planet covered with a similar but different liquid.) As I’ve recently discovered while reading the books aloud to my daughter, the characters originally created by Stan and Jan Berenstain have gone through at least six distinct incarnations, and your understanding of what this series “is” largely depends on when you initially encountered it. The earliest books, like The Bike Lesson or The Bears’ Vacation, were funny rhymed stories in the Beginner Book style in which Papa Bear injures himself in various ways while trying to teach Small Bear a lesson. They were followed by moody, impressionistic works like Bears in the Night and The Spooky Old Tree, in which the younger bears venture out alone into the dark and return safely home after a succession of evocative set pieces. Then came big educational books like The Bears’ Almanac and The Bears’ Nature Guide, my own favorites growing up, which dispensed scientific facts in an inviting, oversized format. There was a brief detour through stories like The Berenstain Bears and the Missing Dinosaur Bone, which returned to the Beginner Book format but lacked the casually violent gags of the earlier installments. Next came perhaps the most famous period, with dozens of books like Trouble With Money and Too Much TV, all written, for the first time, in prose, and ending with a tidy, if secular, moral. Finally, and jarringly, there was an abrupt swerve into Christianity, with titles like God Loves You and The Berenstain Bears Go to Sunday School.
To some extent, you can chalk this up to the noise—and sometimes the degeneration—that afflicts any series that lasts for half a century. Incremental changes can lead to radical shifts in style and tone, and they only become obvious over time. (Peanuts is the classic example, but you can even see it in the likes of Dennis the Menace and The Family Circus, both of which were startlingly funny and beautifully drawn in their early years.) Fashions in publishing can drive an author’s choices, which accounts for the ups and downs of many a long career. And the bears only found Jesus after Mike Berenstain took over the franchise after the deaths of his parents. Yet many critics don’t bother making these distinctions, and the ones who hate the Berenstain Bears books seem to associate them entirely with the Trouble With Money period. In 2005, for instance, Paul Farhi of the Washington Post wrote:
The larger questions about the popularity of the Berenstain Bears are more troubling: Is this what we really want from children’s books in the first place, a world filled with scares and neuroses and problems to be toughed out and solved? And if it is, aren’t the Berenstain Bears simply teaching to the test, providing a lesson to be spit back, rather than one lived and understood and embraced? Where is the warmth, the spirit of discovery and imagination in Bear Country? Stan Berenstain taught a million lessons to children, but subtlety and plain old joy weren’t among them.
Similarly, after Jan Berenstain died, Hanna Rosin of Slate said: “As any right-thinking mother will agree, good riddance. Among my set of mothers the series is known mostly as the one that makes us dread the bedtime routine the most.”
Which only tells me that neither Farhi or Rosin ever saw The Spooky Old Tree, which is a minor masterpiece—quirky, atmospheric, gorgeously rendered, and utterly without any lesson. It’s a book that I look forward to reading with my daughter. And while it may seem strange to dwell so much on these bears, it gets at a larger point about the pitfalls in judging any body of work by looking at a random sampling. I think that Peanuts is one of the great artistic achievements of the twentieth century, but it would be hard to convince anyone who was only familiar with its last two decades. You can see the same thing happening with The Simpsons, a series with six perfect seasons that threaten to be overwhelmed by the mediocre decades that are crowding the rest out of syndication. And the transformations of the Berenstain Bears are nothing compared to those of Robert A. Heinlein, whose career somehow encompassed Beyond This Horizon, Have Spacesuit—Will Travel, Starship Troopers, Stranger in a Strange Land, and I Will Fear No Evil. Yet there are also risks in drawing conclusions from the entirety of an artist’s output. In his biography of Anthony Burgess, Roger Lewis notes that he has read through all of Burgess’s work, and he asks parenthetically: “And how many have done that—except me?” He’s got a point. Trying to internalize everything, especially over a short period of time, can provide as false a picture as any subset of the whole, and it can result in a pattern that not even the author or the most devoted fan would recognize. Whether or not we’re from different universes, my idea of Bear Country isn’t the same as yours. That’s true of any artist’s work, and it hints at the problem at the root of all criticism: What do we talk about when we talk about the Berenstain Bears?
When I was younger, there was a period in which I seriously considered becoming a clown. To understand why, you need to know two things. The first is that a clown lives out of a trunk. I was probably about six years old when I saw the Ringling Bros. and Barnum & Bailey Circus for the first time—it was the year that featured the notorious “living unicorn”—and I don’t remember much about the show itself. What I recall most vividly is the souvenir program, which I brought home and read to pieces. It was packed with information about the performers and their lives, but the tidbit that made the greatest impression on me was the fact that they’re always on the road: they travel by train, and if you want to be a clown, you need to fit all your possessions into that trunk. As a kid, I was always fantasizing about running off to live with nothing, relying on luck and my wits, and this seemed like the ultimate example. It fascinated me for the same reason that I’ve always been intrigued by buskers, except that a clown doesn’t work alone: he’s part of a community of circus folk with their own language and traditions who have managed to survive while moving from one gig to the next. I’ve written here before of how ephemeral the career of a dancer can seem, but clowning takes it to another level. It lacks even the superficial glamor of dance, leaving you with nothing but the life of which Homer Simpson once lamented: “When I started this clown thing, I thought it would be nothing but glory. You know, the glory of being a clown?”
The other important thing about clowns is that they have a college. (Or at least they did when I was growing up, although the original Ringling Bros. and Barnum & Bailey Clown College closed its doors nearly twenty years ago.) I first read about clown college in that souvenir program, and I don’t think I’ve ever gotten over the discovery that it existed. Even now, I can recite the description of the curriculum almost from memory. Tuition was free, but students had to pay for their own room, board, and grease paint. Subjects included costume and makeup design, tumbling, acrobatics, pantomime, juggling, stilt walking, and the history of comedy. The one catch is that if the circus offered you a contract at the end of the term, you were obliged to accept it for a year. Its graduates, I later learned, included Penn Gillette, Bill Irwin, and David Strathairn. But what stuck me the most was that this was a place where instructors and students could come together to discuss something as peculiar as the theory and practice of clowning. When I look back at it, it seems possible that it was my first exposure to the idea of college of any kind, and the basic appeal of it never changed, even when I traded my fantasies of Venice, Florida for Cambridge, Massachusetts. You could argue that learning how to become a clown is more practical than majoring in creative writing or film studies. And in each case, it allows an unlikely community to form around people who are otherwise persistently odd.
It’s that sense of collective effort in the pursuit of strangeness, I think, that makes the circus so enticing. When we talk about running off to join the circus, what we’re actually saying is that we want to leave our responsibilities behind and join a troupe of likeminded individuals: free artists of themselves who require nothing but a vacant lot in order to put on a show. If I was saddened by the recent news that Ringling Bros. is closing after well over a century of operation, it’s because I feel a sense of loss at the end of the dream that it represented. There were aspects of the circus that deserved to be retired, and I wasn’t sorry when they finally put an end to their animal acts. But I never dreamed about being a lion tamer. I identified with the clowns, the trapeze artists, the acrobats, the contortionists, and all the others who symbolized the romance of devoting a life to a form of art that is inherently transient. To some extent, this is true of every artist, but what sets the circus apart is that its performers do it together, on the road, and for years on end. Directors like Fellini and Max Ophüls have been instinctively drawn to circus imagery, because it captures something fundamental about what they do for a living: they’re ringmasters with the ability to harness chaos for just long enough to make a movie. Yet this isn’t quite the same thing. A film, in theory, is something permanent, but a circus is over as soon as the show ends, and to make it last, you have to keep up the act forever.
And I’ve gradually come to realize that I did become a clown, at least in all the ways that count. (As Werner Herzog observes in Werner Herzog Eats His Shoe: “As you see [filmmaking] makes me into a clown. And that happens to everyone—just look at Orson Welles or look at even people like Truffaut. They have become clowns.”) I spent four years at college studying two dead languages that I haven’t used since graduation, which is either a cosmic joke in itself or an acknowledgment that the knowledge you acquire is less important than the fact that you’ve pursued it in the company of others. Later, I left my job to become a writer, an activity that I’ve since begun to understand is as ephemeral, in some respects, as that of a clown or ballet dancer: few of its fruits last for any longer than the time it takes to write them down, and you’re left with nothing but the process. Along the way, I’ve successively joined and departed from various communities of people who share the same goals. We’ve never traveled on a real train together, but we’re all bound for a common destination, and we’ve developed the same set of strategies to get there. The promise of the circus is that you can get paid for being a clown, if you’re willing to sacrifice every practical consideration and assume every indignity along the way, and that you’re not alone. In the end, the joke might be on you. But the joke is ultimately on all of us. And maybe the clowns are the only ones sane enough to understand this.
It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.
Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)
Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:
As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.
It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?
The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”
At last night’s presidential debate, when moderator Chris Wallace asked if he would accept the outcome of the election, Donald Trump replied: “I’ll keep you in suspense, okay?” It was an extraordinary moment that immediately dominated the headlines, and not just because it was an unprecedented repudiation of a crucial cornerstone of the democratic process. Trump’s statement—it seems inaccurate to call it a “gaffe,” since it clearly reflects his actual views—was perhaps the most damaging remark anyone could have made in that setting, and it reveals a curious degree of indifference, or incompetence, in a candidate who has long taken pride in his understanding of the media. It was a short, unforgettable sound bite that could instantly be brought to members of both parties for comment. And it wasn’t an arcane matter of policy or an irrelevant personal issue, but an instantly graspable attack on assumptions shared by every democratically elected official in America, and presumably by the vast majority of voters. Even if Trump had won the rest of the debate, which he didn’t, those six words would have erased whatever gains he might have made. Not only was it politically and philosophically indefensible, but it was a ludicrous tactical mistake, an unforced error in response to a question that he and his advisors knew was going to be asked. As Julia Azari put it during the live chat on FiveThirtyEight: “The American presidency is not the latest Tana French novel—leaders can’t keep the people in suspense.”
But the phrase that he used tells us a lot about Trump. I’m speaking as someone who has devoted my fair share of thought to suspense itself: I’ve written a trilogy of thrillers and blogged here about the topic at length. When I think about the subject, I often start with what John Updike wrote in a review of Nabokov’s Glory, which is that it “never really awakens to its condition as a novel, its obligation to generate suspense.” What Updike meant is that stories are supposed to make us wonder about what’s going to happen next, and it’s that state of pleasurable anticipation that keeps us reading. It can be an end in itself, but it can also be a literary tool for sustaining the reader’s interest while the writer tackles other goals. As Kurt Vonnegut once said of plot, it isn’t necessarily an accurate representation of life, but a way to keep readers turning pages. Over time, the techniques of suspense have developed to the point where you can simulate it using purely mechanical tricks. If you watch enough reality television, you start to notice how the grammar of the editing repeats itself, whether you’re talking about Top Chef or Project Runway or Jim Henson’s Creature Shop. The delay before the judges deliver their decision, the closeups of the faces of the contestants, the way in which an editor pads out the moment by inserting cutaways between every word that Padma Lakshmi says—these are all practical tools that can give a routine stretch of footage the weight of the verdict in the O.J. Simpson trial. You can rely on them when you can’t rely on the events of the show itself.
And the best trick of all is to have a host who keeps things moving whenever the contestants or guests start to drag. That’s where someone like Trump comes in. He’s an embarrassment, but he’s far from untalented, at least within the narrow range of competence in which he used to operate. When I spent a season watching The Celebrity Apprentice—my friend’s older sister was on it—I was struck by how little Trump had to do: he was only onscreen for a few minutes in each episode. But he was good at his job, and he was also the obedient instrument of his producers. He has approached the campaign with the same mindset, but with few of the resources that are at an actual reality show’s disposal. Trump’s strategy has been built around the idea that he doesn’t need to spend money on advertising or a ground game, as long as the media provides him with free coverage. It’s an interesting experiment, but there’s a limit to how effective it can be. In practice, Trump is less like the producer or the host than a contestant, which reduces him to acting like a reality star who wants to maximize his screen time: say alarming things, pick fights, act unpredictably, and generate the footage that the show needs, while never realizing that the incentives of the contestants and producers are fundamentally misaligned. (He should have just watched the first season of UnREAL.) When he says that he’ll keep us in suspense about accepting the results of the election, he’s just following the reality show playbook, which is to milk such climactic moments for all they’re worth.
Yet this approach has backfired, and television provides us with some important clues as to why. I once believed that the best analogy to Trump’s campaign was the rake gag made famous by The Simpsons. As producer Al Jean described it: “Sam Simon had a theory that if you repeat a joke too many times, it stops being funny, but if you keep on repeating it, it might get really funny.” Trump performed a rake gag in public for months. First we were offended when he made fun of John McCain’s military service; then he said so many offensive things that we became numb to it; and then it passed a tipping point, and we got really offended. I still think that’s true. But there’s an even better analogy from television, which is the practice of keeping the audience awake by killing off major characters without warning. As I’ve said here before, it’s a narrative trick that used to seem daring, but now it’s a form of laziness: it’s easier to deliver shocking death scenes than to tell interesting stories about the characters who are still alive. In Trump’s case, the victims are ideas, or key constituents of the electorate: minorities, immigrants, women. When Trump turned on Paul Ryan, it was the equivalent of one of those moments, like the Red Wedding on Game of Thrones, when you’re supposed to gasp and realize that nobody is safe. His attack on a basic principle of democracy might seem like more of the same, but there’s a difference. The strategy might work for a few seasons, but there comes a point at which the show cuts itself too deeply, and there aren’t any characters left that we care about. This is where Trump is now. And by telling us that he’s going to keep us in suspense, he may have just made the ending a lot less suspenseful.
For what does it profit a man to gain the whole world and forfeit his soul?
Whether or not you’re a believer, you eventually end up with your own idea of who Jesus might have been. I like to think of him as the ultimate pragmatist. If you accept his central premise—that the kingdom of heaven, whatever it is, is something that is happening right now—then his ethical system, as impossible as it might seem for most of us to follow, becomes easier to understand. It’s about eliminating distractions, focusing on what really counts, and removing sources of temptation before they have a chance to divert us from the true goal. Poverty, as Michael Grant puts it in Jesus: An Historian’s Review of the Gospels, is a practical solution to a concrete problem: “Excessive wealth might be a positive disadvantage, since its too lavish enjoyment could distract its possessors from the overriding vital matter at hand.” And as Grant observes elsewhere:
Certainly, “blessed are the meek”…but that is because “they shall inherit the earth.” Since nothing less than this is at stake, a contentious spirit is wholly out of place, for it will only distract attention and energy from the preeminent task. It is not even worth hating your enemies…In the urgent circumstances, Jesus believed, it was a sheer waste of time. Love them instead, just as much as you love everyone else; pray for those who persecute you, turn the other cheek. For why not avoid hostilities and embroilments which, beside the infinitely larger issue, are ultimately irrelevant and distracting?
“Love your enemies,” in other words, is nothing but sensible advice. Which doesn’t it make it any easier to do it for real, rather than merely paying it lip service, when it strikes us as inconvenient.
Take the case of Donald Trump. It’s fair to say that I feel less love toward Trump than I do toward any other American public figure of my lifetime. At my best, I just want to go back to the days when I could safely ignore him; at my worst, I want him to suffer some kind of humiliating, career-ending comeuppance, although I’m well aware that real life rarely affords such satisfactions. (If anything, it’s more likely to give us the opposite.) I’m also uncomfortably conscious that this is exactly the kind of reaction that he wants to evoke from me. It’s a victory. No matter what happens in this election, Trump has added perceptibly to the world’s stockpile of hate, resentment, and alienation. Hating him and what he stands for is easy; what isn’t so easy is trying to respond in ways that don’t merely feed into the cycle of hatred. The answer—and I wish it were different—is right there in front of us. We’re told to love our enemies. Jesus, the pragmatic philosopher, knew that there wasn’t time for anything else. But when I think about doing the same with Trump, I feel a bit like Meg Murry in A Wrinkle in Time, when she realizes that love is the only weapon that will work against IT, the hideous brain that rules the planet of Camazotz:
If she could give love to IT perhaps it would shrivel up and die, for she was sure that IT could not withstand love. But she, in all her weakness and foolishness and baseness and nothingness, was incapable of loving IT. Perhaps it was not too much to ask of her, but she could not do it.
The italics, as always, are mine. It isn’t too much to ask. But it’s one thing to acknowledge this, and quite another to grant that we’re obliged to do it for someone like Donald Trump.
So here’s my best shot. Trump grew up wanting nothing more than to please his own demanding father. Early in his career, he was just one real estate developer among many. He ended up concluding that the only values worth pursuing were the acquisition of money and power, abstracted from any possible benefit except as a way of keeping score. What’s worse, he received plenty of validation that his assumptions were correct. He’s never had any reason to grow or change. Instead, as we all do, he’s become more like himself as he’s aged, while categorizing the human beings around him as sources of income, enemies, or potential enablers. Behind his bluster, he’s deeply insecure, as we all are. He refuses to take responsibility for his actions, he can’t admit a mistake, and he blames everyone but himself when things go wrong. (When he says that the first debate was “rigged” because someone tampered with his mike and the moderator was against him, I’m reminded of what David Mamet says in On Directing Film: “Two reasons are equal to no reasons—it’s like saying: ‘I was late because the bus drivers are on strike and my aunt fell downstairs.’”) He seems unhappy. It’s hard to imagine him taking pleasure in reading a book, preparing a meal, or really anything aside from trolling the electorate and putting his name on buildings and planes. He appears to have no affection for anyone or anything, except perhaps his own children. And he’s the creation of forces that even he can’t control. He’s succeeded beyond his wildest expectations, but only by becoming the full-time monster that was only there in flashes before. Trump uses the system, but it also uses him. He has transformed himself into exactly what he hopes people want him to be, and he’s condemned to do it forever. And when the end comes—”As it must to all men,” the newsreel narrator reminds us in Citizen Kane—he’ll have to ask himself whether it was worth it.
I know that this comes perilously close to what the onlookers say after seeing Marge Simpson’s nude portrait of Mr. Burns: “He’s bad, but he’ll die. So I like it.” But it’s the best I can do. I can’t love Trump, but I can sort of forgive him, and pity him, for becoming what he was told to be, and for abandoning what makes us human and valuable—empathy, compassion, humility—in favor of an identity assembled from who we are at our worst. In a way, I’m even grateful to him, for much the same reason that George Saunders expressed in The New Yorker: “Although, to me, Trump seems the very opposite of a guardian angel, I thank him for this: I’ve never before imagined America as fragile, as an experiment that could, within my very lifetime, fail. But I imagine it that way now.” If Trump didn’t exist, it would have been necessary to invent him. He’s a better cautionary tale than any I could have imagined, because he won the trappings of success at a spiritual cost that isn’t tragic so much as deeply sad. He’s like Charles Foster Kane, without any of the qualities that make Kane so misleadingly attractive. When I think of the abyss of his ego, which draws like a battery on the love of his supporters and flails helplessly in every other situation, it feels like the logical extension of a career spent in the pursuit of wealth and celebrity divorced from any other consideration beyond himself. Like all mortals, Trump had exactly one chance to live a meaningful life, with greater resources than most of us ever get, and this is what he did with it. The closest I can come to loving him is the acknowledgment that I might have done the same, if I had been born with his circumstances and incentives. He’s not so different from me, as I fear I might have been in his shoes. And if I love Trump, in some weird way, it’s because I’m thankful I’m not him.
Last week, I finally saw The Revenant. I know that I’m pretty late to the party here, but I don’t have a chance to watch a lot of movies for grownups in the theater these days, and it wasn’t a film that my wife particularly wanted to see, so I had to wait for one of the rare weekends when she was out of town. At this point, a full review probably isn’t of much interest to anyone, so I’ll confine myself to observing that it’s an exquisitely crafted movie that I found very hard to take seriously. Alejandro G. Iñárittu, despite his obvious visual gifts, may be the most pretentious and least self-aware director at work today—which is one reason why Birdman fell so flat for me—and I would have liked The Revenant a lot more if it had allowed itself to smile a little at how absurd it all was. (Even the films of someone like Werner Herzog include flashes of dark humor, and I suspect that Herzog actively seeks out these moments, even if he maintains a straight face.) And it took me about five minutes to realize that the movie and I were fundamentally out of sync. It happened during the scene in which the fur trappers find themselves under attack by an Arikara war party, which announces itself, in classic fashion, with a sudden arrow through a character’s throat. A few seconds later, the camera pans up to show more arrows, now on fire, arcing through the trees overhead. It’s an eerie sight, and it’s given the usual glow by Emmanuel Lubezki’s luminous cinematography. But I’ll confess that when I first saw it, I said to myself: “Hey! They’re lighting their arrows! Can they do that?”
It’s a caption from a Far Side cartoon, of course, and it started me thinking about the ways in which the work of Gary Larson has imperceptibly shaped my inner life. I’ve spoken here before about how quotations from The Simpsons provide a kind of complete metaphorical language for fans, like the one that Captain Picard learns in “Darmok.” You could do much the same thing with Larson’s captions, and there are probably more fluent speakers alive than you might think. Peanuts is still the comic strip that has meant the most to me, and I count myself lucky that I grew up at a time when I could read most of Calvin and Hobbes in its original run. Yet both of these strips, like Bloom County, lived most vividly for me in the form of collections, and in the case of Peanuts, its best years were long behind it. The Far Side, by contrast, obsessed me on a daily basis, more than any other comic strip of its era. When I was eight years old, I spent a few months diligently cutting out all the panels from my local paper and pasting them into a scrapbook, which is an impulse that I hadn’t felt before and haven’t felt since. Two decades later, I got a copy of The Complete Far Side for Christmas, which might still be my favorite present ever. Every three years so, I get bitten by the bug again, and I spend an evening or two with one of those huge volumes on my lap, going through the strip systematically from beginning to end. Its early years are rough and a little uncertain, but they’re still wonderful, and it went out when it was close to its peak. And when I’m reading it in the right mood, there’s nothing else in the world that I’d rather be doing.
A gag panel might seem like the lowest form of comic, but The Far Side also had a weirdly novelistic quality that I’ve always admired as a writer. Larson’s style seemed easy to imitate—I think that every high school newspaper had a strip that was either an homage or outright plagiarism—but his real gift was harder to pin down. It was the ability to take what feels like an ongoing story, pause it, and offer it up to readers at a moment of defining absurdity. (Larson himself says in The Prehistory of The Far Side: “Cartoons are, after all, little stories themselves, frozen at an interesting point in time.”) His ideas stuck in the brain because we couldn’t help but wonder what happened before or afterward. Part of this because he cleverly employed all the usual tropes of the gag cartoon, which are fun precisely because of the imaginative fertility of the clichés they depict: the cowboys singing around a campfire, the explorers in pith helmets hacking their way through the jungle, the castaway on the desert island. But the snapshots in time that Larson captures are both so insane and so logical that the reader has no choice but to make up a story. The panel is never the inciting incident or the climax, but a ticklish moment somewhere in the middle. It can be the gigantic mailman knocking over buildings while a dog exhorts a crowd of his fellows: “Listen! The authorities are helpless! If the city’s to be saved, I’m afraid it’s up to us! This is our hour!” Or the duck hunter with a shotgun confronted by a row of apparitions in a hall of mirrors: “Ah, yes, Mr. Frischberg, I thought you’d come…but which of us is the real duck, Mr. Frischberg, and not just an illusion?”
As a result, you could easily go through a Far Side collection and use it as a series of writing prompts, like a demented version of The Mysteries of Harris Burdick. I’ve occasionally thought about writing a story revolving around the sudden appearance of Professor DeArmond, “the epitome of evil among butterfly collectors,” or expanding on the incomparable caption: “Dwayne paused. As usual, the forest was full of happy little animals—but this time something seemed awry.” It’s hard to pick just one favorite, but the panel I’ve thought about the most is probably the one with the elephant in the trench coat, speaking in a low voice out of the darkness of the stairwell:
Remember me, Mr. Schneider? Kenya. 1947. If you’re going to shoot at an elephant, Mr. Schneider, you better be prepared to finish the job.
Years later, I spent an ungodly amount of time working on a novel, still unpublished, about an elephant hunt, and while I wouldn’t go so far as to say that it was inspired by this cartoon, I’m also not prepared to say that it wasn’t. I should also note Larson’s mastery of perfect proper names, which are harder to come up with than you might think: “Mr. Frischberg” and “Mr. Schneider” were so nice that he said them twice. It’s that inimitable mixture of the ridiculous and the specific that makes Larson such a model for storytellers. He made it to the far side thirty years ago, and we’re just catching up to him now.
There are two sorts of commentary tracks. The first kind is recorded shortly after a movie or television season is finished, or even while it’s still being edited or mixed, and before it comes out in theaters. Because their memories of the production are still vivid, the participants tend to be a little giddy, even punch drunk, and their feelings about the movie are raw: “The wound is still open,” as Jonathan Franzen put it to Slate. They don’t have any distance, and they remember everything, which means that they can easily get sidetracked into irrelevant detail. They don’t yet know what is and isn’t important. Most of all, they don’t know how the film did with viewers or critics, so their commentary becomes a kind of time capsule, sometimes laden with irony. The second kind of commentary is recorded long after the fact, either for a special edition, for the release of an older movie in a new format, or for a television series that is catching up with its early episodes. These tend to be less predictable in quality: while commentaries on recent work all start to sound more or less the same, the ones that reach deeper into the past are either disappointingly superficial or hugely insightful, without much room in between. Memories inevitably fade with time, but this can also allow the artist to be more honest about the result, and the knowledge of how the work was ultimately received adds another layer of interest. (For instance, one of my favorite commentaries from The Simpsons is for “The Principal and the Pauper,” with writer Ken Keeler and others ranting against the fans who declared it—preemptively, it seems safe to say—the worst episode ever.)
Perhaps most interesting of all are the audio commentaries that begin as the first kind, but end up as the second. You can hear it on the bonus features for The Lord of the Rings, in which, if memory serves, Peter Jackson and his cowriters start by talking about a movie that they finished years ago, continue by discussing a movie that they haven’t finished editing yet, and end by recording their comments for The Return of the King after it won the Oscar for Best Picture. (This leads to moments like the one for The Two Towers in which Jackson lays out his reasoning for pushing the confrontation with Saruman to the next movie—which wound up being cut for the theatrical release.) You also see it, on a more modest level, on the author’s commentaries I’ve just finished writing for my three novels. I began the commentary on The Icon Thief way back on April 30, 2012, or less than two months after the book itself came out. At the time, City of Exiles was still half a year away from being released, and I was just beginning the first draft of the novel that I still thought would be called The Scythian. I had a bit of distance from The Icon Thief, since I’d written a whole book and started another in the meantime, but I was still close enough that I remembered pretty much everything from the writing process. In my earliest posts, you can sense me trying to strike the right balance between providing specific anecdotes about the novel itself to offering more general thoughts on storytelling, while using the book mostly as a source of examples. And I eventually reached a compromise that I hoped would allow those who had actually read the book to learn something about how it was put together, while still being useful to those who hadn’t.
As a result, the commentaries began to stray further from the books themselves, usually returning to the novel under discussion only in the final paragraph. I did this partly to keep the posts accessible to nonreaders, but also because my own relationship with the material had changed. Yesterday, when I posted the last entry in my commentary on Eternal Empire, almost four years had passed since I finished the first draft of that novel. Four years is a long time, and it’s even longer in writing terms. If every new project puts a wall between you and the previous one, a series of barricades stands between these novels and me: I’ve since worked on a couple of book-length manuscripts that never got off the ground, a bunch of short stories, a lot of occasional writing, and my ongoing nonfiction project. With each new endeavor, the memory of the earlier ones grows dimmer, and when I go back to look at Eternal Empire now, not only do I barely remember writing it, but I’m often surprised by my own plot. This estrangement from a work that consumed a year of my life is a little sad, but it’s also unavoidable: you can’t keep all this information in your head and still stay sane. Amnesia is a coping strategy. We’re all programmed to forget many of our experiences—as well as our past selves—to free up capacity for the present. A novel is different, because it exists in a form outside the brain. Any book is a piece of its writer, and it can be as disorienting to revisit it as it is to read an old diary. As François Mauriac put it: “It is as painful as reading old letters…We touch it like a thing: a handful of ashes, of dust.” I’m not quite at that point with Eternal Empire, but I’ll sometimes read a whole series of chapters and think to myself, where did that come from?
Under the circumstances, I should count myself lucky that I’m still reasonably happy with how these novels turned out, since I have no choice but to be objective about it. There are things that I’d love to change, of course: sections that run too long, others that seem underdeveloped, conceits that seem too precious or farfetched or convenient. At times, I can see myself taking the easy way out, going with a shortcut or ignoring a possible implication because I lacked the time or energy to do it justice. (I don’t necessarily regret this: half of any writing project involves conserving your resources for when it really matters.) But I’m also surprised by good ideas or connections that seem to have come from outside of me, as if, to use Isaac Asimov’s phrase, I were writing over my own head. Occasionally, I’ll have trouble following my own logic, and the result is less a commentary than a forensic reconstruction of what I must have been thinking at the time. But if I find it hard to remember my reasoning today, it’s easier now than it will be next year, or after another decade. As I suspected at the time, the commentary exists more for me than for anybody else. It’s where I wrote down my feelings about a series of novels that once dominated my life, and which now seem like a distant memory. While I didn’t devote nearly as many hours to these commentaries as I did to the books themselves, they were written over a comparable stretch of time. And now that I’ve gotten to the point of writing a commentary on my commentary—well, it’s pretty clear that it’s time to stop.