Posts Tagged ‘AVQ&A’
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What famous person’s life would you want to assume?”
“Celebrity,” John Updike once wrote, “is a mask that eats into the face.” And Updike would have known, having been one of the most famous—and the most envied—literary novelists of his generation, with a career that seemed to consist of nothing but the serene annual production of poems, stories, essays, and hardcovers that, with their dust jackets removed, turned out to have been bound and designed as a uniform edition. From the very beginning, Updike was already thinking about how his complete works would look on library shelves. That remarkable equanimity made an impression on the writer Nicholson Baker, who wrote in his book U & I:
I compared my awkward self-promotion too with a documentary about Updike that I saw in 1983, I believe, on public TV, in which, in one scene, as the camera follows his climb up a ladder at his mother’s house to put up or take down some storm windows, in the midst of this tricky physical act, he tosses down to us some startlingly lucid little felicity, something about “These small yearly duties which blah blah blah,” and I was stunned to recognize that in Updike we were dealing with a man so naturally verbal that he could write his fucking memoirs on a ladder!
Plenty of writers, young or old, might have wanted to switch places with Updike, although the first rule of inhabiting someone else’s life is that you don’t want to be a writer. (The Updike we see in Adam Begley’s recent biography comes across as more unruffled than most, but all those extramarital affairs in Ipswich must have been exhausting.) Writing might seem like an attractive kind of celebrity: you can inspire fierce devotion in a small community of fans while remaining safely anonymous in a restaurant or airport. You don’t even need to go as far as Thomas Pynchon: how many of us could really pick Michael Chabon or Don DeLillo or Cormac McCarthy out of a crowd? Yet that kind of seclusion carries a psychological toll as well, and I suspect that the daily life of any author, no matter how rich or acclaimed, looks much the same as any other. If you want to know what it’s like to be old, Malcolm Cowley wrote: “Put cotton in your ears and pebbles in your shoes. Pull on rubber gloves. Smear Vaseline over your glasses, and there you have it: instant old age.” And if you want to know what it’s like to be a novelist, you can fill a room with books and papers, go inside, close the door, and stay there for as long as possible while doing absolutely nothing that an outside observer would find interesting. Ninety percent of a writer’s working life looks more or less like that.
What kind of celebrity, then, do you really want to be? If celebrity is a mask, as Updike says, it might be best to make it explicit. Being a member of Daft Punk, say, would allow you to bask in the adulation of a stadium show, then remove your helmet and take the bus back to your hotel without any risk of being recognized. The mask doesn’t need to be literal, either: I have a feeling that Lady Gaga could dress down in a hoodie and ponytail and order a latte at any Starbucks in the country without being mobbed. The trouble, of course, with taking on the identity of a total unknown—Banksy, for instance—is that you’re buying the equivalent of a pig in a poke: you just don’t know what you’re getting. Ideally, you’d switch places with a celebrity whose life has been exhaustively chronicled, either by himself or others, so that there aren’t any unpleasant surprises. It’s probably best to also go with someone slightly advanced in years: as Solon says in Herodotus, you don’t really know how happy someone’s life is until it’s over, and the next best thing would be a person whose legacy seems more or less fixed. (There are dangers there, too, as Bill Cosby knows.) And maybe you want someone with a rich trove of memories of a life spent courting risk and uncertainty, but who has since mellowed into something slightly more stable, with the aura of those past accomplishments still intact.
You also want someone with the kind of career that attracts devoted collaborators, which is the only kind of artistic wealth that really counts. But you don’t want too much fame or power, both of which can become traps in themselves. In many respects, then, what you’d want is something close to the life of half and half that Lin Yutang described so beautifully: “A man living in half-fame and semi-obscurity.” Take it too far, though, and you start to inch away from whatever we call celebrity these days. (Only in today’s world can an otherwise thoughtful profile of Brie Larson talk about her “relative anonymity.”) And there are times when a touch of recognition in public can be a welcome boost to your ego, like for Sally Field in Soapdish, as long as you’re accosted by people with the same basic mindset, rather than those who just recognize you from Istagram. You want, in short, to be someone who can do pretty much what he likes, but less because of material resources than because of a personality that makes the impossible happen. You want to be someone who can tell an interviewer: “Throughout my life I have been able to do what I truly love, which is more valuable than any cash you could throw at me…So long as I have a roof over my head, something to read and something to eat, all is fine…What makes me so rich is that I am welcomed almost everywhere.” You want to be Werner Herzog.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What makes a great trailer?”
A few years ago, in a post about The Cabin in the Woods, which is one of a small handful of recent films I still think about on a regular basis, I wrote:
If there’s one thing we’ve learned about American movie audiences over the past decade or so, it’s that they don’t like being surprised. They may say that they do, and they certainly respond positively to twist endings, properly delivered, within the conventions of the genre they were hoping to see. What they don’t like is going to a movie expecting one thing and being given something else. And while this is sometimes a justifiable response to misleading ads and trailers, it can also be a form of resentment at having one’s expectations upended.
I went on to quote a thoughtful analysis from Box Office Mojo, which put its finger on why the movie scored so badly with audiences:
By delivering something much different, the movie delighted a small group of audience members while generally frustrating those whose expectations were subverted. Moviegoers like to know what they are in for when they go to see a movie, and when it turns out to be something different the movie tends to get punished in exit polling.
And the funny thing is that you can’t really blame the audience for this. If you think of a movie primarily as a commercial product that you’ve paid ten dollars or more to see—which doesn’t even cover the ancillary costs of finding a babysitter and driving to and from the theater—you’re likely to be frustrated if it turns out to be something different from what you were expecting. This is especially the case if you only see a few movies a year, and doubly so if you avoid the reviews and base your decisions solely on trailers, social media, or the presence of a reliable star. In practice, this means that certain surprises are acceptable, while others aren’t. It’s fine if the genre you’re watching all but requires there to be a twist, even if it strains all logic or openly cheats. (A lot of people apparently liked Now You See Me.) But if the twist takes you out of the genre that you thought you were paying to see, viewers tend to get angry. Genre, in many ways, is the most useful metric for deciding where to put your money: if you pay to see an action movie or a romantic comedy or a slasher film, you have a pretty good sense of the story beats you’re going to experience. A movie that poses as one genre and turns out to be another feels like flagrant false advertising, and it leaves many viewers feeling ripped off.
As a result, it’s probably no longer possible for a mainstream movie to radically change in tone halfway through, at least not in a way that hasn’t been spoiled by trailers. Few viewers, I suspect, went into From Dusk Till Dawn without knowing that a bunch of vampires were coming, and a film like Psycho couldn’t be made today at all. (Any attempt to preserve the movie’s secrets in the ads would be seen, after the fact, as a tragic miscalculation in marketing, as many industry insiders thought it was for The Cabin in the Woods.) There’s an interesting exception to this rule, though, and it applies to trailers themselves. Unless it’s for something like The Force Awakens, a trailer, by definition, isn’t something you’ve paid to see: you don’t have any particular investment in what they’re showing you, and it’s only going to claim your attention for a couple of minutes. As a result, trailers can indulge in all kinds of formal experiments that movies can’t, and probably shouldn’t, attempt at feature length. For the most part, trailers aren’t edited according to the same rules as movies, and they’re often cut together by a separate team of editors who are looking at the footage using a very different set of criteria. And as it turns out, one of the most reliable conventions of movie trailers is the old switcheroo: you start off in one genre, then shift abruptly to another, often accompanied by a needle scratch or ominous music cue.
In other words, the trailers frequently try to appeal to audiences using exactly the kind of surprise that the movies themselves can no longer provide. Sometimes it starts off realistically, only to introduce monsters or aliens, as Cloverfield and District 9 did so memorably, and trailers never tire of the gimmick of giving us what looks like a romantic comedy before switching into thriller mode. The ultimate example, to my mind, remains Vanilla Sky, which is still one of my favorite trailers. When I saw it for the first time, the genre switcheroo wasn’t as overused as it later became, and the result knocked me sidways. By now, most of its tricks have become clichés in themselves, down to its use of “Solsbury Hill,” so maybe you’ll have to take my word for it when I say that it was unbelievably effective. (In some ways, I wish the movie, which I also love, had followed the trailer’s template more closely, instead of tipping its hand early on about the weirdness to come.) And I suspect that such trailers, with their ability to cross genre boundaries, represent a kind of longing by directors about the sorts of films that they’d really like to make. The logic of the marketplace has made it impossible for such surprises to survive in the finished product, but a trailer can serve a sort of miniature version of what it might have been under different circumstances. This isn’t always true: in most cases, the studio just cuts together a trailer for the movie that they wish the director had made, rather than the one that he actually delivered. But every now and then, a great trailer can feel like a glimpse of a movie’s inner, secret life, even if it turns out that it was all a dream.
When I was in my early twenties, I was astonished to learn that “One,” “Coconut,” the soundtrack to The Point, and “He Needs Me”—as sung by Shelley Duvall in Popeye and, much later, in Punch-Drunk Love—were all written by the same man, who also sang “Everybody’s Talkin'” from Midnight Cowboy. (This doesn’t even cover “Without You” or “Jump Into the Fire,” which I discovered only later, and it also ignores some of the weirder detours in Harry Nilsson’s huge discography.) At the time, I was reminded of Homer Simpson’s response when Lisa told him that bacon, ham, and pork chops all came from the same animal: “Yeah, right, Lisa. A wonderful, magical animal.” Which is exactly what Nilsson was. But it’s also the kind of diversity that arises from decades of productive, idiosyncratic work. Nilsson was a facile songwriter with a lot of tricks up his sleeve, as he notes in an interview in the book Songwriters on Songwriting:
Most [songs] I find you can write in less time than it takes to sing them. The concept, if there is a concept, or the hook, is all you’re concerned with. Because you know you can go back and fill in the pieces. If you get a front line and a punch line, it’s a question of just filling in the missing bits.
And given Nilsson’s diverse, prolific output, it shouldn’t come as a surprise that I encountered him in so many different guises before realizing that they were all aspects of a single creative personality.
Of course, not every career generates this kind of enticing randomness. Nilsson occupied a curious position for much of his life, stuck somewhere halfway between superstardom and seclusion, and it freed him to make a long series of odd, peculiar choices. When other artists end up in the same position, it’s often less by choice than by necessity. When you look at the résumé of a veteran supporting actor or working writing, you usually find that they resist easy categorizations, since each credit resulted from a confluence of circumstances that may never be repeated. A glance at the filmography of any character actor inspires moment after moment of recognition, as you realize, for instance, that the same guy who played Mr. Noodle on Sesame Street was also the dad in Rachel Getting Married and Tars in Interstellar. A few artists have the luxury of shaping careers that seem all of a piece, but others aren’t all that interested in it, or find that their body of work is determined more by external factors. Most actors aren’t in a position to turn down a paycheck, and learning how and why they took one role and not another is part of what makes Will Harris’s A.V. Club interviews in “Random Roles” so fascinating. When you’re at the constant mercy of trends and casting agents, you can end up with a career that looks like it should belong to three different people. And as someone like Matthew McConaughey can tell you, that goes for stars as well.
It’s particularly true of actresses. I’ve spoken here before of the starlet’s dilemma, in which young actresses are required to balance the needs of extending their shelf life as ingenues for a few more seasons with the very different set of choices required to sustain a career over decades. In many cases, the decisions that make sense now, like engaging in cosmetic surgery, can come back to haunt them later, but the pressure to extend their prime earning years is immense, and it’s no surprise that few manage to navigate the pitfalls that Hollywood presents. I was reminded of this while leafing—never mind why—through the latest issue of Allure, which features Jessica Alba on its cover. Alba has recently begun a second act as the head of her own consumer goods company, and she seems far happier and more satisfied in that role than she ever was as an actress: she admits that she tried to be what everyone else wanted her to be, and she accepted roles and made choices without a larger plan in mind. The result, sadly, was a career without shape or character, determined by an industry that could never decide whether Alba was best suited for comedy, romance, or action. I don’t think any of her movies will still be watched twenty years from now, and I expect that we’ll be surprised one day to remember that the founder of the Honest Company was also a movie star, in the way it amuses us to reflect that Martha Stewart used to be a model.
So how do you end up with a career more like Nilsson’s and less like Alba’s, given countless uncontrollable factors that can govern a life in the arts? You can begin, perhaps, by remembering like an artist, like any human being, will play many roles, and not all of them are going to be consistent. When you look back at what you’ve done, it can be hard to find any particular shape, aside from what was determined by the needs of the moment, and it may even be difficult to recognize the person who thought that a particular project was a good idea—if you had any choice in the matter at all. (When I look at my own career, I find that it divides neatly in two, with one half in science fiction and the other in suspense, with no overlap between them whatsoever, a situation that was created almost entirely by the demands of the market.) But if you need to wear multiple hats, or even multiple personalities, you can at least strive to make all of them interesting. Consistency, as Emerson puts it, is the hobgoblin of little minds, and it’s an equally elusive goal in the arts: the only way to be consistent is to be dependably mediocre. The life you get by staying true to yourself in the face of external pressure will be more interesting than the one that results from a perfect plan. It can even be easier to have two careers than one. And if you try too hard to make everything fit into a single frame, you might find that one is the loneliest number.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What piece of art has actually stopped you in your tracks?”
“All art constantly aspires toward the condition of music,” Walter Pater famously said, but these days, it seems more accurate to say that all art aspires toward the condition of advertising. There’s always been a dialogue between the two, of course, and it runs in both directions, with commercials and print ads picking up on advances in the fine arts, even as artists begin to utilize techniques initially developed on Madison Avenue. Advertising is a particularly ruthless medium—you have only a few seconds to grab the viewer’s attention—and the combination of quick turnover, rapid feedback, and intense financial pressure allows innovations to be adapted and refined with blinding speed, at least within a certain narrow range. (There’s a real sense in which the hard lessons that Jim Henson, say, learned while shooting commercials for Wilkins Coffee are what made Sesame Street so successful.) The difference today is that the push for virality—the need to attract eyeballs in brutal competition with countless potential diversions—has superseded all other considerations, including the ability to grow and maintain an audience. When thousands of “content providers” are fighting for our time on equal terms, there’s no particular reason to remain loyal to any one of them. Everything is an ad now, and it’s selling nothing but itself.
This isn’t a new idea, and I’ve written about it here at length before. What really interests me, though, is how even the most successful examples of storytelling are judged by how effectively they point to some undefined future product. The Marvel movies are essentially commercials or trailers for the idea of a superhero film: every installment builds to a big, meaningless battle that serves as a preview for the confrontation in an upcoming sequel, and we know that nothing can ever truly upset the status quo when the studio’s slate of tentpole releases has already been announced well into the next decade. They aren’t bad films, but they’re just ever so slightly better than they have to be, and I don’t have much of an interest in seeing any more. (Man of Steel has plenty of problems, but at least it represents an actual point of view and an attempt to work through its considerable confusions, and I’d sooner watch it again than The Avengers.) Marvel is fortunate enough to possess one of the few brands capable of maintaining an audience, and it’s petrified at the thought of losing it with anything so upsetting as a genuine surprise. And you can’t blame anyone involved. As Christopher McQuarrie aptly puts it, everyone in Hollywood is “terribly lost and desperately in need of help,” and the last thing Marvel or Disney wants is to turn one of the last reliable franchises into anything less than a predictable stream of cash flows. The pop culture pundits who criticize it—many of whom may not have jobs this time next year—should be so lucky.
But it’s unclear where this leaves the rest of us, especially with the question of how to catch the viewer’s eye while inspiring an engagement that lasts. The human brain is wired in such a way that the images or ideas that seize its attention most easily aren’t likely to retain it over the long term: the quicker the impression, the sooner it evaporates, perhaps because it naturally appeals to our most superficial impulses. Which only means that it’s worth taking a close look at works of art that both capture our interest and reward it. It’s like going to an art gallery. You wander from room to room, glancing at most of the exhibits for just a few seconds, but every now and then, you see something that won’t let go. Usually, it only manages to intrigue you for the minute it takes to read the explanatory text beside it, but occasionally, the impression it makes is a lasting one. Speaking from personal experience, I can think of two revelatory moments in which a glimpse of a picture out of the corner of my eye led to a lifelong obsession. One was Cindy Sherman’s Untitled Film Stills; the other was the silhouette work of Kara Walker. They could hardly be more different, but both succeed because they evoke something to which we instinctively respond—movie archetypes and clichés in Sherman’s case, classic children’s illustrations in Walker’s—and then force us to question why they appealed to us in the first place.
And they manage to have it both ways to an extent that most artists would have reason to envy. Sherman’s film stills both parody and exploit the attitudes that they meticulously reconstruct: they wouldn’t be nearly as effective if they didn’t also serve as pin-ups for readers of Art in America. Similarly, Walker’s cutouts fill us with a kind of uneasy nostalgia for the picture books we read growing up, even as they investigate the darkest subjects imaginable. (They also raise fascinating questions about intentionality. Sherman, like David Lynch, can come across as a naif in interviews, while Walker is closer to Michael Haneke, an artist who is nothing if not completely aware of how each effect was achieved.) That strange combination of surface appeal and paradoxical depth may be the most promising angle of attack that artists currently have. You could say much the same about Vijith Assar’s recent piece for McSweeney’s about ambiguous grammar, which starts out as the kind of viral article that we all love to pass around—the animated graphics, the prepackaged nuggets of insight—only to end on a sweet sucker punch. The future of art may lie in forms that seize on the tools of virality while making us think twice about why we’re tempted to click the share button. And it requires artists of unbelievable virtuosity, who are able to exactly replicate the conditions of viral success while infusing them with a white-hot irony. It isn’t easy, but nothing worth doing ever is. This is the game we’re all playing, like it or not, and the artists who are most likely to survive are the ones who can catch the eye while also burrowing into the brain.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What non-comic creative type do you want to see make a comic?”
Earlier this year, I discovered Radio: An Illustrated Guide, the nifty little manual written by cartoonist Jessica Abel and Ira Glass of This American Life. At the time, the book’s premise struck me as a subtle joke in its own right, and I wrote:
The idea of a visual guide to radio is faintly amusing in itself, particularly when you consider the differences between the two art forms: comics are about as nonlinear a medium as you can get between two covers, with the reader’s eye prone to skip freely across the page.
The more I think about it, though, the more it seems to me that these two art forms share surprising affinities. They’re both venerable mediums with histories that stretch back for close to a century, and they’ve both positioned themselves in relation to a third, invisible other, namely film and television. On a practical level, whether its proponents like it or not, both radio and comics have come to be defined by the ways in which they depart from what a movie or television show can do. In the absence of any visual cues, radio has to relentlessly manage the listener’s attention—”Anecdote then reflection, over and over,” as Glass puts it—and much of the grammar of the comic book emerged from attempts to replicate, transcend, and improve upon the way images are juxtaposed in the editing room.
And smart practitioners in both fields have always found ways of learning from their imposing big brothers, while remaining true to the possibilities that their chosen formats offer in themselves. As Daniel Clowes once said:
To me, the most useful experience in working in “the film industry” has been watching and learning the editing process. You can write whatever you want and try to film whatever you want, but the whole thing really happens in that editing room. How do you edit comics? If you do them in a certain way, the standard way, it’s basically impossible. That’s what led me to this approach of breaking my stories into segments that all have a beginning and end on one, two, three pages. This makes it much easier to shift things around, to rearrange parts of the story sequence.
Meanwhile, the success of a podcast like Serial represents both an attempt to draw upon the lessons of modern prestige television and a return to the roots of this kind of storytelling. Radio has done serialized narratives better than any other art form, and Serial, for all its flaws, was an ambitious attempt to reframe those traditions in a shape that spoke to contemporary listeners.
What’s a little surprising is that we haven’t witnessed a similar mainstream renaissance in nonfiction comics, particularly from writers and directors who have made their mark in traditional documentaries. Nonfiction has always long been central to the comic format, of course, ranging from memoirs like Maus or Persepolis to more didactic works like Logicomix or The Cartoon History of the Universe. More recently, webcomics like The Oatmeal or Randall Munroe’s What If? have explained complicated issues in remarkable ways. What I’d really love to see, though, are original works of documentary storytelling in comic book form, the graphic novel equivalent of This American Life. You could say that the reenactments we see in works like Man on Wire or The Jinx, and even the animated segments in the films of Brett Morgen, are attempts to push against the resources to which documentaries have traditionally been restricted, particularly when it comes to stories set in the past—talking heads, archive footage, and the obligatory Ken Burns effect. At times, such reconstructions can feel like cheating, as if the director were bristling at having to work with the available material. Telling such stories in the form of comics instead would be an elegant way of circumventing those limitations while remaining true to the medium’s logic.
And certain documentaries would work even better as comics, particularly if they require the audience to process large amounts of complicated detail. Serial, with its endless, somewhat confusing discussions of timelines and cell phone towers, might have worked better as a comic book, which would have allowed readers to review the chain of events more easily. And a director like Errol Morris, who has made brilliant use of diagrams and illustrations in his published work, would be a natural fit. There’s no denying that some documentaries would lose something in the translation: the haunted face of Robert Durst in The Jinx has a power that can’t be replicated in a comic panel. But comics, at their best, are an astonishing way of conveying and managing information, and for certain stories, I can’t imagine anything more effective. We’re living in a time in which we seem to be confronting complex systems every day, and as a result, artists of all kinds have begun to address what Zadie Smith has called the problem of “how the world works,” with stories that are as much about data, interpretation, and information overload as about individual human beings. For the latter, narrative formats that can offer us a real face or voice may still hold an edge. But for many of the subjects that documentarians in film, television, or radio will continue to tackle, the comics may be the best solution they’ll ever have.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What’s your absolute favorite piece of media so far this year?”
Earlier this week, while exploring the question of why we say someone is “in” a movie but “on” a television series, I got to thinking about the significance of the television set itself as a physical object. It’s hard to imagine a more ubiquitous appliance: a hotel room that contains nothing else but a bed and a toilet seems bare without that blank screen in the corner, and we encounter them in every waiting room, bar, and airport terminal. Television is a utility, like heat or gas, and when we talk about channels or airwaves, we’re making a subconscious analogy to running water. Most households have one, to the point where its absence is worth mentioning, and choosing not to own a television amounts to a political or lifestyle statement. Or at least it once did. Back when I was in college, acquiring a television set was a big deal: it freed us from the tyranny of the common room, where I had to stake a claim to watch everything from The X-Files to that one time R.E.M. appeared on Sesame Street. Nowadays, fewer college kids seem to make owning a set a priority, and if they do, it’s more likely to be used for gaming. We have plenty of other screens that can do the same work as well or better, and in a decade or two, television sets may seem like dusty relics, kept out of nostalgia or inertia, like the radios or electric organs in the parlor of your grandmother’s house.
Yet that box still carries a psychological significance. It serves as a reminder, or even an advertisement, of the fact that television exists. We still switch it on out of habit, just because it’s there, and even those of us who don’t use it as a source of background noise are likely to flip through the channels as soon as we drop our bags in the aforementioned hotel room. The same qualities that make it seem vaguely anachronistic—the way it’s tethered to a bulky, immovable object, or how the flow of information goes only one way—are a big part of its lingering appeal. It doesn’t demand anything of us, except that we keep it in our line of sight, and it remains an ideal source of distraction and consolation for loners, agoraphobes, and new parents. Even as we migrate to other sources of content, television stands at the center of that solar system: maybe a quarter of the time I spend online is devoted to scrolling through news, criticism, episode recaps, or think pieces about the shows I like, which is more than I spend reading about politics, current events, or just about anything else. Even when that screen in the corner remains dark, it throws out its tendrils into whatever browser window happens to be open. It’s the Cthulhu of pop culture, invading the dreams of its followers even as it slumbers in the deep.
And you can see the impact on this blog. Over the last six months, the only film released this year to which I’ve devoted a complete post, somewhat hilariously, is Blackhat, which was seen by fewer moviegoers on its entire run than turn out on a good afternoon for Jurassic World. I haven’t written about any new books at all—the most recent novel I’ve finished reading, The Goldfinch, was published two years ago. As with most people in their middle thirties, my knowledge of current music is actively embarrassing. Yet over the same period, I’ve written extensively about television shows like Parks and Recreation, House of Cards, Glee, The Jinx, Unbreakable Kimmy Schmidt, The Vampire Diaries, Mad Men, Community, Game of Thrones, True Detective, and Hannibal. I don’t even think of myself as a television fan, at least not in the way I love the movies, but my shift in that direction has been as decisive as it felt inevitable. A lot of this is due to the fact that I just don’t get out as much as I once did, except to bring my daughter to the playground or library. But if I’ve embraced television instead of becoming a better reader or catching up on music, it tells us something about how that medium insinuates itself so readily into the pockets of time that remain.
Television, after all, is infinitely expandable or compressible, as long as you extend its definition to other forms of streaming content. It can take up weeks of your life or a minute or two at a time. If you want to be told a novelistic story, it’s happy to oblige, but it’s equally capable of delivering a quick laugh or a snackable dose of diversion. And at a time when my life sometimes seems packed to bursting with the demands of work and parenthood, it’s glad to take up whatever bandwidth remains. I can give it as much, or as little, energy as I like. My wife listens to podcasts for much the same reason, and radio has certainly mastered the trick of rewarding a wide range of attentiveness: even the best radio programs encourage their listeners to do as little thinking for themselves as possible. And if I’ve stuck with television instead, it’s because it was there already, just waiting for me to turn the faucet. It reminds me of Stephen Covey’s parable of the jar of rocks, although with the opposite moral: even when it seems full, you can pour in a little more water until all the nooks and crannies are filled. Television has had decades of practice at filling us up to the brim, and lucky for me, it’s been a great six months. (For the record, the best things I’ve seen so far this year are The Jinx and the Mad Men finale.) But if television is the water in the jar, books, movies, and music are the rocks. This isn’t a value judgment, just an observation. And as Covey likes to say, if you don’t put the big rocks in first, you’ll never get them in at all.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What individual instances of product placement in movies and television have you found most effective?”
One of the small but consistently troublesome issues that every writer faces is what to do about brand names. We’re surrounded by brands wherever we look, and we casually think and talk about them all the time. In fiction, though, the mention of a specific brand often causes a slight blip in the narrative: we find ourself asking if the character in question would really be using that product, or why the author introduced it at all, and if it isn’t handled well, it can take us out of the story. Which isn’t to say that such references don’t have their uses. John Gardner puts it well in The Art of Fiction:
The writer, if it suits him, should also know and occasionally use brand names, since they help to characterize. The people who drive Toyotas are not the same people who drive BMWs, and people who brush with Crest are different from those who use Pepsodent or, on the other hand, one of the health-food brands made of eggplant. (In super-realist fiction, brand names are more important than the characters they describe.)
And sometimes the clever deployment of brands can be another weapon in the writer’s arsenal, although it usually only works when the author already possesses a formidable descriptive vocabulary. Nicholson Baker is a master of this, and it doesn’t get any better than Updike in Rabbit is Rich:
In the bathroom Harry sees that Ronnie uses shaving cream, Gillette Foamy, out of a pressure can, the kind that’s eating up the ozone so our children will fry. And that new kind of razor with the narrow single-edge blade that snaps in and out with a click on the television commercials. Harry can’t see the point, it’s just more waste, he still uses a rusty old two-edge safety razor he bought for $1.99 about seven years ago, and lathers himself with an old imitation badger-bristle on whatever bar of soap is handy…
For the rest of us, though, I’d say that brand names are one of those places where fiction has to retreat slightly from reality in order to preserve the illusion. Just as dialogue in fiction tends to be more direct and concise than it would be in real life, characters should probably refer to specific brands a little less often than they really would. (This is particularly true when it comes to rapidly changing technology, which can date a story immediately.)
In movies and television, a prominently featured brand sets off a different train of thought: we stop paying attention to the story and wonder if we’re looking at deliberate product placement—if there’s even any question at all. Even a show as densely packed as The Vampire Diaries regularly takes a minute to serve up a commercial for the likes of AT&T MiFi, and shows like Community have turned paid brand integration into entire self-mocking subplots, while still accepting the sponsor’s money, which feels like a textbook example of having it both ways. Tony Pace of Subway explains their strategy in simple terms: “We are kind of looking to be an invited guest with a speaking role.” Which is exactly what happened on Community—and since it was reasonably funny, and it allowed the show to skate along for another couple of episodes, I didn’t really care. When it’s handled poorly, though, this ironic, winking form of product placement can be even more grating than the conventional kind. It flatters us into thinking that we’re all in on the joke, although it isn’t hard to imagine cases where corporate sponsorship, embedded so deeply into a show’s fabric, wouldn’t be so cute and innocuous. Even under the best of circumstances, it’s a fake version of irreverence, done on a company’s terms. And if there’s a joke here, it’s probably on us.
Paid or not, product placement works, at least on me, although often in peculiar forms. I drank Heineken for years because of Blue Velvet, and looking around my house, I see all kinds of products or items that I bought to recapture a moment from pop culture, whether it’s the Pantone mug that reminds me of a Magnetic Fields song or the Spyderco knife that carries the Hannibal seal of approval. (I’ve complained elsewhere about the use of snobbish brand names in Thomas Harris, but it’s a beautiful little object, even if I don’t expect to use it exactly as Lecter does.) If it’s kept within bounds, it’s a mostly harmless way of establishing a connection between us and something we love, but it always ends up feeling a little empty. Which may be why brand names sit so uncomfortably in fiction. Brands or corporations use many of the same strategies as art to generate an emotional response, except the former is constantly on message, unambiguous, and designed to further a specific end. It’s no accident that there are so many affinities between advertising and propaganda. A good work of art, by contrast, is ambiguous, open to multiple interpretations, and asks nothing of us aside from an investment of time—which is the opposite of what a brand wants. Fiction and brands are always going to live together, either because they’ve been paid to do so or because it’s an accurate reflection of our world. But we’re more than just consumers. And art, at its best, should remind us of this.