Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Search Results

Peak television and the future of stardom

with one comment

Kevin Costner in The Postman

Earlier this week, I devoured the long, excellent article by Josef Adalian and Maria Elena Fernandez of Vulture on the business of peak television. It’s full of useful insights and even better gossip—and it names plenty of names—but there’s one passage that really caught my eye, in a section about the huge salaries that movie stars are being paid to make the switch to the small screen:

A top agent defends the sums his clients are commanding, explaining that, in the overall scheme of things, the extra money isn’t all that significant. “Look at it this way,” he says. “If you’re Amazon and you’re going to launch a David E. Kelley show, that’s gonna cost $4 million an episode [to produce], right? That’s $40 million. You can have Bradley Whitford starring in it, [who is] gonna cost you $150,000 an episode. That’s $1.5 million of your $40 million. Or you could spend another $3.5 million [to get Costner] on what will end up being a $60 million investment by the time you market and promote it. You can either spend $60 [million] and have the Bradley Whitford show, or $63.5 [million] and have the Kevin Costner show. It makes a lot of sense when you look at it that way.”

With all due apologies to Bradley Whitford, I found this thought experiment fascinating, and not just for the reasons that the agent presumably shared it. It implies, for one thing, that television—which is often said to be overtaking Hollywood in terms of quality—is becoming more like feature filmmaking in another respect: it’s the last refuge of the traditional star. We frequently hear that movie stardom is dead and that audiences are drawn more to franchises than to recognizable faces, so the fact that cable and streaming networks seem intensely interested in signing film stars, in a post-True Detective world, implies that their model is different. Some of it may be due to the fact, as William Goldman once said, that no studio executive ever got fired for hiring a movie star: as the new platforms fight to establish themselves, it makes sense that they’d fall back on the idea of star power, which is one of the few things that corporate storytelling has ever been able to quantify or understand. It may also be because the marketing strategy for television inherently differs from that for film: an online series is unusually dependent on media coverage to stand out from the pack, and signing a star always generates headlines. Or at least it once did. (The Vulture article notes that Woody Allen’s new series for Amazon “may end up marking peak Peak TV,” and it seems a lot like a deal that was made for the sake of the coverage it would produce.)

Kevin Costner in JFK

But the most plausible explanation lies in simple economics. As the article explains, Netflix and the other streaming companies operate according to a “cost-plus” model: “Rather than holding out the promise of syndication gold, the company instead pays its studio and showrunner talent a guaranteed up-front profit—typically twenty or thirty percent above what it takes to make a show. In exchange, it owns all or most of the rights to distribute the show, domestically and internationally.” This limits the initial risk to the studio, but also the potential upside: nobody involved in producing the show itself will see any money on the back end. In addition, it means that even the lead actors of the series are paid a flat dollar amount, which makes them a more attractive investment than they might be for a movie. Most of the major stars in Hollywood earn gross points, which means that they get a cut of the box office receipts before the film turns a profit—a “first dollar” deal that makes the mathematics of breaking even much more complicated. The thought experiment about Bradley Whitford and Kevin Costner only makes sense if you can get Costner at a fixed salary per episode. In other words, movie stars are being actively courted by television because its model is a throwback to an earlier era, when actors were held under contract by a studio without any profit participation, and before stars and their agents negotiated better deals that ended up undermining the economic basis of the star system entirely.

And it’s revealing that Costner, of all actors, appears in this example. His name came up mostly because multiple sources told Vulture that he was offered $500,000 per episode to star in a streaming series: “He passed,” the article says, “but industry insiders predict he’ll eventually say ‘yes’ to the right offer.” But he also resonates because he stands for a kind of movie stardom that was already on the wane when he first became famous. It has something to do with the quintessentially American roles that he liked to play—even JFK is starting to seem like the last great national epic—and an aura that somehow kept him in leading parts two decades after his career as a major star was essentially over. That’s weirdly impressive in itself, and it testifies to how intriguing a figure he remains, even if audiences aren’t likely to pay to see him in a movie. Whenever I think of Costner, I remember what the studio executive Mike Medavoy once claimed to have told him right at the beginning of his career:

“You know,” I said to him over lunch, “I have this sense that I’m sitting here with someone who is going to become a great big star. You’re going to want to direct your own movies, produce your own movies, and you’re going to end up leaving your wife and going through the whole Hollywood movie-star cycle.”

Costner did, in fact, end up leaving his first wife. And if he also leaves film for television, even temporarily, it may reveal that “the whole Hollywood movie-star cycle” has a surprising final act that few of us could have anticipated.

Written by nevalalee

May 27, 2016 at 9:03 am

“Asthana glanced over at the television…”

leave a comment »

"A woman was standing just over his shoulder..."

Note: This post is the eighteenth installment in my author’s commentary for Eternal Empire, covering Chapter 19. You can read the previous installments here.

A quarter of a century ago, I read a story about the actor Art Carney, possibly apocryphal, that I’ve never forgotten. Here’s the version told by the stage and television actress Patricia Wilson:

During a live performance of the original Honeymooners, before millions of viewers, Jackie [Gleason] was late making an entrance into a scene. He left Art Carney onstage alone, in the familiar seedy apartment set of Alice and Ralph Kramden. Unflappable, Carney improvised action for Ed Norton. He looked around, scratched himself, then went to the Kramden refrigerator and peered in. He pulled out an orange, shuffled to the table, and sat down and peeled it. Meanwhile frantic stage managers raced to find Jackie. Art Carney sat onstage peeling and eating an orange, and the audience convulsed with laughter.

According to some accounts, Carney stretched the bit of business out for a full two minutes before Gleason finally appeared. And while it certainly speaks to Carney’s ingenuity and resourcefulness, we should also take a moment to tip our hats to that humble orange, as well as the prop master who thought to stick it in the fridge—unseen and unremarked—in the first place.

Theatrical props, as all actors and directors know, can be a source of unexpected ideas, just as the physical limitations or possibilities of the set itself can provide a canvas on which the action is conceived in real time. I’ve spoken elsewhere of the ability of vaudeville comedians to improvise routines on the spot using whatever was available on a standing set, and there’s a sense in which the richness of the physical environment in which a scene takes place is a battery from which the performances can draw energy. When a director makes sure that each actor’s pockets are full of the litter that a character might actually carry, it isn’t just a mark of obsessiveness or self-indulgence, or even a nod toward authenticity, but a matter of storing up potential tools. A prop by itself can’t make a scene work, but it can provide the seed around which a memorable moment or notion can grow, like a crystal. In more situations than you might expect, creativity lies less in the ability to invent from scratch than to make effective use of whatever happens to lie at hand. Invention is a precious resource, and most artists have a finite amount of it; it’s better, whenever possible, to utilize what the world provides. And much of the time, when you’re faced with a hard problem to solve, you’ll find that the answer is right there in the background.

"Asthana glanced over at the television..."

This is as true of writing fiction as of any of the performing arts. In the past, I’ve suggested that this is the true purpose of research or location work: it isn’t about accuracy, but about providing raw material for dreams, and any writer faced with the difficult task of inventing a scene would be wise to exploit what already exists. It’s infinitely easier to write a chase scene, for example, if you’re tailoring it to the geography of a particular street. As usual, it comes back to the problem of making choices: the more tangible or physical the constraints, the more likely they’ll generate something interesting when they collide with the fundamentally abstract process of plotting. Even if the scene I’m writing takes place somewhere wholly imaginary, I’ll treat it as if it were being shot on location: I’ll pick a real building or locale that has the qualities I need for the story, pore over blueprints and maps, and depart from the real plan only when I don’t have any alternative. In most cases, the cost of that departure, in terms of the confusion it creates, is far greater than the time and energy required to make the story fit within an existing structure. For much the same reason, I try to utilize the props and furniture you’d naturally find there. And that’s all the more true when a scene occurs in a verifiable place.

Sometimes, this kind of attention to detail can result in surprising resonances. There’s a small example that I like in Chapter 19 of Eternal Empire. Rogozin, my accused intelligence agent, is being held without charges at a detention center in Paddington Green. This is a real location, and its physical setup becomes very important: Rogozin is going to be killed, in an apparent suicide, under conditions of heavy security. To prepare these scenes, I collected reference photographs, studied published descriptions, and shaped the action as much as possible to unfold logically under the constraints the location imposed. And one fact caught my eye, purely as a matter of atmosphere: the cells at Paddington Green are equipped with televisions, usually set to play something innocuous, like a nature video. This had obvious potential as a counterpoint to the action, so I went to work looking for a real video that might play there. And after a bit of searching, I hit on a segment from the BBC series Life in the Undergrowth, narrated by David Attenborough, about the curious life cycle of the gall wasp. The phenomenon it described, as an invading wasp burrows into the gall created by another, happened to coincide well—perhaps too well—with the story itself. As far as I’m concerned, it’s what makes Rogozin’s death scene work. And while I could have made up my own video to suit the situation, it seemed better, and easier, to poke around the stage first to see what I could find…

Written by nevalalee

May 7, 2015 at 9:11 am

The unbreakable television formula

leave a comment »

Ellie Kemper in Unbreakable Kimmy Schmidt

Watching the sixth season premiere of Community last night on Yahoo—which is a statement that would have once seemed like a joke in itself—I was struck by the range of television comedy we have at our disposal these days. We’ve said goodbye to Parks and Recreation, we’re following Community into what is presumably its final stretch, and we’re about to greet Unbreakable Kimmy Schmidt as it starts what looks to be a powerhouse run on Netflix. These shows are superficially in the same genre: they’re single-camera sitcoms that freely grant themselves elaborate sight gags and excursions into surrealism, with a cutaway style that owes as much to The Simpsons as to Arrested Development. Yet they’re palpably different in tone. Parks and Rec was the ultimate refinement of the mockumentary style, with talking heads and reality show techniques used to flesh out a narrative of underlying sweetness; Community, as always, alternates between obsessively detailed fantasy and a comic strip version of emotions to which we can all relate; and Kimmy Schmidt takes place in what I can only call Tina Fey territory, with a barrage of throwaway jokes and non sequiturs designed to be referenced and quoted forever.

And the diversity of approach we see in these three comedies makes the dramatic genre seem impoverished. Most television dramas are still basically linear; they’re told using the same familiar grammar of establishing shots, medium shots, and closeups; and they’re paced in similar ways. If you were to break down an episode by shot length and type, or chart the transitions between scenes, an installment of Game of Thrones would look a lot on paper like one of Mad Men. There’s room for individual quirks of style, of course: the handheld cinematography favored by procedurals has a different feel from the clinical, detached camera movements of House of Cards. And every now and then, we get a scene—like the epic tracking shot during the raid in True Detective—that awakens us to the medium’s potential. But the fact that such moments are striking enough to inspire think pieces the next day only points to how rare they are. Dramas are just less inclined to take big risks of structure and tone, and when they do, they’re likely to be hybrids. Shows like Fargo or Breaking Bad are able to push the envelope precisely because they have a touch of black comedy in their blood, as if that were the secret ingredient that allowed for greater formal daring.

Jon Hamm on Mad Men

It isn’t hard to pin down the reason for this. A cutaway scene or extended homage naturally takes us out of the story for a second, and comedy, which is inherently more anarchic, has trained us to roll with it. We’re better at accepting artifice in comic settings, since we aren’t taking the story quite as seriously: whatever plot exists is tacitly understood to be a medium for the delivery of jokes. Which isn’t to say that we can’t care deeply about these characters; if anything, our feelings for them are strengthened because they take place in a stylized world that allows free play for the emotions. Yet this is also something that comedy had to teach us. It can be fun to watch a sitcom push the limits of plausibility to the breaking point, but if a drama deliberately undermines its own illusion of reality, we can feel cheated. Dramas that constantly draw attention to their own artifice, as Twin Peaks did, are more likely to become cult favorites than popular successes, since most of us just want to sit back and watch a story that presents itself using the narrative language we know. (Which, to be fair, is true of comedies as well: the three sitcoms I’ve mentioned above, taken together, have a fraction of the audience of something like The Big Bang Theory.)

In part, it’s a problem of definition. When a drama pushes against its constraints, we feel more comfortable referring to it as something else: Orange is the New Black, which tests its structure as adventurously as any series on the air today, has suffered at awards season from its resistance to easy categorization. But what’s really funny is that comedy escaped from its old formulas by appropriating the tools that dramas had been using for years. The three-camera sitcom—which has been responsible for countless masterpieces of its own—made radical shifts of tone and location hard to achieve, and once comedies liberated themselves from the obligation to unfold as if for a live audience, they could indulge in extended riffs and flights of imagination that were impossible before. It’s the kind of freedom that dramas, in theory, have always had, even if they utilize it only rarely. This isn’t to say that a uniformity of approach is a bad thing: the standard narrative grammar evolved for a reason, and if it gives us compelling characters with a maximum of transparency, that’s all for the better. Telling good stories is hard enough as it is, and formal experimentation for its own sake can be a trap in itself. Yet we’re still living in a world with countless ways of being funny, and only one way, within a narrow range of variations, of being serious. And that’s no laughing matter.

The crowded circle of television

with 2 comments

The cast of Mad Men

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What’s your favorite TV show of the year so far?”

There are times when watching television can start to feel like a second job—a pleasurable one, to be sure, but one that demands a lot of work nevertheless. Over the last year, I’ve followed more shows than ever, including Mad Men, Game of Thrones, Orange is the New Black, Hannibal, Community, Parks and Recreation, House of Cards, The Vampire Diaries, and True Detective. For the most part, they’ve all had strong runs, and I’d have trouble picking a favorite. (If pressed, I’d probably go with Mad Men, if only for old times’ sake, with Hannibal as a very close second.) They’re all strikingly different in emphasis, tone, and setting, but they also have a lot in common. With one exception, which I’ll get to in a moment, these are dense shows with large casts and intricate storylines. Many seem devoted to pushing the limits of how much complexity can be accommodated within the constraints of the television format, which may be why the majority run for just ten to thirteen episodes: it’s hard to imagine that level of energy sustained over twenty or more installments.

And while I’m thrilled by the level of ambition visible here, it comes at a price. There’s a sort of arms race taking place between media of all kinds, as they compete to stand out in an increasingly crowded space with so much competing for our attention. Books, even literary novels, are expected to be page-turners; movies offer up massive spectacle to the point where miraculous visual effects are taken for granted; and television has taken to packing every minute of narrative time to the bursting point. (This isn’t true of all shows, of course—a lot of television series are still designed to play comfortably in the background of a hotel room—but it’s generally the case with prestige shows that end up on critics’ lists and honored at award ceremonies.) This trend toward complexity arises from a confluence of factors I’ve tried to unpack here before: just as The Simpsons was the first freeze-frame sitcom, modern television takes advantage of our streaming and binge-watching habits to deliver storytelling that rewards, and even demands, close attention.

Matthew McConaughey on True Detective

For the most part, this is a positive development. Yet there’s also a case to be made that television, which is so good at managing extended narratives and enormous casts of characters, is also uniquely suited for the opposite: silence, emptiness, and contemplation. In a film, time is a precious commodity, and when you’re introducing characters while also setting in motion the machinery of a complicated story, there often isn’t time to pause. Television, in theory, should be able to stretch out a little, interspersing relentless forward momentum with moments of quiet, which are often necessary for viewers to consolidate and process what they’ve seen. Twin Peaks was as crowded and plotty as any show on the air today, but it also found time for stretches of weird, inexplicable inaction, and it’s those scenes that I remember best. Even in the series finale, with so many threads to address and only forty minutes to cover them all, it devotes endless minutes to Cooper’s hallucinatory—and almost entirely static—ordeal in the Black Lodge, and even to a gag involving a decrepit bank manager rising from his desk and crossing the floor of his branch very, very slowly.

So while there’s a lot of fun to be had with shows that constantly accelerate the narrative pace, it can also be a limitation, especially when it’s handled less than fluently. (For every show, like Orange is the New Black, that manages to cut expertly between subplots, there’s another, like Game of Thrones, that can’t quite seem to handle its enormous scope, and even The Vampire Diaries is showing signs of strain.) Both Hannibal and Mad Men know when to linger on an image or revelation—roughly half of Hannibal is devoted to contemplating its other half—and True Detective, in particular, seemed to consist almost entirely of such pauses. We remember such high points as the final chase with the killer or the raid in “Who Goes There,” but what made the show special were the scenes in which nothing much seemed to be happening. It was aided in this by its limited cast and its tight focus on its two leads, so it’s possible that what shows really need to slow things down are a couple of movie stars to hold the eye. But it’s a step in the right direction. If time is a flat circle, as Rust says, so is television, and it’s good to see it coming back around.

The dreamlife of television

with one comment

Aaron Paul on Breaking Bad

I’ve been dreaming a lot about Breaking Bad. On Wednesday, my wife and I returned from a trip to Barcelona, where we’d spent a beautiful week: my baby daughter was perfectly happy to be toted around various restaurants, cultural sites, and the Sagrada Familia, and it came as a welcome break from my own work. Unfortunately, it also meant that we were going to miss the Breaking Bad finale, which aired the Sunday before we came home. For a while, I seriously considered bringing my laptop and downloading it while we were out of the country, both because I was enormously anxious to see how the show turned out and because I dreaded the spoilers I’d have to avoid for the three days before we returned. In the end, I gritted my teeth and decided to wait until we got home. This meant avoiding most of my favorite news and pop cultural sites—I was afraid to even glance past the top few headlines on the New York Times—and staying off Twitter entirely, which I suppose wasn’t such a great loss. And even as we toured the Picasso Museum and walked for miles along the marina with a baby in tow, my thoughts were rarely very far from Walter White.

This must have done quite a number on my psyche, because I started dreaming about the show with alarming frequency. My dreams included two separate, highly elaborated versions of the finale, one of which was a straightforward bloodbath with a quiet epilogue, the other a weird metafictional conclusion in which the events of the series were played out on a movie screen with the cast and crew watching them unfold—which led me to exclaim, while still dreaming: “Of course that’s how they would end it!” Now that I’ve finally seen the real finale, the details of these dreams are fading, and only a few scraps of imagery remain. Yet the memories are still emotionally charged, and they undoubtedly affected how I approached the last episode itself, which I was afraid would never live up to the versions I’d dreamed for myself. I suspect that a lot of fans, even those who didn’t actually hallucinate alternate endings, probably felt the same way. (For the record, I liked the finale a lot, even if it ranks a notch below the best episodes of the show, which was always best at creating chaos, not resolving it. And I think about its closing moments almost every day.)

Jon Hamm on Mad Men

And it made me reflect on the ways in which television, especially in its modern, highly serialized form, is so conducive to dreaming. Dreams are a way of assembling and processing fragments of the day’s experience, or recollections from the distant past, and a great television series is nothing less than a vast storehouse of memories from another life. When a show is as intensely serialized as Breaking Bad was, it can be hard to remember individual episodes, aside from the occasional formal standout like “Fly”: I can’t always recall what scenes took place when, or in what order, and an especially charged sequence of installments—like the last half of this final season—tends to blend together into a blur of vivid impressions. What I remember are facial expressions, images, bits of dialogue: “Stay out of my territory.” “Run.” “Tread lightly.” And the result is a mine of moments that end up naturally incorporated into my own subconscious. A good movie or novel exists as a piece, and I rarely find myself dreaming alternate lives for, say, Rick and Ilsa or Charles Foster Kane. With Walter White, it’s easy to imagine different paths that the action could have taken, and those byways play themselves out in the deepest parts of my brain.

Which may explain why television is so naturally drawn to dream sequences and fantasies, which are only one step removed from the supposedly factual events of the shows themselves. Don Draper’s dreams have become a huge part of Mad Men, almost to the point of parody, and this has always been an art form that attracts surreal temperaments, from David Lynch to Bryan Fuller, even if they tend to be destroyed by it. As I’ve often said before, it’s the strangest medium I know, and at its best, it’s the outcome of many unresolved tensions. Television can feel maddeningly real, a hidden part of your own life, which is why it can be so hard to say goodbye to a great show. It’s also impossible to get a lasting grip on it or to hold it all in your mind at once, especially if it runs for more than a few seasons, which hints at an even deeper meaning. I’ve always been struck by how poorly we integrate the different chapters in our own past: there are entire decades of my life that I don’t think about for months on end. When they return, it’s usually in the hours just before waking. And by teaching us to process narratives that can last for years, it’s possible that television subtly trains us to better understand the shapes of our own lives, even if it’s only in dreams.

Written by nevalalee

October 7, 2013 at 8:27 am

Posted in Television

Tagged with ,

Critical television studies

with 4 comments

The cast of Community

Television is such a pervasive medium that it’s easy to forget how deeply strange it is. Most works of art are designed to be consumed all at once, or at least in a fixed period of time—it’s physically possible, if not entirely advisable, to read War and Peace in one sitting. Television, by contrast, is defined by the fact of its indefinite duration. House of Cards aside, it seems likely that most of us will continue to watch shows week by week, year after year, until they become a part of our lives. This kind of extended narrative can be delightful, but it’s also subject to risk. A beloved show can change for reasons beyond anyone’s control. Sooner or later, we find out who killed Laura Palmer. An actor’s contract expires, so Mulder is abducted by aliens, and even if he comes back, by that point, we’ve lost interest. For every show like Breaking Bad that has its dark evolution mapped out for seasons to come, there’s a series like Glee, which disappoints, or Parks and Recreation, which gradually reveals a richness and warmth that you’d never guess from the first season alone. And sometimes a show breaks your heart.

It’s clear at this point that the firing of Dan Harmon from Community was the most dramatic creative upheaval for any show in recent memory. This isn’t the first time that a show’s guiding force has departed under less than amicable terms—just ask Frank Darabont—but it’s unusual in a series so intimately linked to one man’s particular vision. Before I discovered Community, I’d never heard of Dan Harmon, but now I care deeply about what this guy feels and thinks. (Luckily, he’s never been shy about sharing this with the rest of us.) And although it’s obvious from the opening minutes of last night’s season premiere that the show’s new creative team takes its legacy seriously, there’s no escaping the sense that they’re a cover band doing a great job with somebody else’s music. Showrunners David Guarascio and Moses Port do their best to convince us out of the gate that they know how much this show means to us, and that’s part of the problem. Community was never a show about reassuring us that things won’t change, but about unsettling us with its endless transformations, even as it delighted us with its new tricks.

The Community episode "Remedial Chaos Theory"

Don’t get me wrong: I laughed a lot at last night’s episode, and I was overjoyed to see these characters again. By faulting the new staff for repeating the same beats I loved before, when I might have been outraged by any major alterations, I’m setting it up so they just can’t win. But the show seems familiar now in a way that would have seemed unthinkable for most of its first three seasons. Part of the pleasure of watching the series came from the fact that you never knew what the hell might happen next, and it wasn’t clear if Harmon knew either. Not all of his experiments worked: there even some clunkers, like “Messianic Myths and Ancient Peoples,” in the glorious second season, which is one of my favorite runs of any modern sitcom. But as strange as this might have once seemed, it feels like we finally know what Community is about. It’s a show that takes big formal risks, finds the emotional core in a flurry of pop culture references, and has no idea how to use Chevy Chase. And although I’m grateful that this version of the show has survived, I don’t think I’m going to tune in every week wondering where in the world it will take me.

And the strange thing is that Community might have gone down this path with or without Harmon. When a show needs only two seasons to establish that anything is possible, even the most outlandish developments can seem like variations on a theme. Even at the end of the third season, there was the sense that the series was repeating itself. I loved “Digital Estate Planning,” for instance, but it felt like the latest attempt to do one of the formally ambitious episodes that crop up at regular intervals each season, rather than an idea that forced itself onto television because the writers couldn’t help themselves. In my review of The Master, I noted that Paul Thomas Anderson has perfected his brand of hermetic filmmaking to the point where it would be more surprising if he made a movie that wasn’t ambiguous, frustrating, and deeply weird. Community has ended up in much the same place, so maybe it’s best that Harmon got out when he did. It’s doubtful that the series will ever be able to fake us out with a “Critical Film Studies” again, because it’s already schooled us, like all great shows, in how it needs to be watched. And although its characters haven’t graduated from Greendale yet, its viewers, to their everlasting benefit, already have.

Written by nevalalee

February 8, 2013 at 9:50 am

Wouldn’t it be easier to write for television?

leave a comment »

Last week, I had dinner with a college friend I hadn’t seen in years, who is thinking about giving up a PhD in psychology to write for television in Los Angeles. We spent a long time commiserating about the challenges of the medium, at least from a writer’s point of view, hitting many of the points that I’ve discussed here before. With the prospects of a fledgling television show so uncertain, I said, especially when the show might be canceled after four episodes, or fourteen, or forty, it’s all but impossible for the creator to tell effective stories over time. Running a television show is one of the hardest jobs in the world, with countless obstacles along the way, even for critical darlings. Knowing all this, I asked my friend, why did he want to do this in the first place?

My friend’s response was an enlightening one. The trouble with writing novels or short stories, he said, is the fact that the author is expected to spend a great deal of time on description, style, and other tedious elements that a television writer can cheerfully ignore. Teleplays, like feature scripts, are nothing but structure and dialogue (or maybe just structure, as William Goldman says), and there’s something liberating in how they strip storytelling down to its core. The writer takes care of the bones of the narrative, which is where his primary interest presumably lies, then outsources the work of casting, staging, and art direction to qualified professionals who are happy to do the work. And while I didn’t agree with everything my friend said, I could certainly see his point.

Yet that’s only half of the story. It’s true that a screenwriter gets to outsource much of the conventional apparatus of fiction to other departments, but only at the price of creative control. You may have an idea about how a character should look, or what kind of home he should have, or how a moment of dialogue, a scene, or an overall story should unfold, but as a writer, you don’t have much control over the matter. Scripts are easier to write than novels for a reason: they’re only one piece of a larger enterprise, which is reflected in the writer’s relative powerlessness. The closest equivalent to a novelist in television isn’t the writer, but the executive producer. Gene Roddenberry, in The Making of Star Trek, neatly sums up the similarity between the two roles:

Producing in television is like storytelling. The choice of the actor, picking the right costumes, getting the right flavor, the right pace—these are as much a part of storytelling as writing out that same description of a character in a novel.

And the crucial point about producing a television series, like directing a feature film, is that it’s insanely hard. As Thomas Lennon and Robert Ben Garant point out in their surprisingly useful Writing Movies for Fun and Profit, as far as directing is concerned, “If you’re doing it right, it’s not that fun.” As a feature director or television producer, you’re responsible for a thousand small but critical decisions that need to be made very quickly, and while you’re working on the story, you’re also casting parts, scouting for locations, dealing with the studio and the heads of various departments, and surviving on only a few hours of sleep a night, for a year or more of your life. In short, the amount of effort required to keep control of the project is greater, not less, than what is required to write a novel—except with more money on the line, in public, and with greater risk that control will eventually be taken away from you.

So it easier to write for television? Yes, if that’s all you want to do. But if you want control of your work, if you want your stories to be experienced in a form close to what you originally envisioned, it isn’t easier. It’s much harder. Which is why, to my mind, John Irving still puts it best: “When I feel like being a director, I write a novel.”

Lessons from great (and not-so-great) television

with one comment

It can be hard for a writer to admit being influenced by television. In On Becoming a Novelist, John Gardner struck a disdainful note that hasn’t changed much since:

Much of the dialogue one encounters in student fiction, as well as plot, gesture, even setting, comes not from life but from life filtered through TV. Many student writers seem unable to tell their own most important stories—the death of a father, the first disillusionment in love—except in the molds and formulas of TV. One can spot the difference at once because TV is of necessity—given its commercial pressures—false to life.

In the nearly thirty years since Gardner wrote these words, the television landscape has changed dramatically, but it’s worth pointing out that much of what he says here is still true. The basic elements of fiction—emotion, character, theme, even plot—need to come from close observation of life, or even the most skillful novel will eventually ring false. That said, the structure of fiction, and the author’s understanding of the possibilities of the form, doesn’t need to come from life alone, and probably shouldn’t. To develop a sense of what fiction can do, a writer needs to pay close attention to all types of art, even the nonliterary kind. And over the past few decades, television has expanded the possibilities of narrative in ways that no writer can afford to ignore.

If you think I’m exaggerating, consider a show like The Wire, which tells complex stories involving a vast range of characters, locations, and social issues in ways that aren’t possible in any other medium. The Simpsons, at least in its classic seasons, acquired a richness and velocity that continued to build for years, until it had populated a world that rivaled the real one for density and immediacy. (Like the rest of the Internet, I respond to most situations with a Simpsons quote.) And Mad Men continues to furnish a fictional world of astonishing detail and charm. World-building, it seems, is where television shines: in creating a long-form narrative that begins with a core group of characters and explores them for years, until they can come to seem as real as one’s own family and friends.

Which is why Glee can seem like such a disappointment. Perhaps because the musical is already the archest of genres, the show has always regarded its own medium with an air of detachment, as if the conventions of the after-school special or the high school sitcom were merely a sandbox in which the producers could play. On some level, this is fine: The Simpsons, among many other great shows, has fruitfully treated television as a place for narrative experimentation. But by turning its back on character continuity and refusing to follow any plot for more than a few episodes, Glee is abandoning many of the pleasures that narrative television can provide. Watching the show run out of ideas for its lead characters in less than two seasons simply serves as a reminder of how challenging this kind of storytelling can be.

Mad Men, by contrast, not only gives us characters who take on lives of their own, but consistently lives up to those characters in its acting, writing, and direction. (This is in stark contrast to Glee, where I sense that a lot of the real action is taking place in fanfic.) And its example has changed the way I write. My first novel tells a complicated story with a fairly controlled cast of characters, but Mad Men—in particular, the spellbinding convergence of plots in “Shut the Door, Have a Seat”—reminded me of the possibilities of expansive casts, which allows characters to pair off and develop in unexpected ways. (The evolution of Christina Hendricks’s Joan from eye candy to second lead is only the most obvious example.) As a result, I’ve tried to cast a wider net with my second novel, using more characters and settings in the hopes that something unusual will arise. Television, strangely, has made me more ambitious. I’d like to think that even John Gardner would approve.

Written by nevalalee

March 17, 2011 at 8:41 am

The faults in our stars

leave a comment »

In his wonderful conversational autobiography Cavett, Dick Cavett is asked about his relationship with Johnny Carson, for whom he served as a writer on The Tonight Show. Cavett replies:

I did work for Carson. We didn’t go fishing together on weekends, and I never slept over at his house, the two of us lying awake in our jammies eating the fudge we had made together, talking of our dreams and hopes and fears. But I found him to be cordial and businesslike, and to have himself well in hand as far as the show as concerned…He is not a man who seems to seek close buddies, and, if he were, the staff of his own television show would not be the ideal place to seek them.

It’s a memorable passage, especially the last line, which seems particularly relevant at a time when our talk show hosts seem eager to seem accessible to everybody, and to depict their writing staffs as one big happy family. When asked to comment on the widespread notion that Carson was “cold,” Cavett responds:

I know very little about Johnny’s personal relationships. I have heard that he has been manipulated and screwed more than once by trusted associates, to the point where he is defensively wary to what some find an excessive degree. I see this as a perfectly reasonable response. It is, I suppose, the sort of thing that happens to a person in show business that makes his former friends say, with heavy disapprobation, “Boy, has he changed.”

Cavett could easily let the subject rest there, but something in the question seems to stick in his mind, and he continues:

While I’m at it, I’ll do a short cadenza on the subject of changing. If you are going to survive in show business, the chances are you are going to change or be changed. Whatever your reasons for going into the business, it is safe to admit they form a mixture of talent, ambition, and neurosis. If you are going to succeed and remain successful, you are going to do it at the expense of a number of people who are clamoring to climb the same rope you are climbing. When you suddenly acquire money, hangers-on, well-wishers, and ill-wishers; when you need to make baffling decisions quickly, to do too much in too little time, to try to lead a personal and a professional life when you can’t seem to find the time for either; when you have to kick some people’s fannies and kiss others’ to get to the point where you won’t need to do either any more; when you have to sort out conflicting advice, distinguish between the treacherous and the faithful or the competent and the merely aggressive, suffer fools when time is short and incompetents when you are in a pinch; and when you add to this the one thing that you don’t get in other professions—the need to be constantly fresh and presentable and at your best just at the times when you are bone-weary, snappish, and depressed; when all these things apply, it is possible that you are going to be altered, changed, and sometimes for the worse.

This is one of the best things I’ve ever read about show business, and if anything, it feels even more insightful today, when we collectively have so much invested in the idea that stars have inner lives that are more or less like our own.

It’s often been said that the reason that television actors have trouble crossing over to the movies is that we expect different things from our stars in either medium. One requires a personality that is larger than life, which allows it to survive being projected onto an enormous screen in a darkened theater; the other is a person whom we’d feel comfortable inviting on a regular basis into our living rooms. If that’s true of scripted television that airs once a week, it’s even more true of the talk shows that we’re expected to watch every night. And now that the online content created by such series has become so central to their success, we’re rapidly approaching this trend’s logical culmination: a talk show host has to be someone whose face we’d be comfortable seeing anywhere, at any time. This doesn’t just apply to television, either. As social media is increasingly called upon to supplement the marketing budgets of big movies, actors are obliged to make themselves accessible—on Twitter, on Instagram, as good sports on Saturday Night Live and in viral videos—to an extent that a star of the old studio system of the forties would have found utterly baffling. Deadline’s writeup of Alien: Covenant is typical:

RelishMix…assessed that Alien: Covenant has a strong social media universe…spread across Twitter, Facebook, YouTube views and Instagram followers…The company also adds that Covenant was challenged by a generally inactive cast, with Empire’s Jussie Smollett being the most popular activated star. Danny McBride across Twitter, Instagram and Facebook counts over 250,000. Michael Fassbender is not socially active.

I love the implication that stars these days need to be “activated,” like cell phones, to be fully functional, as well as the tone of disapproval at the fact that Michael Fassbender isn’t socially active. It’s hard to imagine how that would even look: Fassbender’s appeal as an actor emerges largely from his slight sense of reserve, even in showy parts. But in today’s climate, you could also argue that this has hampered his rise as a star.

And Cavett’s cadenza on change gets at an inherent tension in the way we see our stars, which may not be entirely sustainable. In The Way of the Gun, written and directed by Christopher McQuarrie, who knows more than anyone about survival in Hollywood, the character played by James Caan says: “The only thing you can assume about a broken-down old man is that he’s a survivor.” Similarly, the only thing you can assume about a movie star, talk show host, or any other figure in show business whose face you recognize is that he or she possesses superhuman levels of ambition. Luck obviously plays a large role in success, as does talent, but both require a preternatural drive, which is the matrix in which giftedness and good fortune have a chance to do their work. Ambition may not be sufficient, but it’s certainly necessary. Yet we persist in believing that stars are accessible and ordinary, when, by definition, they can hardly be other than extraordinary. It’s a myth that emerges from the structural assumptions of social media, a potent advertising tool that demands a kind of perceptual leveling to be effective. I was recently delighted to learn that the notorious feature “Stars—They’re Just Like Us!” originated when the editors at Us Magazine had to figure out how to use the cheap paparazzi shots that they could afford to buy on their tiny budget, like a picture of Drew Barrymore picking up a penny. Social media works in much the same way. It creates an illusion of intimacy that is as false as the airbrushed images of the movie stars of Hollywood’s golden age, and it deprives us of some of the distance required for dreams. Whether or not they want to admit it, stars, unlike the rich, truly are different. And I’ll let Cavett have the last word:

Unless you are one of these serene, saintly individuals about whom it can be truly said, “He or she hasn’t changed one bit from the day I knew them in the old house at Elm Street.” This is true mostly of those who have found others to do their dirty work for them. All I’m saying is that your demands and needs change, and if you don’t change with them you don’t survive.

Written by nevalalee

May 24, 2017 at 9:53 am

Quote of the Day

leave a comment »

The word “show” suggests that you’re revealing something. It doesn’t suggest finding. And because I do what I do every day, I have to make sure that the showing of things is in itself the seeking for things.

Es Devlin, on the television series Abstract

Written by nevalalee

May 24, 2017 at 7:30 am

Posted in Quote of the Day, Theater

Tagged with ,

The darkness of future past

leave a comment »

Note: Spoilers follow for the first two episodes of the third season of Twin Peaks.

“Is it future, or is it past?” Mike, the one-armed man, asks Cooper in the Black Lodge. During the premiere of the belated third season of Twin Peaks, there are times when it seems to be both at once. We often seem to be in familiar territory, and the twinge of recognition that it provokes has a way of alerting us to aspects of the original that we may have overlooked. When two new characters, played appealingly—and altogether too briefly—by Ben Rosenfield and Madeline Zima, engage in an oddly uninflected conversation, it’s a reminder of the appealingly flat tone that David Lynch likes to elicit from his actors, who sometimes seem to be reading their lines phonetically, like the kids in a Peanuts cartoon. It isn’t bad or amateurish acting, but an indication that even the performers aren’t entirely sure what they’re doing there. In recent years, accomplished imitators from Fargo to Legion have drawn on Lynch’s style, but they’re fully conscious of it, and we’re aware of the technical trickery of such players as Ewan McGregor or Dan Stevens. In Lynch’s best works, there’s never a sense that anyone involved is standing above or apart from the material. (The major exceptions are Dennis Hopper and Dean Stockwell in Blue Velvet, who disrupt the proceedings with their own brand of strangeness, and, eerily, Robert Blake in Lost Highway.) The show’s original cast included a few artful performers, notably Ray Wise and the late Miguel Ferrer, but most of the actors were endearingly unaffected. They were innocents. And innocence is a quality that we haven’t seen on television in a long time.

Yet it doesn’t take long to realize that some things have also changed. There’s the heightened level of sex and gore, which reflects the same kind of liberation from the standards of network television that made parts of Fire Walk With Me so difficult to watch. (I’d be tempted to observe that its violence against women is airing at a moment in which such scenes are likely to be intensely scrutinized, if it weren’t for the fact that Lynch has been making people uncomfortable in that regard for over thirty years.) The show is also premiering in an era in which every aspect of it will inevitably be picked apart in real time on social media, which strikes me as a diminished way of experiencing it. Its initial run obviously prompted plenty of theorizing around the nation’s water coolers, but if there’s anything that Twin Peaks has taught us, it’s that the clues are not what they seem. Lynch is a director who starts with a handful of intuitive images that are potent in themselves—an empty glass cube, a severed head, a talking tree. You could call them dreamlike, or the fruits of the unconscious, or the products, to use a slightly dated term, of the right hemisphere of the brain. Later on, the left hemisphere, which is widely but misleadingly associated with Lynch’s collaborator Mark Frost, circles back and tries to impose meaning on those symbols, but these readings are never entirely convincing. Decades ago, when the show tried to turn Cooper’s dream of the Black Lodge into a rebus for the killer’s identity, you could sense that it was straining. There isn’t always a deeper answer to be found, aside from the power of those pictures, which should be deep enough in itself.

As a result, I expect to avoid reading most reviews or analysis, at least until the season is over. Elements that seem inexplicable now may or may not pay off, but the series deserves the benefit of the doubt. This isn’t to say that what we’ve seen so far has been perfect: Twin Peaks, whatever else it may have been, was never a flawless show. Kyle MacLachlan has been as important to my inner life as any actor, but I’m not sure whether he has the range to convincingly portray Dark Cooper. He’s peerless when it comes to serving as the director’s surrogate, or a guileless ego wandering through the wilderness of the id, but he isn’t Dennis Hopper, and much of this material might have been better left to implication. Similarly, the new sequences in the Black Lodge are striking—and I’ve been waiting for them for what feels like my entire life—but they’re also allowed to run for too long. Those original scenes were so memorable that it’s easy to forget that they accounted for maybe twenty minutes, stretched across two seasons, and that imagination filled in the rest. (A screenshot of Cooper seated with the Man from Another Place was the desktop image on my computer for most of college.) If anything, the show seems almost too eager to give us more of Cooper in those iconic surroundings, and half as much would have gone a long way. In the finale of the second season, when Cooper stepped through those red curtains at last, it felt like the culmination of everything that the series had promised. Now it feels like a set where we have to linger for a while longer before the real story can begin. It’s exactly what the Man from Another Place once called it: the waiting room.

Lynch and Frost seem to be reveling in the breathing space and creative freedom that eighteen full hours on Showtime can afford, and they’ve certainly earned that right. But as I’ve noted elsewhere, Twin Peaks may have benefited from the constraints that a broadcast network imposed, just as Wild at Heart strikes me as one of the few films to have been notably improved by being edited for television. When Lynch made Blue Velvet, he and editor Duwayne Dunham, who is also editing the new season, were forced to cut the original version to the bone to meet their contractually mandated runtime, and the result was the best American movie I’ve ever seen. Lynch’s most memorable work has been forced to work within similar limitations, and I’m curious to see how it turns out when most of those barriers are removed. (I still haven’t seen any of the hours of additional footage that were recently released from Fire Walk With Me, but I wish now that I’d taken the trouble to seek them out. The prospect of viewing those lost scenes is less exciting, now that we’re being given the equivalent of a sequel that will be allowed to run for as long as it likes.) In the end, though, these are minor quibbles. When I look back at the first two seasons of Twin Peaks, I’m startled to realize how little of it I remember: it comes to about three hours of unforgettable images, mostly from the episodes directed by Lynch. If the first two episodes of the new run are any indication, it’s likely to at least double that number, which makes it a good deal by any standard. Twin Peaks played a pivotal role in my own past. And I still can’t entirely believe that it’s going to be part of my future, too.

Written by nevalalee

May 23, 2017 at 10:32 am

The voice of love

leave a comment »

Industrial Symphony No. 1

Note: I can’t wait to write about the return of Twin Peaks, which already feels like the television event of my lifetime, but I won’t be able to get to it until tomorrow. In the meantime, I’m reposting my piece on the show’s indelible score, which originally appeared, in a slightly different form, on August 10, 2016.

At some point, everyone owns a copy of The Album. The title or the artist differs from one person to another, but its impact on the listener is the same: it simply alerts you to the fact that it can be worth devoting every last corner of your inner life to music, rather than treating it as a source of background noise or diversion. It’s the first album that leaves a mark on your soul. Usually, it makes an appearance as you’re entering your teens, which means that there’s as much random chance involved as in any of the other cultural influences that dig in their claws at that age. You don’t have a lot of control over what it will be. Maybe it begins with a song on the radio, or a piece of art that catches your eye at a record store, or a stab of familiarity that comes from a passing moment of exposure. (In your early teens, you’re likely to love something just because you recognize it.) Whatever it is, unlike every other album you’ve ever heard, it doesn’t let you go. It gets into your dreams. You draw pictures of the cover and pick out a few notes from it on every piano you pass. And it shapes you in ways that you can’t fully articulate. The particular album that fills that role is different for everyone, or so it seems, although logic suggests that it’s probably the same for a lot of teenagers at any given time. In fact, I think that you can draw a clear line between those for whom the Album immersed them deeply in the culture of their era and those who wound up estranged from it. I’d be a different person—and maybe a happier one—if mine had been something like Nevermind. But it wasn’t. It was the soundtrack from Twin Peaks, followed by Julee Cruise’s Floating Into the Night.

If I had been born a few years earlier, this might not have been an issue, but I happened to get seriously into Twin Peaks, or at least its score, shortly after the series itself had ceased to be a cultural phenomenon. The finale had aired two full years beforehand, and it had been followed soon thereafter, with what seems today like startling speed, by Twin Peaks: Fire Walk With Me. After that, it mostly disappeared. There wasn’t even a chance for me to belatedly get into the show itself. I’d watched some of it back when it initially ran, including the pilot and the horrifying episode in which the identity of Laura’s killer is finally revealed. The European cut of the premiere was later released on video, but aside from that, I had to get by with a few grainy episodes that my parents had recorded on VHS. It wasn’t until many years later that the first box set became available, allowing me to fully experience a show that I ultimately ended up loving, even if it was far more uneven—and often routine—than its reputation had led me to believe. But that didn’t really matter. Twin Peaks was just a television show, admittedly an exceptional one, but the score by Angelo Badalamenti was something else: a vision of a world that was complete in itself. I’d have trouble conveying exactly what it represents, except that it takes place in the liminal area where a gorgeous nightmare shades imperceptibly into the everyday. In Blue Velvet, which I still think is David Lynch’s greatest achievement, Jeffrey expresses it as simply as possible: “It’s a strange world.” But you can hear it more clearly in “Laura Palmer’s Theme,” which Badalamenti composed in response to Lynch’s instructions:

Start it off foreboding, like you’re in a dark wood, and then segue into something beautiful to reflect the trouble of a beautiful teenage girl. Then, once you’ve got that, go back and do something that’s sad and go back into that sad, foreboding darkness.

And it wasn’t until years later that they realized that the song had the visual structure of a pair of mountain peaks, arranged side by side. It’s a strange world indeed.

Soundtrack from Twin Peaks

If all forms of art, as the critic Water Pater famously observed, aspire to the condition of music, then it isn’t an exaggeration to say that Twin Peaks aspired to the sublimity of its own soundtrack. Badalamenti’s score did everything that the series itself often struggled to accomplish, and there were times when I felt that the music was the primary work, with the show as a kind of visual adjunct. I still feel that way, on some level, about Fire Walk With Me: the movie played an important role in my life, but I don’t have a lot of interest in rewatching it, while I know every note of its soundtrack by heart. And even if I grant that a score is never really complete in itself, the music of Twin Peaks pointed toward an even more intriguing artifact. It included three tracks—“The Nightingale,” “Into the Night,” and “Falling”—sung by Julee Cruise, with music by Badalamenti and lyrics by Lynch, who had earlier written her haunting song “Mysteries of Love” for Blue Velvet. I loved them all, and I can still remember the moment when a close reading of the liner notes clued me into the fact that there was an entire album by Cruise, Floating Into the Night, that I could actually own. (In fact, there were two. As it happened, my brainstorm occurred only a few months after the release of The Voice of Love, a less coherent sophomore album that I wouldn’t have missed for the world.) Listening to it for the first time, I felt like the narrator of Borges’s “Tlön, Uqbar, Orbis Tertius,” who once saw a fragment of an undiscovered country, and now found himself confronted with all of it at once. The next few years of my life were hugely eventful, as they are for every teenager. I read, did, and thought about a lot of things, some of which are paying off only now. But whatever else I was doing, I was probably listening to Floating Into the Night.

Last year, when I heard that the Twin Peaks soundtrack was coming out in a deluxe vinyl release, it filled me with mixed feelings. (Of course, I bought a copy, and so should you.) The plain fact is that toward the end of my teens, I put Badalamenti and Cruise away, and I haven’t listened to them much since. Which isn’t to say that I didn’t give them a lifetime’s worth of listening in the meantime. I became obsessed with Industrial Symphony No. 1: The Dream of the Brokenhearted, the curious performance piece, directed by Lynch, in which Cruise floats on wires high above the stage at the Brooklyn Academy of Music, not far from the neighborhood where I ended up spending most of my twenties. Much later, I saw Cruise perform, somewhat awkwardly, in person. I tracked down her collaborations and guest appearances—including the excellent “If I Survive” with Hybrid—and even bought her third album, The Art of Being a Girl, which I liked a lot. Somehow I never got around to buying the next one, though, and long before I graduated from college, Cruise and Badalamenti had all but disappeared from my personal rotation. And I regret this. I still feel that Floating Into the Night is a perfect album, although it wasn’t until years later, when I heard Cruise’s real, hilariously brassy voice in her interviews, that I realized the extent to which I’d fallen in love with an ironic simulation. There are moments when I believe, with complete seriousness, that I’d be a better person today if I’d kept listening to this music: half of my life has been spent trying to live up to the values of my early adolescence, and I might have had an easier job of integrating all of my past selves if they shared a common soundtrack. Whenever I play it now, it feels like a part of me that has been locked away, ageless and untouched, in the Black Lodge. But life has a way of coming full circle. As Laura says to Cooper: “I’ll see you again in twenty-five years. Meanwhile…” And it feels sometimes as if she were talking to me.

Hollywood in Limbo

with 2 comments

In his essay on the fourth canto of Dante’s Inferno, which describes the circle of Limbo populated by the souls of virtuous pagans, Jorge Luis Borges discusses the notion of the uncanny, which has proven elusively hard to define:

Toward the beginning of the nineteenth century, or the end of the eighteenth, certain adjectives of Saxon or Scottish origin (eerie, uncanny, weird) came into circulation in the English language, serving to define those places or things that vaguely inspire horror…In German, they are perfectly translated by the word unheimlich; in Spanish, the best word may be siniestro.

I was reminded of this passage while reading, of all things, Benjamin Wallace’s recent article in Vanity Fair on the decline of National Lampoon. It’s a great piece, and it captures the sense of uncanniness that I’ve always associated with a certain part of Hollywood. Writing of the former Lampoon head Dan Laikin, Wallace says:

Poor choice of partners proved a recurring problem. Unable to get traction with the Hollywood establishment, Laikin appeared ready to work with just about anyone. “There were those of us who’d been in the business a long time,” [development executive Randi] Siegel says, “who told him not to do business with certain people. Dan had a tendency to trust people that were probably not the best people to trust. I think he wanted to see the good in it and change things.” He didn’t necessarily have much choice. If you’re not playing in Hollywood’s big leagues, you’re playing in its minors, which teem with marginal characters…“Everyone Danny hung out with was sketchy,” says someone who did business with Laikin. Laikin, for his part, blames the milieu: “I’m telling you, I don’t surround myself with these people. I don’t search them out. They’re all over this town.”

Years ago, I attended a talk by David Mamet in which he said something that I’ve never forgotten. Everybody gets a break in Hollywood after twenty-five years, but some get it at the beginning and others at the end, and the important thing is to be the one who stays after everyone else has gone home. Wallace’s article perfectly encapsulates that quality, which I’ve always found fascinating, perhaps because I’ve never had to live with it. It results in a stratum of players in the movie and television industry who haven’t quite broken through, but also haven’t reached the point where they drop out entirely. They end up, in short, in a kind of limbo, which Borges vividly describes in the same essay:

There is is something of the oppressive wax museum about this still enclosure: Caesar, armed and idle; Lavinia, eternally seated next to her father…A much later passage of the Purgatorio adds that the shades of the poets, who are barred from writing, since they are in the Inferno, seek to distract their eternity with literary discussions.

You could say that the inhabitants of Hollywood’s fourth circle of hell, who are barred from actually making movies, seek to distract their eternity by talking about the movies that they wish they could make. It’s easy to mock them, but there’s also something weirdly ennobling about their sheer persistence. They’re survivors in a profession where few of us would have lasted, if we even had the courage to go out there in the first place, and at a time when such people seem more likely to end up at something like the Fyre Festival, it’s nice to see that they still exist in Hollywood.

So what is it about the movie industry that draws and retains such personalities? One of its most emblematic figures is Robert Towne, who, despite his Oscar for Chinatown and his reputation as the dean of American screenwriters, has spent his entire career looking like a man on the verge of his big break. If Hollywood is Limbo, Towne is its Caesar, “armed and idle,” and he’s been there for five decades. Not surprisingly, he has a lot of insight into the nature of that hell. In his interview with John Brady in The Craft of the Screenwriter, Towne says:

You are often involved with a producer who is more interested in making money on the making of the movie than he is on the releasing of the movie. There is a lot of money to be made on the production of a movie, not just in salary, but all sorts of ways that are just not altogether honest. So he’s going to make his money on the making, which is really reprehensible.

“Movies are so difficult that you should really make movies that you feel you absolutely have to make,” Towne continues—and the fact that this happens so rarely implies that the studio ecosystem is set up for something totally different. Towne adds:

It’s easier for a director and an actor to be mediocre and get away with it than it is for a writer. Even a writer who happens to be mediocre has to work pretty hard to get through a script, whereas a cameraman will say to the director, “Where do you think you want to put the camera? You want it here? All right, I’m going to put it here.” In other words, a director can be carried along by the production if he’s mediocre, to some extent; and that’s true of an actor, too.

Towne tosses off these observations without dwelling on them, knowing that there’s plenty more where they came from, but if you put them together, you end up with a pretty good explanation of why Hollywood is the way it is. It’s built to profit from the making of movies, rather than from the movies themselves, which is only logical: if it depended on success at the box office, everybody would be out of a job. The industry also has structures in place that allow people to skate by for years without any particular skills, if they manage to stick to the margins. (In any field where past success is no guarantee of future performance, it’s the tall poppies that get their heads chopped off.) Under such conditions, survival isn’t a matter of talent, but of something much less definable. A brand like National Lampoon, which has been leveled by time but retains some of its old allure, draws such people like a bright light draws fish in the abyss, and it provides a place where they can be studied. The fact that Kato Kaelin makes an appearance in these circles shouldn’t be surprising—he’s the patron saint of those who hang on for decades for no particular reason. And it’s hard not to relate to the hope that sustains them:

“What everyone always does at the company is feel like something big is about to happen, and I want to be here for it,” [creative director] Marty Dundics says. “We’re one hit movie away from, or one big thing away from, being back on top. It’s always this underdog you’re rooting for. And you don’t want to miss it. That big thing that’s about to happen. That was always the mood.”

Extend that mood across a quarter of a century, and you have Hollywood, which also struggles against the realization that Borges perceives in Limbo: “The certainty that tomorrow will be like today, which was like yesterday, which was like every day.”

The critical path

leave a comment »

Renata Adler

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on February 16, 2016.

Every few years or so, I go back and revisit Renata Adler’s famous attack in the New York Review of Books on the reputation of the film critic Pauline Kael. As a lifelong Kael fan, I don’t agree with Adler—who describes Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless”—but I respect the essay’s fire and eloquence, and it’s still a great read. What is sometimes forgotten is that Adler opens with an assault, not on Kael alone, but on the entire enterprise of professional criticism itself. Here’s what she says:

The job of the regular daily, weekly, or even monthly critic resembles the work of the serious intermittent critic, who writes only when he is asked to or genuinely moved to, in limited ways and for only a limited period of time…Normally, no art can support for long the play of a major intelligence, working flat out, on a quotidian basis. No serious critic can devote himself, frequently, exclusively, and indefinitely, to reviewing works most of which inevitably cannot bear, would even be misrepresented by, review in depth…

The simple truth—this is okay, this is not okay, this is vile, this resembles that, this is good indeed, this is unspeakable—is not a day’s work for a thinking adult. Some critics go shrill. Others go stale. A lot go simultaneously shrill and stale.

Adler concludes: “By far the most common tendency, however, is to stay put and simply to inflate, to pretend that each day’s text is after all a crisis—the most, first, best, worst, finest, meanest, deepest, etc.—to take on, since we are dealing in superlatives, one of the first, most unmistakable marks of the hack.” And I think that she has a point, even if I have to challenge a few of her assumptions. (The statement that most works of art “inevitably cannot bear, would even be misrepresented by, review in depth,” is particularly strange, with its implicit division of all artistic productions into the sheep and the goats. It also implies that it’s the obligation of the artist to provide a worthy subject for the major critic, when in fact it’s the other way around: as a critic, you prove yourself in large part through your ability to mine insight from the unlikeliest of sources.) Writing reviews on a daily or weekly basis, especially when you have a limited amount of time to absorb the work itself, lends itself inevitably to shortcuts, and you often find yourself falling back on the same stock phrases and judgments. And Adler’s warning about “dealing in superlatives” seems altogether prescient. As Keith Phipps and Tasha Robinson of The A.V. Club pointed out a few years back, the need to stand out in an ocean of competing coverage means that every topic under consideration becomes either an epic fail or an epic win: a sensible middle ground doesn’t generate page views.

Pauline Kael

But the situation, at least from Adler’s point of view, is even more dire than when she wrote this essay in the early eighties. When Adler’s takedown of Kael first appeared, the most threatening form of critical dilution lay in weekly movie reviews: today, we’re living in a media environment in which every episode of every television show gets thousands of words of critical analysis from multiple pop culture sites. (Adler writes: “Television, in this respect, is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which is only a measure of how much the way we think and talk about the medium has changed over the intervening three decades.) The conditions that Adler identifies as necessary for the creation of a major critic like Edmund Wilson or Harold Rosenberg—time, the ability to choose one’s subjects, and the freedom to quit when necessary—have all but disappeared for most writers hoping to make a mark, or even just a living. To borrow a trendy phrase, we’ve reached a point of peak content, with a torrent of verbiage being churned out at an unsustainable pace without the advertising dollars to support it, in a situation that can be maintained only by the seemingly endless supply of aspiring writers willing to be chewed up by the machine. And if Adler thought that even a monthly reviewing schedule was deadly for serious criticism, I’d be curious to hear how she feels about the online apprenticeship that all young writers seem expected to undergo these days.

Still, I’d like to think that Adler got it wrong, just as I believe that she was ultimately mistaken about Kael, whose legacy, for all its flaws, still endures. (It’s revealing to note that Adler had a long, distinguished career as a writer and critic herself, and yet she almost certainly remains best known among casual readers for her Kael review.) Not every lengthy writeup of the latest episode of Riverdale is going to stand the test of time, but as a crucible for forming a critic’s judgment, this daily grind feels like a necessary component, even if it isn’t the only one. A critic needs time and leisure to think about major works of art, which is a situation that the current media landscape doesn’t seem prepared to offer. But the ability to form quick judgments about works of widely varying quality and to express them fluently on deadline is an indispensable part of any critic’s toolbox. When taken as an end itself, it can be deadening, as Adler notes, but it can also be the foundation for something more, even if it has to be undertaken outside of—or despite—the critic’s day job. The critic’s responsibility, now more than ever, isn’t to detach entirely from the relentless pace of pop culture, but to find ways of channeling it into something deeper than the instantaneous think piece or hot take. As a daily blogger who also undertakes projects that can last for months or years, I’m constantly mindful of the relationship between my work on demand and my larger ambitions. And I sure hope that the two halves can work together. Because, like it or not, every critic is walking that path already.

Written by nevalalee

April 18, 2017 at 9:00 am

The illusion of life

leave a comment »

Last week, The A.V. Club ran an entire article devoted to television shows in which the lead is also the best character, which only points to how boring many protagonists tend to be. I’ve learned to chalk this up to two factors, one internal, the other external. The internal problem stems from the reasonable principle that the narrative and the hero’s objectives should be inseparable: the conflict should emerge from something that the protagonist urgently needs to accomplish, and when the goal has been met—or spectacularly thwarted—the story is over. It’s great advice, but in practice, it often results in leads who are boringly singleminded: when every action needs to advance the plot, there isn’t much room for the digressions and quirks that bring characters to life. The supporting cast has room to go off on tangents, but the characters at the center have to constantly triangulate between action, motivation, and relatability, which can drain them of all surprise. A protagonist is under so much narrative pressure that when the story relaxes, he bursts, like a sea creature brought up from its crevasse to the surface. Elsewhere, I’ve compared a main character to a diagram of a pattern of forces, like one of the fish in D’Arcy Wentworth Thompson’s On Growth and Form, in which the animal’s physical shape is determined by the outside stresses to which it has been subjected. And on top of this, there’s an external factor, which is the universal desire of editors, producers, and studio executives to make the protagonist “likable,” which, whether or not you agree with it, tends to smooth out the rough edges that make a character vivid and memorable.

In the classic textbook Disney Animation: The Illusion of Life, we find a useful perspective on this problem. The legendary animators Frank Thomas and Ollie Johnston provide a list of guidelines for evaluating story material before the animation begins, including the following:

Tell your story through the broad cartoon characters rather than the “straight” ones. There is no way to animate strong-enough attitudes, feelings, or expressions on realistic characters to get the communication you should have. The more real, the less latitude for clear communication. This is more easily done with the cartoon characters who can carry the story with more interest and spirit anyway. Snow White was told through the animals, the dwarfs, and the witch—not through the prince or the queen or the huntsman. They had vital roles, but their scenes were essentially situation. The girl herself was a real problem, but she was helped by always working to a sympathetic animal or a broad character. This is the old vaudeville trick of playing the pretty girl against the buffoon; it helps both characters.

Even more than Snow White, the great example here is Sleeping Beauty, which has always fascinated me as an attempt by Disney to recapture past glories by a mechanical application of its old principles raised to dazzling technical heights. Not only do Aurora and Prince Philip fail to drive the story, but they’re all but abandoned by it—Aurora speaks fewer lines than any other Disney main character, and neither of them talk for the last thirty minutes. Not only does the film acknowledge the dullness of its protagonists, but it practically turns it into an artistic statement in itself.

And it arises from a tension between the nature of animation, which is naturally drawn to caricature, and the notion that sympathetic protagonists need to be basically realistic. With regard to the first point, Thomas and Johnston advise:

Ask yourself, “Can the story point be done in caricature?” Be sure the scenes call for action, or acting that can be caricatured if you are to make a clear statement. Just to imitate nature, illustrate reality, or duplicate live action not only wastes the medium but puts an enormous burden on the animator. It should be believable, but not realistic.

The italics are mine. This is a good rule, but it collides headlong with the principle that the “real” characters should be rendered with greater naturalism:

Of course, there is always a big problem in making the “real” or “straight” characters in our pictures have enough personality to carry their part of the story…The point of this is misinterpreted by many to mean that characters who have to be represented as real should be left out of feature films, that the stories should be told with broad characters who can be handled more easily. This would be a mistake, for spectators need to have someone or something they can believe in, or the picture falls apart.

And while you could make a strong case that viewers relate just as much to the sidekicks, it’s probably also true that a realistic central character serves an important functional role, which allows the audience to take the story seriously. This doesn’t just apply to animation, either, but to all forms of storytelling—including most fiction, film, and television—that work best with broad strokes. In many cases, you can sense the reluctance of animators to tackle characters who don’t lend themselves to such bold gestures:

Early in the story development, these questions will be asked: “Does this character have to be straight?” “What is the role we need here?” If it is a prince or a hero or a sympathetic person who needs acceptance from the audience to make the story work, then the character must be drawn realistically.

Figuring out the protagonists is a thankless job: they have to serve a function within the overall story, but they’re also liable to be taken out and judged on their own merits, in the absence of the narrative pressures that created them in the first place. The best stories, it seems, are the ones in which that pattern of forces results in something fascinating in its own right, or which transform a stock character into something more. (It’s revealing that Thomas and Johnston refer to the queen and the witch in Snow White as separate figures, when they’re really a single person who evolves over the course of the story into her true form.) And their concluding advice is worth bearing in mind by everyone: “Generally speaking, if there is a human character in a story, it is wise to draw the person with as much caricature as the role will permit.”

How to rest

with 4 comments

As a practical matter, there appears to be a limit to how long a novelist can work on any given day while still remaining productive. Anecdotally, the maximum effective period seems to fall somewhere in the range of four to six hours, which leaves some writers with a lot of time to kill. In a recent essay for The New Yorker, Gary Shteyngart writes:

I believe that a novelist should write for no more than four hours a day, after which returns truly diminish; this, of course, leaves many hours for idle play and contemplation. Usually, such a schedule results in alcoholism, but sometimes a hobby comes along, especially in middle age.

In Shteyngart’s case, the hobby took the form of a fascination with fine watches, to the point where he was spending thousands of dollars on his obsession every year. This isn’t a confession designed to elicit much sympathy from others—especially when he observes that spending $4,137.25 on a watch means throwing away “roughly 4.3 writing days”—but I’d like to believe that he chose a deliberately provocative symbol of wasted time. Most novelists have day jobs, with all their writing squeezed into the few spare moments that remain, so to say that writers have hours of idleness at their disposal, complete with that casual “of course,” implies an unthinking acceptance of a privilege that only a handful of authors ever attain. Shteyngart, I think, is smarter than this, and he may simply be using the luxury watch as an emblem of how precious each minute can be for writers for whom time itself hasn’t become devalued.

But let’s assume that you’re lucky enough to write for a living, and that your familial or social obligations are restricted enough to leave you with over half the day to spend as you see fit. What can you do with all those leisure hours? Alcoholism, as Shteyngart notes, is an attractive possibility, but perhaps you want to invest your time in an activity that enhances your professional life. Georg von Békésy, the Hungarian biophysicist, thought along similar lines, as his biographer Floyd Ratliff relates:

His first idea about how to excel as a scientist was simply to work hard and long hours, but he realized that his colleagues were working just as hard and just as long. So he decided instead to follow the old rule: sleep eight hours, work eight hours, and rest eight hours. But Békésy put a “Hungarian twist” on this, too. There are many ways to rest, and he reasoned that perhaps he could work in some way that would improve his judgment, and thus improve his work. The study of art, in which he already had a strong interest, seemed to offer this possibility…By turning his attention daily from science to art, Békésy refreshed his mind and sharpened his faculties.

This determination to turn even one’s free time into a form of self-improvement seems almost inhuman. (His “old rule” reminds me of the similar advice that Ursula K. LeGuin offers in The Left Hand of Darkness: “When action grows unprofitable, gather information; when information grows unprofitable, sleep.”) But I think that Békésy was also onto something when he sought out a hobby that provided a contrast to what he was doing for a living. A change, as the saying goes, is as good as a rest.

In fact, you could say that there are two types of hobbies, although they aren’t mutually exclusive. There are hobbies that are orthogonal to the rest of our lives, activating parts of the mind or personality that otherwise go unused, or providing a soothing mechanical respite from the nervous act of brainwork—think of Churchill and his bricklaying. Alternatively, they can channel our professional urges into a contained, orderly form that provides a kind of release. Ayn Rand, of all people, wrote perceptively about stamp collecting:

Stamp collecting is a hobby for busy, purposeful, ambitious people…because, in pattern, it has the essential elements of a career, but transposed to a clearly delimited, intensely private world…In stamp collecting, one experiences the rare pleasure of independent action without irrelevant burdens or impositions.

In my case, this blog amounts to a sort of hobby, and I keep at it for both reasons. It’s a form of writing, so it provides me with an outlet for those energies, but it also allows me to think about subjects that aren’t directly connected to my work. The process is oddly refreshing—I often feel more awake and alert after I’ve spent an hour writing a post, as if I’ve been practicing my scales on the piano—and it saves an hour from being wasted in unaccountable ways. This may be why many people are drawn to hobbies that leave you with a visible result in the end, whether it’s a blog post, a stamp collection, or a brick wall.

But there’s also something to be said for doing nothing. If you’ve devoted four hours—or whatever amount seems reasonable—to work that you love, you’ve earned the right to spend your remaining time however you like. As Sir Walter Scott wrote in a letter to a friend:

And long ere dinner time, I have
Full eight close pages wrote;
What, duty, has thou now to crave?
Well done, Sir Walter Scott!

At the end of the day, I often feel like watching television, and the show I pick serves as an index to how tired I am. If I’m relatively energized, I can sit through a prestige drama; if I’m more drained, I’ll suggest a show along the lines of Riverdale; and if I can barely see straight, I’ll put on a special feature from my Lord of the Rings box set, which is my equivalent of comfort food. And you can see this impulse in far more illustrious careers. Ludwig Wittgenstein, who thought harder than anyone else of his century, liked to relax by watching cowboy movies. The degree to which he felt obliged to unplug is a measure of how much he drove himself, and in the absence of other vices, this was as good a way of decompressing as any. It prompted Nicholson Baker to write: “[Wittgenstein] would go every afternoon to watch gunfights and arrows through the chest for hours at a time. Can you take seriously a person’s theory of language when you know that he was delighted by the woodenness and tedium of cowboy movies?” To which I can only respond: “Absolutely.”

Written by nevalalee

April 5, 2017 at 9:36 am

The cliché factory

with one comment

A few days ago, Bob Mankoff, the cartoon editor of The New Yorker, devoted his weekly email newsletter to the subject of “The Great Clichés.” A cliché, as Mankoff defines it, is a restricted comic situation “that would be incomprehensible if the other versions had not first appeared,” and he provides a list of examples that should ring bells for all readers of the magazine, from the ubiquitous “desert island” to “The-End-Is-Nigh Guy.” Here are a few of my favorites:

Atlas holding up the world; big fish eating little fish; burglars in masks; cave paintings; chalk outline at crime scene; crawling through desert; galley slaves; guru on mountain; mobsters and victim with cement shoes; man in stocks; police lineup; two guys in horse costume.

Inevitably, Mankoff’s list includes a few questionable choices, while also omitting what seem like obvious contenders. (Why “metal detector,” but not “Adam and Eve?”) But it’s still something that writers of all kinds will want to clip and save. Mankoff doesn’t make the point explicitly, but most gag artists probably keep a similar list of clichés as a starting point for ideas, as we read in Mort Gerberg’s excellent book Cartooning:

List familiar situations—clichés. You might break them down into categories, like domestic (couple at breakfast, couple watching television); business (boss berating employee, secretary taking dictation); historic (Paul Revere’s ride, Washington crossing the Delaware); even famous cartoon clichés (the desert island, the Indian snake charmer)…Then change something a little bit.

As it happened, when I saw Mankoff’s newsletter, I had already been thinking about a far more harmful kind of comedy cliché. Last week, Kal Penn went on Twitter to post some of the scripts from his years auditioning as a struggling actor, and they amount to an alternative list of clichés kept by bad comedy writers, consciously or otherwise: “Gandhi lookalike,” “snake charmer,” “foreign student.” One character has a “slight Hindi accent,” another is a “Pakistani computer geek who dresses like Beck and is in a perpetual state of perspiration,” while a third delivers dialogue that is “peppered with Indian cultural references…[His] idiomatic conversation is hit and miss.” A typical one-liner: “We are propagating like flies on elephant dung.” One script describes a South Asian character’s “spastic techno pop moves,” with Penn adding that “the big joke was an accent and too much cologne.” (It recalls the Morrissey song “Bengali in Platforms,” which included the notorious line: “Life is hard enough when you belong here.” You could amend it to read: “Being a comedy writer is hard enough when you belong here.”) Penn closes by praising shows with writers “who didn’t have to use external things to mask subpar writing,” which cuts to the real issue here. The real person in “a perpetual state of perspiration” isn’t the character, but the scriptwriter. Reading the teleplay for an awful sitcom is a deadening experience in itself, but it’s even more depressing to realize that in most cases, the writer is falling back on a stereotype to cover up the desperate unfunniness of the writing. When Penn once asked if he could play a role without an accent, in order to “make it funny on the merits,” he was told that he couldn’t, probably because everybody else knew that the merits were nonexistent.

So why is one list harmless and the other one toxic? In part, it’s because we’ve caught them at different stages of evolution. The list of comedy conventions that we find acceptable is constantly being culled and refined, and certain art forms are slightly in advance of the others. Because of its cultural position, The New Yorker is particularly subject to outside pressures, as it learned a decade ago with its Obama terrorist cover—which demonstrated that there are jokes and images that aren’t acceptable even if the magazine’s attitude is clear. Turn back the clock, and Mankoff’s list would include conventions that probably wouldn’t fly today. Gerberg’s list, like Penn’s, includes “snake charmer,” which Mankoff omits, and he leaves out “Cowboys and Indians,” a cartoon perennial that seems to be disappearing. And it can be hard to reconstruct this history, because the offenders tend to be consigned to the memory hole. When you read a lot of old magazine fiction, as I do, you inevitably find racist stereotypes that would be utterly unthinkable today, but most of the stories in which they appear have long since been forgotten. (One exception, unfortunately, is the Sherlock Holmes short story “The Adventure of the Three Gables,” which opens with a horrifying racial caricature that most Holmes fans must wish didn’t exist.) If we don’t see such figures as often today, it isn’t necessarily because we’ve become more enlightened, but because we’ve collectively agreed to remove certain figures from the catalog of stock comedy characters, while papering over their use in the past. A list of clichés is a snapshot of a culture’s inner life, and we don’t always like what it says. The demeaning parts still offered to Penn and actors of similar backgrounds have survived for longer than they should have, but sitcoms that trade in such stereotypes will be unwatchable in a decade or two, if they haven’t already been consigned to oblivion.

Of course, most comedy writers aren’t thinking in terms of decades, but about getting through the next five minutes. And these stereotypes endure precisely because they’re seen as useful, in a shallow, short-term kind of way. There’s a reason why such caricatures are more visible in comedy than in drama: comedy is simply harder to write, but we always want more of it, so it’s inevitable that writers on a deadline will fall back on lazy conventions. The really insidious thing about these clichés is that they sort of work, at least to the extent of being approved by a producer without raising any red flags. Any laughter that they inspire is the equivalent of empty calories, but they persist because they fill a cynical need. As Penn points out, most writers wouldn’t bother with them at all if they could come up with something better. Stereotypes, like all clichés, are a kind of fallback option, a cheap trick that you deploy if you need a laugh and can’t think of another way to get one. Clichés can be a precious commodity, and all writers resort to them occasionally. They’re particularly valuable for gag cartoonists, who can’t rely on a good idea from last week to fill the blank space on the page—they’ve got to produce, and sometimes that means yet another variation on an old theme. But there’s a big difference between “Two guys in a horse costume” and “Gandhi lookalike.” Being able to make that distinction isn’t a matter of political correctness, but of craft. The real solution is to teach people to be better writers, so that they won’t even be tempted to resort to such tired solutions. This might seem like a daunting task, but in fact, it happens all the time. A cliché factory operates on the principle of supply and demand. And it shuts down as soon as people no longer find it funny.

Written by nevalalee

March 20, 2017 at 11:18 am

A series of technical events

with 6 comments

In his book Four Arguments for the Elimination of Television, which was first published in the late seventies, the author Jerry Mander, a former advertising executive, lists a few of the “technical tricks” that television can use to stimulate the viewer’s interest:

Editors make it possible for a scene in one room to be followed instantly by a scene in another room, or at another time, or another place. Words appears over the images. Music rises and falls in the background. Two images or three can appear simultaneously. One image can be superposed on another on the screen. Motion can be slowed down or sped up.

These days, we take most of these effects for granted, as part of the basic grammar of the medium, but to Mander, they’re something more sinister. Technique, he argues, is replacing content, and at its heart, it’s something of a confidence game:

Through these technical events, television images alter the usual, natural imagery possibilities, taking on the quality of a naturally highlighted event. They make it seem that what you are looking at is unique, unusual, and extraordinary…But nothing unusual is going on. All that’s happening is that the viewer is watching television, which is the same thing that happened an hour ago, or yesterday. A trick has been played. The viewer is fixated by a conspiracy of dimmed-out environments combined with an artificial, impossible, fictitious unusualness.

In order to demonstrate “the extent to which television is dependent upon technical tricks to maintain your interest,” Mander invites the reader to conduct what he calls a technical events test:

Put on your television set and simply count the number of times there is a cut, a zoom, a superimposition, a voiceover, the appearance of words on the screen—a technical event of some kind…Each technical event—each alteration of what would be natural imagery—is intended to keep your attention from waning as it might otherwise…Every time you are about to relax your attention, another technical event keeps you attached..

You will probably find that in the average commercial television program, there are eight or ten technical events for every sixty-second period…You may also find that there is rarely a period of twenty seconds without any sort of technical event at all. That may give you an idea of the extent to which producers worry about whether the content itself can carry your interest.

He goes on to list the alleged consequences of exposure to such techniques, from shortened attention span in adults to heightened hyperactivity in children, and concludes: “Advertisers are the high artists of the medium. They have gone further in the technologies of fixation than anyone else.”

Mander’s argument was prophetic in many ways, but in one respect, he was clearly wrong. In the four decades since his book first appeared, it has become obvious that the “high artists” of distraction and fixation aren’t advertisers, but viewers themselves, and its true canvas isn’t television, but the Internet. Instead of passively viewing a series of juxtaposed images, we assemble our online experience for ourselves, and each time we open a new link, we’re effectively acting as our own editors. Every click is a cut. (The anecdotal figure that the reader spends less than fifteen seconds on the average web page is very close to the frequency of technical events on television, which isn’t an accident.) We do a better job of distracting ourselves than any third party ever could, as long as we’re given sufficient raw material and an intuitive interface—which explains much of the evolution of online content. When you look back at web pages from the early nineties, it’s easy to laugh at how noisy and busy they tended to be, with music, animated graphics, and loud colors. This wasn’t just a matter of bad taste, but of a mistaken analogy to television. Web designers thought that they had to grab our attention using the same technical tricks employed by other media, but that wasn’t the case. The hypnotic browsing state that we’ve all experienced isn’t produced by any one page, but by the succession of similar pages as the user moves between them at his or her private rhythm. Ideally, from the point of view of a media company, that movement will take place within the same family of pages, but it also leads to a convergence of style and tone between sites. Most web pages these days look more or less the same because it creates a kind of continuity of experience. Instead of the loud, colorful pages of old, they’re static and full of white space. Mander calls this “the quality of even tone” of television, and the Internet does it one better. It’s uniform and easily aggregated, and you can cut it together however you like, like yard goods.

In fact, it isn’t content that gives us the most pleasure, but the act of clicking, with the sense of control it provides. This implies that bland, interchangeable content is actually preferable to more arresting material. The easier it is to move between basically similar units, the closer the experience is to that of an ideally curated television show—which is why different sources have a way of blurring together into the same voice. When I’m trying to tell my wife about a story I read online, I often have trouble remembering if I read it on Vox, Vulture, or Vice, which isn’t a knock against those sites, but a reflection of the unconscious pressure to create a seamless browsing experience. From there, it’s only a short step to outright content mills and fake news. In the past, I’ve called this AutoContent, after the interchangeable bullet points used to populate slideshow presentations, but it’s only effective if you can cut quickly from one slide to another. If you had to stare at it for longer than fifteen seconds, you wouldn’t be able to stand it. (This may be why we’ve come to associate quality with length, which is more resistant to being to reduced to the filler between technical events. The “long read,” as I’ve argued elsewhere, can be a marketing category in itself, but it does need to try a little harder.) The idea that browsing online is a form of addictive behavior isn’t a new one, of course, and it’s often explained in terms of the “random rewards” that the brain receives when we check email or social media. But the notion of online content as a convenient source of technical events is worth remembering. When we spend any period of time online, we’re essentially watching a television show while simultaneously acting as its editor and director, and often as its writer and actors. In the end, to slightly misquote Mander, all that’s happening is that the reader is seated in front of a computer or looking at a phone, “which is the same thing that happened an hour ago, or yesterday.” The Internet is better at this than television ever was. And in a generation or two, it may result in television being eliminated after all.

Written by nevalalee

March 14, 2017 at 9:18 am

Farewell to Mystic Falls

with one comment

Note: Spoilers follow for the series finale of The Vampire Diaries.

On Friday, I said goodbye to The Vampire Diaries, a series that I once thought was one of the best genre shows on television, only to stop watching it for its last two seasons. Despite its flaws, it occupies a special place in my memory, in part because its strengths were inseparable from the reasons that I finally abandoned it. Like Glee, The Vampire Diaries responded to its obvious debt to an earlier franchise—High School Musical for the former, Twilight for the latter—both by subverting its predecessor and by burning through ideas as relentlessly as it could. It’s as if both shows decided to refute any accusations of unoriginality by proving that they could be more ingenious than their inspirations, and amazingly, it sort of worked, at least for a while. There’s a limit to how long any series can repeatedly break down and reassemble itself, however, and both started to lose steam after about three years. In the case of The Vampire Diaries, its problems crystallized around its ostensible lead, Elena Gilbert, as portrayed by the game and talented Nina Dobrev, who left the show two seasons ago before returning for an encore in the finale. Elena spent most of her first sendoff asleep, and she isn’t given much more to do here. There’s a lot about the episode that I liked, and it provides satisfying moments of closure for many of its characters, but Elena isn’t among them. In the end, when she awakens from the magical coma in which she has been slumbering, it’s so anticlimactic that it reminds me of what Pauline Kael wrote of Han’s revival in Return of the Jedi: “It’s as if Han Solo had locked himself in the garage, tapped on the door, and been let out.”

And what happened to Elena provides a striking case study of why the story’s hero is often fated to become the least interesting person in sight. The main character of a serialized drama is under such pressure to advance the plot that he or she becomes reduced to the diagram of a pattern of forces, like one of the fish in D’Arcy Wentworth Thompson’s On Growth and Form, in which the animal’s physical shape is determined by the outside stresses to which it has been subjected. Instead of making her own decisions, Elena was obliged to become whatever the series needed her to be. Every protagonist serves as a kind of motor for the story, which is frequently a thankless role, but it was particularly problematic on a show that defined itself by its willingness to burn through a year of potential storylines each month. Every episode felt like a season finale, and characters were freely killed, resurrected, and brainwashed to keep the wheels turning. It was hardest on Elena, who, at her best, was a compelling, resourceful heroine. After six seasons of personality changes, possessions, memory wipes, and the inexplicable choices that she made just because the story demanded it, she became an empty shell. If you were designing a show in a laboratory to see what would happen if its protagonist was forced to live through plot twists at an accelerated rate, like the stress tests that engineers use to put a component through a lifetime’s worth of wear in a short period of time, you couldn’t do much better than The Vampire Diaries. And while it might have been theoretically interesting to see what happened to the series after that one piece was removed, I didn’t think it was worth sitting through another two seasons of increasingly frustrating television.

After the finale was shot, series creators Kevin Williamson and Julie Plec made the rounds of interviews to discuss the ending, and they shared one particular detail that fascinates me. If you haven’t watched The Vampire Diaries, all you need to know is that its early seasons revolved around a love triangle between Elena and the vampire brothers Stefan and Damon, a nod to Twilight that quickly became one of the show’s least interesting aspects. Elena seemed fated to end up with Stefan, but she spent the back half of the series with Damon, and it ended with the two of them reunited. In a conversation with Deadline, Williamson revealed that this wasn’t always the plan:

Well, I always thought it would be Stefan and Elena. They were sort of the anchor of the show, but because we lost Elena in season six, we couldn’t go back. You know Nina could only come back for one episode—maybe if she had came back for the whole season, we could even have warped back towards that, but you can’t just do it in forty-two minutes.

Dobrev’s departure, in other words, froze that part of the story in place, even as the show around it continued its usual frantic developments, and when she returned, there wasn’t time to do anything but keep Elena and Damon where they had left off. There’s a limit to how much ground you can cover in the course of a single episode, so it seemed easier for the producers to stick with what they had and figure out a way to make it seem inevitable.

The fact that it works at all is a tribute to the skill of the writers and cast, as well as to the fact that the whole love triangle was basically arbitrary in the first place. As James Joyce said in a very different context, it was a bridge across which the characters could walk, and once they were safely on the other side, it could be blown to smithereens. The real challenge was how to make the finale seem like a definitive ending, after the show had killed off and resurrected so many characters that not even death itself felt like a conclusion. It resorted to much the same solution that Lost did when faced with a similar problem: it shut off all possibility of future narrative by reuniting its characters in heaven. This partially a form of wish fulfillment, as we’ve seen with so many other television series, but it also puts a full stop on the story by leaving us in an afterlife, where, by definition, nothing can ever change. It’s hilariously unlike the various versions of the world to come that the series has presented over the years, from which characters can always be yanked back to life when necessary, but it’s also oddly moving and effective. Watching it, I began to appreciate how the show’s biggest narrative liability—a cast that just can’t be killed—also became its greatest asset. The defining image of The Vampire Diaries was that of a character who has his neck snapped, and then just shakes it off. Williamson and Plec must have realized, consciously or otherwise, that it was a reset button that would allow them to go through more ideas than would be possible than a show on which a broken neck was permanent. Every denizen of Mystic Falls got a great death scene, often multiple times per season, and the show exploited that freedom until it exhausted itself. It only really worked for three years out of eight, but it was a great run while it lasted. And now, after life’s fitful fever, the characters can sleep well, as they sail off into the mystic.

From Sputnik to WikiLeaks

with 2 comments

In Toy Story 2, there’s a moment in which Woody discovers that his old television series, Woody’s Roundup, was abruptly yanked off the air toward the end of the fifties. He asks: “That was a great show. Why cancel it?” The Prospector replies bitterly: “Two words: Sput-nik. Once the astronauts went up, children only wanted to play with space toys.” And while I wouldn’t dream of questioning the credibility of a man known as Stinky Pete, I feel obliged to point out that his version of events isn’t entirely accurate. The space craze among kids really began more than half a decade earlier, with the premiere of Tom Corbett, Space Cadet, and the impact of Sputnik on science fiction was far from a positive one. Here’s what John W. Campbell wrote about it in the first issue of Astounding to be printed after the satellite’s launch:

Well, we lost that race; Russian technology achieved an important milestone in human history—one that the United States tried for, talked about a lot, and didn’t make…One of the things Americans have long been proud of—and with sound reason—is our ability to convert theoretical science into practical, working engineering…This time we’re faced with the uncomfortable realization that the Russians have beaten us in our own special field; they solved a problem of engineering technology faster and better than we did.

And while much of the resulting “Sputnik crisis” was founded on legitimate concerns—Sputnik was as much a triumph of ballistic rocketry as it was of satellite technology—it also arose from the notion that the United States had been beaten at its own game. As Arthur C. Clarke is alleged to have said, America had become “a second-rate power.”

Campbell knew right away that he had reason to worry. Lester del Rey writes in The World of Science Fiction:

Sputnik simply convinced John Campbell that he’d better watch his covers and begin cutting back on space scenes. (He never did, but the art director of the magazine and others were involved in that decision.) We agreed in our first conversation after the satellite went up that people were going to react by deciding science had caught up with science fiction, and with a measure of initial fear. They did. Rather than helping science fiction, Sputnik made it seem outmoded.

And that’s more or less exactly what happened. There was a brief spike in sales, followed by a precipitous fall as mainstream readers abandoned the genre. I haven’t been able to find specific numbers for this period, but one source, the Australian fan Wynne Whitford, states that the circulation of Astounding fell by half after Sputnik—which seems high, but probably reflects a real decline. In a letter written decades later, Campbell said of Sputnik: “Far from encouraging the sales of science fiction magazines—half the magazines being published lost circulation so drastically they went out of business!” An unscientific glance at a list of titles appears to support this. In 1958, the magazines Imagination, Imaginative Tales, Infinity Science Fiction, Phantom, Saturn, Science Fiction Adventures, Science Fiction Quarterly, Star Science Fiction, and Vanguard Science Fiction all ceased publication, followed by three more over the next twelve months. The year before, just four magazines had folded. There was a bubble, and after Sputnik, it burst.

At first, this might seem like a sort of psychological self-care, of the same kind that motivated me to scale back my news consumption after the election. Americans were simply depressed, and they didn’t need any reminders of the situation they were in. But it also seems to have affected the public’s appetite for science fiction in particular, rather than science as a whole. In fact, the demand for nonfiction science writing actually increased. As Isaac Asimov writes in his memoir In Joy Still Felt:

The United States went into a dreadful crisis of confidence over the fact that the Soviet Union had gotten there first and berated itself for not being interested enough in science. And I berated myself for spending too much time on science fiction when I had the talent to be a great science writer…Sputnik also served to increase the importance of any known public speaker who could talk on science and, particularly, on space, and that meant me.

What made science fiction painful to read, I think, was its implicit assumption of American superiority, which had been disproven so spectacularly. Campbell later compared it to the reaction after the bomb fell, claiming that it was the moment when people realized that science fiction wasn’t a form of escapism, but a warning:

The reactions to Sputnik have been more rapid, and, therefore, more readily perceptible and correlatable. There was, again, a sudden rise in interest in science fiction…and there is, now, an even more marked dropping of the science-fiction interest. A number of the magazines have been very heavily hit…I think the people of the United States thought we were kidding.

And while Campbell seemed to believe that readers had simply misinterpreted science fiction’s intentions, the conventions of the genre itself clearly bore part of the blame.

In his first editorials after Sputnik, Campbell drew a contrast between the American approach to engineering, which proceeded logically and with vast technological resources, and the quick and dirty Soviet program, which was based on rules of thumb, trial and error, and the ability to bull its way through on one particular point of attack. It reminds me a little of the election. Like the space race, last year’s presidential campaign could be seen as a kind of proxy war between the American and Russian administrations, and regardless of what you believe about the Trump camp’s involvement, which I suspect was probably a tacit one, there’s no question as to which side Putin favored. On one hand, you had a large, well-funded political machine, and on the other, one that often seemed comically inept. Yet it was the quick and dirty approach that triumphed. “The essence of ingenuity is the ability to get precision results without precision equipment,” Campbell wrote, and that’s pretty much what occurred. A few applications of brute force in the right place made all the difference, and they were aided, to some extent, by a similar complacency. The Americans saw the Soviets as bunglers, and they never seriously considered the possibility that they might be beaten by a bunch of amateurs. As Campbell put it: “We earned what we got—fully, and of our own efforts. The ridicule we’ve collected is our just reward for our consistent efforts.” Sometimes I feel the same way. Right now, we’re entering a period in which the prospect of becoming a second-rate power is far more real than it was when Clarke made his comment. It took a few months for the implications of Sputnik to really sink in. And if history is any indication, we haven’t even gotten to the crisis yet.

%d bloggers like this: