Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Search Results

Peak television and the future of stardom

with one comment

Kevin Costner in The Postman

Earlier this week, I devoured the long, excellent article by Josef Adalian and Maria Elena Fernandez of Vulture on the business of peak television. It’s full of useful insights and even better gossip—and it names plenty of names—but there’s one passage that really caught my eye, in a section about the huge salaries that movie stars are being paid to make the switch to the small screen:

A top agent defends the sums his clients are commanding, explaining that, in the overall scheme of things, the extra money isn’t all that significant. “Look at it this way,” he says. “If you’re Amazon and you’re going to launch a David E. Kelley show, that’s gonna cost $4 million an episode [to produce], right? That’s $40 million. You can have Bradley Whitford starring in it, [who is] gonna cost you $150,000 an episode. That’s $1.5 million of your $40 million. Or you could spend another $3.5 million [to get Costner] on what will end up being a $60 million investment by the time you market and promote it. You can either spend $60 [million] and have the Bradley Whitford show, or $63.5 [million] and have the Kevin Costner show. It makes a lot of sense when you look at it that way.”

With all due apologies to Bradley Whitford, I found this thought experiment fascinating, and not just for the reasons that the agent presumably shared it. It implies, for one thing, that television—which is often said to be overtaking Hollywood in terms of quality—is becoming more like feature filmmaking in another respect: it’s the last refuge of the traditional star. We frequently hear that movie stardom is dead and that audiences are drawn more to franchises than to recognizable faces, so the fact that cable and streaming networks seem intensely interested in signing film stars, in a post-True Detective world, implies that their model is different. Some of it may be due to the fact, as William Goldman once said, that no studio executive ever got fired for hiring a movie star: as the new platforms fight to establish themselves, it makes sense that they’d fall back on the idea of star power, which is one of the few things that corporate storytelling has ever been able to quantify or understand. It may also be because the marketing strategy for television inherently differs from that for film: an online series is unusually dependent on media coverage to stand out from the pack, and signing a star always generates headlines. Or at least it once did. (The Vulture article notes that Woody Allen’s new series for Amazon “may end up marking peak Peak TV,” and it seems a lot like a deal that was made for the sake of the coverage it would produce.)

Kevin Costner in JFK

But the most plausible explanation lies in simple economics. As the article explains, Netflix and the other streaming companies operate according to a “cost-plus” model: “Rather than holding out the promise of syndication gold, the company instead pays its studio and showrunner talent a guaranteed up-front profit—typically twenty or thirty percent above what it takes to make a show. In exchange, it owns all or most of the rights to distribute the show, domestically and internationally.” This limits the initial risk to the studio, but also the potential upside: nobody involved in producing the show itself will see any money on the back end. In addition, it means that even the lead actors of the series are paid a flat dollar amount, which makes them a more attractive investment than they might be for a movie. Most of the major stars in Hollywood earn gross points, which means that they get a cut of the box office receipts before the film turns a profit—a “first dollar” deal that makes the mathematics of breaking even much more complicated. The thought experiment about Bradley Whitford and Kevin Costner only makes sense if you can get Costner at a fixed salary per episode. In other words, movie stars are being actively courted by television because its model is a throwback to an earlier era, when actors were held under contract by a studio without any profit participation, and before stars and their agents negotiated better deals that ended up undermining the economic basis of the star system entirely.

And it’s revealing that Costner, of all actors, appears in this example. His name came up mostly because multiple sources told Vulture that he was offered $500,000 per episode to star in a streaming series: “He passed,” the article says, “but industry insiders predict he’ll eventually say ‘yes’ to the right offer.” But he also resonates because he stands for a kind of movie stardom that was already on the wane when he first became famous. It has something to do with the quintessentially American roles that he liked to play—even JFK is starting to seem like the last great national epic—and an aura that somehow kept him in leading parts two decades after his career as a major star was essentially over. That’s weirdly impressive in itself, and it testifies to how intriguing a figure he remains, even if audiences aren’t likely to pay to see him in a movie. Whenever I think of Costner, I remember what the studio executive Mike Medavoy once claimed to have told him right at the beginning of his career:

“You know,” I said to him over lunch, “I have this sense that I’m sitting here with someone who is going to become a great big star. You’re going to want to direct your own movies, produce your own movies, and you’re going to end up leaving your wife and going through the whole Hollywood movie-star cycle.”

Costner did, in fact, end up leaving his first wife. And if he also leaves film for television, even temporarily, it may reveal that “the whole Hollywood movie-star cycle” has a surprising final act that few of us could have anticipated.

Written by nevalalee

May 27, 2016 at 9:03 am

“Asthana glanced over at the television…”

leave a comment »

"A woman was standing just over his shoulder..."

Note: This post is the eighteenth installment in my author’s commentary for Eternal Empire, covering Chapter 19. You can read the previous installments here.

A quarter of a century ago, I read a story about the actor Art Carney, possibly apocryphal, that I’ve never forgotten. Here’s the version told by the stage and television actress Patricia Wilson:

During a live performance of the original Honeymooners, before millions of viewers, Jackie [Gleason] was late making an entrance into a scene. He left Art Carney onstage alone, in the familiar seedy apartment set of Alice and Ralph Kramden. Unflappable, Carney improvised action for Ed Norton. He looked around, scratched himself, then went to the Kramden refrigerator and peered in. He pulled out an orange, shuffled to the table, and sat down and peeled it. Meanwhile frantic stage managers raced to find Jackie. Art Carney sat onstage peeling and eating an orange, and the audience convulsed with laughter.

According to some accounts, Carney stretched the bit of business out for a full two minutes before Gleason finally appeared. And while it certainly speaks to Carney’s ingenuity and resourcefulness, we should also take a moment to tip our hats to that humble orange, as well as the prop master who thought to stick it in the fridge—unseen and unremarked—in the first place.

Theatrical props, as all actors and directors know, can be a source of unexpected ideas, just as the physical limitations or possibilities of the set itself can provide a canvas on which the action is conceived in real time. I’ve spoken elsewhere of the ability of vaudeville comedians to improvise routines on the spot using whatever was available on a standing set, and there’s a sense in which the richness of the physical environment in which a scene takes place is a battery from which the performances can draw energy. When a director makes sure that each actor’s pockets are full of the litter that a character might actually carry, it isn’t just a mark of obsessiveness or self-indulgence, or even a nod toward authenticity, but a matter of storing up potential tools. A prop by itself can’t make a scene work, but it can provide the seed around which a memorable moment or notion can grow, like a crystal. In more situations than you might expect, creativity lies less in the ability to invent from scratch than to make effective use of whatever happens to lie at hand. Invention is a precious resource, and most artists have a finite amount of it; it’s better, whenever possible, to utilize what the world provides. And much of the time, when you’re faced with a hard problem to solve, you’ll find that the answer is right there in the background.

"Asthana glanced over at the television..."

This is as true of writing fiction as of any of the performing arts. In the past, I’ve suggested that this is the true purpose of research or location work: it isn’t about accuracy, but about providing raw material for dreams, and any writer faced with the difficult task of inventing a scene would be wise to exploit what already exists. It’s infinitely easier to write a chase scene, for example, if you’re tailoring it to the geography of a particular street. As usual, it comes back to the problem of making choices: the more tangible or physical the constraints, the more likely they’ll generate something interesting when they collide with the fundamentally abstract process of plotting. Even if the scene I’m writing takes place somewhere wholly imaginary, I’ll treat it as if it were being shot on location: I’ll pick a real building or locale that has the qualities I need for the story, pore over blueprints and maps, and depart from the real plan only when I don’t have any alternative. In most cases, the cost of that departure, in terms of the confusion it creates, is far greater than the time and energy required to make the story fit within an existing structure. For much the same reason, I try to utilize the props and furniture you’d naturally find there. And that’s all the more true when a scene occurs in a verifiable place.

Sometimes, this kind of attention to detail can result in surprising resonances. There’s a small example that I like in Chapter 19 of Eternal Empire. Rogozin, my accused intelligence agent, is being held without charges at a detention center in Paddington Green. This is a real location, and its physical setup becomes very important: Rogozin is going to be killed, in an apparent suicide, under conditions of heavy security. To prepare these scenes, I collected reference photographs, studied published descriptions, and shaped the action as much as possible to unfold logically under the constraints the location imposed. And one fact caught my eye, purely as a matter of atmosphere: the cells at Paddington Green are equipped with televisions, usually set to play something innocuous, like a nature video. This had obvious potential as a counterpoint to the action, so I went to work looking for a real video that might play there. And after a bit of searching, I hit on a segment from the BBC series Life in the Undergrowth, narrated by David Attenborough, about the curious life cycle of the gall wasp. The phenomenon it described, as an invading wasp burrows into the gall created by another, happened to coincide well—perhaps too well—with the story itself. As far as I’m concerned, it’s what makes Rogozin’s death scene work. And while I could have made up my own video to suit the situation, it seemed better, and easier, to poke around the stage first to see what I could find…

Written by nevalalee

May 7, 2015 at 9:11 am

The unbreakable television formula

leave a comment »

Ellie Kemper in Unbreakable Kimmy Schmidt

Watching the sixth season premiere of Community last night on Yahoo—which is a statement that would have once seemed like a joke in itself—I was struck by the range of television comedy we have at our disposal these days. We’ve said goodbye to Parks and Recreation, we’re following Community into what is presumably its final stretch, and we’re about to greet Unbreakable Kimmy Schmidt as it starts what looks to be a powerhouse run on Netflix. These shows are superficially in the same genre: they’re single-camera sitcoms that freely grant themselves elaborate sight gags and excursions into surrealism, with a cutaway style that owes as much to The Simpsons as to Arrested Development. Yet they’re palpably different in tone. Parks and Rec was the ultimate refinement of the mockumentary style, with talking heads and reality show techniques used to flesh out a narrative of underlying sweetness; Community, as always, alternates between obsessively detailed fantasy and a comic strip version of emotions to which we can all relate; and Kimmy Schmidt takes place in what I can only call Tina Fey territory, with a barrage of throwaway jokes and non sequiturs designed to be referenced and quoted forever.

And the diversity of approach we see in these three comedies makes the dramatic genre seem impoverished. Most television dramas are still basically linear; they’re told using the same familiar grammar of establishing shots, medium shots, and closeups; and they’re paced in similar ways. If you were to break down an episode by shot length and type, or chart the transitions between scenes, an installment of Game of Thrones would look a lot on paper like one of Mad Men. There’s room for individual quirks of style, of course: the handheld cinematography favored by procedurals has a different feel from the clinical, detached camera movements of House of Cards. And every now and then, we get a scene—like the epic tracking shot during the raid in True Detective—that awakens us to the medium’s potential. But the fact that such moments are striking enough to inspire think pieces the next day only points to how rare they are. Dramas are just less inclined to take big risks of structure and tone, and when they do, they’re likely to be hybrids. Shows like Fargo or Breaking Bad are able to push the envelope precisely because they have a touch of black comedy in their blood, as if that were the secret ingredient that allowed for greater formal daring.

Jon Hamm on Mad Men

It isn’t hard to pin down the reason for this. A cutaway scene or extended homage naturally takes us out of the story for a second, and comedy, which is inherently more anarchic, has trained us to roll with it. We’re better at accepting artifice in comic settings, since we aren’t taking the story quite as seriously: whatever plot exists is tacitly understood to be a medium for the delivery of jokes. Which isn’t to say that we can’t care deeply about these characters; if anything, our feelings for them are strengthened because they take place in a stylized world that allows free play for the emotions. Yet this is also something that comedy had to teach us. It can be fun to watch a sitcom push the limits of plausibility to the breaking point, but if a drama deliberately undermines its own illusion of reality, we can feel cheated. Dramas that constantly draw attention to their own artifice, as Twin Peaks did, are more likely to become cult favorites than popular successes, since most of us just want to sit back and watch a story that presents itself using the narrative language we know. (Which, to be fair, is true of comedies as well: the three sitcoms I’ve mentioned above, taken together, have a fraction of the audience of something like The Big Bang Theory.)

In part, it’s a problem of definition. When a drama pushes against its constraints, we feel more comfortable referring to it as something else: Orange is the New Black, which tests its structure as adventurously as any series on the air today, has suffered at awards season from its resistance to easy categorization. But what’s really funny is that comedy escaped from its old formulas by appropriating the tools that dramas had been using for years. The three-camera sitcom—which has been responsible for countless masterpieces of its own—made radical shifts of tone and location hard to achieve, and once comedies liberated themselves from the obligation to unfold as if for a live audience, they could indulge in extended riffs and flights of imagination that were impossible before. It’s the kind of freedom that dramas, in theory, have always had, even if they utilize it only rarely. This isn’t to say that a uniformity of approach is a bad thing: the standard narrative grammar evolved for a reason, and if it gives us compelling characters with a maximum of transparency, that’s all for the better. Telling good stories is hard enough as it is, and formal experimentation for its own sake can be a trap in itself. Yet we’re still living in a world with countless ways of being funny, and only one way, within a narrow range of variations, of being serious. And that’s no laughing matter.

The crowded circle of television

with 2 comments

The cast of Mad Men

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What’s your favorite TV show of the year so far?”

There are times when watching television can start to feel like a second job—a pleasurable one, to be sure, but one that demands a lot of work nevertheless. Over the last year, I’ve followed more shows than ever, including Mad Men, Game of Thrones, Orange is the New Black, Hannibal, Community, Parks and Recreation, House of Cards, The Vampire Diaries, and True Detective. For the most part, they’ve all had strong runs, and I’d have trouble picking a favorite. (If pressed, I’d probably go with Mad Men, if only for old times’ sake, with Hannibal as a very close second.) They’re all strikingly different in emphasis, tone, and setting, but they also have a lot in common. With one exception, which I’ll get to in a moment, these are dense shows with large casts and intricate storylines. Many seem devoted to pushing the limits of how much complexity can be accommodated within the constraints of the television format, which may be why the majority run for just ten to thirteen episodes: it’s hard to imagine that level of energy sustained over twenty or more installments.

And while I’m thrilled by the level of ambition visible here, it comes at a price. There’s a sort of arms race taking place between media of all kinds, as they compete to stand out in an increasingly crowded space with so much competing for our attention. Books, even literary novels, are expected to be page-turners; movies offer up massive spectacle to the point where miraculous visual effects are taken for granted; and television has taken to packing every minute of narrative time to the bursting point. (This isn’t true of all shows, of course—a lot of television series are still designed to play comfortably in the background of a hotel room—but it’s generally the case with prestige shows that end up on critics’ lists and honored at award ceremonies.) This trend toward complexity arises from a confluence of factors I’ve tried to unpack here before: just as The Simpsons was the first freeze-frame sitcom, modern television takes advantage of our streaming and binge-watching habits to deliver storytelling that rewards, and even demands, close attention.

Matthew McConaughey on True Detective

For the most part, this is a positive development. Yet there’s also a case to be made that television, which is so good at managing extended narratives and enormous casts of characters, is also uniquely suited for the opposite: silence, emptiness, and contemplation. In a film, time is a precious commodity, and when you’re introducing characters while also setting in motion the machinery of a complicated story, there often isn’t time to pause. Television, in theory, should be able to stretch out a little, interspersing relentless forward momentum with moments of quiet, which are often necessary for viewers to consolidate and process what they’ve seen. Twin Peaks was as crowded and plotty as any show on the air today, but it also found time for stretches of weird, inexplicable inaction, and it’s those scenes that I remember best. Even in the series finale, with so many threads to address and only forty minutes to cover them all, it devotes endless minutes to Cooper’s hallucinatory—and almost entirely static—ordeal in the Black Lodge, and even to a gag involving a decrepit bank manager rising from his desk and crossing the floor of his branch very, very slowly.

So while there’s a lot of fun to be had with shows that constantly accelerate the narrative pace, it can also be a limitation, especially when it’s handled less than fluently. (For every show, like Orange is the New Black, that manages to cut expertly between subplots, there’s another, like Game of Thrones, that can’t quite seem to handle its enormous scope, and even The Vampire Diaries is showing signs of strain.) Both Hannibal and Mad Men know when to linger on an image or revelation—roughly half of Hannibal is devoted to contemplating its other half—and True Detective, in particular, seemed to consist almost entirely of such pauses. We remember such high points as the final chase with the killer or the raid in “Who Goes There,” but what made the show special were the scenes in which nothing much seemed to be happening. It was aided in this by its limited cast and its tight focus on its two leads, so it’s possible that what shows really need to slow things down are a couple of movie stars to hold the eye. But it’s a step in the right direction. If time is a flat circle, as Rust says, so is television, and it’s good to see it coming back around.

The dreamlife of television

with one comment

Aaron Paul on Breaking Bad

I’ve been dreaming a lot about Breaking Bad. On Wednesday, my wife and I returned from a trip to Barcelona, where we’d spent a beautiful week: my baby daughter was perfectly happy to be toted around various restaurants, cultural sites, and the Sagrada Familia, and it came as a welcome break from my own work. Unfortunately, it also meant that we were going to miss the Breaking Bad finale, which aired the Sunday before we came home. For a while, I seriously considered bringing my laptop and downloading it while we were out of the country, both because I was enormously anxious to see how the show turned out and because I dreaded the spoilers I’d have to avoid for the three days before we returned. In the end, I gritted my teeth and decided to wait until we got home. This meant avoiding most of my favorite news and pop cultural sites—I was afraid to even glance past the top few headlines on the New York Times—and staying off Twitter entirely, which I suppose wasn’t such a great loss. And even as we toured the Picasso Museum and walked for miles along the marina with a baby in tow, my thoughts were rarely very far from Walter White.

This must have done quite a number on my psyche, because I started dreaming about the show with alarming frequency. My dreams included two separate, highly elaborated versions of the finale, one of which was a straightforward bloodbath with a quiet epilogue, the other a weird metafictional conclusion in which the events of the series were played out on a movie screen with the cast and crew watching them unfold—which led me to exclaim, while still dreaming: “Of course that’s how they would end it!” Now that I’ve finally seen the real finale, the details of these dreams are fading, and only a few scraps of imagery remain. Yet the memories are still emotionally charged, and they undoubtedly affected how I approached the last episode itself, which I was afraid would never live up to the versions I’d dreamed for myself. I suspect that a lot of fans, even those who didn’t actually hallucinate alternate endings, probably felt the same way. (For the record, I liked the finale a lot, even if it ranks a notch below the best episodes of the show, which was always best at creating chaos, not resolving it. And I think about its closing moments almost every day.)

Jon Hamm on Mad Men

And it made me reflect on the ways in which television, especially in its modern, highly serialized form, is so conducive to dreaming. Dreams are a way of assembling and processing fragments of the day’s experience, or recollections from the distant past, and a great television series is nothing less than a vast storehouse of memories from another life. When a show is as intensely serialized as Breaking Bad was, it can be hard to remember individual episodes, aside from the occasional formal standout like “Fly”: I can’t always recall what scenes took place when, or in what order, and an especially charged sequence of installments—like the last half of this final season—tends to blend together into a blur of vivid impressions. What I remember are facial expressions, images, bits of dialogue: “Stay out of my territory.” “Run.” “Tread lightly.” And the result is a mine of moments that end up naturally incorporated into my own subconscious. A good movie or novel exists as a piece, and I rarely find myself dreaming alternate lives for, say, Rick and Ilsa or Charles Foster Kane. With Walter White, it’s easy to imagine different paths that the action could have taken, and those byways play themselves out in the deepest parts of my brain.

Which may explain why television is so naturally drawn to dream sequences and fantasies, which are only one step removed from the supposedly factual events of the shows themselves. Don Draper’s dreams have become a huge part of Mad Men, almost to the point of parody, and this has always been an art form that attracts surreal temperaments, from David Lynch to Bryan Fuller, even if they tend to be destroyed by it. As I’ve often said before, it’s the strangest medium I know, and at its best, it’s the outcome of many unresolved tensions. Television can feel maddeningly real, a hidden part of your own life, which is why it can be so hard to say goodbye to a great show. It’s also impossible to get a lasting grip on it or to hold it all in your mind at once, especially if it runs for more than a few seasons, which hints at an even deeper meaning. I’ve always been struck by how poorly we integrate the different chapters in our own past: there are entire decades of my life that I don’t think about for months on end. When they return, it’s usually in the hours just before waking. And by teaching us to process narratives that can last for years, it’s possible that television subtly trains us to better understand the shapes of our own lives, even if it’s only in dreams.

Written by nevalalee

October 7, 2013 at 8:27 am

Posted in Television

Tagged with ,

Critical television studies

with 4 comments

The cast of Community

Television is such a pervasive medium that it’s easy to forget how deeply strange it is. Most works of art are designed to be consumed all at once, or at least in a fixed period of time—it’s physically possible, if not entirely advisable, to read War and Peace in one sitting. Television, by contrast, is defined by the fact of its indefinite duration. House of Cards aside, it seems likely that most of us will continue to watch shows week by week, year after year, until they become a part of our lives. This kind of extended narrative can be delightful, but it’s also subject to risk. A beloved show can change for reasons beyond anyone’s control. Sooner or later, we find out who killed Laura Palmer. An actor’s contract expires, so Mulder is abducted by aliens, and even if he comes back, by that point, we’ve lost interest. For every show like Breaking Bad that has its dark evolution mapped out for seasons to come, there’s a series like Glee, which disappoints, or Parks and Recreation, which gradually reveals a richness and warmth that you’d never guess from the first season alone. And sometimes a show breaks your heart.

It’s clear at this point that the firing of Dan Harmon from Community was the most dramatic creative upheaval for any show in recent memory. This isn’t the first time that a show’s guiding force has departed under less than amicable terms—just ask Frank Darabont—but it’s unusual in a series so intimately linked to one man’s particular vision. Before I discovered Community, I’d never heard of Dan Harmon, but now I care deeply about what this guy feels and thinks. (Luckily, he’s never been shy about sharing this with the rest of us.) And although it’s obvious from the opening minutes of last night’s season premiere that the show’s new creative team takes its legacy seriously, there’s no escaping the sense that they’re a cover band doing a great job with somebody else’s music. Showrunners David Guarascio and Moses Port do their best to convince us out of the gate that they know how much this show means to us, and that’s part of the problem. Community was never a show about reassuring us that things won’t change, but about unsettling us with its endless transformations, even as it delighted us with its new tricks.

The Community episode "Remedial Chaos Theory"

Don’t get me wrong: I laughed a lot at last night’s episode, and I was overjoyed to see these characters again. By faulting the new staff for repeating the same beats I loved before, when I might have been outraged by any major alterations, I’m setting it up so they just can’t win. But the show seems familiar now in a way that would have seemed unthinkable for most of its first three seasons. Part of the pleasure of watching the series came from the fact that you never knew what the hell might happen next, and it wasn’t clear if Harmon knew either. Not all of his experiments worked: there even some clunkers, like “Messianic Myths and Ancient Peoples,” in the glorious second season, which is one of my favorite runs of any modern sitcom. But as strange as this might have once seemed, it feels like we finally know what Community is about. It’s a show that takes big formal risks, finds the emotional core in a flurry of pop culture references, and has no idea how to use Chevy Chase. And although I’m grateful that this version of the show has survived, I don’t think I’m going to tune in every week wondering where in the world it will take me.

And the strange thing is that Community might have gone down this path with or without Harmon. When a show needs only two seasons to establish that anything is possible, even the most outlandish developments can seem like variations on a theme. Even at the end of the third season, there was the sense that the series was repeating itself. I loved “Digital Estate Planning,” for instance, but it felt like the latest attempt to do one of the formally ambitious episodes that crop up at regular intervals each season, rather than an idea that forced itself onto television because the writers couldn’t help themselves. In my review of The Master, I noted that Paul Thomas Anderson has perfected his brand of hermetic filmmaking to the point where it would be more surprising if he made a movie that wasn’t ambiguous, frustrating, and deeply weird. Community has ended up in much the same place, so maybe it’s best that Harmon got out when he did. It’s doubtful that the series will ever be able to fake us out with a “Critical Film Studies” again, because it’s already schooled us, like all great shows, in how it needs to be watched. And although its characters haven’t graduated from Greendale yet, its viewers, to their everlasting benefit, already have.

Written by nevalalee

February 8, 2013 at 9:50 am

Wouldn’t it be easier to write for television?

leave a comment »

Last week, I had dinner with a college friend I hadn’t seen in years, who is thinking about giving up a PhD in psychology to write for television in Los Angeles. We spent a long time commiserating about the challenges of the medium, at least from a writer’s point of view, hitting many of the points that I’ve discussed here before. With the prospects of a fledgling television show so uncertain, I said, especially when the show might be canceled after four episodes, or fourteen, or forty, it’s all but impossible for the creator to tell effective stories over time. Running a television show is one of the hardest jobs in the world, with countless obstacles along the way, even for critical darlings. Knowing all this, I asked my friend, why did he want to do this in the first place?

My friend’s response was an enlightening one. The trouble with writing novels or short stories, he said, is the fact that the author is expected to spend a great deal of time on description, style, and other tedious elements that a television writer can cheerfully ignore. Teleplays, like feature scripts, are nothing but structure and dialogue (or maybe just structure, as William Goldman says), and there’s something liberating in how they strip storytelling down to its core. The writer takes care of the bones of the narrative, which is where his primary interest presumably lies, then outsources the work of casting, staging, and art direction to qualified professionals who are happy to do the work. And while I didn’t agree with everything my friend said, I could certainly see his point.

Yet that’s only half of the story. It’s true that a screenwriter gets to outsource much of the conventional apparatus of fiction to other departments, but only at the price of creative control. You may have an idea about how a character should look, or what kind of home he should have, or how a moment of dialogue, a scene, or an overall story should unfold, but as a writer, you don’t have much control over the matter. Scripts are easier to write than novels for a reason: they’re only one piece of a larger enterprise, which is reflected in the writer’s relative powerlessness. The closest equivalent to a novelist in television isn’t the writer, but the executive producer. Gene Roddenberry, in The Making of Star Trek, neatly sums up the similarity between the two roles:

Producing in television is like storytelling. The choice of the actor, picking the right costumes, getting the right flavor, the right pace—these are as much a part of storytelling as writing out that same description of a character in a novel.

And the crucial point about producing a television series, like directing a feature film, is that it’s insanely hard. As Thomas Lennon and Robert Ben Garant point out in their surprisingly useful Writing Movies for Fun and Profit, as far as directing is concerned, “If you’re doing it right, it’s not that fun.” As a feature director or television producer, you’re responsible for a thousand small but critical decisions that need to be made very quickly, and while you’re working on the story, you’re also casting parts, scouting for locations, dealing with the studio and the heads of various departments, and surviving on only a few hours of sleep a night, for a year or more of your life. In short, the amount of effort required to keep control of the project is greater, not less, than what is required to write a novel—except with more money on the line, in public, and with greater risk that control will eventually be taken away from you.

So it easier to write for television? Yes, if that’s all you want to do. But if you want control of your work, if you want your stories to be experienced in a form close to what you originally envisioned, it isn’t easier. It’s much harder. Which is why, to my mind, John Irving still puts it best: “When I feel like being a director, I write a novel.”

Lessons from great (and not-so-great) television

with one comment

It can be hard for a writer to admit being influenced by television. In On Becoming a Novelist, John Gardner struck a disdainful note that hasn’t changed much since:

Much of the dialogue one encounters in student fiction, as well as plot, gesture, even setting, comes not from life but from life filtered through TV. Many student writers seem unable to tell their own most important stories—the death of a father, the first disillusionment in love—except in the molds and formulas of TV. One can spot the difference at once because TV is of necessity—given its commercial pressures—false to life.

In the nearly thirty years since Gardner wrote these words, the television landscape has changed dramatically, but it’s worth pointing out that much of what he says here is still true. The basic elements of fiction—emotion, character, theme, even plot—need to come from close observation of life, or even the most skillful novel will eventually ring false. That said, the structure of fiction, and the author’s understanding of the possibilities of the form, doesn’t need to come from life alone, and probably shouldn’t. To develop a sense of what fiction can do, a writer needs to pay close attention to all types of art, even the nonliterary kind. And over the past few decades, television has expanded the possibilities of narrative in ways that no writer can afford to ignore.

If you think I’m exaggerating, consider a show like The Wire, which tells complex stories involving a vast range of characters, locations, and social issues in ways that aren’t possible in any other medium. The Simpsons, at least in its classic seasons, acquired a richness and velocity that continued to build for years, until it had populated a world that rivaled the real one for density and immediacy. (Like the rest of the Internet, I respond to most situations with a Simpsons quote.) And Mad Men continues to furnish a fictional world of astonishing detail and charm. World-building, it seems, is where television shines: in creating a long-form narrative that begins with a core group of characters and explores them for years, until they can come to seem as real as one’s own family and friends.

Which is why Glee can seem like such a disappointment. Perhaps because the musical is already the archest of genres, the show has always regarded its own medium with an air of detachment, as if the conventions of the after-school special or the high school sitcom were merely a sandbox in which the producers could play. On some level, this is fine: The Simpsons, among many other great shows, has fruitfully treated television as a place for narrative experimentation. But by turning its back on character continuity and refusing to follow any plot for more than a few episodes, Glee is abandoning many of the pleasures that narrative television can provide. Watching the show run out of ideas for its lead characters in less than two seasons simply serves as a reminder of how challenging this kind of storytelling can be.

Mad Men, by contrast, not only gives us characters who take on lives of their own, but consistently lives up to those characters in its acting, writing, and direction. (This is in stark contrast to Glee, where I sense that a lot of the real action is taking place in fanfic.) And its example has changed the way I write. My first novel tells a complicated story with a fairly controlled cast of characters, but Mad Men—in particular, the spellbinding convergence of plots in “Shut the Door, Have a Seat”—reminded me of the possibilities of expansive casts, which allows characters to pair off and develop in unexpected ways. (The evolution of Christina Hendricks’s Joan from eye candy to second lead is only the most obvious example.) As a result, I’ve tried to cast a wider net with my second novel, using more characters and settings in the hopes that something unusual will arise. Television, strangely, has made me more ambitious. I’d like to think that even John Gardner would approve.

Written by nevalalee

March 17, 2011 at 8:41 am

The cliché factory

with one comment

A few days ago, Bob Mankoff, the cartoon editor of The New Yorker, devoted his weekly email newsletter to the subject of “The Great Clichés.” A cliché, as Mankoff defines it, is a restricted comic situation “that would be incomprehensible if the other versions had not first appeared,” and he provides a list of examples that should ring bells for all readers of the magazine, from the ubiquitous “desert island” to “The-End-Is-Nigh Guy.” Here are a few of my favorites:

Atlas holding up the world; big fish eating little fish; burglars in masks; cave paintings; chalk outline at crime scene; crawling through desert; galley slaves; guru on mountain; mobsters and victim with cement shoes; man in stocks; police lineup; two guys in horse costume.

Inevitably, Mankoff’s list includes a few questionable choices, while also omitting what seem like obvious contenders. (Why “metal detector,” but not “Adam and Eve?”) But it’s still something that writers of all kinds will want to clip and save. Mankoff doesn’t make the point explicitly, but most gag artists probably keep a similar list of clichés as a starting point for ideas, as we read in Mort Gerberg’s excellent book Cartooning:

List familiar situations—clichés. You might break them down into categories, like domestic (couple at breakfast, couple watching television); business (boss berating employee, secretary taking dictation); historic (Paul Revere’s ride, Washington crossing the Delaware); even famous cartoon clichés (the desert island, the Indian snake charmer)…Then change something a little bit.

As it happened, when I saw Mankoff’s newsletter, I had already been thinking about a far more harmful kind of comedy cliché. Last week, Kal Penn went on Twitter to post some of the scripts from his years auditioning as a struggling actor, and they amount to an alternative list of clichés kept by bad comedy writers, consciously or otherwise: “Gandhi lookalike,” “snake charmer,” “foreign student.” One character has a “slight Hindi accent,” another is a “Pakistani computer geek who dresses like Beck and is in a perpetual state of perspiration,” while a third delivers dialogue that is “peppered with Indian cultural references…[His] idiomatic conversation is hit and miss.” A typical one-liner: “We are propagating like flies on elephant dung.” One script describes a South Asian character’s “spastic techno pop moves,” with Penn adding that “the big joke was an accent and too much cologne.” (It recalls the Morrissey song “Bengali in Platforms,” which included the notorious line: “Life is hard enough when you belong here.” You could amend it to read: “Being a comedy writer is hard enough when you belong here.”) Penn closes by praising shows with writers “who didn’t have to use external things to mask subpar writing,” which cuts to the real issue here. The real person in “a perpetual state of perspiration” isn’t the character, but the scriptwriter. Reading the teleplay for an awful sitcom is a deadening experience in itself, but it’s even more depressing to realize that in most cases, the writer is falling back on a stereotype to cover up the desperate unfunniness of the writing. When Penn once asked if he could play a role without an accent, in order to “make it funny on the merits,” he was told that he couldn’t, probably because everybody else knew that the merits were nonexistent.

So why is one list harmless and the other one toxic? In part, it’s because we’ve caught them at different stages of evolution. The list of comedy conventions that we find acceptable is constantly being culled and refined, and certain art forms are slightly in advance of the others. Because of its cultural position, The New Yorker is particularly subject to outside pressures, as it learned a decade ago with its Obama terrorist cover—which demonstrated that there are jokes and images that aren’t acceptable even if the magazine’s attitude is clear. Turn back the clock, and Mankoff’s list would include conventions that probably wouldn’t fly today. Gerberg’s list, like Penn’s, includes “snake charmer,” which Mankoff omits, and he leaves out “Cowboys and Indians,” a cartoon perennial that seems to be disappearing. And it can be hard to reconstruct this history, because the offenders tend to be consigned to the memory hole. When you read a lot of old magazine fiction, as I do, you inevitably find racist stereotypes that would be utterly unthinkable today, but most of the stories in which they appear have long since been forgotten. (One exception, unfortunately, is the Sherlock Holmes short story “The Adventure of the Three Gables,” which opens with a horrifying racial caricature that most Holmes fans must wish didn’t exist.) If we don’t see such figures as often today, it isn’t necessarily because we’ve become more enlightened, but because we’ve collectively agreed to remove certain figures from the catalog of stock comedy characters, while papering over their use in the past. A list of clichés is a snapshot of a culture’s inner life, and we don’t always like what it says. The demeaning parts still offered to Penn and actors of similar backgrounds have survived for longer than they should have, but sitcoms that trade in such stereotypes will be unwatchable in a decade or two, if they haven’t already been consigned to oblivion.

Of course, most comedy writers aren’t thinking in terms of decades, but about getting through the next five minutes. And these stereotypes endure precisely because they’re seen as useful, in a shallow, short-term kind of way. There’s a reason why such caricatures are more visible in comedy than in drama: comedy is simply harder to write, but we always want more of it, so it’s inevitable that writers on a deadline will fall back on lazy conventions. The really insidious thing about these clichés is that they sort of work, at least to the extent of being approved by a producer without raising any red flags. Any laughter that they inspire is the equivalent of empty calories, but they persist because they fill a cynical need. As Penn points out, most writers wouldn’t bother with them at all if they could come up with something better. Stereotypes, like all clichés, are a kind of fallback option, a cheap trick that you deploy if you need a laugh and can’t think of another way to get one. Clichés can be a precious commodity, and all writers resort to them occasionally. They’re particularly valuable for gag cartoonists, who can’t rely on a good idea from last week to fill the blank space on the page—they’ve got to produce, and sometimes that means yet another variation on an old theme. But there’s a big difference between “Two guys in a horse costume” and “Gandhi lookalike.” Being able to make that distinction isn’t a matter of political correctness, but of craft. The real solution is to teach people to be better writers, so that they won’t even be tempted to resort to such tired solutions. This might seem like a daunting task, but in fact, it happens all the time. A cliché factory operates on the principle of supply and demand. And it shuts down as soon as people no longer find it funny.

Written by nevalalee

March 20, 2017 at 11:18 am

A series of technical events

with 6 comments

In his book Four Arguments for the Elimination of Television, which was first published in the late seventies, the author Jerry Mander, a former advertising executive, lists a few of the “technical tricks” that television can use to stimulate the viewer’s interest:

Editors make it possible for a scene in one room to be followed instantly by a scene in another room, or at another time, or another place. Words appears over the images. Music rises and falls in the background. Two images or three can appear simultaneously. One image can be superposed on another on the screen. Motion can be slowed down or sped up.

These days, we take most of these effects for granted, as part of the basic grammar of the medium, but to Mander, they’re something more sinister. Technique, he argues, is replacing content, and at its heart, it’s something of a confidence game:

Through these technical events, television images alter the usual, natural imagery possibilities, taking on the quality of a naturally highlighted event. They make it seem that what you are looking at is unique, unusual, and extraordinary…But nothing unusual is going on. All that’s happening is that the viewer is watching television, which is the same thing that happened an hour ago, or yesterday. A trick has been played. The viewer is fixated by a conspiracy of dimmed-out environments combined with an artificial, impossible, fictitious unusualness.

In order to demonstrate “the extent to which television is dependent upon technical tricks to maintain your interest,” Mander invites the reader to conduct what he calls a technical events test:

Put on your television set and simply count the number of times there is a cut, a zoom, a superimposition, a voiceover, the appearance of words on the screen—a technical event of some kind…Each technical event—each alteration of what would be natural imagery—is intended to keep your attention from waning as it might otherwise…Every time you are about to relax your attention, another technical event keeps you attached..

You will probably find that in the average commercial television program, there are eight or ten technical events for every sixty-second period…You may also find that there is rarely a period of twenty seconds without any sort of technical event at all. That may give you an idea of the extent to which producers worry about whether the content itself can carry your interest.

He goes on to list the alleged consequences of exposure to such techniques, from shortened attention span in adults to heightened hyperactivity in children, and concludes: “Advertisers are the high artists of the medium. They have gone further in the technologies of fixation than anyone else.”

Mander’s argument was prophetic in many ways, but in one respect, he was clearly wrong. In the four decades since his book first appeared, it has become obvious that the “high artists” of distraction and fixation aren’t advertisers, but viewers themselves, and its true canvas isn’t television, but the Internet. Instead of passively viewing a series of juxtaposed images, we assemble our online experience for ourselves, and each time we open a new link, we’re effectively acting as our own editors. Every click is a cut. (The anecdotal figure that the reader spends less than fifteen seconds on the average web page is very close to the frequency of technical events on television, which isn’t an accident.) We do a better job of distracting ourselves than any third party ever could, as long as we’re given sufficient raw material and an intuitive interface—which explains much of the evolution of online content. When you look back at web pages from the early nineties, it’s easy to laugh at how noisy and busy they tended to be, with music, animated graphics, and loud colors. This wasn’t just a matter of bad taste, but of a mistaken analogy to television. Web designers thought that they had to grab our attention using the same technical tricks employed by other media, but that wasn’t the case. The hypnotic browsing state that we’ve all experienced isn’t produced by any one page, but by the succession of similar pages as the user moves between them at his or her private rhythm. Ideally, from the point of view of a media company, that movement will take place within the same family of pages, but it also leads to a convergence of style and tone between sites. Most web pages these days look more or less the same because it creates a kind of continuity of experience. Instead of the loud, colorful pages of old, they’re static and full of white space. Mander calls this “the quality of even tone” of television, and the Internet does it one better. It’s uniform and easily aggregated, and you can cut it together however you like, like yard goods.

In fact, it isn’t content that gives us the most pleasure, but the act of clicking, with the sense of control it provides. This implies that bland, interchangeable content is actually preferable to more arresting material. The easier it is to move between basically similar units, the closer the experience is to that of an ideally curated television show—which is why different sources have a way of blurring together into the same voice. When I’m trying to tell my wife about a story I read online, I often have trouble remembering if I read it on Vox, Vulture, or Vice, which isn’t a knock against those sites, but a reflection of the unconscious pressure to create a seamless browsing experience. From there, it’s only a short step to outright content mills and fake news. In the past, I’ve called this AutoContent, after the interchangeable bullet points used to populate slideshow presentations, but it’s only effective if you can cut quickly from one slide to another. If you had to stare at it for longer than fifteen seconds, you wouldn’t be able to stand it. (This may be why we’ve come to associate quality with length, which is more resistant to being to reduced to the filler between technical events. The “long read,” as I’ve argued elsewhere, can be a marketing category in itself, but it does need to try a little harder.) The idea that browsing online is a form of addictive behavior isn’t a new one, of course, and it’s often explained in terms of the “random rewards” that the brain receives when we check email or social media. But the notion of online content as a convenient source of technical events is worth remembering. When we spend any period of time online, we’re essentially watching a television show while simultaneously acting as its editor and director, and often as its writer and actors. In the end, to slightly misquote Mander, all that’s happening is that the reader is seated in front of a computer or looking at a phone, “which is the same thing that happened an hour ago, or yesterday.” The Internet is better at this than television ever was. And in a generation or two, it may result in television being eliminated after all.

Written by nevalalee

March 14, 2017 at 9:18 am

Farewell to Mystic Falls

with one comment

Note: Spoilers follow for the series finale of The Vampire Diaries.

On Friday, I said goodbye to The Vampire Diaries, a series that I once thought was one of the best genre shows on television, only to stop watching it for its last two seasons. Despite its flaws, it occupies a special place in my memory, in part because its strengths were inseparable from the reasons that I finally abandoned it. Like Glee, The Vampire Diaries responded to its obvious debt to an earlier franchise—High School Musical for the former, Twilight for the latter—both by subverting its predecessor and by burning through ideas as relentlessly as it could. It’s as if both shows decided to refute any accusations of unoriginality by proving that they could be more ingenious than their inspirations, and amazingly, it sort of worked, at least for a while. There’s a limit to how long any series can repeatedly break down and reassemble itself, however, and both started to lose steam after about three years. In the case of The Vampire Diaries, its problems crystallized around its ostensible lead, Elena Gilbert, as portrayed by the game and talented Nina Dobrev, who left the show two seasons ago before returning for an encore in the finale. Elena spent most of her first sendoff asleep, and she isn’t given much more to do here. There’s a lot about the episode that I liked, and it provides satisfying moments of closure for many of its characters, but Elena isn’t among them. In the end, when she awakens from the magical coma in which she has been slumbering, it’s so anticlimactic that it reminds me of what Pauline Kael wrote of Han’s revival in Return of the Jedi: “It’s as if Han Solo had locked himself in the garage, tapped on the door, and been let out.”

And what happened to Elena provides a striking case study of why the story’s hero is often fated to become the least interesting person in sight. The main character of a serialized drama is under such pressure to advance the plot that he or she becomes reduced to the diagram of a pattern of forces, like one of the fish in D’Arcy Wentworth Thompson’s On Growth and Form, in which the animal’s physical shape is determined by the outside stresses to which it has been subjected. Instead of making her own decisions, Elena was obliged to become whatever the series needed her to be. Every protagonist serves as a kind of motor for the story, which is frequently a thankless role, but it was particularly problematic on a show that defined itself by its willingness to burn through a year of potential storylines each month. Every episode felt like a season finale, and characters were freely killed, resurrected, and brainwashed to keep the wheels turning. It was hardest on Elena, who, at her best, was a compelling, resourceful heroine. After six seasons of personality changes, possessions, memory wipes, and the inexplicable choices that she made just because the story demanded it, she became an empty shell. If you were designing a show in a laboratory to see what would happen if its protagonist was forced to live through plot twists at an accelerated rate, like the stress tests that engineers use to put a component through a lifetime’s worth of wear in a short period of time, you couldn’t do much better than The Vampire Diaries. And while it might have been theoretically interesting to see what happened to the series after that one piece was removed, I didn’t think it was worth sitting through another two seasons of increasingly frustrating television.

After the finale was shot, series creators Kevin Williamson and Julie Plec made the rounds of interviews to discuss the ending, and they shared one particular detail that fascinates me. If you haven’t watched The Vampire Diaries, all you need to know is that its early seasons revolved around a love triangle between Elena and the vampire brothers Stefan and Damon, a nod to Twilight that quickly became one of the show’s least interesting aspects. Elena seemed fated to end up with Stefan, but she spent the back half of the series with Damon, and it ended with the two of them reunited. In a conversation with Deadline, Williamson revealed that this wasn’t always the plan:

Well, I always thought it would be Stefan and Elena. They were sort of the anchor of the show, but because we lost Elena in season six, we couldn’t go back. You know Nina could only come back for one episode—maybe if she had came back for the whole season, we could even have warped back towards that, but you can’t just do it in forty-two minutes.

Dobrev’s departure, in other words, froze that part of the story in place, even as the show around it continued its usual frantic developments, and when she returned, there wasn’t time to do anything but keep Elena and Damon where they had left off. There’s a limit to how much ground you can cover in the course of a single episode, so it seemed easier for the producers to stick with what they had and figure out a way to make it seem inevitable.

The fact that it works at all is a tribute to the skill of the writers and cast, as well as to the fact that the whole love triangle was basically arbitrary in the first place. As James Joyce said in a very different context, it was a bridge across which the characters could walk, and once they were safely on the other side, it could be blown to smithereens. The real challenge was how to make the finale seem like a definitive ending, after the show had killed off and resurrected so many characters that not even death itself felt like a conclusion. It resorted to much the same solution that Lost did when faced with a similar problem: it shut off all possibility of future narrative by reuniting its characters in heaven. This partially a form of wish fulfillment, as we’ve seen with so many other television series, but it also puts a full stop on the story by leaving us in an afterlife, where, by definition, nothing can ever change. It’s hilariously unlike the various versions of the world to come that the series has presented over the years, from which characters can always be yanked back to life when necessary, but it’s also oddly moving and effective. Watching it, I began to appreciate how the show’s biggest narrative liability—a cast that just can’t be killed—also became its greatest asset. The defining image of The Vampire Diaries was that of a character who has his neck snapped, and then just shakes it off. Williamson and Plec must have realized, consciously or otherwise, that it was a reset button that would allow them to go through more ideas than would be possible than a show on which a broken neck was permanent. Every denizen of Mystic Falls got a great death scene, often multiple times per season, and the show exploited that freedom until it exhausted itself. It only really worked for three years out of eight, but it was a great run while it lasted. And now, after life’s fitful fever, the characters can sleep well, as they sail off into the mystic.

From Sputnik to WikiLeaks

with 2 comments

In Toy Story 2, there’s a moment in which Woody discovers that his old television series, Woody’s Roundup, was abruptly yanked off the air toward the end of the fifties. He asks: “That was a great show. Why cancel it?” The Prospector replies bitterly: “Two words: Sput-nik. Once the astronauts went up, children only wanted to play with space toys.” And while I wouldn’t dream of questioning the credibility of a man known as Stinky Pete, I feel obliged to point out that his version of events isn’t entirely accurate. The space craze among kids really began more than half a decade earlier, with the premiere of Tom Corbett, Space Cadet, and the impact of Sputnik on science fiction was far from a positive one. Here’s what John W. Campbell wrote about it in the first issue of Astounding to be printed after the satellite’s launch:

Well, we lost that race; Russian technology achieved an important milestone in human history—one that the United States tried for, talked about a lot, and didn’t make…One of the things Americans have long been proud of—and with sound reason—is our ability to convert theoretical science into practical, working engineering…This time we’re faced with the uncomfortable realization that the Russians have beaten us in our own special field; they solved a problem of engineering technology faster and better than we did.

And while much of the resulting “Sputnik crisis” was founded on legitimate concerns—Sputnik was as much a triumph of ballistic rocketry as it was of satellite technology—it also arose from the notion that the United States had been beaten at its own game. As Arthur C. Clarke is alleged to have said, America had become “a second-rate power.”

Campbell knew right away that he had reason to worry. Lester del Rey writes in The World of Science Fiction:

Sputnik simply convinced John Campbell that he’d better watch his covers and begin cutting back on space scenes. (He never did, but the art director of the magazine and others were involved in that decision.) We agreed in our first conversation after the satellite went up that people were going to react by deciding science had caught up with science fiction, and with a measure of initial fear. They did. Rather than helping science fiction, Sputnik made it seem outmoded.

And that’s more or less exactly what happened. There was a brief spike in sales, followed by a precipitous fall as mainstream readers abandoned the genre. I haven’t been able to find specific numbers for this period, but one source, the Australian fan Wynne Whitford, states that the circulation of Astounding fell by half after Sputnik—which seems high, but probably reflects a real decline. In a letter written decades later, Campbell said of Sputnik: “Far from encouraging the sales of science fiction magazines—half the magazines being published lost circulation so drastically they went out of business!” An unscientific glance at a list of titles appears to support this. In 1958, the magazines Imagination, Imaginative Tales, Infinity Science Fiction, Phantom, Saturn, Science Fiction Adventures, Science Fiction Quarterly, Star Science Fiction, and Vanguard Science Fiction all ceased publication, followed by three more over the next twelve months. The year before, just four magazines had folded. There was a bubble, and after Sputnik, it burst.

At first, this might seem like a sort of psychological self-care, of the same kind that motivated me to scale back my news consumption after the election. Americans were simply depressed, and they didn’t need any reminders of the situation they were in. But it also seems to have affected the public’s appetite for science fiction in particular, rather than science as a whole. In fact, the demand for nonfiction science writing actually increased. As Isaac Asimov writes in his memoir In Joy Still Felt:

The United States went into a dreadful crisis of confidence over the fact that the Soviet Union had gotten there first and berated itself for not being interested enough in science. And I berated myself for spending too much time on science fiction when I had the talent to be a great science writer…Sputnik also served to increase the importance of any known public speaker who could talk on science and, particularly, on space, and that meant me.

What made science fiction painful to read, I think, was its implicit assumption of American superiority, which had been disproven so spectacularly. Campbell later compared it to the reaction after the bomb fell, claiming that it was the moment when people realized that science fiction wasn’t a form of escapism, but a warning:

The reactions to Sputnik have been more rapid, and, therefore, more readily perceptible and correlatable. There was, again, a sudden rise in interest in science fiction…and there is, now, an even more marked dropping of the science-fiction interest. A number of the magazines have been very heavily hit…I think the people of the United States thought we were kidding.

And while Campbell seemed to believe that readers had simply misinterpreted science fiction’s intentions, the conventions of the genre itself clearly bore part of the blame.

In his first editorials after Sputnik, Campbell drew a contrast between the American approach to engineering, which proceeded logically and with vast technological resources, and the quick and dirty Soviet program, which was based on rules of thumb, trial and error, and the ability to bull its way through on one particular point of attack. It reminds me a little of the election. Like the space race, last year’s presidential campaign could be seen as a kind of proxy war between the American and Russian administrations, and regardless of what you believe about the Trump camp’s involvement, which I suspect was probably a tacit one, there’s no question as to which side Putin favored. On one hand, you had a large, well-funded political machine, and on the other, one that often seemed comically inept. Yet it was the quick and dirty approach that triumphed. “The essence of ingenuity is the ability to get precision results without precision equipment,” Campbell wrote, and that’s pretty much what occurred. A few applications of brute force in the right place made all the difference, and they were aided, to some extent, by a similar complacency. The Americans saw the Soviets as bunglers, and they never seriously considered the possibility that they might be beaten by a bunch of amateurs. As Campbell put it: “We earned what we got—fully, and of our own efforts. The ridicule we’ve collected is our just reward for our consistent efforts.” Sometimes I feel the same way. Right now, we’re entering a period in which the prospect of becoming a second-rate power is far more real than it was when Clarke made his comment. It took a few months for the implications of Sputnik to really sink in. And if history is any indication, we haven’t even gotten to the crisis yet.

Who we are in the moment

with 57 comments

Jordan Horowitz and Barry Jenkins

By now, you’re probably sick of hearing about what happened at the Oscars. I’m getting a little tired of it, too, even though it was possibly the strangest and most riveting two minutes I’ve ever seen on live television. It left me feeling sorry for everyone involved, but there are at least three bright spots. The first is that it’s going to make a great case study for somebody like Malcolm Gladwell, who is always looking for a showy anecdote to serve as a grabber opening for a book or article. So many different things had to go wrong for it to happen—on the levels of design, human error, and simple dumb luck—that you can use it to illustrate just about any point you like. A second silver lining is that it highlights the basically arbitrary nature of all such awards. As time passes, the list of Best Picture winners starts to look inevitable, as if Cimarron and Gandhi and Chariots of Fire had all been canonized by a comprehensible historical process. If anything, the cycle of inevitability is accelerating, so that within seconds of any win, the narratives are already locking into place. As soon as La La Land was announced as the winner, a story was emerging about how Hollywood always goes for the safe, predictable choice. The first thing that Dave Itzkoff, a very smart reporter, posted on the New York Times live chat was: “Of course.” Within a couple of minutes, however, that plot line had been yanked away and replaced with one for Moonlight. And the fact that the two versions were all but superimposed onscreen should warn us against reading too much into outcomes that could have gone any number of ways.

But what I want to keep in mind above all else is the example of La La Land producer Jordan Horowitz, who, at a moment of unbelievable pressure, simply said: “I’m going to be really proud to hand this to my friends from Moonlight.” It was the best thing that anybody could have uttered under those circumstances, and it tells us a lot about Horowitz himself. If you were going to design a psychological experiment to test a subject’s reaction under the most extreme conditions imaginable, it’s hard to think of a better one—although it might strike a grant committee as possibly too expensive. It takes what is undoubtedly one of the high points of someone’s life and twists it instantly into what, if perhaps not the worst moment, at least amounts to a savage correction. Everything that the participants onstage did or said, down to the facial expressions of those standing in the background, has been subjected to a level of scrutiny worthy of the Zapruder film. At the end of an event in which very little occurs that hasn’t been scripted or premeditated, a lot of people were called upon to figure out how to act in real time in front of an audience of hundreds of millions. It’s proverbial that nobody tells the truth in Hollywood, an industry that inspires insider accounts with titles like Hello, He Lied and Which Lie Did I Tell? A mixup like the one at the Oscars might have been expressly conceived as a stress test to bring out everyone’s true colors. Yet Horowitz said what he did. And I suspect that it will do more for his career than even an outright win would have accomplished.

Kellyanne Conway

It also reminds me of other instances over the last year in which we’ve learned exactly what someone thinks. When we get in trouble for a remark picked up on a hot mike, we often say that it doesn’t reflect who we really are—which is just another way of stating that it doesn’t live up to the versions of ourselves that we create for public consumption. It’s far crueler, but also more convincing, to argue that it’s exactly in those unguarded, unscripted moments that our true selves emerge. (Freud, whose intuition on such matters was uncanny, was onto something when he focused on verbal mistakes and slips of the tongue.) The justifications that we use are equally revealing. Maybe we dismiss it as “locker room talk,” even if it didn’t take place anywhere near a locker room. Kellyanne Conway excused her reference to the nonexistent Bowling Green Massacre by saying “I misspoke one word,” even though she misspoke it on three separate occasions. It doesn’t even need to be something said on the spur of the moment. At his confirmation hearing for the position of ambassador to Israel, David M. Friedman apologized for an opinion piece he had written before the election: “These were hurtful words, and I deeply regret them. They’re not reflective of my nature or my character.” Friedman also said that “the inflammatory rhetoric that accompanied the presidential campaign is entirely over,” as if it were an impersonal force that briefly took possession of its users and then departed. We ask to be judged on our most composed selves, not the ones that we reveal at our worst.

To some extent, that’s a reasonable request. I’ve said things in public and in private that I’ve regretted, and I wouldn’t want to be judged solely on my worst moments as a writer or parent. At a time when a life can be ruined by a single tweet, it’s often best to err on the side of forgiveness, especially when there’s any chance of misinterpretation. But there’s also a place for common sense. You don’t refer to an event as a “massacre” unless you really think of it that way or want to encourage others to do so. And we judge our public figures by what they say when they think that nobody is listening, or when they let their guard down. It might seem like an impossibly high standard, but it’s also the one that’s effectively applied in practice. You can respond by becoming inhumanly disciplined, like Obama, who in a decade of public life has said maybe five things he has reason to regret. Or you can react like Trump, who says five regrettable things every day and trusts that its sheer volume will reduce it to a kind of background noise—which has awakened us, as Trump has in so many other ways, to a political option that we didn’t even knew existed. Both strategies are exhausting, and most of us don’t have the energy to pursue either path. Instead, we’re left with the practical solution of cultivating the inner voice that, as I wrote last week, allows us to act instinctively. Kant writes: “Live your life as though your every act were to become a universal law.” Which is another way of saying that we should strive to be the best version of ourselves at all times. It’s probably impossible. But it’s easier than wearing a mask.

Written by nevalalee

February 28, 2017 at 9:00 am

Swallowing the turkey

with 2 comments

Benjamin Disraeli

Lord Rowton…says that he once asked Disraeli what was the most remarkable, the most self-sustained and powerful sentence he knew. Dizzy paused for a moment, and then said, “Sufficient unto the day is the evil thereof.”

—Augustus J.C. Hare, The Story of My Life

Disraeli was a politician and a novelist, which is an unusual combination, and he knew his business. Politics and writing have less to do with each other than a lot of authors might like to believe, and the fact that you can create a compelling world on paper doesn’t mean that you can do the same thing in real life. (One of the hidden themes of Astounding is that the skills that many science fiction writers acquired in organizing ideas on the page turned out to be notably inadequate when it came to getting anything done during World War II.) Yet both disciplines can be equally daunting and infuriating to novices, in large part because they both involve enormously complicated projects—often requiring years of effort—that need to be approached one day at a time. A single day’s work is rarely very satisfying in itself, and you have to cling to the belief that countless invisible actions and compromises will somehow result in something real. It doesn’t always happen, and even if it does, you may never get credit or praise. The ability to deal with the everyday tedium of politics or writing is what separates professionals from amateurs. And in both cases, the greatest accomplishments are usually achieved by freaks who can combine an overarching vision with a finicky obsession with minute particulars. As Eugène-Melchior de Vogüé, who was both a diplomat and literary critic, said of Tolstoy, it requires “a queer combination of the brain of an English chemist with the soul of an Indian Buddhist.”

And if you go into either field without the necessary degree of patience, the results can be unfortunate. If you’re a writer who can’t subordinate yourself to the routine of writing on a daily basis, the most probable outcome is that you’ll never finish your novel. In politics, you end up with something very much like what we’ve all observed over the last few weeks. Regardless of what you might think about the presidential refugee order, its rollout was clearly botched, thanks mostly to a president and staff that want to skip over all the boring parts of governing and get right to the good stuff. And it’s tempting to draw a contrast between the incumbent, who achieved his greatest success on reality television, and his predecessor, a detail-oriented introvert who once thought about becoming a novelist. (I’m also struck, yet again, by the analogy to L. Ron Hubbard. He spent most of his career fantasizing about a life of adventure, but when he finally got into the Navy, he made a series of stupid mistakes—including attacking two nonexistent submarines off the coast of Oregon—that ultimately caused him to be stripped of his command. The pattern repeated itself so many times that it hints at a fundamental aspect of his personality. He was too impatient to deal with the tedious reality of life during wartime, which failed to live up to the version he had dreamed of himself. And while I don’t want to push this too far, it’s hard not to notice the difference between Hubbard, who cranked out his fiction without much regard for quality, and Heinlein, a far more disciplined writer who was able to consciously tame his own natural impatience into a productive role at the Philadelphia Navy Yard.)

R.H. Blyth

Which brings us back to the sentence that impressed Disraeli. It’s easy to interpret it as an admonition not to think about the future, which isn’t quite right. We can start by observing that it comes at the end of what The Five Gospels notes is possibly “the longest connected discourse that can be directly attributed to Jesus.” It’s the one that asks us to consider the birds of the air and the lilies of the field, which, for a lot of us, prompts an immediate flashback to The Life of Brian. (“Consider the lilies?” “Uh, well, the birds, then.” “What birds?” “Any birds.” “Why?” “Well, have they got jobs?”) But whether or not you agree with the argument, it’s worth noticing that the advice to focus on the evils of each day comes only after an extended attempt at defining a larger set of values—what matters, what doesn’t, and what, if anything, you can change by worrying. You’re only in a position to figure out how best to spend your time after you’ve considered the big questions. As the physician William Osler put it:

[My ideal is] to do the day’s work well and not to bother about tomorrow. You may say that is not a satisfactory ideal. It is; and there is not one which the student can carry with him into practice with greater effect. To it more than anything else I owe whatever success I have had—to this power of settling down to the day’s work and trying to do it well to the best of my ability, and letting the future take care of itself.

This has important implications for both writers and politicians, as well as for progressives who wonder how they’ll be able to get through the next twenty-four hours, much less the next four years. When you’re working on any important project, even the most ambitious agenda comes down to what you’re going to do right now. In On Directing Film, David Mamet expresses it rather differently:

Now, you don’t eat a whole turkey, right? You take off the drumstick and you take a bite of the drumstick. Okay. Eventually you get the whole turkey done. It’ll probably get dry before you do, unless you have an incredibly good refrigerator and a very small turkey, but that is outside the scope of this lecture.

A lot of frustration in art, politics, and life in general comes from attempting to swallow the turkey in one bite. Jesus, I think, was aware of the susceptibility of his followers to grandiose but meaningless gestures, which is why he offered up the advice, so easy to remember and so hard to follow, to simultaneously focus on the given day while keeping the kingdom of heaven in mind. Nearly every piece of practical wisdom in any field is about maintaining that double awareness. Fortunately, it goes in both directions: small acts of discipline aid us in grasping the whole, and awareness of the whole tells us what to do in the moment. As R.H. Blyth says of Zen: “That is all religion is: eat when you are hungry, sleep when you are tired.” And don’t try to eat the entire turkey at once.

From Xenu to Xanadu

leave a comment »

L. Ron Hubbard

I do know that I could form a political platform, for instance, which would encompass the support of the unemployed, the industrialist and the clerk and day laborer all at one and the same time. And enthusiastic support it would be.

L. Ron Hubbard, in a letter to his wife Polly, October 1938

Yesterday, my article “Xenu’s Paradox: The Fiction of L. Ron Hubbard and the Making of Scientology” was published on Longreads. I’d been working on this piece, off and on, for the better part of a year, almost from the moment I knew that I was going to be writing the book Astounding. As part of my research, I had to read just about everything Hubbard ever wrote in the genres of science fiction and fantasy, and I ended up working my way through well over a million words of his prose. The essay that emerged from this process was inspired by a simple question. Hubbard clearly didn’t much care for science fiction, and he wrote it primarily for the money. Yet when the time came to invent a founding myth for Scientology, he turned to the conventions of space opera, which had previously played a minimal role in his work. Both his critics and his followers have looked hard at his published stories to find hints of the ideas to come, and there are a few that seem to point toward later developments. (One that frequently gets mentioned is “One Was Stubborn,” in which a fake religious messiah convinces people to believe in the nonexistence of matter so that he can rule the universe. There’s circumstantial evidence, however, that the premise came mostly from John W. Campbell, and that Hubbard wrote it up on the train ride home from New York to Puget Sound.) Still, it’s a tiny fraction of the whole. And such stories by other writers as “The Double Minds” by Campbell, “Lost Legacy” by Robert A. Heinlein, and The World of Null-A by A.E. van Vogt make for more compelling precursors to dianetics than anything Hubbard ever wrote.

The solution to the mystery, as I discuss at length in the article, is that Hubbard tailored his teachings to the small circle of followers he had available after his blowup with Campbell, many of whom were science fiction fans who owed their first exposure to his ideas to magazines like Astounding. And this was only the most dramatic and decisive instance of a pattern that is visible throughout his life. Hubbard is often called a fabulist who compulsively embellished own accomplishments and turned himself into something more than he really was. But it would be even more accurate to say that Hubbard transformed himself into whatever he thought the people around him wanted him to be. When he was hanging out with members of the Explorers Club, he became a barnstormer, world traveler, and intrepid explorer of the Caribbean and Alaska. Around his fellow authors, he presented himself as the most productive pulp writer of all time, inflating his already impressive word count to a ridiculous extent. During the war, he spun stories about his exploits in battle, claiming to have been repeatedly sunk and wounded, and even a former naval officer as intelligent and experienced as Heinlein evidently took him at his word. Hubbard simply became whatever seemed necessary at the time—as long as he was the most impressive man in the room. It wasn’t until he found himself surrounded by science fiction fans, whom he had mostly avoided until then, that he assumed the form that he would take for the rest of his career. He had never been interested in past lives, but many of his followers were, and the memories that they were “recovering” in their auditing sessions were often colored by the imagery of the stories they had read. And Hubbard responded by coming up with the grandest, most unbelievable space opera saga of them all.

Donald Trump

This leaves us with a few important takeaways. The first is that Hubbard, in the early days, was basically harmless. He had invented a colorful background for himself, but he wasn’t alone: Lester del Rey, among others, seems to have engaged in the same kind of self-mythologizing. His first marriage wasn’t a happy one, and he was always something of a blowhard, determined to outshine everyone he met. Yet he also genuinely impressed John and Doña Campbell, Heinlein, Asimov, and many other perceptive men and women. It wasn’t until after the unexpected success of dianetics that he grew convinced of his own infallibility, casting off such inconvenient collaborators as Campbell and Joseph Winter as obstacles to his power. Even after he went off to Wichita with his remaining disciples, he might have become little more than a harmless crank. As he began to feel persecuted by the government and professional organizations, however, his mood curdled into something poisonous, and it happened at a time in which he had undisputed authority over the people around him. It wasn’t a huge kingdom, but because of its isolation—particularly when he was at sea—he was able to exercise a terrifying amount of control over his closest followers. Hubbard didn’t even enjoy it. He had wealth, fame, and the adulation of a handful of true believers, but he grew increasingly paranoid and miserable. At the time of his death, his wrath was restricted to his critics and to anyone within arm’s reach, but he created a culture of oppression that his successor cheerfully extended against current and former members in faraway places, until no one inside or outside the Church of Scientology was safe.

I wrote the first draft of this essay in May of last year, but it’s hard to read it now without thinking of Donald Trump. Like Hubbard, Trump spent much of his life as an annoying but harmless windbag: a relentless self-promoter who constantly inflated his own achievements. As with Hubbard, everything that he did had to be the biggest and best, and until recently, he was too conscious of the value of his own brand to risk alienating too many people at once. After a lifetime of random grabs for attention, however, he latched onto a cause—the birther movement—that was more powerful than anything he had encountered before, and, like Hubbard, he began to focus on the small number of passionate followers he had attracted. His presidential campaign seems to have been conceived as yet another form of brand extension, culminating in the establishment of a Trump Television network. He shaped his message in response to the crowds who came to his rallies, and before long, he was caught in the same kind of cycle: a man who had once believed in nothing but himself gradually came to believe his own words. (Hubbard and Trump have both been described as con men, but the former spent countless hours auditing himself, and Trump no longer seems conscious of his own lies.) Both fell upward into positions of power that exceeded their wildest expectations, and it’s frightening to consider what might come next, when we consider how Hubbard was transformed. During his lifetime, Hubbard had a small handful of active followers; the Church of Scientology has perhaps 30,000, although, like Trump, they’re prone to exaggerate such numbers; Trump has millions. It’s especially telling that both Hubbard and Trump loved Citizen Kane. I love it, too. But both men ended up in their own personal Xanadu. And as I’ve noted before, the only problem with that movie is that our affection for Orson Welles distracts us from the fact that Kane ultimately went crazy.

Don’t stay out of Riverdale

leave a comment »

Riverdale

In the opening seconds of the series premiere of Riverdale, a young man speaks quietly in voiceover, his words playing over idyllic shots of American life:

Our story is about a town, a small town, and the people who live in the town. From a distance, it presents itself like so many other small towns all over the world. Safe. Decent. Innocent. Get closer, though, and you start seeing the shadows underneath. The name of our town is Riverdale.

Much later, we realize that the speaker is Jughead of Archie Comics fame, played by former Disney child star Cole Sprouse, which might seem peculiar enough in itself. But what I noticed first about this monologue is that it basically summarizes the prologue of Blue Velvet, which begins with images of roses and picket fences and then dives into the grass, revealing the insects ravening like feral animals in the darkness. It’s one of the greatest declarations of intent in all of cinema, and initially, there’s something a little disappointing in the way that Riverdale feels obliged to blandly state what Lynch put into a series of unforgettable images. Yet I have the feeling that series creator Roberto Aguirre-Sacasa, who says that Blue Velvet is one of his favorite movies, knows exactly what he’s doing. And the result promises to be more interesting than even he can anticipate.

Riverdale has been described as The O.C. meets Twin Peaks, which is how it first came to my attention. But it’s also a series on the CW, with all the good, the bad, and the lack of ugly that this implies. This the network that produced The Vampire Diaries, the first three seasons of which unexpectedly generated some of my favorite television from the last few years, and it takes its genre shows very seriously. There’s a fascinating pattern at work within systems that produce such narratives on a regular basis, whether in pulp magazines or comic books or exploitation pictures: as long as you hit all the obligatory notes and come in under budget, you’re granted a surprising amount of freedom. The CW, like its predecessors, has become an unlikely haven for auteurs, and it’s the sort of place where a showrunner like Aguirre-Sacasa—who has an intriguing background in playwriting, comics, and television—can explore a sandbox like this for years. Yet it also requires certain heavy, obvious beats, like structural supports, to prop up the rest of the edifice. A lot of the first episode of Riverdale, like most pilots, is devoted to setting up its premise and characters for even the most distracted viewers, and it can be almost insultingly on the nose. It’s why it feels obliged to spell out its theme of dark shadows beneath its sunlit surfaces, which isn’t exactly hard to grasp. As Roger Ebert wrote decades ago in his notoriously indignant review of Blue Velvet: “What are we being told? That beneath the surface of Small Town, U.S.A., passions run dark and dangerous? Don’t stop the presses.”

Blue Velvet

As a result, if you want to watch Riverdale at all, you need to get used to being treated occasionally as if you were twelve years old. But Aguirre-Sacasa seems determined to have it both ways. Like Glee before it, it feels as if it’s being pulled in three different directions even before it begins, but in this case, it comes off less as an unwanted side effect than as a strategy. It’s worth noting that not only did Aguirre-Sacasa write for Glee itself, but he’s also the guy who stepped in rewrite Spider-Man: Turn Off the Dark, which means that he knows something about wrangling intractable material for a mass audience under enormous scrutiny. (He’s also the chief creative officer of Archie Comics, which feels like a dream job in the best sort of way: one of his projects at the Yale School of Drama was a play about Archie encountering the murderers Leopold and Loeb, and he later received a cease and desist order from his future employer over Archie’s Weird Fantasy, which depicted its lead character as coming out of the closet.) Riverdale often plays like the work of a prodigiously talented writer trying to put his ideas into a form that could plausibly air on Thursdays after Supernatural. Like most shows at this stage, it’s also openly trying to decide what it’s supposed to be about. And I want to believe, on the basis of almost zero evidence, that Aguirre-Sacasa is deliberately attempting something almost unworkable, in hopes that he’ll be able to stick with it long enough—on a network that seems fairly indulgent of shows on the margins—to make it something special.

Most great television results from this sort of evolutionary process, and I’ve noted before—most explicitly in my Salon piece on The X-Files—that the best genre shows emerge when a jumble of inconsistent elements is given the chance to find its ideal form, usually because it lucks into a position where it can play under the radar for years. The pressures of weekly airings, fan response, critical reviews, and ratings, along with the unpredictable inputs of the cast and writing staff, lead to far more rewarding results than even the most visionary showrunner could produce in isolation. Writers of serialized narratives like comic books know this intuitively, and consciously or not, Aguirre-Sacasa seems to be trying something similar on television. It’s not an approach that would make sense for a series like Westworld, which was produced for so much money and with such high expectations that its creators had no choice but to start with a plan. But it might just work on the CW. I’m hopeful that Aguirre-Sacasa and his collaborators will use the mystery at the heart of the series much as Twin Peaks did, as a kind of clothesline on which they can hang a lot of wild experiments, only a certain percentage of which can be expected to work. Twin Peaks itself provides a measure of this method’s limitations: it mutated into something extraordinary, but it didn’t survive the departure of its original creative team. Riverdale feels like an attempt to recreate those conditions, and if it utilizes the Archie characters as its available raw material, well, why not? If Lynch had been able to get the rights, he might have used them, too.

Rob and Betty and Don and Laura

with 3 comments

Laura Petrie and Betty Draper

Mary Tyler Moore was the loveliest woman ever to appear on television, but you can only fully appreciate her charms if you also believe that Dick Van Dyke was maybe the most attractive man. I spent much of my youth obsessed with Rob and Laura Petrie on The Dick Van Dyke Show, which I think is the best three-camera sitcom of all time, and the one that secretly had the greatest impact on my inner life. Along with so much else, it was the first show that seemed to mine comedic and narrative material out of the act of its own creation. Rob was a comedy writer, and thanks to his scenes at the office with Sally and Buddy, I thought for a while I might want to do the same thing. I know now that this wouldn’t be a great job for someone like me, but the image of it is still enticing. What made it so appealing, I’ve come to realize, is that when Rob came home, the show never ended—he was married to a woman who was just as smart, funny, and talented as he was. (Looking at Moore, who was only twenty-four when the series premiered, I’m reminded a little of Debbie Reynolds in Singin’ in the Rain, who effortlessly kept up with her older costars under conditions of enormous pressure.) It was my first and best picture of a life that seemed complete both at work and at home. And the fact that both Moore and Van Dyke seem to have been drinking heavily during the show’s production only points at how difficult it must have been to sustain that dream on camera.

What strikes me the most now about The Dick Van Dyke Show is the uncanny way in which it anticipates the early seasons of Mad Men. In both shows, a husband leaves his idyllic home in Westchester each morning to commute to a creative job in Manhattan, where he brainstorms ideas with his wisecracking colleagues. (Don and Betty lived in Ossining, but the house that was used for exterior shots was in New Rochelle, with Rob and Laura presumably just up the road.) His wife is a much younger knockout—Laura was a former dancer, Betty a model—who seems that she ought to be doing something else besides watching a precocious kindergartener. The storylines are about evenly divided between the home and the office, and between the two, they give us a fuller portrait of the protagonist than most shows ever do. The influence, I can only assume, was unconscious. We know that Matthew Weiner watched the earlier series, as he revealed in a GQ interview when asked about life in the writers’ room:

We all came up in this system…When I watch The Dick Van Dyke Show, I’m like, Wow, this is the same job. There’s the twelve-year-old kid on the staff. There’s the guy who delivers lunch. I guarantee you I can walk into [another writer’s office] and, except for where the snack room is, it’s gonna be similar on some level.

And I don’t think it’s farfetched to guess that The Dick Van Dyke Show was Weiner’s introduction, as it was for so many of us, to the idea of writing for television in the first place.

Rob Petrie and Don Draper

The more I think about it, the more these two shows feel like mirror images of each other, just as “Don and Betty Draper” and “Rob and Laura Petrie” share the same rhythm. I’m not the first one to draw this connection, but instead of highlighting the obvious contrast between the sunniness of the former and the darkness of the latter, I’d prefer to focus on what they have in common. Both are hugely romantic visions of what it means to be a man who can afford a nice house in Westchester based solely on his ability to pitch better ideas than anybody else. Mad Men succeeds in large part because it manages to have it both ways. The series implicitly rebukes Don’s personal behavior, but it never questions his intelligence or talent. It doesn’t really sour us on advertising, any more than it does on drinking or smoking, and I don’t have any doubt that there are people who will build entire careers around its example. Both shows are the work of auteurs—Carl Reiner and Matt Weiner, whose names actually rhyme—who can’t help but let their joy in their own technical facility seep into the narrative. Rob and Don are veiled portraits of their creators. One is a lot better and the other a whole lot worse, but both amount to alternate lives, enacted for an audience, that reflect the restless activity behind the scenes.

And the real difference between Mad Men and The Dick Van Dyke Show doesn’t have anything to do with the decades in which they first aired, or even with the light and dark halves of the Camelot era that they both evoke. It comes down to the contrast between Laura and Betty—who, on some weird level, seem to represent opposing sides of the public image of Jacqueline Kennedy, and not just because the hairstyles are so similar. Betty was never a match for Don at home, and the only way in which she could win the game, which she did so emphatically, was to leave him altogether. Laura was Rob’s equal, intellectually and comedically, and she fit so well into the craziness at The Alan Brady Show that it wasn’t hard to envision her working there. In some ways, she was limited by her role as a housewife, and she would find her fullest realization in her second life as Mary Richards. But the enormous gap between Rob and Don boils down to the fact that one was married to a full partner and teammate, while the other had to make do with a glacial symbol of his success. When I think of them, I remember two songs. One is “Song of India,” which plays as Betty descends the hotel steps in “For Those Who Think Young,” as Don gazes at her so longingly that he seems to be seeing the ghost of his own marriage. The other is “Mountain Greenery,” which Rob and Laura sing at a party at their house, in a scene that struck me as contrived even at the time. Were there ever parties like this? It doesn’t really matter. Because I can’t imagine Don and Betty doing anything like it.

Written by nevalalee

January 26, 2017 at 9:05 am

Listening to “Retention,” Part 3

leave a comment »

Retention

Note: I’m discussing the origins of “Retention,” the episode that I wrote for the audio science fiction anthology series The Outer Reach. It’s available for streaming here on the Howl podcast network, and you can get a free month of access by using the promotional code REACH.

One of the unsung benefits of writing for film, television, or radio is that it requires the writer to conform to a fixed format on the printed page. The stylistic conventions of the screenplay originally evolved for the sake of everyone but the screenwriter: it’s full of small courtesies for the director, actors, sound editor, production designer, and line producer, and in theory, it’s supposed to result in one minute of running time per page—although, in practice, the differences between filmmakers and genres make even this rule of thumb almost meaningless. But it also offers certain advantages for writers, too, even if it’s mostly by accident. It can be helpful for authors to force themselves to work within the typeface, margins, and arbitrary formatting rules that the script imposes: it leaves them with minimal freedom except in the choice of the words themselves. Because all the dialogue is indented, you can see the balance between talk and action at a glance, and you eventually develop an intuition about how a good script should look when you flip rapidly through the pages. (The average studio executive, I suspect, rarely does much more than this.) Its typographical constraints amount to a kind of poetic form, and you find yourself thinking in terms of the logic of that space. As the screenwriter Terry Rossio put it:

In retrospect, my dedication—or my obsession—toward getting the script to look exactly the way it should, no matter how long it took—that’s an example of the sort of focus one needs to make it in this industry…If you find yourself with this sort of obsessive behavior—like coming up with inventive ways to cheat the page count!—then, I think, you’ve got the right kind of attitude to make it in Hollywood.

When it came time to write “Retention,” I was looking forward to working within a new template: the radio play. I studied other radio scripts and did my best to make the final result look right. This was more for my own sake than for anybody else’s, and I’m pretty sure that my producer would have been happy to get a readable script in any form. But I had a feeling that it would be helpful to adapt my habitual style to the standard format, and it was. In many ways, this story was a more straightforward piece of writing than most: it’s just two actors talking with minimal sound effects. Yet the stark look of the radio script, which consists of nothing but numbered lines of dialogue alternating between characters, had a way of clarifying the backbone of the narrative. Once I had an outline, I began by typing the dialogue as quickly as I could, almost in real time, without even indicating the identities of the speakers. Then I copied and pasted the transcript—which is how I came to think of it—into the radio play template. For the second draft, I found myself making small changes, as I always do, so that the result would look good on the page, rewriting lines to make for an even right margin and tightening speeches so that they wouldn’t fall across a page break. My goal was to come up with a document that would be readable and compelling in itself. And what distinguished it from my other projects was that I knew that it would ultimately be translated into performance, which was how its intended audience would experience it.

A page from the radio script of "Retention"

I delivered a draft of the script to Nick White, my producer, on January 8, 2016, which should give you a sense of how long it takes for something like this to come to fruition. Nick made a few edits, and I did one more pass on the whole thing, but we essentially had a finished version by the end of the month. After that, there was a long stretch of waiting, as we ran the script past the Howl network and began the process of casting. It went out to a number of potential actors, and it wasn’t until September that Aparna Nancherla and Echo Kellum came on board. (I also finally got paid for the script, which was noteworthy in itself—not many similar projects can afford to pay their writers. The amount was fairly modest, but it was more than reasonable for what amounted to a week of work.) In November, I got a rough cut of the episode, and I was able to make a few small suggestions. Finally, on December 21, it premiered online. All told, it took about a year to transform my initial idea into fifteen minutes of audio, so I was able to listen to the result with a decent amount of detachment. I’m relieved to say that I’m pleased with how it turned out. Casting Aparna Nancherla as Lisa, in particular, was an inspired touch. And although I hadn’t anticipated the decision to process her voice to make it more obvious from the beginning that she was a chatbot, on balance, I think that it was a valid choice. It’s probably the most predictable of the story’s twists, and by tipping it in advance, it serves as a kind of mislead for listeners, who might catch onto it quickly and conclude, incorrectly, that it was the only surprise in store.

What I found most interesting about the whole process was how it felt to deliver what amounted to a blueprint of a story for others to execute. Playwrights and screenwriters do it all the time, but for me, it was a novel experience: I may not be entirely happy with every story I’ve published, but they’re all mine, and I bear full responsibility for the outcome. “Retention” gave me a taste, in a modest way, of how it feels to hand an idea over to someone else, and of the peculiar relationship between a script and the dramatic work itself. Many aspiring screenwriters like to think that their vision on the page is complete, but it isn’t, and it has to pass through many intermediaries—the actors, the producer, the editor, the technical team—before it comes out on the other side. On balance, I prefer writing my own stuff, but I came away from “Retention” with valuable lessons that I expect to put into practice, whether or not I write for audio again. (I’m hopeful that there will be a second season of The Outer Reach, and I’d love to be a part of it, but its future is still up in the air.) I’ve spent most of my career worrying about issues of clarity, and in the case of a script, this isn’t an abstract goal, but a strategic element that can determine how faithfully the story is translated into its final form. Any fuzzy thinking early on will only be magnified in subsequent stages, so there’s a huge incentive for the writer to make the pieces as transparent and logical as possible. This is especially true when you’re providing a sketch for someone else to finish, but it also applies when you’re writing for ordinary readers, who are doing nothing else, after all, but turning the story into a movie in their heads.

Written by nevalalee

January 25, 2017 at 10:30 am

Rogue One and the logic of the story reel

leave a comment »

Gareth Edwards and Felicity Jones on the set of Rogue One

Last week, I came across a conversation on Yahoo Movies UK with John Gilroy and Colin Goudie, two of the editors who worked on Rogue One. I’ve never read an interview with a movie editor that wasn’t loaded with insights into storytelling, and this one is no exception. Here’s my favorite tidbit, in which Goudie describes cutting together a story reel early in the production process:

There was no screenplay, there was just a story breakdown at that point, scene by scene. [Director Gareth Edwards] got me to rip hundreds of movies and basically make Rogue One using other films so that they could work out how much dialogue they actually needed in the film.

It’s very simple to have a line [in the script] that reads “Krennic’s shuttle descends to the planet.” Now that takes maybe two to three seconds in other films, but if you look at any other Star Wars film you realize that takes forty-five seconds or a minute of screen time. So by making the whole film that way—I used a lot of the Star Wars films—but also hundreds of other films, too, it gave us a good idea of the timing.

This is a striking observation in itself. If Rogue One does an excellent job of recreating the feel of its source material, and I think it does, it’s because it honors its rhythms—which differ in subtle respects from those of other films—to an extent that the recent Star Trek movies mostly don’t. Goudie continues:

For example, the sequence of them breaking into the vault, I was ripping the big door closing in WarGames to work out how long does a vault door take to close.

So that’s what I did, and that was three months work to do that, and that had captions at the bottom which explained the action that was going to be taking place, and two thirds of the screen was filled with the concept art that had already been done and one quarter, the bottom corner, was the little movie clip to give you how long that scene would actually take.

Then I used dialogue from other movies to give you a sense of how long it would take in other films for someone to be interrogated. So for instance, when Jyn gets interrogated at the beginning of the film by the Rebel council, I used the scene where Ripley gets interrogated in Aliens.

Rogue One

This might seem like little more than interesting trivia, but there’s actually a lot to unpack. You could argue that the ability to construct an entire Star Wars movie out of analogous scenes from other films only points to how derivative the series has always been: it’s hard to imagine doing this for, say, Manchester By the Sea, or even Inception. But that’s also a big part of the franchise’s appeal. Umberto Eco famously said that Casablanca was made up of the memories of other movies, and he suggested that a cult movie—which we can revisit in our imagination from different angles, rather than recalling it as a seamless whole—is necessarily “unhinged”:

Only an unhinged movie survives as a disconnected series of images, of peaks, of visual icebergs. It should display not one central idea but many. It should not reveal a coherent philosophy of composition. It must live on, and because of, its glorious ricketiness.

After reminding us of the uncertain circumstances under which Casablanca was written and filmed, Eco then suggests: “When you don’t know how to deal with a story, you put stereotyped situations in it because you know that they, at least, have already worked elsewhere…My guess is that…[director Michael Curtiz] was simply quoting, unconsciously, similar situations in other movies and trying to provide a reasonably complete repetition of them.”

What interests me the most is Eco’s conclusion: “What Casablanca does unconsciously, other movies will do with extreme intertextual awareness, assuming also that the addressee is equally aware of their purposes.” He cites Raiders of the Lost Ark and E.T. as two examples, and he easily could have named Star Wars as well, which is explicitly made up of such references. (In fact, George Lucas was putting together story reels before there was even a word for it: “Every time there was a war movie on television, like The Bridges at Toko-Ri, I would watch it—and if there was a dogfight sequence, I would videotape it. Then we would transfer that to 16mm film, and I’d just edit it according to my story of Star Wars. It was really my way of getting a sense of the movement of the spaceships.”) What Eco doesn’t mention—perhaps because he was writing a generation ago—is how such films can pass through intertextuality and end up on the other side. They create memories for viewers who aren’t familiar with the originals, and they end up being quoted in turn by filmmakers who only know Star Wars. They become texts in themselves. In assembling a story reel from hundreds of other movies, Edwards and Goudie were only doing in a literal fashion what most storytellers do in their heads. They figure out how a story should “look” at its highest level, in a rough sketch of the whole, and fill in the details later. The difference here is that Rogue One had the budget and resources to pay someone to do it for real, in a form that could be timed down to the second and reviewed by others, on the assumption that it would save money and effort down the line. Did it work? I’ll be talking about this more tomorrow.

Written by nevalalee

January 12, 2017 at 9:13 am

The tentpole test

leave a comment »

Rogue One: A Star Wars Story

How do you release blockbusters like clockwork and still make each one seem special? It’s an issue that the movie industry is anxious to solve, and there’s a lot riding on the outcome. When I saw The Phantom Menace nearly two decades ago, there was an electric sense of excitement in the theater: we were pinching ourselves over the fact that we were about to see see the opening crawl for a new Star Wars movie on the big screen. That air of expectancy diminished for the two prequels that followed, and not only because they weren’t very good. There’s a big difference, after all, between the accumulated anticipation of sixteen years and one in which the installments are only a few years apart. The decade that elapsed between Revenge of the Sith and The Force Awakens was enough to ramp it up again, as if fan excitement were a battery that recovers some of its charge after it’s allowed to rest for a while. In the past, when we’ve watched a new chapter in a beloved franchise, our experience hasn’t just been shaped by the movie itself, but by the sudden release of energy that has been bottled up for so long. That kind of prolonged wait can prevent us from honestly evaluating the result—I wasn’t the only one who initially thought that The Phantom Menace had lived up to my expectations—but that isn’t necessarily a mistake. A tentpole picture is named for the support that it offers to the rest of the studio, but it also plays a central role in the lives of fans, which have been going on long before the film starts and will continue after it ends. As Robert Frost once wrote about a different tent, it’s “loosely bound / By countless silken ties of love and thought / to every thing on earth the compass round.”

When you have too many tentpoles coming out in rapid succession, however, the outcome—if I can switch metaphors yet again—is a kind of wave interference that can lead to a weakening of the overall system. On Christmas Eve, I went to see Rogue One, which was preceded by what felt like a dozen trailers. One was for Spider-Man: Homecoming, which left me with a perplexing feeling of indifference. I’m not the only one to observe that the constant onslaught of Marvel movies makes each installment feel less interesting, but in the case of Spider-Man, we actually have a baseline for comparison. Two baselines, really. I can’t defend every moment of the three Sam Raimi films, but there’s no question that each of those movies felt like an event. There was even enough residual excitement lingering after the franchise was rebooted to make me see The Amazing Spider-Man in the theater, and even its sequel felt, for better or worse, like a major movie. (I wonder sometimes if audiences can sense the pressure when a studio has a lot riding on a particular film: even a mediocre movie can seem significant if a company has tethered all its hopes to it.) Spider-Man: Homecoming, by contrast, feels like just one more component in the Marvel machine, and not even a particularly significant one. It has the effect of diminishing a superhero who ought to be at the heart of any universe in which he appears, relegating one of the two or three most successful comic book characters of all time to a supporting role in a larger universe. And because we still remember how central he was to no fewer than two previous franchises, it feels like a demotion, as if Spider-Man were an employee who had left the company, came back, and is now reporting to Iron Man.

Spider-Man in Captain America: Civil War

It isn’t that I’m all that emotionally invested in the future of Spider-Man, but it’s a useful case study for what it tells us about the pitfalls of these films, which can take something that once felt like a milestone and reduce it to a midseason episode of an ongoing television series. What’s funny, of course, is that the attitude we’re now being asked to take toward these movies is actually closer to the way in which they were originally conceived. The word “episode” is right there in the title of every Star Wars movie, which George Lucas saw as an homage to classic serials, with one installment following another on a weekly basis. Superhero films, obviously, are based on comic books, which are cranked out by the month. The fact that audiences once had to wait for years between movies may turn out to have been a historical artifact caused by technological limitations and corporate inertia. Maybe the logical way to view these films is, in fact, in semiannual installments, as younger viewers are no doubt growing up to expect. In years to come, the extended gaps between these movies in prior decades will seem like a structural quirk, rather than an inherent feature of how we relate to them. This transition may not be as meaningful as, say, the shift from silent films to the talkies, but they imply a similar change in the way we relate to the film onscreen. Blockbusters used to be released with years of anticipation baked into the response from moviegoers, which is no longer something that can be taken for granted. It’s a loss, in its way, to fan culture, which had to learn how to sustain itself during the dry periods between films, but it also implies that the movies themselves face a new set of challenges.

To be fair, Disney, which controls both the Marvel and Star Wars franchises, has clearly thought a lot about this problem, and they’ve hit on approaches that seem to work pretty well. With the Marvel Universe, this means pitching most of the films at a level at which they’re just good enough, but no more, while investing real energy every few years into a movie that is first among equals. This leads to a lot of fairly mediocre installments, but also to the occasional Captain America: Civil War, which I think is the best Marvel movie yet—it pulls off the impossible task of updating us on a dozen important characters while also creating real emotional stakes in the process, which is even more difficult than it looks. Rogue One, which I also liked a lot, takes a slightly different tack. For most of the first half, I was skeptical of how heavily it was leaning on its predecessors, but by the end, I was on board, and for exactly the same reason. This is a movie that depends on our knowledge of the prior films for its full impact, but it does so with intelligence and ingenuity, and there’s a real satisfaction in how neatly it aligns with and enhances the original Star Wars, while also having the consideration to close itself off at the end. (A lot of the credit for this may be due to Tony Gilroy, the screenwriter and unbilled co-director, who pulled off much of the same feat when he structured much of The Bourne Ultimatum to take place during gaps in The Bourne Supremacy.) Relying on nostalgia is a clever way to compensate for the reduced buildup between movies, as if Rogue One were drawing on the goodwill that Star Wars built up and hasn’t dissipated, like a flywheel that serves as an uninterruptible power supply. Star Wars isn’t just a tentpole, but a source of energy. And it might just be powerful enough to keep the whole machine running forever.

%d bloggers like this: