Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for February 2016

Alas, “Babylon”

with 2 comments

David Duchovny on The X-Files

Note: Spoilers follow for The X-Files episode “Babylon.”

By now, I’ve more or less resigned myself to the realization that the tenth season of The X-Files will consist of five forgettable episodes and one minor masterpiece. Since the latter is the first true Darin Morgan casefile in close to twenty years, the whole thing still shakes out as a pretty good deal, even if the ratio of good, bad, and mediocre is a little worse than I’d expected. But an installment like this week’s “Babylon” is particularly infuriating because its premise and early moments are so promising, but get systematically squandered by a writer—in this case Chris Carter himself—who seems to have no idea what to do with the opportunities that the revival presented. The first image we see is that of a Muslim man in his twenties on a prayer rug, framed at floor level, and it instantly got my hopes up: this is territory that the original run of the series rarely, if ever, explored, and it’s a rich trove of potential ideas. Even when the young man promptly blows himself up with a friend in a suicide bombing in Texas, I allowed myself to think that the show had something else up its sleeve. It does, but not in a good way: the rest of the episode is a mess, with a mishmash of tones, goofy music cues, dialogue that alternates between frenetic and painfully obvious, an extended hallucination scene, and a weird supporting turn from the gifted Lauren Ambrose, all of which plays even worse than it should because of the pall cast by the opening scene. (Although seeing Mulder in a cowboy hat allowed me to recognize how David Duchovny turned into Fred Ward so gradually that I didn’t even notice.)

In short, it’s not much worth discussing, except for the general observation that if you’re going to use an act of domestic terrorism as a plot device, you’d better be prepared to justify it with some great television. (Even Quantico did a better job of moving rapidly in its own ridiculous direction after an opening terror attack. And the fact that I’m getting nostalgic for Quantico, of all shows, only highlights how disappointing much of this season has been.) But it raises the related issue, which seems worth exploring, of the degree to which The X-Files benefited from the accident of its impeccable historical timing. The series ran for most of the nineties, a decade that wasn’t devoid of partisan politics, but of a kind that tended to focus more on a little blue dress than on Islamic extremism. It had its share of dislocating moments—including the Oklahoma City bombing, which was uncomfortably evoked, with characteristic clumsiness, in The X-Files: Fight the Future—but none that recentered the entire culture in the way that September 11 did. For the most part, The X-Files was free to operate on a separate playing field without much reference to current events, a situation which might not have been the case if its premiere date had been shifted even five years forward or backward. It came after the Cold War and before the war on terror, leaving it with the narrative equivalent of a blank canvas to fill with a cast of imaginary monsters.

David Duchovny and Gillian Anderson on The X-Files

Not surprisingly, Chris Carter has stated elsewhere that the show benefited from occurring before the fall of the World Trade Center, which inaugurated a period, however temporary it turned out to be, in which people wanted to believe in their government. Carter implies that this is antithetical to what The X-Files represented, and while that seems plausible at first glance, it doesn’t really hold water. In many ways, the conspiracy thread was one of its weakest elements of the original series: it quickly became too convoluted for words, and it was often used as a kind of reset button, with shadowy government agents moving in to erase any evidence of that week’s revelations. Aside from one occasion, the tag at the end of the opening credits wasn’t “Trust No One,” but “The Truth is Out There.” Paranoia was a useful narrative device, but it wasn’t central to the show’s appeal, and I’d like to think that the series would have evolved into a different but equally satisfying shape if the politics of the time had demanded it—although the damp squib of the reboot, which was explicitly designed to bring Mulder and Scully into the modern world, doesn’t exactly help to make that case. (The clear parallel here is 24, which was transformed by uncontrollable events into something very unlike what it was once intended to be. One of my favorite pieces of show business trivia is that its producers briefly considered optioning The Da Vinci Code as the plot for the show’s second season, which hints at what that series might have been in some other universe.)

In the end, an episode like “Babylon” makes me almost grateful that the show concluded when it did, given its inability to do anything worthwhile with what might have been a decent premise. And it’s an ineptitude that emerges, not from the fog of cranking out a weekly television series, but after Carter had close to fifteen years to think about the kind of story he could tell, which makes it even harder to forgive. The episode’s central gimmick—which involves communicating with a clinically dead suicide bomber to prevent a future attack—is pretty good, or it might have been, if the script didn’t insist on constantly tap-dancing away from it. (A plot revolving around getting into an unconscious killer’s head didn’t even need to be about terrorism at all: a rehash of The Cell would have been preferable to what we actually got.) It’s hard not to conclude that the best thing that ever happened to The X-Files was a run of nine seasons that uniquely positioned it to ignore contemporary politics and pick its source material from anywhere convenient, with time and forgetfulness allowing it to exploit the nightmares of the past in a typically cavalier fashion. But just as recent political developments have rendered House of Cards all but obsolete, I have a feeling that The X-Files, which always depended on such a fragile suspension of disbelief, couldn’t have endured conditions that forced it to honestly confront its own era—which suggests that this reboot may have been doomed from the beginning. Because the incursion of the real world into fantasy is one invasion that this show wouldn’t be able to survive.

Written by nevalalee

February 17, 2016 at 9:49 am

Quote of the Day

leave a comment »

Written by nevalalee

February 17, 2016 at 7:30 am

The critical path

with 5 comments

Renata Adler

A few weeks ago, I had occasion to mention Renata Adler’s famous attack in the New York Review of Books on the reputation of the film critic Pauline Kael. As a lifelong Kael fan, I don’t agree with Adler—who describes Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless”—but I respect the essay’s fire and eloquence, and it’s still a great read. What I’d forgotten is that Adler opens with an assault, not on Kael alone, but on the entire enterprise of professional criticism itself. Here’s what she says:

The job of the regular daily, weekly, or even monthly critic resembles the work of the serious intermittent critic, who writes only when he is asked to or genuinely moved to, in limited ways and for only a limited period of time…Normally, no art can support for long the play of a major intelligence, working flat out, on a quotidian basis. No serious critic can devote himself, frequently, exclusively, and indefinitely, to reviewing works most of which inevitably cannot bear, would even be misrepresented by, review in depth…

The simple truth—this is okay, this is not okay, this is vile, this resembles that, this is good indeed, this is unspeakable—is not a day’s work for a thinking adult. Some critics go shrill. Others go stale. A lot go simultaneously shrill and stale.

Adler concludes: “By far the most common tendency, however, is to stay put and simply to inflate, to pretend that each day’s text is after all a crisis—the most, first, best, worst, finest, meanest, deepest, etc.—to take on, since we are dealing in superlatives, one of the first, most unmistakable marks of the hack.” And I think that she has a point, even if I have to challenge a few of her assumptions. (The statement that most works of art “inevitably cannot bear, would even be misrepresented by, review in depth,” is particularly strange, with its implicit division of all artistic productions into the sheep and the goats. It also implies that it’s the obligation of the artist to provide a worthy subject for the major critic, when in fact it’s the other way around: as a critic, you prove yourself in large part through your ability to mine insight from the unlikeliest of sources.) Writing reviews on a daily or weekly basis, especially when you have a limited amount of time to absorb the work itself, lends itself inevitably to shortcuts, and you often find yourself falling back on the same stock phrases and judgments. And Adler’s warning about “dealing in superlatives” seems altogether prescient. As Keith Phipps and Tasha Robinson of The A.V. Club pointed out a few years back, the need to stand out in an ocean of competing coverage means that every topic under consideration becomes either an epic fail or an epic win: a sensible middle ground doesn’t generate page views.

Pauline Kael

But the situation, at least from Adler’s point of view, is even more dire than when she wrote this essay in the early eighties. When Adler’s takedown of Kael first appeared, the most threatening form of critical dilution lay in weekly movie reviews: today, we’re living in a media environment in which every episode of every television show gets thousands of words of critical analysis from multiple pop culture sites. (Adler writes: “Television, in this respect, is clearly not an art but an appliance, through which reviewable material is sometimes played.” Which is only a measure of how much the way we think and talk about the medium has changed over the intervening three decades.) The conditions that Adler identifies as necessary for the creation of a major critic like Edmund Wilson or Harold Rosenberg—time, the ability to choose one’s subjects, and the freedom to quit when necessary—have all but disappeared for most writers hoping to make a mark, or even just a living. To borrow a trendy phrase, we’ve reached a point of peak content, with a torrent of verbiage being churned out at an unsustainable pace without the advertising dollars to support it, in a situation that can be maintained only by the seemingly endless supply of aspiring writers willing to be chewed up by the machine. And if Adler thought that even a monthly reviewing schedule was deadly for serious criticism, I’d be curious to hear how she feels about the online apprenticeship that all young writers seem expected to undergo these days.

Still, I’d like to think that Adler got it wrong, just as I believe that she was ultimately mistaken about Kael, whose legacy, for all its flaws, still endures. (It’s revealing to note that Adler had a long, distinguished career as a writer and critic herself, and yet she almost certainly remains best known among casual readers for her Kael review.) Not every lengthy writeup of the latest episode of The Vampire Diaries is going to stand the test of time, but as a crucible for forming a critic’s judgment, this daily grind feels like a necessary component, even if it isn’t the only one. A critic needs time and leisure to think about major works of art, which is a situation that the current media landscape doesn’t seem prepared to offer. But the ability to form quick judgments about works of widely varying quality and to express them fluently on deadline is an indispensable part of any critic’s toolbox. When taken as an end itself, it can be deadening, as Adler notes, but it can also be the foundation for something more, even if it has to be undertaken outside of—or despite—the critic’s day job. The critic’s responsibility, now more than ever, isn’t to detach entirely from the relentless pace of pop culture, but to find ways of channeling it into something deeper than the instant think piece or hot take. As a blogger who frequently undertakes projects that can last for months or years, I’m constantly mindful of the relationship between my work on demand and my larger ambitions. And I sure hope that the two halves can work together. Because like it or not, every critic is walking that path already.

Written by nevalalee

February 16, 2016 at 8:55 am

Quote of the Day

leave a comment »

Craig Lucas

Doubt is an important component to sanity. People always talk about self-esteem, self-worth. The only people who have complete self-esteem are lunatics. Doubt is necessary for the process. That’s why we proof a document: I doubt I spelled every word correctly, I doubt solecisms, I doubt I made myself entirely clear, or did I make myself too clear? An awful lot of playwriting is about cutting away what’s unnecessary. It’s a constant process of asking questions, which is what doubt is. I think you want the time and the skill to weigh all the considerations.

Craig Lucas, to Tin House

Written by nevalalee

February 16, 2016 at 7:30 am

Twenty-five years later: The Silence of the Lambs

leave a comment »

Jodie Foster in The Silence of the Lambs

At this point, it might seem like there’s nothing new to say—at least by me—about The Silence of the Lambs. I’ve discussed both the book and the movie here at length, and I’ve devoted countless posts to unpacking Hannibal Lecter’s most recent televised incarnation. Yet like all lasting works of art, and I’d argue that both the novel and the film qualify, The Silence of the Lambs continues to reveal new aspects when seen from different angles, especially now that exactly a quarter of a century has gone by since the movie’s release. Watching it again today, for instance, it’s hard not to be struck by how young Clarice Starling really is: Jodie Foster was just twenty-eight when the film was shot, and when I look at Starling from the perspective of my middle thirties, she comes off as simultaneously more vulnerable and more extraordinary. (I have an uneasy feeling that it’s close to the way Jack Crawford, not to mention Lecter, might have seen her at the time.) And it only highlights her affinities to Buffalo Bill’s chosen prey. This isn’t exactly a revelation: that sense of a dark sisterhood is a pivotal plot point in the original story. But it’s one thing to grasp this intellectually and quite another to go back and see how cannily the movie casts actresses as Bill’s victims who subtly suggest Foster’s own facial features, just a little wider. And it’s more clear than ever how Foster’s early fame, her passage into movies like Taxi Driver, her strange historical linkage to a stalker and failed assassin, and her closely guarded personal life gave her the tools and aura to evoke Starling’s odd mixture of toughness and fragility.

What’s also obvious now, unfortunately, is the extent to which Starling was—and remains—an anomaly in the genre. Starling, as embodied by Foster, has inspired countless female leads in thrillers in the decades since. (When I found myself obliged to create a similar character for my own novels, my thoughts began and ended with her.) Yet aside from Dana Scully, the results have been less than memorable. Starling has always been eclipsed by the shadow of the monster in the cell beside her, but in many ways, she was a harder character to crack, and the fact that she works so well in her written and cinematic incarnations is the result of an invisible, all but miraculous balancing act. None of the later efforts in the same direction have done as well. Christopher McQuarrie, while discussing the characters played by Emily Blunt in Edge of Tomorrow and Rebecca Ferguson in Mission: Impossible—Rogue Nation, gets close to the heart of the challenge:

They’re not men. They’re women that are not trying to be men…To me, more than anything, Rebecca is mature, elegant, confident, and at peace. Her only vulnerability in the movie is she’s just as fucked as everybody else…Usually when you want to create vulnerability for a woman, it’s about giving her a neurosis—a fear or some emotional arc that, ultimately, gets the better of her, whether it’s a need for revenge or need for redemption. You know, “Her father was killed by a twister, so she has to defeat twisters no matter what,” and I wouldn’t have any of that either. It simply was: you’re here on your own terms and you’re in a shitty situation created by people in power above you. How do you escape this situation and maintain your dignity?

Which isn’t to say that Starling didn’t suffer from her share of father issues. But those last two sentences capture her appeal as well as any I’ve ever read.

Anthony Hopkins and Jodie Foster in The Silence of the Lambs

Time also offers some surprising perspectives on Lecter himself, or at least the version of him we see here. The Silence of the Lambs, like Rocky, is one of those classic movies that has been diminished in certain respects by our knowledge of the sequels that followed it. Conventional wisdom holds that Anthony Hopkins’s take on Lecter became broader and more self-indulgent with every installment, and it’s fashionable to say that the best version of the character was really Brian Cox in Manhunter, or, more plausibly, Mads Mikkelsen on Hannibal. It’s a seductively contrarian argument, but it’s also inherently ridiculous. As great as the novel is, we probably wouldn’t be talking about Lecter or Thomas Harris or The Silence of the Lambs at all if it weren’t for Hopkins’s performance. And in many ways, it’s his facile, even superficial interpretation of the character that made the result so potent. Hopkins was discovered and mentored by Laurence Olivier, whom he understudied in August Strindberg’s Dance of Death, and it helps to view his approach to Lecter through the lens of the quote from Olivier that I cited here the other week: “I’m afraid I do work mostly from the outside in. I usually collect a lot of details, a lot of characteristics, and find a creature swimming about somewhere in the middle of them.” Hopkins’s creature is the finest example I know of a classically trained stage lion slumming it in a juicy genre part, and even if it wasn’t a particularly difficult performance once Hopkins figured out the voice, still—he figured out that voice.

And as soon as we acknowledge, or even embrace, the degree to which Lecter is a fantasy that barely survives twelve minutes onscreen, the more this approach seems like a perfectly valid solution to this dance of death. If Lecter seemed increasingly hammy and unconvincing in the movie versions of Hannibal and Red Dragon, that isn’t a failure on Hopkins’s part: making him the main attraction only brought out the artificiality and implausibility that had been there all along, and Hopkins just did what any smart actor would have done under the circumstances—take the money and try to salvage his own sense of fun. (As it happens, Ted Tally’s script for Red Dragon is surprisingly good, a thoughtful, inventive approach to tough material that was let down by the execution. If I had to choose, I’d say he did a better job on the page than Bryan Fuller ultimately did with the same story.) With the passage of time, it’s increasingly clear that Lecter falls apart even as you look at him, and that he’s a monster like the shark in Jaws or the dinosaurs that would follow two years later in Jurassic Park: they’re only convincing when glimpsed in flashes or in darkness, and half of the director’s art lies in knowing when to cut away. Put him front and center, as the sequels did, and the magic vanishes. Asking why Hopkins is so much more effective in The Silence of the Lambs than in the films that followed is like asking why the computer effects in Jurassic Park look better than their equivalents today: it isn’t about technology or technique, but about how the film deploys it to solve particular problems. Twelve minutes over twenty-five years is about as much scrutiny as Hopkins’s wonderful Lecter could sustain. And the rest, as they say, should have been silence.

Quote of the Day

leave a comment »

T.S. Eliot

Composing on the typewriter, I find that I am sloughing off all my long sentences which I used to dote upon. Short, staccato, like modern French prose. The typewriter makes for lucidity, but I am not sure that it encourages subtlety.

T.S. Eliot, in a letter to Conrad Aiken

Written by nevalalee

February 15, 2016 at 7:30 am

The painting in the mirror

leave a comment »

Sketch by Leonardo Da Vinci

We know very well that errors are better recognized in the works of others than in our own; and that often, while reproving little faults in others, you may ignore great ones in yourself…

I say that when you paint you should have a flat mirror and often look at your work as reflected in it, when you will see it reversed, and it will appear to you like some other painter’s work, so you will be better able to judge of its faults than in any other way.

Again, it is well that you should often leave off work and take a little relaxation, because, when you come back to it you are a better judge; for sitting too close at work may greatly deceive you.

Again, it is good to retire to a distance because the work looks smaller and your eye takes in more of it at a glance and sees more easily the discords or disproportion in the limbs and colors of the objects.

Leonardo da Vinci, Notebooks

Written by nevalalee

February 14, 2016 at 7:30 am

Posted in Quote of the Day

Tagged with

The ballerina’s odds

leave a comment »

American Ballet Theatre

Each year, on the road or back at home in New York, Baryshnikov and his artistic staff hold regularly scheduled auditions for young dancers from all over America…In some ways, they differ little from any other sort of auditions: many come calling, and extremely few are chosen. Just getting a chance to audition is something of an ordeal…If someone is touted as truly exceptional, and the person doing the touting has a proven record of talent development, he or she might get a private audition with Baryshnikov, but this is rare. For the most part they come in clutches on audition days, trying to pretend they aren’t nervous, and hoping against hope that the years of hard work are finally going to pay off. They are numbered and noted in coldly clinical detail. After a basic technical competence is ascertained, decisions are sometimes made in seconds and based on pure whim, and the atmosphere is reminiscent of an animal auction.

In an odd and quieter way, though, auditions are an infinitely sad business because the demands of ballet are so much stricter and more rigorous than those of any other of the performing arts, while the return—even with success at the audition level—seems to outsiders so meager. Yet if the ballet world has its share of rebels, malcontents, and rational critics who all resent the “system” as it is presented to them, it is also remarkable for the broad and sublime rejection of common sense. Common sense would tell parents never to send their children to ballet school. Common sense would tell teenagers that there is a wider and happier world beyond the grueling strictures of daily barre and class. Common sense would tell the graduating student that there are infinitely superior ways of making money than joining a professional ballet company. Common sense would tell a young dancer that very few—laughably few—of his or her colleagues will ever make it to the top or even near the top. Common sense would tell a maturing dancer that there is much psychological grief stored up for middle age and, quite often, great pain in old age.

Yet there remains the dance and dancers, and that fact alone testifies to the endurance of faith and sacrifice in an unbelieving age.

John Fraser, Private View: Inside Baryshnikov’s American Ballet Theatre

Written by nevalalee

February 13, 2016 at 7:30 am

“The rest of the wedding was a blur…”

leave a comment »

"Wolfe walked gingerly down the aisle..."

Note: This post is the forty-third installment in my author’s commentary for Eternal Empire, covering Chapter 42. You can read the previous installments here.

I’ve frequently written here about the theory that you can classify any given writer as either a gardener or an architect. George R.R. Martin, who obviously places himself in the former category, returns to that premise repeatedly in discussing his work, and it’s been picked up by other writers of speculative fiction: when I attended the World Science Fiction Convention in Chicago a few years back, it came up at nearly every panel I saw. And like most such categorizations, it’s most illuminating when we look at the places where it falls short. A good gardener, for instance, doesn’t just put words down on paper and hope for the best: to keep the process from spiraling out of control, he or she soon develops a set of tactics for managing the resulting pages. If there’s a hidden architecture here, it’s the vernacular kind, which emerges out of the constraints of the landscape, the materials, and the needs of the people who live there. It doesn’t arise from a blueprint, but it depends nonetheless on experience and good tricks. And the self-described gardeners of literature—the published ones, anyway—tend to be exceptionally capable at controlling structure at the level of the sentence or paragraph. If they weren’t, the story wouldn’t get written at all. (Or if you’d prefer to keep the gardening metaphor alive, it’s also a little like the parable of the sower: ideas that fall on rocky soil or among thorns are unlikely to grow, but they yield a hundredfold when sown on good ground. And even if this isn’t architecture, it’s at least a kind of horticulture.)

In a similar way, one of the most counterintuitive aspects of the architectural approach is that all of its careful planning and analysis really exists to enable a handful of moments in which the plan goes away. Making an outline is less about laying down the proper path for the story—which is likely to change in the rewrite anyway—than about keeping on task and maintaining a necessary discipline over many weeks and months. This routine exists both to generate pages and to make sure you’re physically there when an organic, unplanned insight occurs: it’s a kind of precipitate from the solution that the writer has prepared beforehand in the lab. I’ve spoken elsewhere about the importance of rendering time, in which you need to stick yourself behind a desk for a certain number of hours or days before good ideas can emerge. Outlining and writing a logical first draft happens to be a pretty great use of the time between inspirations, and it can be hard to tell whether an idea emerged from the preparatory stage or if the latter was just an excuse to keep working until the former appeared. But it still works, and the fact that useful insights tend to appear only after a stretch of systematic, sometimes tedious effort is the best argument I know for writing like an architect. The idea that you need to prepare obsessively to allow for the unexpected isn’t exactly new: in fact, it’s so familiar that it has inspired some of creativity’s great clichés, from Louis Pasteur’s “Chance favors the prepared mind” to Branch Rickey’s “Luck is the residue of design.” But like a lot of clichés, they’re true.

"The rest of the wedding was a blur..."

For the most part, The Icon Thief and its successors were meticulously planned novels: they all called for a lot of research, and their outlines, in some cases, approached the lengths of the finished chapters themselves. But that planning was meaningful mostly to the extent that it enabled about ten minutes of real insight, spread unevenly over the course of three years of work. I don’t know of any better example than that of Maya Asthana. When I started writing City of Exiles, the second book in the series, I was working toward what I thought would be a neat twist: Alan Powell, the hero of the first installment, would turn out to be the mole in his own agency. I wasn’t exactly sure how this would work, but I trusted that I’d be able to figure it out, and I wrote about half the book with that revelation in mind. When it came time to outline the second half, however, I froze up: I just couldn’t see how to do it. Yet I’d already baked the idea of a mole into the story, and I couldn’t bear the thought of discarding those pages. Out of desperation, I cast around for another character who could assume that role. And to my surprise, I found that the only plausible candidate was Asthana, the smart, slightly conceited, but warmhearted agent I’d introduced into the story solely as a sounding board for Rachel Wolfe, my protagonist. But once I recognized Asthana’s potential, I realized that her origins as a purely functional supporting character were a real asset: the reader would be unlikely to see the twist coming—and I think the surprise works—because I hadn’t seen it, either.

And one of the unanticipated dividends of that decision was the wealth of small, almost arbitrary character details that I’d unwittingly bequeathed to myself. Like Wolfe, who was originally a minor character whom I made into a Mormon just to make her a little more distinctive, Asthana had acquired traits and bits of business nearly at random, and now I had a chance to put them to good use. In City of Exiles, for example, I’d established the fact that she was planning her wedding, mostly because it was a thread I could write without much effort—I’d gotten married just a couple of years earlier—and because it seemed consistent with her personality. Once Asthana became the villain of the series, though, and after it became clear that I wasn’t going to be able to resolve her story in the second book, it seemed obvious that her wedding day was going to be a major set piece in Eternal Empire. Again, I could have simply ignored the clue that had been planted, but it felt right, like using every part of the buffalo, and I had a hunch that it would be a good scene. And it was. In fact, the sequence that reaches its climax here, in Chapter 41, as Wolfe realizes that Asthana is the mole while standing up as a bridesmaid during the wedding ceremony, is maybe my favorite thing in the whole novel. (A big part of the challenge was figuring out how Wolfe could stumble across the truth at the wedding itself. The solution, which involves a surprise poetry reading and a clue from John Donne, manages to be tidy and contrived at the same time.) It’s a scene that never would have occurred to me if the pieces hadn’t fallen into place almost by accident. And while I’d never call myself a gardener, it was nice to see one idea finally bear fruit…

Written by nevalalee

February 12, 2016 at 8:46 am

Quote of the Day

leave a comment »

Written by nevalalee

February 12, 2016 at 7:30 am

Going for the kill

with 2 comments

David Duchovny and Gillian Anderson on The X-Files

Note: Spoilers follow for the X-Files episode “Home Again.”

One of the unexpected but undeniable pleasures of the tenth season of The X-Files is the chance it provides to reflect on how television itself has changed over the last twenty years. The original series was so influential in terms of storytelling and tone that it’s easy to forget how compelling its visuals were, too: it managed to tell brooding, cinematic stories on a tiny budget, with the setting and supporting cast changing entirely from one episode to the next, and it mined a tremendous amount of atmosphere from those Vancouver locations. When it pushed itself, it could come up with installments like “Triangle”—one of the first television episodes ever to air in widescreen—or “The Post-Modern Prometheus,” none of which looked like anything you’d ever seen before, but it could be equally impressive in its moody procedural mode. Yet after a couple of decades, even the most innovative shows start to look a little dated. Its blocking and camera style can seem static compared to many contemporary dramas, and one of the most intriguing qualities of the ongoing reboot has been its commitment to maintaining the feel of the initial run of the series while upgrading its technical aspects when necessary. (Sometimes the best choice is to do nothing at all: the decision to keep the classic title sequence bought it tremendous amounts of goodwill, at least with me, and the slightly chintzy digital transformation effects in “Mulder and Scully Meet the Were-Monster” come off as just right.)

This week’s episode, Glen Morgan’s “Home Again,” is interesting mostly as an illustration of the revival’s strengths and limitations. It’s basically a supernatural slasher movie, with a ghostly killer called the Band-Aid Nose Man stalking and tearing apart a string of unsympathetic victims who have exploited the homeless in Philadelphia. And the casefile element here is even more perfunctory than usual. All we get in the way of an explanation is some handwaving about the Tibetan tulpa, which the show undermines at once, and the killer turns out to be hilariously ineffective: he slaughters a bunch of people without doing anything to change the underlying situation. But there’s also a clear implication that the case isn’t meant to be taken seriously, except as a counterpoint to the real story about the death of Scully’s mother. Even there, though, the parallels are strained, and if the implicit point is that the case could have been about anything, literally anything would have been more interesting than this. (There’s another point to be made, which I don’t feel like exploring at length here, about how the show constantly falls back on using Scully’s family—when it isn’t using her body—to put her through the wringer. Scully has lost her father, her sister, and now her mother, and it feels even lazier here than usual, as if the writers thought she’d had too much fun last week, which meant that she had to suffer.)

Gillian Anderson and David Duchovny on The X-Files

What we have, then, are a series of scenes—four, to be exact—in which an unstoppable killer goes after his quarry. There’s nothing wrong with this, and if the resulting sequences were genuinely scary, the episode wouldn’t need to work so hard to justify its existence. Yet none of it is particularly memorable or frightening. As I watched it, I was struck by the extent to which the bar has been raised for this kind of televised suspense, particularly in shows like Breaking Bad and Fargo, which expertly blend the comedic and the terrifying. Fargo isn’t even billed as a suspense show, but it has given us scenes and whole episodes over the last two seasons that built the pressure so expertly that they were almost painful to watch: I’ve rarely had a show keep me in a state of dread for so long. And this doesn’t require graphic violence, or even any violence at all. Despite its title, Fargo takes its most important stylistic cue from another Coen brothers movie entirely, and particularly from the sequence in No Country For Old Men in which Llewelyn Moss awaits Anton Chigurh in his motel room. It’s the most brilliantly sustained sequence of tension in recent memory, and it’s built from little more than our knowledge of the two characters, the physical layout of the space, and a shadow under the door. Fargo has given us a version of this scene in every season, and it does it so well that it makes it all the less forgivable when an episode like “Home Again” falls short.

And the funny thing, of course, is that both Fargo and Breaking Bad lie in a direct line of descent from The X-Files. Breaking Bad, obviously, is the handiwork of Vince Gilligan, who learned much of what he knows in his stint on the earlier show, and who revealed himself in “Pusher” to be a master of constructing a tight suspense sequence from a handful of well-chosen elements. And Fargo constantly winks at The X-Files, most notably in the spaceship that darted in and out of sight during the second season, but also in its range and juxtaposition of tones and its sense of stoicism in the face of an incomprehensible universe. If an episode like “Home Again” starts to look a little lame, it’s only because the show’s descendants have done such a good job of expanding upon the basic set of tools that the original series provided. (It also points to a flaw in the show’s decision to allow all the writers to direct their own episodes. It’s a nice gesture, but it also makes me wonder how an episode like this would have played in the hands of a director like, say, Michelle McLaren, who is an expert at extending tension to the breaking point.) Not every Monster of the Week needs to be a masterpiece, but when we’re talking about six episodes after so many years, there’s greater pressure on each installment to give us something special—aside from killing off another member of the Scully family. Because if the show were just a little smarter about dispatching its other victims, it might have decided to let Margaret Scully live.

Written by nevalalee

February 11, 2016 at 9:30 am

Quote of the Day

leave a comment »

Written by nevalalee

February 11, 2016 at 7:30 am

Posted in Quote of the Day

Tagged with

Making the mark

with one comment

Illustration by Jules Feiffer

I only wanted to do what I could right away. I didn’t want to have to do things that were hard. Hard was too hard. Hard was full of defeat. Hard was full of rejection. Hard was full of self-reproach and self-hate. There was enough self-hate operating within me under ordinary circumstances not to provoke even more by repeated failures at something I felt was beyond my ken, but which might not have been, had I been able to apply a little more effort. So, I think that at least unconsciously, becoming the sort of cartoonist I became, instead of the more traditional cartoonist, was because I felt I couldn’t compete as a more traditional cartoonist. I couldn’t do the slick thick and thin line. I couldn’t draw super-characters with ease and facility. I couldn’t do the work I thought I wanted to do…

Not being able to really be as good as I wanted to at my first love, which would have been a daily strip, I had to invent another form for myself, within cartooning. That no one else could do, that I was the only one doing, so that I couldn’t have any competitors. So nobody could be any better at it than I was. If I invented it, who was my competition? I mean, all competition had to be measured against me. I was making the mark. I’m not saying this was by any means a conscious choice. I think by a process of elimination, I just slipped into it…

Striking off on my own has never been intimidating. Being like everyone else has been intimidating, because I’m lousy at it. Being part of a group is intimidating, because I just don’t get the hang of it.

Jules Feiffer, to The Comics Journal

Written by nevalalee

February 10, 2016 at 10:09 am

Quote of the Day

leave a comment »

Terrence McNally

Theater should resemble more a newsroom, with deadlines, than a slow, leisurely workshop development process…In theater, you should strike while the iron is hot. It’s the moment. It’s like with food. You taste it, and you don’t wait a week to say if you like the sauerkraut.

Terrence McNally, to the Boston Globe

Written by nevalalee

February 10, 2016 at 7:30 am

The time factor

with 7 comments

Concept art for Toy Story 3

Earlier this week, my daughter saw Toy Story for the first time. Not surprisingly, she loved it—she’s asked to watch it three more times in two days—and we’ve already moved on to Toy Story 2. Seeing the two movies back to back, I was struck most of all by the contrast between them. The first installment, as lovely as it is, comes off as a sketch of things to come: the supporting cast of toys gets maybe ten minutes total of screen time, and the script still has vestiges of the villainous version of Woody who appeared in the earlier drafts. It’s a relatively limited film, compared to the sequels. Yet if you were to watch it today without any knowledge of the glories that followed, you’d come away with a sense that Pixar had done everything imaginable with the idea of toys who come to life. The original Toy Story feels like an exhaustive list of scenes and situations that emerge organically from its premise, as smartly developed by Joss Whedon and his fellow screenwriters, and in classic Pixar fashion, it exploits that core gimmick for all it’s worth. Like Finding Nemo, it amounts to an anthology of all the jokes and set pieces that its setting implies: you can practically hear the writers pitching out ideas. And taken on its own, it seems like it does everything it possibly can with that fantastic concept.

Except, of course, it doesn’t, as two incredible sequels and a series of shorts would demonstrate. Toy Story 2 may be the best example I know of a movie that takes what made its predecessor special and elevates it to a level of storytelling that you never imagined could exist. And it does this, crucially, by introducing a new element: time. If Toy Story is about toys and children, Toy Story 2 and its successor are about what happens when those kids become adults. It’s a complication that was inherent to its premise from the beginning, but the first movie wasn’t equipped to explore it—we had to get to know and care about these characters before we could worry about what would happen after Andy grew up. It’s a part of the story that had to be told, if its assumptions were to be treated honestly, and it shows that the original movie, which seemed so complete in itself, only gave us a fraction of the full picture. Toy Story 3 is an astonishing achievement on its own terms, but there’s a sense in which it only extends and trades on the previous film’s moment of insight, which turned it into a franchise of almost painful emotional resonance. If comedy is tragedy plus time, the Toy Story series knows that when you add time to comedy, you end up with something startlingly close to tragedy again.

Robert De Niro in The Godfather Part II

And thinking about the passage of time is an indispensable trick for creators of series fiction, or for those looking to expand a story’s premise beyond the obvious. Writers of all kinds tend to think in terms of unity of time and place, which means that time itself isn’t a factor in most stories: the action is confined within a safe, manageable scope. Adding more time to the story in either direction has a way of exploding the story’s assumptions, or of exposing fissures that lead to promising conflicts. If The Godfather Part II is more powerful and complex than its predecessor, it’s largely because of its double timeline, which naturally introduces elements of irony and regret that weren’t present in the first movie: the outside world seems to break into the hermetically sealed existence of the Corleones just as the movie itself breaks out of its linear chronology. And the abrupt time jump, which television series from Fargo to Parks and Recreation have cleverly employed, is such a useful way of advancing a story and upending the status quo that it’s become a cliché in itself. Even if you don’t plan on writing more than one story or incorporating the passage of time explicitly into the plot, asking yourself how the characters would change after five or ten years allows you to see whether the story depends on a static, unchanging timeframe. And those insights can only be good for the work.

This also applies to series in which time itself has become a factor for reasons outside anyone’s control. The Force Awakens gains much of its emotional impact from our recognition, even if it’s unconscious, that Mark Hamill is older now than Alec Guinness was in the original, and the fact that decades have gone by both within the story’s universe and in our own world only increases its power. The Star Trek series became nothing less than a meditation on the aging of its own cast. And this goes a long way toward explaining why Toy Story 3 was able to close the narrative circle so beautifully: eleven years had passed since the last movie, and both Andy and his voice actor had grown to adulthood, as had so many of the original film’s fans. (It’s also worth noting that the time element seems to have all but disappeared from the current incarnation of the Toy Story franchise: Bonnie, who owns the toys now, is in no danger of growing up soon, and even if she does, it would feel as if the films were repeating themselves. I’m still optimistic about Toy Story 4, but it seems unlikely to have the same resonance as its predecessors—the time factor has already been fully exploited. Of course, I’d also be glad to be proven wrong.) For a meaningful story, time isn’t a liability, but an asset. And it can lead to discoveries that you didn’t know were possible, but only if you’re willing to play with it.

Quote of the Day

leave a comment »

Written by nevalalee

February 9, 2016 at 7:30 am

Posted in Quote of the Day

Tagged with

The case against convenience

with 2 comments

Early patent sketches for Apple handheld device

Last week, I finally bought a MacBook Pro. It’s a slightly older model, since I wanted the optical drive and the ports that Apple is busy prying away from its current generation of devices, and though it isn’t as powerful under the hood as most of its younger cousins, it’s by any measure the nicest laptop I’ve ever owned. (For the last few years, I’ve been muddling through with a refurbished MacBook that literally disintegrated beneath my fingers as I used it: the screws came out of the case, the plastic buckled and warped, and I ended up keeping it together with packing tape and prayer. If this new computer self-destructs, I assume that it won’t be in such a dramatic fashion.) And while it might seem strange that I sprang for a relatively expensive art object from Apple shortly after my conversion to an Android phone, my favorite thing about this new arrangement is that I don’t need to worry about syncing a damned thing. For years, keeping my laptop and my phone synced up was a minor but real annoyance, particularly on a computer that seemed to audibly gasp for air whenever I connected it with my iPhone. Now that I don’t have that option, it feels weirdly liberating. My smartphone is off in its own little world, interacting happily with my personal data through Google Photos and other apps, while my laptop has access to the same information without any need to connect to my phone, physically or otherwise. Each has its own separate umbilicus linking it with the cloud—and never the twain shall meet.

And there’s something oddly comforting about relegating these devices to two separate spheres, as defined by their incompatible operating systems. I’ve spoken here before about Metcalfe’s Law, which is a way of thinking about the links between nodes in a telecommunications network: in theory, the more connections, the greater the total value. And while this may well be true of systems, like social media, in which each user occupies a single node, it’s a little different when you apply it to all the devices you own, since the complexity of overseeing those gadgets and their connections—which are entities in themselves—can quickly become overwhelming. Let’s say you have a laptop, a tablet, a smartphone. If each connects separately with the cloud, you’ve only got three connections to worry about, and you can allocate separate headspace to each one. But if they’re connected with each other as well as the cloud, the number of potential connections increases to six. This may not sound like much, although even two extra connections can grow burdensome if you’re dealing with them every day. But it’s even worse than that: the connections don’t run in parallel, but form a web, so that any modification you make to one invisibly affects all the others. If you’re anything like me, you’ve experienced the frustration of trying to customize the way you interact with one device, only to find that you’ve inadvertently changed the settings on another. The result is a mare’s nest of incompatible preferences that generate unpredictable interference patterns.

Apple Continuity

Segregating all the parts of your digital life from one another takes away much of that confusion: you don’t have to think about any of it if your computer and your phone don’t speak a common language. (They can each talk to the cloud, but not to each other, which provides all the connectivity you need while keeping the nodes at arm’s length.) But Apple and other tech companies seem determined to combine all of our devices into one terrifying hydra of information. One of the big selling points of the last few Mac OS X updates has been a feature ominously known as Continuity: you can start writing an email or editing a document on one device and pick it up on another, or use your laptop or tablet to make calls through your phone. This sounds like a nice feature in theory, but on closer scrutiny, it falls apart. The whole point of owning multiple devices is that each one is best suited for a certain kind of activity: I don’t want to edit a text document on my phone or make a call on my laptop if I can possibly avoid it. It might be nice to have the option of resuming on one device where you left off somewhere else, but in practice, most of us structure our routines so that we don’t have to worry about that: we can always save something and come back to it, and if we can’t, it implies that we’re enslaved to our work in a way that makes a mockery of any discussion of convenience. And retaining that option, in the rare cases when it’s really useful, involves tethering ourselves to a whole other system of logins, notifications, and switching stations that clutter up the ordinary tasks that don’t require that kind of connectivity.

Is the result “convenient?” Maybe for a user assembling such a system from scratch, like Adam naming the animals. But if you’re at all intelligent or thoughtful about how you work, you’ve naturally built up existing routines that work for you alone, using the tools that you have available. No solution designed for everybody is going to be perfect for any particular person, and in practice, the “continuity” that it promises is really a series of discontinuous interruptions, as you struggle to reconcile your work habits with the prepackaged solution that Apple provides. That search for idiosyncratic, practical, and provisional solutions for managing information and switching between different activities is central to all forms of work, creative and otherwise, and an imperfect solution that belongs to you—even if it involves rearranging your plans, heaven forbid, to suit whatever device happens to be accessible at the time—is likely to be more useful than whatever Apple has in mind. And treating the different parts of your digital life as essentially separate seems like a good first step. When we keep each device in its own little silo, we have a decent shot at figuring out an arrangement that suits each one individually, rather than wrestling with the octopus of connectivity. In the long run, any version of convenience that has been imposed from the outside isn’t convenient at all. And that’s the inconvenient truth.

Written by nevalalee

February 8, 2016 at 9:59 am

Posted in Writing

Tagged with , , ,

Quote of the Day

leave a comment »

Cornelius Eady

The draft is what you know about writing a poem running up against what you don’t know about the subject. If you’re lucky, you get to surprise yourself.

Cornelius Eady, to Drunken Boat

Written by nevalalee

February 8, 2016 at 7:30 am

Picasso until proven otherwise

leave a comment »

August Wilson

I always say that any painter that stands before a canvas is Picasso until proven otherwise. He stands before a blank canvas and he takes his tools. Paint, form, line, mass, color, relationship—those are the tools, and his mastery of those tools is what will enable him to put that painting on canvas. Everybody does the same thing. His turn out like that because he’s mastered the tools. What happens with writers is that they don’t want to learn the craft—that is, your tools. So if you wanna write plays, you can’t write plays without knowing the craft of playwriting. Once you have your tools, then you still gotta create out of that thing, that impulse. Out of necessity, as Bearden says: “Art is born out of necessity.” Most writers ignore the very thing that would get them results, and that’s craft. And how do you learn craft? In the trenches.

August Wilson, to The Believer

Written by nevalalee

February 7, 2016 at 7:30 am

A chef’s technique

leave a comment »

Jacques Pépin

To be a good chef, you have to be a good technician first. It does take time, and it’s a very thorough and very disciplined part of the learning process to be a good technician. If you happen to have talent, however, then the technique allows the talent to show, it takes it somewhere. If you are not a good technician, even if you are talented you cannot show it off, because you don’t have the knowledge in your fingers to do it…

Does [technique] make you an artist? Not really; you are a good technician. But if you happen to have talent and knowledge of the trade in your hand, then you can take it somewhere. Often, you reject all of the technique…But the fact of rejecting implies that you have acquired technique: otherwise there is nothing to reject. Whether you are a photographer or a surgeon or someone who works with his hands, a cabinetmaker and all that, you have to know your trade inside and out, and control it.

Jacques Pépin, quoted in Wisdom by Andrew Zuckerman

Written by nevalalee

February 6, 2016 at 7:30 am