Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Joss Whedon

Famous monsters of filmland

leave a comment »

For his new book The Big Picture: The Fight for the Future of the Movies, the journalist Ben Fritz reviewed every email from the hack of Sony Pictures, which are still available online. Whatever you might think about the ethics of using such material, it’s a gold mine of information about how Hollywood has done business over the last decade, and Fritz has come up with some fascinating nuggets. One of the most memorable finds is an exchange between studio head Amy Pascal and the producer Scott Rudin, who was trying to convince her to take a chance on Danny Boyle’s adaptation of Steve Jobs. Pascal had expressed doubts about the project, particularly over the casting of Michael Fassbender in the lead, and after arguing it was less risky than The Social Network, Rudin delivered a remarkable pep talk:

You ought to be doing this movie—period—and you and I both know that the cold feet you are feeling is costing you this movie that you want. Once you have cold feet, you’re done. You’re making this decision in the anticipation of how you will be looked at in failure. That’s how you fail. So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is not a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. You have this. Face it…Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed. Fall on your sword—when you’ve lost that, it’s finished. You’re the person who does these movies. That’s—for better or worse—who you are and who you will remain. To lose that is to lose yourself.

Steve Jobs turned out to be a financial disappointment, and its failure—despite the prestige of its subject, director, and cast—feels emblematic of the move away from films driven by stars to those that depend on “intellectual property” of the kind that Sony lacked. In particular, the movie industry seems to have shifted to a model perfected by Marvel Studios, which builds a cinematic universe that can drum up excitement for future installments and generate huge grosses overseas. Yet this isn’t exactly new. In the groundbreaking book The Genius of the System, which was published three decades ago, Thomas Schatz notes that Universal did much the same in the thirties, when it pioneered the genre of cinematic horror under founder Carl Laemmle and his son:

The horror picture scarcely emerged full-blown from the Universal machinery, however. In fact, the studio had been cultivating the genre for years, precisely because it played to Universal’s strengths and maximized its resources…Over the years Carl Laemmle built a strong international distribution system, particularly in Europe…[European filmmakers] brought a fascination for the cinema’s distinctly unrealistic qualities, its capacity to depict a surreal landscape of darkness, nightmare logic, and death. This style sold well in Europe.

After noting that the aesthetics of horror lent itself to movies built out of little more than shadows and fog, which were the visual effects of its time, Schatz continues: “This rather odd form of narrative economy was vitally important to a studio with limited financial resources and no top stars to carry its pictures. And in casting, too, the studio turned a limitation into an asset, since the horror film did not require romantic leads or name stars.”

The turning point was Tod Browning’s Dracula, a movie “based on a presold property” that could serve as an entry point for other films along the same lines. It didn’t require a star, but “an offbeat character actor,” and Universal’s expectations for it eerily foreshadow the way in which studio executives still talk today. Schatz writes:

Laemmle was sure it would [succeed]—so sure, in fact, that he closed the Frankenstein deal several weeks before Dracula’s February 1931 release. The Lugosi picture promptly took off at the box office, and Laemmle was more convinced than ever that the horror film was an ideal formula for Universal, given its resources and the prevailing market conditions. He was convinced, too, that he had made the right decision with Frankenstein, which had little presold appeal but now had the success of Dracula to generate audience anticipation.

Frankenstein, in short, was sort of like the Ant-Man of the thirties, a niche property that leveraged the success of its predecessors into something like real excitement. It worked, and Universal’s approach to its monsters anticipates what Marvel would later do on a vaster scale, with “ambitious crossover events” like House of Frankenstein and House of Dracula that combined the studio’s big franchises with lesser names that seemed unable to carry a film on their own. (If Universal’s more recent attempt to do the same with The Mummy fell flat, it was partially because it was unable to distinguish between the horror genre, the star picture, and the comic book movie, resulting in a film that turned out to be none of the above. The real equivalent today would be Blumhouse Productions, which has done a much better job of building its brand—and which distributes its movies through Universal.)

And the inability of such movies to provide narrative closure isn’t a new development, either. After seeing James Whale’s Frankenstein, Carl Laemmle, Jr. reacted in much the same way that executives presumably do now:

Junior Laemmle was equally pleased with Whale’s work, but after seeing the rough cut he was certain that the end of the picture needed to be changed. His concerns were twofold. The finale, in which both Frankenstein and his monster are killed, seemed vaguely dissatisfying; Laemmle suspected that audiences might want a glimmer of hope or redemption. He also had a more pragmatic concern about killing off the characters—and thus any possibility of sequels. Laemmle now regretted letting Professor Van Helsing drive that stake through Count Dracula’s heart, since it consigned the original character to the grave…Laemmle was not about to make the same mistake by letting that angry mob do away with the mad doctor and his monster.

Whale disagreed, but he was persuaded to change the ending after a preview screening, leaving open the possibility that the monster might have survived. Over eight decades later, Joss Whedon offered a similar explanation in an interview with Mental Floss: “It’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant…My feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.” For now, we’re living in a world made by the Universal monsters—and with only a handful of viable properties, half of which are owned by Disney. Without them, it might seem impossible, as Rudin said, “to create them from standard material.” But we’re also still waiting to be blindsided by the next great franchise. As another famous monster once put it: “A lot of times, people don’t know what they want until you show it to them.” And when it came to the movies, at least, Steve Jobs was right.

This is how they lose us

with one comment

“If you’re a boy writer, it’s a simple rule,” Junot Díaz once said. “You’ve gotta get used to the fact that you suck at writing women and that the worst woman writer can write a better man than the best male writer can write a good woman. And it’s just the minimum.” Díaz was speaking at an event at the Word Up Community Bookshop in New York on June 7, 2012, and his remarks, which he delivered in response to a question from the audience, have been widely quoted, passed around, and admired. He continued:

Because the thing about the sort of heteronormative masculine privilege, whether it’s in Santo Domingo, or the United States, is you grow up your entire life being told that women aren’t human beings, and that women have no independent subjectivity. And because you grow up with this, it’s this huge surprise when you go to college and realize that, “Oh, women aren’t people who does my shit and fucks me.”

And it’s hard to read this now without thinking of the recent essay by the writer Alisa Valdes, who says—along with so much else—of her painful relationship with Díaz: “Once, Díaz even asked me to clean his disgusting kitchen before I left back to Boston, telling me his severe depression made it hard for him to pick up after himself…When I asked him about this, he laughed and called out from his futon on the floor in his bedroom: ‘Sweetie, you can take the man out of the D.R., but you can’t take the Dominican out of the man.’”

But in light of the allegations against Díaz, it’s important to revisit his words from six years ago, because they speak to a difficult point that is only going to get harder. I wish I could quote the entire thing—which starts here around the 36:15 mark—but I’ll content myself with one more excerpt:

Every time I’m teaching boys to write, I read their women to them, and I’m like, “Yo, you think this is good writing?” These motherfuckers attack each other over cliché lines but they won’t attack each other over these toxic representations of women that they have inherited. Their sexist shorthand—they think that is observation. And I think the first step is to admit that you, because of your privilege, have a very distorted sense of women’s subjectivity. And without an enormous amount of assistance, you’re not even going to get a D. I think with male writers the most that you can hope for is a D with an occasional C thrown in. Where the average women writer, when she writes men, she gets a B right off the bat, because they spent their whole life being taught that men have a subjectivity. In fact, part of the whole feminism revolution was saying, “Me too, motherfuckers.”

When I read this, the first thing that hits me, apart from the intensity, is how beautifully Díaz manages to say all the right things. It reminds me now of what the librarian Allie Jane Bruce said of an interview with Daniel Handler and Sherman Alexie, who are currently being scrutinized themselves. These are men who “speak the language of liberalism, progressivism, and feminism perfectly and are capitalizing on it. Using it to promote themselves and their books.”

I haven’t read much of Díaz, so I’m not qualified to discuss his work in detail, but I can testify to what he meant to many different groups of writers and readers, including science fiction fans. And his case speaks to the next stage of the reckoning that confronts us, which will involve talking about the behavior of men who we thought were different, and who amount to native speakers of the language of third-wave feminism. I often think of a quote that is widely attributed to Joss Whedon, allegedly in response to an interviewer who asked why he wrote strong female characters: “Because you’re still asking me that question.” In fact, this wasn’t an interview, but an acceptance speech for the Equality Now award, in which he asked himself that question six times and came up with appropriately cutting responses. I don’t doubt that he was asked about it a lot—but it’s also noteworthy that his most quotable response came in reply to a straw man that he had set up expressly to knock down. And these days, his remarks have a more sinister ring. Whedon opened with the words: “I’m surrounded tonight by people of extraordinary courage.” According to his former wife Kai Cole, however, Whedon once felt that he was surrounded by something rather different:

He wrote me: “When I was running Buffy, I was surrounded by beautiful, needy, aggressive young women. It felt like I had a disease, like something from a Greek myth. Suddenly I am a powerful producer and the world is laid out at my feet and I can’t touch it.” But he did touch it.

And the hardest realization of all might be that these two sides of Whedon weren’t mutually exclusive. They existed at the same time.

In fact, we’re reaching the point where a man’s overly insistent assertion of his feminism might even come off as a warning sign. As Lindsay Zoladz wrote a few months ago on The Ringer: “There’s also something vaguely unsettling right now about male producers who make a point of their good relationships working with creative women…Quietly existing as a male ally is one thing; building a public brand off Not Being That Creep is another.” And there’s nothing easy about the conversations that still have yet to come. (I can’t help comparing Díaz’s situation to that of Eric T. Schneiderman, another prominent public advocate of women who resigned just hours after the publication of an article in The New Yorker about his abusive behavior in his private life. The New Yorker also has a long history with Díaz, including a recent personal essay that was widely seen as an attempt to get ahead of the allegations. But the magazine hasn’t said anything about him yet. And this isn’t a matter of indifference, but a reflection of how hard it can be to acknowledge the behavior of those we know and admire.) But perhaps the first step is to listen to our doubts, even if they seem unlikely to be heeded. As Virginia Vitzthum writes in Elle:

Díaz is an outspoken leftist, decrying economic and other inequalities from his position as fiction editor of the Boston Review. He calls sexism, along with racism and genocide, one of his major concerns as an activist and a writer…He refers to his writing as a “feminist-aligned project” achieved by “mapping male subjectivities.” I do not doubt that he is sensitive to the ways women are marginalized; it seems appropriate to ask him about the sexism in [This is How You Lose Her].

When she raises her concerns about his “constant dismissal of women as sets of culo-and-titties,” Díaz gets “all professorial” on her, but Vitzthum is having none of it. She writes in her conclusion: “About my failure to engage productively with your maps of male subjectivity? It’s not me, it’s you.” She’s right. And she was right when she wrote it six years ago.

Written by nevalalee

May 8, 2018 at 8:41 am

The crowded circle

leave a comment »

Earlier this week, Thrillist posted a massive oral history devoted entirely to the climactic battle scene in The Avengers. It’s well over twelve thousand words, or fifty percent longer than Ronan Farrow’s Pulitzer Prize-winning investigation of Harvey Weinstein, and you can occasionally feel it straining to justify its length. In its introduction, it doesn’t shy away from the hard sell:

Scholars swore that comic-book moviemaking peaked with Christopher Nolan’s lauded vision for The Dark Knight, yet here was an alternative, propulsive, prismatic, and thoughtful…The Battle of New York wasn’t just a third-act magic trick; it was a terraforming of the blockbuster business Hollywood believed it understood.

To put it mildly, this slightly overstates the case. Yet the article is still worth reading, both for its emphasis on the contributions of such artists as storyboard artist Jane Wu and for the presence of director Joss Whedon, who casually throws shade in all directions, including at himself. For instance, at one point, Ryan Meinerding, the visual effects department supervisor, recalls of the design of the alien guns: “We tried to find something that, if Black Widow got ahold of one of their weapons, she could use it in an interesting way. Which is how we ended up with that sort of long Civil War weapons.” Whedon’s perspective is somewhat different: “I look back, and I’m like, So my idea for making the weapons look different was to give them muskets? Did I really do that? Was that the sexiest choice? Muskets? Okay. But you know, hit or miss.”

These days, I can’t listen to Whedon’s studiously candid, self-deprecating voice in quite the way that I once did, but he’s been consistently interesting—if not always convincing—on points of craft, and his insights here are as memorable as usual. My favorite moment comes when he discusses the structure of the sequence itself, which grew from an idea for what he hoped would be an iconic image:

We’re going to want to see the group together. We’re going to want to do a shot of everyone back to back. Now we are a team. This is “The Avengers.” We’d get them in a circle and all facing up. Ryan Meinerding painted the team back to back, and that’s basically what I shot. They’re so kinetic and gorgeous, and he has a way of taking comic books and really bringing them to life, even beyond Alex Ross in a way that I’ve never seen…But then it was like, okay, why are they in a circle? That’s where they’re standing, but why? Let’s assume that there are aliens all over the walls, they’re surrounding them, they’re going to shoot at them, but they haven’t started yet. Why haven’t they started yet? And I was like Oh, let’s give the aliens a war cry… Then one of the aliens takes off his mask because we need to see their faces and hear that cry. The Avengers are surrounded by guys going, “We are going to fuck you up.” But not by guys who are shooting yet.

He concludes: “So there is a very specific reason that sort of evolved more and more right before we shot it. And then it’s like, okay, we got them here, and then once they’re there, you’re like, okay, how do we get them to the next thing?”

On some level, this is the kind of thing I should love. As I’ve discussed here before, the big beats of a story can emerge from figuring out what comes before and after a single moment, and I always enjoy watching a writer work through such problems in the most pragmatic way possible. In this case, though, I’m not sure about the result. The third act of The Avengers has always suffered a little, at least for me, from its geographic constraints. A handful of heroes have to credibly fend off an attack from an alien army, which naturally limits how big or dispersed the threat can be, and it seems strange that an invasion of the entire planet could be contained within a few blocks, even if they happen to include the photogenic Park Avenue Viaduct. The entire conception is undermined by the need to keep most of the characters in one place. You could imagine other possible climaxes—a chase, an assault on the enemy stronghold, a battle raging simultaneously at different locations around the world—that would have involved all the major players while still preserving a sense of plausibility and scale. But then you wouldn’t have gotten that circle shot. (Elsewhere in the article, Whedon offers a weirdly condescending aside about Zak Penn’s original draft of the script: “I read it one time, and I’ve never seen it since. I was like, ‘Nope. There’s nothing here.’ There was no character connection. There was a line in the stage directions that said, apropos of nothing, ‘And then they all walk towards the camera in slow motion because you have to have that.’ Yeah, well, no: You have to earn that.” Which sounds more to me like Whedon defensively dismissing the kind of joke that he might have made himself. And you could make much the same criticism of the circle shot that he had in mind.)

And the whole anecdote sums up my mixed feelings toward the Marvel Universe in general and The Avengers in particular. On its initial release, I wrote that “a lot of the film, probably too much, is spent slotting all the components into place.” That certainly seems to have been true of the climax, which also set a dangerous precedent in which otherwise good movies, like The Winter Soldier, felt obliged to end in a blur of computer effects. And it’s even more clear now that Whedon’s tastes and personality were only occasionally allowed to shine through, often in the face of active opposition from the studio. (Of the one of the few moments from the entire movie that I still recall fondly, Whedon remembers: “There were objections to Hulk tossing Loki. I mean, strong objections. But they were not from Kevin [Feige] and Jeremy [Latcham], so I didn’t have to worry.”) Marvel has since moved on to movies like Captain America: Civil War, Thor: Ragnarok, and Black Panther, much of which are authentically idiosyncratic, fun, and powerful in a way that the studio’s defining effort managed to only intermittently pull off. But it’s revealing that the last two films were mostly allowed to stand on their own, which is starting to seem like a luxury. Marvel is always trying to get to that circle shot, and now the numbers have been multiplied by five. It reflects what I’ve described as the poster problem, which turns graphic design—or storytelling—into an exercise in crowd control. I’m looking forward to Avengers: Infinity War, but my expectations have been tempered in ways for which The Avengers itself, and specifically its climactic battle, was largely responsible. As Whedon concedes: “Sometimes you have to do the shorthand version, and again, that’s sort of against how I like to view people, but it’s necessary when you already have twenty major characters.”

The watchful protectors

leave a comment »

Ben Affleck in Batman V. Superman: Dawn Of Justice

In the forward to his new book Better Living Through Criticism, the critic A.O. Scott imagines a conversation with a hypothetical interlocutor who asks: “Would it be accurate to say that you wrote this whole book to settle a score with Samuel L. Jackson?” “Not exactly,” Scott replies. The story, in case you’ve forgotten, is that after reading Scott’s negative review of The Avengers, Jackson tweeted that it was time to find the New York Times critic a job “he can actually do.” As Scott recounts:

Scores of his followers heeded his call, not by demanding that my editors fire me but, in the best Twitter tradition, by retweeting Jackson’s outburst and adding their own vivid suggestions about what I was qualified to do with myself. The more coherent tweets expressed familiar, you might even say canonical, anticritical sentiments: that I had no capacity for joy; that I wanted to ruin everyone else’s fun; that I was a hater, a square, and a snob; even—and this was kind of a new one—that the nerdy kid in middle school who everybody picked on because he didn’t like comic books had grown up to be me.

Before long, it all blew over, although not before briefly turning Scott into “both a hissable villain and a make-believe martyr for a noble and much-maligned cause.” And while he says that he didn’t write his book solely as a rebuttal to Jackson, he implies that the kerfuffle raised a valuable question: what, exactly, is the function of a critic these days?

It’s an issue that seems worth revisiting after this weekend, when a movie openly inspired by the success of The Avengers rode a tide of fan excitement to a record opening, despite a significantly less positive response from critics. (Deadline quotes an unnamed studio executive: “I don’t think anyone read the reviews!”) By some measures, it’s the biggest opening in history for a movie that received such a negative critical reaction, and if anything, the disconnect between critical and popular reaction is even more striking this time around. But it doesn’t seem to have resulted in the kind of war of words that blindsided Scott four years ago. Part of this might be due to the fact that fans seem much more mixed on the movie itself, or that the critical consensus was uniform enough that no single naysayer stood out. You could even argue—as somebody inevitably does whenever a critically panned movie becomes a big financial success—that the critical reaction is irrelevant for this kind of blockbuster. To some extent, you’d be right: the only tentpole series that seems vulnerable to reviews is the Bond franchise, which skews older, and for the most part, the moviegoers who lined up to see Dawn of Justice were taking something other than the opinions of professional critics into account. This isn’t a superpower on the movie’s part: it simply reflects a different set of concerns. And you might reasonably ask whether this kind of movie has rendered the role of a professional critic obsolete.

A.O. Scott

But I would argue that such critics are more important than ever, and for reasons that have a lot to do with the “soulless corporate spectacle” that Scott decried in The AvengersI’ve noted here before that the individual installments in such franchises aren’t designed to stand on their own: when you’ve got ten more sequels on the release schedule, it’s hard to tell a self-contained, satisfying story, and even harder to change the status quo. (As Joss Whedon said in an interview with Mental Floss: “You’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant.”) You could be cynical and say that no particular film can be allowed to interfere with the larger synergies at stake, or, if you’re in a slightly more generous mood, you could note that this approach is perfectly consistent with the way in which superhero stories have always been told. For the most part, no one issue of Batman is meant to stand as a definitive statement: it’s a narrative that unfolds month by month, year by year, and the character of Batman himself is far more important than any specific adventure. Sustaining that situation for decades on end involves a lot of artistic compromises, as we see in the endless reboots, resets, spinoffs, and alternate universes that the comic book companies use to keep their continuities under control. Like a soap opera, a superhero comic has to create the illusion of forward momentum while remaining more or less in the same place. It’s no surprise that comic book movies would employ the same strategy, which also implies that we need to start judging them by the right set of standards.

But you could say much the same thing about a professional critic. What A.O. Scott says about any one movie may not have an impact on what the overall population of moviegoers—even the ones who read the New York Times—will pay to see, and a long string of reviews quickly blurs together. But a critic who writes thoughtfully about the movies from week to week is gradually building up a narrative, or at least a voice, that isn’t too far removed from what we find in the comics. Critics are usually more concerned with meeting that day’s deadline than with adding another brick to their life’s work, but when I think of Roger Ebert or Pauline Kael, it’s sort of how I think of Batman: it’s an image or an attitude created by its ongoing interactions with the minds of its readers. (Reading Roger Ebert’s memoirs is like revisiting a superhero’s origin story: it’s interesting, but it only incidentally touches the reasons that Ebert continues to mean so much to me.) The career of a working critic these days naturally unfolds in parallel with the franchise movies that will dominate studio filmmaking for the foreseeable future, and if the Justice League series will be defined by our engagement with it for years to come, a critic whose impact is meted out over the same stretch of time is better equipped to talk about it than almost anyone else—as long as he or she approaches it as a dialogue that never ends. If franchises are fated to last forever, we need critics who can stick around long enough to see larger patterns, to keep the conversation going, and to offer some perspective to balance out the hype. These are the critics we deserve. And they’re the ones we need right now.

The second time around

with one comment

Lolita

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What’s something you wish could be remade or redone but is maybe too iconic or otherwise singular for anyone to actually take on the risk?”

When you look at a chronological list of any artist’s works, the first item can be both less and more than meets the eye. A first novel or movie—to take just two art forms—is always biographically interesting, but it’s also subject to particular pressures that can limit how well it expresses the creator’s personality. It’s the product of comparative youth, so it often suffers from rawness and inexperience, and it enters the world under unfavorable circumstances. For an unproven quantity from an unknown name, the tension between personal expression and the realities of the marketplace can seem especially stark. An aspiring novelist may write a book he hopes he can sell; a filmmaker usually starts with a small project that has a chance at being financed; and both may be drawn to genres that have traditionally been open to new talent. Hence the many directors who got their start in horror, exploitation, and even borderline porn. Francis Ford Coppola’s apprenticeship is a case in point. Before Dementia 13, which he made under the auspices of Roger Corman, he’d directed skin flicks like Tonight for Sure and The Bellboy and the Playgirls, and it took years of kicking around before he landed on The Godfather, which I’m sure he, and the rest of us, would prefer to see as his real debut.

Any early work, then, needs to be taken with a grain of salt. (This doesn’t even account for the fact that what looks like a debut may turn out that way almost by accident. The Icon Thief wasn’t the first novel I attempted or even finished, but it was the first one published, and it set a pattern for my career that I didn’t entirely anticipate.) But there’s also a real sense that an artist’s freshman efforts may be the most characteristic works he or she will ever produce. When you’re writing a novel or making a movie for the first time, you aren’t necessarily thinking in terms of a filmography that will stretch over fifty years: it seems like enough of a miracle to get this one story out into the world. As a result, if you’re at all rational, you’ll invest that effort into something that matters to you. This could be your only shot, so you may as well spend it on an idea that counts. Later, as you grow older, you often move past those early interests and obsessions, but they’ll always carry an emotional charge that isn’t there in the works you tackled in your maturity, or after you had all the resources you needed. And when you look back, you may find yourself haunted by the divide between your ambitions and the means—internal and otherwise—available to you at the time.

The Fury

That’s why I’m always a little surprised that more artists don’t go back to revisit their own early work with an eye to doing a better job. Sometimes, of course, the last thing you want is to return to an old project: doing it even once can be enough to drain you of all enthusiasm. But it happens. In fiction, the revised versions of novels like The Magus, The Sot-Weed Factor, and The Stand represent a writer’s attempt to get it right the second time. You could see the television version of Buffy the Vampire Slayer as Joss Whedon’s remake of his own original screenplay in the form that it deserved. In film, directors as different as Ozu, DeMille, Hitchcock, and Haneke have gone back to redo their earlier work with bigger stars, larger budgets, or simply a more sophisticated sense of what the story could be. (My own favorite example is probably Evil Dead 2, which is less a sequel than a remake in a style closer to Sam Raimi’s intentions.) And of course, the director’s cut, which has turned into a gimmick to sell movies on video or to restore deleted scenes that should have remained unseen, began as a way for filmmakers to make another pass on the same material. Close Encounters, Blade Runner, Apocalypse Now, and Ashes of Time have all been revised, and even if you prefer the older versions, it’s always fascinating to see a director rethink the choices he initially made.

That said, this impulse has its dark side: George Lucas has every right to tinker with the Star Wars movies, but not to withdraw the originals from circulation. But it’s an idea that deserves to happen more often. Hollywood loves remakes, but they’d be infinitely more interesting if they represented the original director’s renewed engagement with his own material. I’d love to have seen Kubrick—rather than Adrian Lyne—revisit Lolita in a more permissive decade, for instance, and to take a modern example almost at random, I’d much rather see Brian DePalma go back to one of his earlier flawed movies, like The Fury or even Dressed to Kill, rather than try to recapture the same magic with diminishing returns. And the prospect of David Fincher doing an Alien movie now would be considerably more enticing than what he actually managed to do with it twenty years ago. (On a somewhat different level, I’ve always thought that The X-Files, which strained repeatedly to find new stories in its later years, should have gone back to remake some of its more forgettable episodes from the first season with better visual effects and a fresh approach.) Most artists, obviously, prefer to strike out in new directions, and such projects would carry the implication that they were only repeating themselves. But if the movies are going to repeat old ideas anyway, they might as well let their creators take another shot.

The poster problem

leave a comment »

Avengers: Age of Ultron

Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:

This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.

If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.

Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:

But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.

Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.

Inception

It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.

And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.

Written by nevalalee

April 29, 2015 at 8:44 am

The killing joke

with 2 comments

Kevin Spacey and Robin Wright in House of Cards

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What television trope aggravates you the most?”

Clichés exist for a reason. As I pointed out in my post on the cinematic baguette, whenever a trope becomes exhausted to the point of parody, it’s because it was once incredibly useful. Most of the conventions that wind up advertising a story’s unreality, like the fact that the first bill the hero pulls from his wallet is always the exact amount he needs to pay for his cab, or that people in movies rarely say “Hello” or “Goodbye” on the phone, are about saving time or conveying information to the audience. Two of my wife’s least favorite tropes fall in this category: Hollywood Gift Wrap, in which a present is wrapped so that the character can open it simply by lifting off the lid, and They Wasted a Perfectly Good Sandwich, in which one character meets another for lunch, orders, then leaves as soon as the food arrives. In both cases, there’s a pragmatic rationale—it’s a pain to rewrap a present between takes, and it’s equally hard to maintain continuity with food—but it also makes good narrative sense. The point isn’t to watch the character unwrapping the present, but to see what’s inside the box, and even if we’re annoyed by the transparent artifice of that lid with its separate ribbon, we’d probably be even more irritated if the show spent thirty seconds showing the character fumbling with the gift paper.

Television has its own set of tropes that the nature of the medium enforces, although whenever you notice a convention for the first time, you’ll also find a show that can’t wait to break it. For decades, sitcoms and procedural dramas tended to hit the reset button at the end of every episode: no matter what happened, you’d find the characters in the same familiar situations and relationships the following week. This was largely a consequence of syndication, which routinely aired episodes out of order, and the rise in serialized storytelling fueled by alternative viewing options has allowed shows of every genre to show characters evolving over time. Similarly, the concept of the character shield originates in the logistics of actors’ contracts: when the lead actors are slated to appear at least through the rest of the season, there’s little suspense over whether Mulder or Scully will survive their latest brush with the paranormal. More recently, however, shows have begun to play with the audience’s expectations on how invulnerable major characters can be. Joss Whedon is notorious for killing off fan favorites, and Game of Thrones has raised the bar for showing us the unexpected deaths of lead characters—and not once but twice.

The Breaking Bad episode "Fly"

On the surface, this seems like a positive development, since it discourages audience complacency and forces the viewer to fully commit to the drama of each episode. With occasional exceptions, the show’s lead character is still relatively safe, barring the occasional contract dispute, but when it comes to the supporting cast, we’ve been taught that no one is immune. Yet I’ve begun to feel that this idea has become a cliché in itself, and at its worst, the storytelling it inspires can be equally lazy. One unexpected character death can be shocking; when a show piles them up over and over again, as The Vampire Diaries does, it isn’t long before we start to see diminishing returns. (It doesn’t help that nobody on The Vampire Diaries seems to stay dead forever.) Even on shows that parcel out their casualties out more scrupulously, there’s a sense that this trope is becoming exhausted. When an important character was suddenly dispatched at the beginning of the second season of House of Cards, it was shocking in the moment—although I found myself more distracted by the inexplicability of it all—but the show seemed eager to dance away from confronting the consequences. These days, it’s just business as usual.

And the worst thing about the casual killing of characters is that it encourages a sort of all or nothing approach to writing stories. Ninety percent of the time, a show goes through the motions, but every few episodes, somebody is shoved in front of a bus—when it might be more interesting, and more difficult, to create tension and suspense while those characters were sill alive. Major deaths should be honestly earned, not just a way to keep the audience awake. Of course, it’s easier to shock than to engage, and the sudden death of a character has become television’s equivalent of a jump scare, an effect that can pulled off the shelf without thinking. I hate to keep coming back to Breaking Bad as a reference point, just because it’s what everyone else does, but I can’t help it. Few viewers had any doubt that Walt, and probably Jesse, would make it to the final episode, so the writers became agonizingly inventive at finding ways of testing them and their loved ones in the meantime, to the point where death itself seemed like a blessing. At this point, I’m no longer surprised or impressed when a character dies, but I’m actively grateful when a show puts us through the wringer in other ways. There’s an enormous spectrum of experience between life and death. And it’s far better to keep these characters alive, if you can make me care about what happens to them next.

Written by nevalalee

March 14, 2014 at 9:49 am

Man and supermen

leave a comment »

Man of Steel

I’m starting to come to terms with an uncomfortable realization: I don’t much like The Avengers. Watching it again recently on Netflix, I was impressed by how fluidly it constructs an engaging movie out of so many prefabricated parts, but I couldn’t help noticing how arbitrary much of it seems. Much of the second act, in particular, feels like it’s killing time, and nothing seems all that essential: it clocks along nicely, but the action scenes follow on one another without building, and the stakes never feel especially high, even as the fate of the world hangs in the balance. And I don’t think this is Joss Whedon’s fault. He comes up with an entertaining package, but he’s stuck between the need to play with all the toys he’s been given while delivering them intact to their next three movies. Each hero has his or her own franchise where the real story development takes place, so The Avengers begins to play like a sideshow, rather than the main event it could have been. This is a story about these characters, not the story, and for all its color and energy, it’s a movie devoted to preserving the status quo. (Even its most memorable moment seems to have been retconned out of existence by the upcoming Agents of S.H.I.E.L.D.)

And while it may seem pointless to worry about this now, I think it’s worth asking what kind of comic book movies we really want, now that it seems that they’re going to dominate every summer for the foreseeable future. I’ve been pondering this even more since finally seeing Man of Steel, which I liked a lot. It has huge problems, above all the fact that its vision of Superman never quite comes into focus: by isolating him from his supporting cast for much of the movie, it blurs his identity to the point where major turning points, like his decision to embrace his role as a hero, flit by almost unnoticed. Yet once it ditches its awkward flashback structure, the movie starts to work, and its last hour has a real sense of awe, scale, and danger. And I’m looking forward to the inevitable sequel, even if it remains unclear if Henry Cavill—much less Zach Snyder or Christopher Nolan—can give the scenes set at the Daily Planet the necessary zest. At their best, the Superman films evoke a line of classic newspaper comedies that extends back to His Girl Friday and even Citizen Kane, and it’s in his ability to both wear the suit and occupy the skin of Clark Kent that Christopher Reeve is most sorely missed.

Joss Whedon on the set of The Avengers

If nothing else, Man of Steel at least has a point of view about its material, however clouded it might be, which is exactly what most of the Marvel Universe movies are lacking. At this point, when dazzling special effects can be taken for granted, what we need more than anything is a perspective toward these heroes that doesn’t feel as if it were dictated solely by a marketing department. Marvel itself doesn’t have much of an incentive to change its way of doing business: it’s earned a ton of money with this approach, and these movies have made a lot of people happy. But I’d still rather watch Chris Nolan’s Batman films, or even an insanity like Watchmen or Ang Lee’s Hulk, than yet another impersonal raid on the Marvel toy chest. Whedon himself is more than capable of imposing an idiosyncratic take on his projects, and even though it only intermittently comes through in The Avengers itself, I’m hopeful that its success will allow him to express himself more clearly in the future—which is one reason why I’m looking forward to Agents of S.H.I.E.L.D., which seems more geared toward his strengths.

And although I love Nolan’s take on the material, it doesn’t need to be dark, or even particularly ambitious. For an illustration, we need look no further than Captain America, which increasingly seems to me like the best of the Marvel movies. Joe Johnston’s Spielberg imitation is the most credible we’ve seen in a long time—even better, in many ways, than Spielberg himself has managed recently with similar material—and you can sense his joy at being given a chance to make his own Raiders knockoff. Watching it again last night, even on the small screen, I was utterly charmed by almost every frame. It’s a goof, but charged with huge affection toward its sources, and I suspect that it will hold up better over time than anyone could have anticipated. Unfortunately, it already feels like an anomaly. Much of its appeal is due to the period setting, which we’ve already lost for the sequel, and it looks like we’ve seen the last of Hugo Weaving’s Red Skull, who may well turn out to be the most memorable villain the Marvel movies will ever see. Marvel’s future is unlikely to be anything other than hugely profitable for all concerned, but it’s grown increasingly less interesting.

Written by nevalalee

July 9, 2013 at 8:54 am

Joss Whedon on the importance of structure

leave a comment »

Joss Whedon on the set of The Avengers

Structure means knowing where you’re going; making sure you don’t meander about. Some great films have been made by meandering people, like Terrence Malick and Robert Altman, but it’s not as well done today and I don’t recommend it. I’m a structure nut. I actually make charts. Where are the jokes? The thrills? The romance? Who knows what, and when? You need these things to happen at the right times, and that’s what you build your structure around: the way you want your audience to feel. Charts, graphs, coloured pens, anything that means you don’t go in blind is useful.

Joss Whedon, to Hotdog (courtesy of Aerogramme Writers’ Studio)

Written by nevalalee

March 16, 2013 at 9:50 am

Lessons from Great TV: An Introduction

leave a comment »

As the triumphant conclusion to the fifth season of Mad Men recently made clear, we’re living in an age of great television, at least for those willing to seek it out. It’s also a time in which the role of the television writer has achieved greater prominence in popular culture than ever before. This is partly because of the shows themselves, which are increasingly eager to engage in layered, serialized storytelling; because writers have a much wider range of platforms to discuss their work, whether in the media, at conferences, or on commentary tracks; and because of the emergence of highly articulate fandoms that have made cult heroes out of showrunners like Joss Whedon and Dan Harmon. In my own case, television has inevitably played a large role in my life—everything I’ve ever gotten paid for writing owes something to The X-Files—but it’s only more recently that I’ve begun to think about the specific lessons that television has for writers in any medium.

Over the next two weeks, then, I’m going to be talking about ten episodes of television, in chronological order, that have shaped the way I think about writing. This isn’t meant to be a list of the greatest TV series of all time—unless my plans change, I won’t have a chance to discuss such recent high points as The Wire or Breaking Bad. Rather, these are episodes that illustrate what television has taught me about such important matters as telling complex stories over time; dealing with constraints; managing a large cast of characters; and, crucially, finding a way to end it all. The shows I’ve chosen reflect the haphazard nature of my television education, which was first informed by Nick at Nite and resumed in real time in the early nineties, right around the time Twin Peaks and The Simpsons premiered only four months apart. In short, it’s inseparable from the rhythms of my own life. For the next couple of weeks, I’m going to do my best to explain what the effects have been.

On Monday: Why I wanted Rob Petrie’s job.

Written by nevalalee

June 29, 2012 at 10:00 am

Whither Whedon?

leave a comment »

Over the weekend, along with everyone else in the Northern Hemisphere, my wife and I saw The Avengers. I’m not going to bother with a formal review, since there are plenty to go around, and in any case, if you haven’t already seen it, your mind is probably made up either way. I’ll just say that while I enjoyed it, this is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many, as well as a rather dull villain—far better if they had gone with the Red Skull of Captain America—so that a lot of the film, probably too much, is spent slotting all the components into place.

Still, once everything clicks, it moves along efficiently, if not always coherently, and it’s a bright, shiny toy for the eyes, certainly compared to the dreary Thor. It doesn’t force us to rethink what this genre is capable of doing, as The Dark Knight did, but it’s a movie that delivers exactly what audiences want, and perhaps a touch more, which is more than enough to deliver the highest opening weekend in history. And this, more than anything else, puts its director in a peculiar position. Joss Whedon has made a career out of seeming to work against considerable obstacles, and never quite succeeding, except in the eyes of his devoted fans. Buffy switched networks; Firefly was canceled before its time; Dollhouse struggled on for two seasons in the face of considerable interference. All of his projects carry a wistful sense of what might have been, and throughout it all, Whedon has been his own best character, unfailingly insightful in interviews, gracious, funny and brave, the underdog whose side he has always so eloquently taken.

So what happens when the underdog becomes responsible for a record-shattering blockbuster? The Avengers isn’t all that interesting as a movie—far less so than The Cabin in the Woods—but it’s fascinating as a portent of things to come. Whedon has delivered the kind of big popular success that can usually be cashed in for the equivalent of one free movie with unlimited studio resources, as if all the holes in his frequent shopper’s card had finally been punched. For most of his career, at least since Buffy, Whedon has had everything—charm, talent, an incredibly avid fanbase—except the one thing that a creative type needs to survive in Hollywood: power. Now, abruptly, he has oodles of it, obtained in the only way possible, by making an ungodly amount of money for a major studio. Which means that he’s suddenly in a position, real or imaginary, to make every fanboy’s dreams come true.

The question is what he intends to do with it. Unlike Christopher Nolan, he isn’t a director who seems to gain personal satisfaction from deepening and heightening someone else’s material, so The Avengers 2 doesn’t seem like the best use of his talents. Personally, I hope he pulls a Gary Ross, takes the money, and runs. He could probably make another Firefly movie, although that doesn’t seem likely at this point. He could make Goners. He could pick up an ailing franchise with fewer moving parts and do wonderful things with it—I hear that Green Lantern is available. Or, perhaps, he’ll surprise us. The Avengers isn’t a bad film, but it gives us only occasional glimpses of the full Whedon, peeking out from between those glossy toys, and those hints make you hunger for a big movie that he could control from beginning to end. For most of his career, fans have been wondering what he’d do with the full resources and freedom he’d long been denied—even as he seemed to thrive on the struggle. And if he’s as smart and brave as he’s always seemed, he won’t wait long to show us.

Written by nevalalee

May 7, 2012 at 9:50 am

Writers of all work

with 7 comments

When I was younger, I wanted to be a man of letters. I wasn’t sure what this meant, or even if such a thing still existed, but based on my vague sense of what the position entailed, it sounded like an ideal job. You’d be a novelist first, sure, but you’d also write short stories, nonfiction, criticism, and more, following your own inclinations, after the example of many of my early heroes, like Norman Mailer. It never entered my head to wonder why a writer might produce a body of work like this—I assumed he did it just because it seemed cool. But the more time passes, the more I realize that the figure of “the man of letters” is really a byproduct of years spent looking for ways to make a living while writing. And it’s been like this for a long time. Speaking of the essayists of the eighteenth century, whom he calls “writers of all work,” the critic George Saintsbury says:

The establishment of the calling of man of letters as an irregular profession, and a regular means of livelihood, almost necessarily brought with it the devotion of the man of letters himself to any and every form of literature for which there was a public demand…It became, therefore, almost necessary on the one hand, and comparatively easy on the other, for the [writer]…to be everything by turns and nothing long.

Strike out the phrase “comparatively easy,” and you have a pretty good description of the contemporary freelance writer, which is essentially what Samuel Johnson, Oliver Goldsmith, and other denizens of Grub Street really were. They worked as essayists, dramatists, poets, and producers of what Saintsbury calls “hackwork or something more”—translations, histories, popular science—as demand and opportunity required. They were, in short, freelancers. And if their work has endured, it’s because of their exceptional talent, productivity, and versatility, all of which were born, not from some abstract ideal of the man of letters, but from the practical constraints of being a working writer, which is something that every freelancer can understand. They just happened to be better at it than most.

Looking at my own life these days, it’s clear that I’ve had to be “everything by turns and nothing long” to an extent that still takes me by surprise. In the past couple of months alone, I’ve seen the publication of my first novel, worked on the copy edit of the second, and pushed ahead furiously on a rough draft of the third. I’ve written a couple of articles, including my debut essay in The Daily Beast, as well as a long Q&A, a guest post on another blog, and thousands of words here. I have a science fiction novelette coming out in Analog in July and I’m preparing a proposal this week for another nonfiction project. In short, as usual, I’m working on a lot of things at once that don’t, at first glance, have much to do with one another, and sometimes the payoff can be hard to see. But this is what being a working writer is all about.

And this sort of multitasking has creative benefits as well. Drew Goddard, talking to the New York Times the other day about Joss Whedon’s wide range of activities, puts it nicely: “Everything became a vacation from other things.” When you get burnt out on one project, it’s nice to have something else to turn to instead, and your various pieces of work can inform one another in surprising ways. I’ve learned a lot about structuring nonfiction from my work as a novelist—a good essay is often surprisingly similar to a well-constructed chapter—and my fiction, in turn, has benefited from the skills I’ve acquired as an essayist and, yes, a blogger. Everything feeds into everything else, if not right away, then somewhere down the line. It keeps me sane. And after forty years of scrounging around, I’ll have a body of work of which I can hopefully be proud. Because in the end, a man of letters is just a freelancer who survived.

Written by nevalalee

April 18, 2012 at 10:48 am

We need to talk about Cabin

with 2 comments

If there’s one thing we’ve learned about American movie audiences over the past decade or so, it’s that they don’t like being surprised. They may say that they do, and they certainly respond positively to twist endings, properly delivered, within the conventions of the genre they were hoping to see. What they don’t like is going to a movie expecting one thing and being given something else. And while this is sometimes a justifiable response to misleading ads and trailers, it can also be a form of resentment at having one’s expectations upended. Audiences, it seems, would rather see a bad movie that meets their expectations than a great one that subverts them. And whenever there’s a sharp discrepancy between critical acclaim and audience reaction, as measured by CinemaScore, it’s often for a challenging film—think Drive or The American—that has been cut together in its commercials to look like safe, brainless genre fare, or one like Vanilla Sky or Solaris that, whatever its flaws, is trying valiantly to break out of the box. (Or The Box.)

I found myself mulling over this yesterday after seeing The Cabin in the Woods, an uneven but often terrific movie, in both senses of the word, that seems designed to frustrate the kinds of audience members that CinemaScore so diligently tracks. All the danger signs were there: this is ostensibly a horror movie, after all, a genre that tends to get positive responses from audience members only if it gives them precisely what they want. It’s also comedy-horror, a notoriously tricky genre. And most of all, writers Joss Whedon and Drew Goddard take a seemingly conventional story—five familiar slasher-movie types menaced in, well, a cabin in the woods—and deconstruct it so savagely that no one, not even the filmmakers or the audience, can escape. Despite all this, The Cabin in the Woods escaped with a C rating on CinemaScore, which is more than I would have expected, but still implies that a lot of people aren’t happy—anything less than a B+ or so is seen as a sign of trouble ahead. As a commenter on the A.V. Club says of the early reaction: “There was quite a lot of love and stunnedness, sure, but there was also a healthy amount of ‘waste of money’ and ‘dumbest movie ever.'”

And in a sense, The Cabin in the Woods is a stupid movie, if you define stupidity as an obstinate refusal to meet your expectations. Clearly, it’s more than capable of delivering the kind of horror that the audience wants: it cheerfully provides plenty of jump scares, shadowy basements, and bucketfuls of gore. The fact that it then turns into something much different can strike a lot of people as simple incompetence. The logic goes something like this: if they could give us a straightforward horror film, but didn’t, they must have no clue as to what we really want. The idea that a movie may know what we want and refuse to provide it, in the classic Joss Whedon style, doesn’t entirely compute—and rightly so, since most of the movies we see have trouble just delivering on their most basic promises. The Cabin in the Woods has it both ways as much as a movie possibly can—it never stops being scary, funny, and entertaining even as it changes the rules of its own game—but it still seems to have left a lot of people feeling cheated. Box Office Mojo sums up the situation nicely:

By delivering something much different, the movie delighted a small group of audience members while generally frustrating those whose expectations were subverted. Moviegoers like to know what they are in for when they go to see a movie, and when it turns out to be something different the movie tends to get punished in exit polling.

So what’s a director, or a movie studio, to do? The easiest response, obviously, is either to give away every twist in the trailer, as the director Robert Zemeckis has famously advocated, or to only make movies that deliver blandly on an audience’s expectations while flattering them otherwise. In the latter case, this results in movies and marketing campaigns like those for Super 8 and Cloverfield (also written, interestingly, by Drew Goddard), which are essentially elaborate simulations of movies with a twist or secret premise, when in fact the film itself is utterly conventional. The Cabin in the Woods, by contrast, has a real secret, not a winking simulacrum of one: the trailer hints at it, but the movie goes much further than most moviegoers would expect. Not surprisingly, it’s getting punished for it. Because unlike movies that appeal squarely to the art house or the solid mainstream, Cabin occupies that risky space where the expectations of a mass audience collide with something rich and strange. And that’s the scariest place for any movie to be.

Written by nevalalee

April 17, 2012 at 10:00 am

Quote of the Day

leave a comment »

Written by nevalalee

September 22, 2011 at 7:51 am

What I’ve learned from Glee

leave a comment »

The other night, my wife asked, with genuine curiosity: “Why do you like Glee?” Which, honestly, is a really good question. I don’t watch a lot of television; I’m not, as far as I can tell, anything close to Glee‘s target demographic; I know that Glee is fundamentally flawed, and often disappointing; and yet I find it fun to watch and, more surprisingly, interesting to think about. But why?

My only answer, aside from the fact that I like musicals, that that I enjoy Glee because of its flaws, because it can be frustrating and horrifically uneven, because it regularly neglects its own characters, and because an average episode can get nearly every moment wrong—and yet still remain a compelling show. For a writer who cares about pop culture, it’s the most interesting case study around. (As opposed to, say, Mad Men, which is the best TV drama I’ve ever seen, but much less instructive in its sheer perfection.)

Here, then, are some of the lessons, positive and negative, that I’ve tried to draw from Glee:

Positive:

1. Do follow through on big moments. Howard Hawks defined a good movie as having three good scenes and no bad scenes. The average episode of Glee has maybe three good scenes and eight bad scenes, but the good stuff is usually executed with enough conviction and skill to carry the audience past the rest. The lesson? Every story has a few big moments. No matter what else you do as a writer, make sure those moments work.

2. Do invest the audience in your characters as early as possible. Glee‘s pilot, which now seems so long ago, did an impressive job of generating interest in a massive cast of characters. Since then, nearly everything the pilot established has been thrown out the window, but the viewer’s initial engagement with Will, Rachel, and the rest still gives the show a lot of goodwill, which it hasn’t entirely squandered. (Please note, though, that a cast of appealing actors goes a long way toward maintaining the audience’s sympathy. In a novel, once your characters have lost the reader’s interest, it’s very hard to win it back.)

3. Do push against yourself and your story. A.V. Club critic Todd VanDerWerff has done a heroic job of arguing the “three authors” theory of Glee: that the show’s creators—Ryan Murphy, Brad Falchuk, and Ian Brennan—each have distinct, and conflicting, visions of what the show should be, and that this inherent tension is what makes the show so fascinating. Similarly, much of the interest of an ambitious novel comes from the writer’s struggle against the restrictions and contradictions of his or her own story. (Of course, if you don’t give yourself at least some constraints, such as those of genre, you aren’t likely to benefit from this.)

Negative:

1. Don’t neglect structure. Remember the importance of constraints? The trouble with Glee is that it doesn’t seem to have any. Early on, the show established a tone and style in which almost anything could happen, which is fine—but even the most anarchic comedy benefits from following a consistent set of rules. In Glee‘s case, a little more narrative coherence, and a lot more character consistency, would go a long way towards making it a great show, rather than a fascinating train wreck.

2. Don’t take your eye off the long game. Glee rather notoriously went through four years’ worth of plotlines in its first season, and as a result, the second season has seemed increasingly aimless. Obviously, it’s hard for most TV shows, which hover precariously between cancellation and renewal, to plan much further ahead than the next order of episodes, but a novelist has no such excuse. A writer has to maintain the reader’s interest over hundreds of pages, so as tempting as it is to put all your best ideas up front, it’s important to keep a few things in reserve, especially for the ending.

3. Don’t give the audience what it wants. Joss Whedon, as usual, put it best:

In terms of not giving people what they want, I think it’s a mandate: Don’t give people what they want, give them what they need. What they want is for Sam and Diane to get together. [Whispers.] Don’t give it to them. Trust me. [Normal voice.] You know?

Glee, because it was so successful so early on, and with such a devoted fan base, has repeatedly succumbed to the temptation to give viewers exactly what they want, whether it’s more jukebox episodes, bigger musical numbers, or a romance between two of its leads. (And fans don’t like it if the show takes one of these things away.) This approach might work in the short term, but in the long run, it leaves the show—as is becoming increasingly clear—with nowhere else to go. Remember: once your characters, or your readers, get what they want, the story is essentially over.

Of course, none of these issues have hurt Glee‘s success, and judging from the last few episodes, the show is making an effort to dial back the worst of its excesses. And I do hope it continues to improve. As much as I enjoy it now, a show can’t work as a case study forever. Because a show like Glee is always interesting…until, alas, it isn’t.

Written by nevalalee

December 3, 2010 at 12:11 pm

%d bloggers like this: