Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Joss Whedon

Famous monsters of filmland

leave a comment »

For his new book The Big Picture: The Fight for the Future of the Movies, the journalist Ben Fritz reviewed every email from the hack of Sony Pictures, which are still available online. Whatever you might think about the ethics of using such material, it’s a gold mine of information about how Hollywood has done business over the last decade, and Fritz has come up with some fascinating nuggets. One of the most memorable finds is an exchange between studio head Amy Pascal and the producer Scott Rudin, who was trying to convince her to take a chance on Danny Boyle’s adaptation of Steve Jobs. Pascal had expressed doubts about the project, particularly over the casting of Michael Fassbender in the lead, and after arguing it was less risky than The Social Network, Rudin delivered a remarkable pep talk:

You ought to be doing this movie—period—and you and I both know that the cold feet you are feeling is costing you this movie that you want. Once you have cold feet, you’re done. You’re making this decision in the anticipation of how you will be looked at in failure. That’s how you fail. So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is not a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. You have this. Face it…Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed. Fall on your sword—when you’ve lost that, it’s finished. You’re the person who does these movies. That’s—for better or worse—who you are and who you will remain. To lose that is to lose yourself.

Steve Jobs turned out to be a financial disappointment, and its failure—despite the prestige of its subject, director, and cast—feels emblematic of the move away from films driven by stars to those that depend on “intellectual property” of the kind that Sony lacked. In particular, the movie industry seems to have shifted to a model perfected by Marvel Studios, which builds a cinematic universe that can drum up excitement for future installments and generate huge grosses overseas. Yet this isn’t exactly new. In the groundbreaking book The Genius of the System, which was published three decades ago, Thomas Schatz notes that Universal did much the same in the thirties, when it pioneered the genre of cinematic horror under founder Carl Laemmle and his son:

The horror picture scarcely emerged full-blown from the Universal machinery, however. In fact, the studio had been cultivating the genre for years, precisely because it played to Universal’s strengths and maximized its resources…Over the years Carl Laemmle built a strong international distribution system, particularly in Europe…[European filmmakers] brought a fascination for the cinema’s distinctly unrealistic qualities, its capacity to depict a surreal landscape of darkness, nightmare logic, and death. This style sold well in Europe.

After noting that the aesthetics of horror lent itself to movies built out of little more than shadows and fog, which were the visual effects of its time, Schatz continues: “This rather odd form of narrative economy was vitally important to a studio with limited financial resources and no top stars to carry its pictures. And in casting, too, the studio turned a limitation into an asset, since the horror film did not require romantic leads or name stars.”

The turning point was Tod Browning’s Dracula, a movie “based on a presold property” that could serve as an entry point for other films along the same lines. It didn’t require a star, but “an offbeat character actor,” and Universal’s expectations for it eerily foreshadow the way in which studio executives still talk today. Schatz writes:

Laemmle was sure it would [succeed]—so sure, in fact, that he closed the Frankenstein deal several weeks before Dracula’s February 1931 release. The Lugosi picture promptly took off at the box office, and Laemmle was more convinced than ever that the horror film was an ideal formula for Universal, given its resources and the prevailing market conditions. He was convinced, too, that he had made the right decision with Frankenstein, which had little presold appeal but now had the success of Dracula to generate audience anticipation.

Frankenstein, in short, was sort of like the Ant-Man of the thirties, a niche property that leveraged the success of its predecessors into something like real excitement. It worked, and Universal’s approach to its monsters anticipates what Marvel would later do on a vaster scale, with “ambitious crossover events” like House of Frankenstein and House of Dracula that combined the studio’s big franchises with lesser names that seemed unable to carry a film on their own. (If Universal’s more recent attempt to do the same with The Mummy fell flat, it was partially because it was unable to distinguish between the horror genre, the star picture, and the comic book movie, resulting in a film that turned out to be none of the above. The real equivalent today would be Blumhouse Productions, which has done a much better job of building its brand—and which distributes its movies through Universal.)

And the inability of such movies to provide narrative closure isn’t a new development, either. After seeing James Whale’s Frankenstein, Carl Laemmle, Jr. reacted in much the same way that executives presumably do now:

Junior Laemmle was equally pleased with Whale’s work, but after seeing the rough cut he was certain that the end of the picture needed to be changed. His concerns were twofold. The finale, in which both Frankenstein and his monster are killed, seemed vaguely dissatisfying; Laemmle suspected that audiences might want a glimmer of hope or redemption. He also had a more pragmatic concern about killing off the characters—and thus any possibility of sequels. Laemmle now regretted letting Professor Van Helsing drive that stake through Count Dracula’s heart, since it consigned the original character to the grave…Laemmle was not about to make the same mistake by letting that angry mob do away with the mad doctor and his monster.

Whale disagreed, but he was persuaded to change the ending after a preview screening, leaving open the possibility that the monster might have survived. Over eight decades later, Joss Whedon offered a similar explanation in an interview with Mental Floss: “It’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant…My feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.” For now, we’re living in a world made by the Universal monsters—and with only a handful of viable properties, half of which are owned by Disney. Without them, it might seem impossible, as Rudin said, “to create them from standard material.” But we’re also still waiting to be blindsided by the next great franchise. As another famous monster once put it: “A lot of times, people don’t know what they want until you show it to them.” And when it came to the movies, at least, Steve Jobs was right.

This is how they lose us

with one comment

“If you’re a boy writer, it’s a simple rule,” Junot Díaz once said. “You’ve gotta get used to the fact that you suck at writing women and that the worst woman writer can write a better man than the best male writer can write a good woman. And it’s just the minimum.” Díaz was speaking at an event at the Word Up Community Bookshop in New York on June 7, 2012, and his remarks, which he delivered in response to a question from the audience, have been widely quoted, passed around, and admired. He continued:

Because the thing about the sort of heteronormative masculine privilege, whether it’s in Santo Domingo, or the United States, is you grow up your entire life being told that women aren’t human beings, and that women have no independent subjectivity. And because you grow up with this, it’s this huge surprise when you go to college and realize that, “Oh, women aren’t people who does my shit and fucks me.”

And it’s hard to read this now without thinking of the recent essay by the writer Alisa Valdes, who says—along with so much else—of her painful relationship with Díaz: “Once, Díaz even asked me to clean his disgusting kitchen before I left back to Boston, telling me his severe depression made it hard for him to pick up after himself…When I asked him about this, he laughed and called out from his futon on the floor in his bedroom: ‘Sweetie, you can take the man out of the D.R., but you can’t take the Dominican out of the man.’”

But in light of the allegations against Díaz, it’s important to revisit his words from six years ago, because they speak to a difficult point that is only going to get harder. I wish I could quote the entire thing—which starts here around the 36:15 mark—but I’ll content myself with one more excerpt:

Every time I’m teaching boys to write, I read their women to them, and I’m like, “Yo, you think this is good writing?” These motherfuckers attack each other over cliché lines but they won’t attack each other over these toxic representations of women that they have inherited. Their sexist shorthand—they think that is observation. And I think the first step is to admit that you, because of your privilege, have a very distorted sense of women’s subjectivity. And without an enormous amount of assistance, you’re not even going to get a D. I think with male writers the most that you can hope for is a D with an occasional C thrown in. Where the average women writer, when she writes men, she gets a B right off the bat, because they spent their whole life being taught that men have a subjectivity. In fact, part of the whole feminism revolution was saying, “Me too, motherfuckers.”

When I read this, the first thing that hits me, apart from the intensity, is how beautifully Díaz manages to say all the right things. It reminds me now of what the librarian Allie Jane Bruce said of an interview with Daniel Handler and Sherman Alexie, who are currently being scrutinized themselves. These are men who “speak the language of liberalism, progressivism, and feminism perfectly and are capitalizing on it. Using it to promote themselves and their books.”

I haven’t read much of Díaz, so I’m not qualified to discuss his work in detail, but I can testify to what he meant to many different groups of writers and readers, including science fiction fans. And his case speaks to the next stage of the reckoning that confronts us, which will involve talking about the behavior of men who we thought were different, and who amount to native speakers of the language of third-wave feminism. I often think of a quote that is widely attributed to Joss Whedon, allegedly in response to an interviewer who asked why he wrote strong female characters: “Because you’re still asking me that question.” In fact, this wasn’t an interview, but an acceptance speech for the Equality Now award, in which he asked himself that question six times and came up with appropriately cutting responses. I don’t doubt that he was asked about it a lot—but it’s also noteworthy that his most quotable response came in reply to a straw man that he had set up expressly to knock down. And these days, his remarks have a more sinister ring. Whedon opened with the words: “I’m surrounded tonight by people of extraordinary courage.” According to his former wife Kai Cole, however, Whedon once felt that he was surrounded by something rather different:

He wrote me: “When I was running Buffy, I was surrounded by beautiful, needy, aggressive young women. It felt like I had a disease, like something from a Greek myth. Suddenly I am a powerful producer and the world is laid out at my feet and I can’t touch it.” But he did touch it.

And the hardest realization of all might be that these two sides of Whedon weren’t mutually exclusive. They existed at the same time.

In fact, we’re reaching the point where a man’s overly insistent assertion of his feminism might even come off as a warning sign. As Lindsay Zoladz wrote a few months ago on The Ringer: “There’s also something vaguely unsettling right now about male producers who make a point of their good relationships working with creative women…Quietly existing as a male ally is one thing; building a public brand off Not Being That Creep is another.” And there’s nothing easy about the conversations that still have yet to come. (I can’t help comparing Díaz’s situation to that of Eric T. Schneiderman, another prominent public advocate of women who resigned just hours after the publication of an article in The New Yorker about his abusive behavior in his private life. The New Yorker also has a long history with Díaz, including a recent personal essay that was widely seen as an attempt to get ahead of the allegations. But the magazine hasn’t said anything about him yet. And this isn’t a matter of indifference, but a reflection of how hard it can be to acknowledge the behavior of those we know and admire.) But perhaps the first step is to listen to our doubts, even if they seem unlikely to be heeded. As Virginia Vitzthum writes in Elle:

Díaz is an outspoken leftist, decrying economic and other inequalities from his position as fiction editor of the Boston Review. He calls sexism, along with racism and genocide, one of his major concerns as an activist and a writer…He refers to his writing as a “feminist-aligned project” achieved by “mapping male subjectivities.” I do not doubt that he is sensitive to the ways women are marginalized; it seems appropriate to ask him about the sexism in [This is How You Lose Her].

When she raises her concerns about his “constant dismissal of women as sets of culo-and-titties,” Díaz gets “all professorial” on her, but Vitzthum is having none of it. She writes in her conclusion: “About my failure to engage productively with your maps of male subjectivity? It’s not me, it’s you.” She’s right. And she was right when she wrote it six years ago.

Written by nevalalee

May 8, 2018 at 8:41 am

The crowded circle

leave a comment »

Earlier this week, Thrillist posted a massive oral history devoted entirely to the climactic battle scene in The Avengers. It’s well over twelve thousand words, or fifty percent longer than Ronan Farrow’s Pulitzer Prize-winning investigation of Harvey Weinstein, and you can occasionally feel it straining to justify its length. In its introduction, it doesn’t shy away from the hard sell:

Scholars swore that comic-book moviemaking peaked with Christopher Nolan’s lauded vision for The Dark Knight, yet here was an alternative, propulsive, prismatic, and thoughtful…The Battle of New York wasn’t just a third-act magic trick; it was a terraforming of the blockbuster business Hollywood believed it understood.

To put it mildly, this slightly overstates the case. Yet the article is still worth reading, both for its emphasis on the contributions of such artists as storyboard artist Jane Wu and for the presence of director Joss Whedon, who casually throws shade in all directions, including at himself. For instance, at one point, Ryan Meinerding, the visual effects department supervisor, recalls of the design of the alien guns: “We tried to find something that, if Black Widow got ahold of one of their weapons, she could use it in an interesting way. Which is how we ended up with that sort of long Civil War weapons.” Whedon’s perspective is somewhat different: “I look back, and I’m like, So my idea for making the weapons look different was to give them muskets? Did I really do that? Was that the sexiest choice? Muskets? Okay. But you know, hit or miss.”

These days, I can’t listen to Whedon’s studiously candid, self-deprecating voice in quite the way that I once did, but he’s been consistently interesting—if not always convincing—on points of craft, and his insights here are as memorable as usual. My favorite moment comes when he discusses the structure of the sequence itself, which grew from an idea for what he hoped would be an iconic image:

We’re going to want to see the group together. We’re going to want to do a shot of everyone back to back. Now we are a team. This is “The Avengers.” We’d get them in a circle and all facing up. Ryan Meinerding painted the team back to back, and that’s basically what I shot. They’re so kinetic and gorgeous, and he has a way of taking comic books and really bringing them to life, even beyond Alex Ross in a way that I’ve never seen…But then it was like, okay, why are they in a circle? That’s where they’re standing, but why? Let’s assume that there are aliens all over the walls, they’re surrounding them, they’re going to shoot at them, but they haven’t started yet. Why haven’t they started yet? And I was like Oh, let’s give the aliens a war cry… Then one of the aliens takes off his mask because we need to see their faces and hear that cry. The Avengers are surrounded by guys going, “We are going to fuck you up.” But not by guys who are shooting yet.

He concludes: “So there is a very specific reason that sort of evolved more and more right before we shot it. And then it’s like, okay, we got them here, and then once they’re there, you’re like, okay, how do we get them to the next thing?”

On some level, this is the kind of thing I should love. As I’ve discussed here before, the big beats of a story can emerge from figuring out what comes before and after a single moment, and I always enjoy watching a writer work through such problems in the most pragmatic way possible. In this case, though, I’m not sure about the result. The third act of The Avengers has always suffered a little, at least for me, from its geographic constraints. A handful of heroes have to credibly fend off an attack from an alien army, which naturally limits how big or dispersed the threat can be, and it seems strange that an invasion of the entire planet could be contained within a few blocks, even if they happen to include the photogenic Park Avenue Viaduct. The entire conception is undermined by the need to keep most of the characters in one place. You could imagine other possible climaxes—a chase, an assault on the enemy stronghold, a battle raging simultaneously at different locations around the world—that would have involved all the major players while still preserving a sense of plausibility and scale. But then you wouldn’t have gotten that circle shot. (Elsewhere in the article, Whedon offers a weirdly condescending aside about Zak Penn’s original draft of the script: “I read it one time, and I’ve never seen it since. I was like, ‘Nope. There’s nothing here.’ There was no character connection. There was a line in the stage directions that said, apropos of nothing, ‘And then they all walk towards the camera in slow motion because you have to have that.’ Yeah, well, no: You have to earn that.” Which sounds more to me like Whedon defensively dismissing the kind of joke that he might have made himself. And you could make much the same criticism of the circle shot that he had in mind.)

And the whole anecdote sums up my mixed feelings toward the Marvel Universe in general and The Avengers in particular. On its initial release, I wrote that “a lot of the film, probably too much, is spent slotting all the components into place.” That certainly seems to have been true of the climax, which also set a dangerous precedent in which otherwise good movies, like The Winter Soldier, felt obliged to end in a blur of computer effects. And it’s even more clear now that Whedon’s tastes and personality were only occasionally allowed to shine through, often in the face of active opposition from the studio. (Of the one of the few moments from the entire movie that I still recall fondly, Whedon remembers: “There were objections to Hulk tossing Loki. I mean, strong objections. But they were not from Kevin [Feige] and Jeremy [Latcham], so I didn’t have to worry.”) Marvel has since moved on to movies like Captain America: Civil War, Thor: Ragnarok, and Black Panther, much of which are authentically idiosyncratic, fun, and powerful in a way that the studio’s defining effort managed to only intermittently pull off. But it’s revealing that the last two films were mostly allowed to stand on their own, which is starting to seem like a luxury. Marvel is always trying to get to that circle shot, and now the numbers have been multiplied by five. It reflects what I’ve described as the poster problem, which turns graphic design—or storytelling—into an exercise in crowd control. I’m looking forward to Avengers: Infinity War, but my expectations have been tempered in ways for which The Avengers itself, and specifically its climactic battle, was largely responsible. As Whedon concedes: “Sometimes you have to do the shorthand version, and again, that’s sort of against how I like to view people, but it’s necessary when you already have twenty major characters.”

The watchful protectors

leave a comment »

Ben Affleck in Batman V. Superman: Dawn Of Justice

In the forward to his new book Better Living Through Criticism, the critic A.O. Scott imagines a conversation with a hypothetical interlocutor who asks: “Would it be accurate to say that you wrote this whole book to settle a score with Samuel L. Jackson?” “Not exactly,” Scott replies. The story, in case you’ve forgotten, is that after reading Scott’s negative review of The Avengers, Jackson tweeted that it was time to find the New York Times critic a job “he can actually do.” As Scott recounts:

Scores of his followers heeded his call, not by demanding that my editors fire me but, in the best Twitter tradition, by retweeting Jackson’s outburst and adding their own vivid suggestions about what I was qualified to do with myself. The more coherent tweets expressed familiar, you might even say canonical, anticritical sentiments: that I had no capacity for joy; that I wanted to ruin everyone else’s fun; that I was a hater, a square, and a snob; even—and this was kind of a new one—that the nerdy kid in middle school who everybody picked on because he didn’t like comic books had grown up to be me.

Before long, it all blew over, although not before briefly turning Scott into “both a hissable villain and a make-believe martyr for a noble and much-maligned cause.” And while he says that he didn’t write his book solely as a rebuttal to Jackson, he implies that the kerfuffle raised a valuable question: what, exactly, is the function of a critic these days?

It’s an issue that seems worth revisiting after this weekend, when a movie openly inspired by the success of The Avengers rode a tide of fan excitement to a record opening, despite a significantly less positive response from critics. (Deadline quotes an unnamed studio executive: “I don’t think anyone read the reviews!”) By some measures, it’s the biggest opening in history for a movie that received such a negative critical reaction, and if anything, the disconnect between critical and popular reaction is even more striking this time around. But it doesn’t seem to have resulted in the kind of war of words that blindsided Scott four years ago. Part of this might be due to the fact that fans seem much more mixed on the movie itself, or that the critical consensus was uniform enough that no single naysayer stood out. You could even argue—as somebody inevitably does whenever a critically panned movie becomes a big financial success—that the critical reaction is irrelevant for this kind of blockbuster. To some extent, you’d be right: the only tentpole series that seems vulnerable to reviews is the Bond franchise, which skews older, and for the most part, the moviegoers who lined up to see Dawn of Justice were taking something other than the opinions of professional critics into account. This isn’t a superpower on the movie’s part: it simply reflects a different set of concerns. And you might reasonably ask whether this kind of movie has rendered the role of a professional critic obsolete.

A.O. Scott

But I would argue that such critics are more important than ever, and for reasons that have a lot to do with the “soulless corporate spectacle” that Scott decried in The AvengersI’ve noted here before that the individual installments in such franchises aren’t designed to stand on their own: when you’ve got ten more sequels on the release schedule, it’s hard to tell a self-contained, satisfying story, and even harder to change the status quo. (As Joss Whedon said in an interview with Mental Floss: “You’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant.”) You could be cynical and say that no particular film can be allowed to interfere with the larger synergies at stake, or, if you’re in a slightly more generous mood, you could note that this approach is perfectly consistent with the way in which superhero stories have always been told. For the most part, no one issue of Batman is meant to stand as a definitive statement: it’s a narrative that unfolds month by month, year by year, and the character of Batman himself is far more important than any specific adventure. Sustaining that situation for decades on end involves a lot of artistic compromises, as we see in the endless reboots, resets, spinoffs, and alternate universes that the comic book companies use to keep their continuities under control. Like a soap opera, a superhero comic has to create the illusion of forward momentum while remaining more or less in the same place. It’s no surprise that comic book movies would employ the same strategy, which also implies that we need to start judging them by the right set of standards.

But you could say much the same thing about a professional critic. What A.O. Scott says about any one movie may not have an impact on what the overall population of moviegoers—even the ones who read the New York Times—will pay to see, and a long string of reviews quickly blurs together. But a critic who writes thoughtfully about the movies from week to week is gradually building up a narrative, or at least a voice, that isn’t too far removed from what we find in the comics. Critics are usually more concerned with meeting that day’s deadline than with adding another brick to their life’s work, but when I think of Roger Ebert or Pauline Kael, it’s sort of how I think of Batman: it’s an image or an attitude created by its ongoing interactions with the minds of its readers. (Reading Roger Ebert’s memoirs is like revisiting a superhero’s origin story: it’s interesting, but it only incidentally touches the reasons that Ebert continues to mean so much to me.) The career of a working critic these days naturally unfolds in parallel with the franchise movies that will dominate studio filmmaking for the foreseeable future, and if the Justice League series will be defined by our engagement with it for years to come, a critic whose impact is meted out over the same stretch of time is better equipped to talk about it than almost anyone else—as long as he or she approaches it as a dialogue that never ends. If franchises are fated to last forever, we need critics who can stick around long enough to see larger patterns, to keep the conversation going, and to offer some perspective to balance out the hype. These are the critics we deserve. And they’re the ones we need right now.

The second time around

with one comment

Lolita

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What’s something you wish could be remade or redone but is maybe too iconic or otherwise singular for anyone to actually take on the risk?”

When you look at a chronological list of any artist’s works, the first item can be both less and more than meets the eye. A first novel or movie—to take just two art forms—is always biographically interesting, but it’s also subject to particular pressures that can limit how well it expresses the creator’s personality. It’s the product of comparative youth, so it often suffers from rawness and inexperience, and it enters the world under unfavorable circumstances. For an unproven quantity from an unknown name, the tension between personal expression and the realities of the marketplace can seem especially stark. An aspiring novelist may write a book he hopes he can sell; a filmmaker usually starts with a small project that has a chance at being financed; and both may be drawn to genres that have traditionally been open to new talent. Hence the many directors who got their start in horror, exploitation, and even borderline porn. Francis Ford Coppola’s apprenticeship is a case in point. Before Dementia 13, which he made under the auspices of Roger Corman, he’d directed skin flicks like Tonight for Sure and The Bellboy and the Playgirls, and it took years of kicking around before he landed on The Godfather, which I’m sure he, and the rest of us, would prefer to see as his real debut.

Any early work, then, needs to be taken with a grain of salt. (This doesn’t even account for the fact that what looks like a debut may turn out that way almost by accident. The Icon Thief wasn’t the first novel I attempted or even finished, but it was the first one published, and it set a pattern for my career that I didn’t entirely anticipate.) But there’s also a real sense that an artist’s freshman efforts may be the most characteristic works he or she will ever produce. When you’re writing a novel or making a movie for the first time, you aren’t necessarily thinking in terms of a filmography that will stretch over fifty years: it seems like enough of a miracle to get this one story out into the world. As a result, if you’re at all rational, you’ll invest that effort into something that matters to you. This could be your only shot, so you may as well spend it on an idea that counts. Later, as you grow older, you often move past those early interests and obsessions, but they’ll always carry an emotional charge that isn’t there in the works you tackled in your maturity, or after you had all the resources you needed. And when you look back, you may find yourself haunted by the divide between your ambitions and the means—internal and otherwise—available to you at the time.

The Fury

That’s why I’m always a little surprised that more artists don’t go back to revisit their own early work with an eye to doing a better job. Sometimes, of course, the last thing you want is to return to an old project: doing it even once can be enough to drain you of all enthusiasm. But it happens. In fiction, the revised versions of novels like The Magus, The Sot-Weed Factor, and The Stand represent a writer’s attempt to get it right the second time. You could see the television version of Buffy the Vampire Slayer as Joss Whedon’s remake of his own original screenplay in the form that it deserved. In film, directors as different as Ozu, DeMille, Hitchcock, and Haneke have gone back to redo their earlier work with bigger stars, larger budgets, or simply a more sophisticated sense of what the story could be. (My own favorite example is probably Evil Dead 2, which is less a sequel than a remake in a style closer to Sam Raimi’s intentions.) And of course, the director’s cut, which has turned into a gimmick to sell movies on video or to restore deleted scenes that should have remained unseen, began as a way for filmmakers to make another pass on the same material. Close Encounters, Blade Runner, Apocalypse Now, and Ashes of Time have all been revised, and even if you prefer the older versions, it’s always fascinating to see a director rethink the choices he initially made.

That said, this impulse has its dark side: George Lucas has every right to tinker with the Star Wars movies, but not to withdraw the originals from circulation. But it’s an idea that deserves to happen more often. Hollywood loves remakes, but they’d be infinitely more interesting if they represented the original director’s renewed engagement with his own material. I’d love to have seen Kubrick—rather than Adrian Lyne—revisit Lolita in a more permissive decade, for instance, and to take a modern example almost at random, I’d much rather see Brian DePalma go back to one of his earlier flawed movies, like The Fury or even Dressed to Kill, rather than try to recapture the same magic with diminishing returns. And the prospect of David Fincher doing an Alien movie now would be considerably more enticing than what he actually managed to do with it twenty years ago. (On a somewhat different level, I’ve always thought that The X-Files, which strained repeatedly to find new stories in its later years, should have gone back to remake some of its more forgettable episodes from the first season with better visual effects and a fresh approach.) Most artists, obviously, prefer to strike out in new directions, and such projects would carry the implication that they were only repeating themselves. But if the movies are going to repeat old ideas anyway, they might as well let their creators take another shot.

The poster problem

leave a comment »

Avengers: Age of Ultron

Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:

This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.

If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.

Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:

But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.

Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.

Inception

It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.

And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.

Written by nevalalee

April 29, 2015 at 8:44 am

The killing joke

with 2 comments

Kevin Spacey and Robin Wright in House of Cards

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “What television trope aggravates you the most?”

Clichés exist for a reason. As I pointed out in my post on the cinematic baguette, whenever a trope becomes exhausted to the point of parody, it’s because it was once incredibly useful. Most of the conventions that wind up advertising a story’s unreality, like the fact that the first bill the hero pulls from his wallet is always the exact amount he needs to pay for his cab, or that people in movies rarely say “Hello” or “Goodbye” on the phone, are about saving time or conveying information to the audience. Two of my wife’s least favorite tropes fall in this category: Hollywood Gift Wrap, in which a present is wrapped so that the character can open it simply by lifting off the lid, and They Wasted a Perfectly Good Sandwich, in which one character meets another for lunch, orders, then leaves as soon as the food arrives. In both cases, there’s a pragmatic rationale—it’s a pain to rewrap a present between takes, and it’s equally hard to maintain continuity with food—but it also makes good narrative sense. The point isn’t to watch the character unwrapping the present, but to see what’s inside the box, and even if we’re annoyed by the transparent artifice of that lid with its separate ribbon, we’d probably be even more irritated if the show spent thirty seconds showing the character fumbling with the gift paper.

Television has its own set of tropes that the nature of the medium enforces, although whenever you notice a convention for the first time, you’ll also find a show that can’t wait to break it. For decades, sitcoms and procedural dramas tended to hit the reset button at the end of every episode: no matter what happened, you’d find the characters in the same familiar situations and relationships the following week. This was largely a consequence of syndication, which routinely aired episodes out of order, and the rise in serialized storytelling fueled by alternative viewing options has allowed shows of every genre to show characters evolving over time. Similarly, the concept of the character shield originates in the logistics of actors’ contracts: when the lead actors are slated to appear at least through the rest of the season, there’s little suspense over whether Mulder or Scully will survive their latest brush with the paranormal. More recently, however, shows have begun to play with the audience’s expectations on how invulnerable major characters can be. Joss Whedon is notorious for killing off fan favorites, and Game of Thrones has raised the bar for showing us the unexpected deaths of lead characters—and not once but twice.

The Breaking Bad episode "Fly"

On the surface, this seems like a positive development, since it discourages audience complacency and forces the viewer to fully commit to the drama of each episode. With occasional exceptions, the show’s lead character is still relatively safe, barring the occasional contract dispute, but when it comes to the supporting cast, we’ve been taught that no one is immune. Yet I’ve begun to feel that this idea has become a cliché in itself, and at its worst, the storytelling it inspires can be equally lazy. One unexpected character death can be shocking; when a show piles them up over and over again, as The Vampire Diaries does, it isn’t long before we start to see diminishing returns. (It doesn’t help that nobody on The Vampire Diaries seems to stay dead forever.) Even on shows that parcel out their casualties out more scrupulously, there’s a sense that this trope is becoming exhausted. When an important character was suddenly dispatched at the beginning of the second season of House of Cards, it was shocking in the moment—although I found myself more distracted by the inexplicability of it all—but the show seemed eager to dance away from confronting the consequences. These days, it’s just business as usual.

And the worst thing about the casual killing of characters is that it encourages a sort of all or nothing approach to writing stories. Ninety percent of the time, a show goes through the motions, but every few episodes, somebody is shoved in front of a bus—when it might be more interesting, and more difficult, to create tension and suspense while those characters were sill alive. Major deaths should be honestly earned, not just a way to keep the audience awake. Of course, it’s easier to shock than to engage, and the sudden death of a character has become television’s equivalent of a jump scare, an effect that can pulled off the shelf without thinking. I hate to keep coming back to Breaking Bad as a reference point, just because it’s what everyone else does, but I can’t help it. Few viewers had any doubt that Walt, and probably Jesse, would make it to the final episode, so the writers became agonizingly inventive at finding ways of testing them and their loved ones in the meantime, to the point where death itself seemed like a blessing. At this point, I’m no longer surprised or impressed when a character dies, but I’m actively grateful when a show puts us through the wringer in other ways. There’s an enormous spectrum of experience between life and death. And it’s far better to keep these characters alive, if you can make me care about what happens to them next.

Written by nevalalee

March 14, 2014 at 9:49 am

%d bloggers like this: