Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Star Wars: The Force Awakens

The tentpole test

leave a comment »

Rogue One: A Star Wars Story

How do you release blockbusters like clockwork and still make each one seem special? It’s an issue that the movie industry is anxious to solve, and there’s a lot riding on the outcome. When I saw The Phantom Menace nearly two decades ago, there was an electric sense of excitement in the theater: we were pinching ourselves over the fact that we were about to see see the opening crawl for a new Star Wars movie on the big screen. That air of expectancy diminished for the two prequels that followed, and not only because they weren’t very good. There’s a big difference, after all, between the accumulated anticipation of sixteen years and one in which the installments are only a few years apart. The decade that elapsed between Revenge of the Sith and The Force Awakens was enough to ramp it up again, as if fan excitement were a battery that recovers some of its charge after it’s allowed to rest for a while. In the past, when we’ve watched a new chapter in a beloved franchise, our experience hasn’t just been shaped by the movie itself, but by the sudden release of energy that has been bottled up for so long. That kind of prolonged wait can prevent us from honestly evaluating the result—I wasn’t the only one who initially thought that The Phantom Menace had lived up to my expectations—but that isn’t necessarily a mistake. A tentpole picture is named for the support that it offers to the rest of the studio, but it also plays a central role in the lives of fans, which have been going on long before the film starts and will continue after it ends. As Robert Frost once wrote about a different tent, it’s “loosely bound / By countless silken ties of love and thought / to every thing on earth the compass round.”

When you have too many tentpoles coming out in rapid succession, however, the outcome—if I can switch metaphors yet again—is a kind of wave interference that can lead to a weakening of the overall system. On Christmas Eve, I went to see Rogue One, which was preceded by what felt like a dozen trailers. One was for Spider-Man: Homecoming, which left me with a perplexing feeling of indifference. I’m not the only one to observe that the constant onslaught of Marvel movies makes each installment feel less interesting, but in the case of Spider-Man, we actually have a baseline for comparison. Two baselines, really. I can’t defend every moment of the three Sam Raimi films, but there’s no question that each of those movies felt like an event. There was even enough residual excitement lingering after the franchise was rebooted to make me see The Amazing Spider-Man in the theater, and even its sequel felt, for better or worse, like a major movie. (I wonder sometimes if audiences can sense the pressure when a studio has a lot riding on a particular film: even a mediocre movie can seem significant if a company has tethered all its hopes to it.) Spider-Man: Homecoming, by contrast, feels like just one more component in the Marvel machine, and not even a particularly significant one. It has the effect of diminishing a superhero who ought to be at the heart of any universe in which he appears, relegating one of the two or three most successful comic book characters of all time to a supporting role in a larger universe. And because we still remember how central he was to no fewer than two previous franchises, it feels like a demotion, as if Spider-Man were an employee who had left the company, came back, and is now reporting to Iron Man.

Spider-Man in Captain America: Civil War

It isn’t that I’m all that emotionally invested in the future of Spider-Man, but it’s a useful case study for what it tells us about the pitfalls of these films, which can take something that once felt like a milestone and reduce it to a midseason episode of an ongoing television series. What’s funny, of course, is that the attitude we’re now being asked to take toward these movies is actually closer to the way in which they were originally conceived. The word “episode” is right there in the title of every Star Wars movie, which George Lucas saw as an homage to classic serials, with one installment following another on a weekly basis. Superhero films, obviously, are based on comic books, which are cranked out by the month. The fact that audiences once had to wait for years between movies may turn out to have been a historical artifact caused by technological limitations and corporate inertia. Maybe the logical way to view these films is, in fact, in semiannual installments, as younger viewers are no doubt growing up to expect. In years to come, the extended gaps between these movies in prior decades will seem like a structural quirk, rather than an inherent feature of how we relate to them. This transition may not be as meaningful as, say, the shift from silent films to the talkies, but they imply a similar change in the way we relate to the film onscreen. Blockbusters used to be released with years of anticipation baked into the response from moviegoers, which is no longer something that can be taken for granted. It’s a loss, in its way, to fan culture, which had to learn how to sustain itself during the dry periods between films, but it also implies that the movies themselves face a new set of challenges.

To be fair, Disney, which controls both the Marvel and Star Wars franchises, has clearly thought a lot about this problem, and they’ve hit on approaches that seem to work pretty well. With the Marvel Universe, this means pitching most of the films at a level at which they’re just good enough, but no more, while investing real energy every few years into a movie that is first among equals. This leads to a lot of fairly mediocre installments, but also to the occasional Captain America: Civil War, which I think is the best Marvel movie yet—it pulls off the impossible task of updating us on a dozen important characters while also creating real emotional stakes in the process, which is even more difficult than it looks. Rogue One, which I also liked a lot, takes a slightly different tack. For most of the first half, I was skeptical of how heavily it was leaning on its predecessors, but by the end, I was on board, and for exactly the same reason. This is a movie that depends on our knowledge of the prior films for its full impact, but it does so with intelligence and ingenuity, and there’s a real satisfaction in how neatly it aligns with and enhances the original Star Wars, while also having the consideration to close itself off at the end. (A lot of the credit for this may be due to Tony Gilroy, the screenwriter and unbilled co-director, who pulled off much of the same feat when he structured much of The Bourne Ultimatum to take place during gaps in The Bourne Supremacy.) Relying on nostalgia is a clever way to compensate for the reduced buildup between movies, as if Rogue One were drawing on the goodwill that Star Wars built up and hasn’t dissipated, like a flywheel that serves as an uninterruptible power supply. Star Wars isn’t just a tentpole, but a source of energy. And it might just be powerful enough to keep the whole machine running forever.

Strange currencies

with 2 comments

Doctor Strange

Last Monday, I took a break. I don’t normally take much time off, but I wanted to tune out of election coverage on what I feared, implausibly but correctly, might be the last entirely happy day I’d have for the next four years. My book project was in good shape, I’d finished a decent draft of a short story, and I had nothing else pressing to hold my attention. So I lit out. I treated myself to a Lyft ride into Chicago, where I dropped into two of my favorite used bookstores—Bookman’s Corner and Booklegger’s—and spent about twenty bucks. Then I took a train to the River East Theater on Illinois Street, where I met up with my wife to catch Doctor Strange, which was the first movie we’d seen together on the big screen since The Force Awakens. Afterward, we headed home just in time to put our daughter, who had spent the day with her grandparents, to bed. And if I lay out the context in such detail, it’s because I have a feeling that this is how most people in this country go to the movies. After a young adulthood in which I turned up at my local cineplex or art house theater at least once a week to see whatever blockbuster or critical darling was currently in the reviews, along with countless revivals, I’ve settled down into a routine in which I’m more likely to see two or three movies each year with my daughter and a couple of others for myself. This places me squarely in the mainstream of most moviegoers: according to a recent survey, the average American sees five movies a year, and I seem likely to hit that number exactly.

Which is both remarkable and kind of unsurprising. Hollywood releases about six hundred movies every year, a significant percentage of which are trying to appeal to as many demographic quadrants as possible. Yet even The Force Awakens, which sold over a hundred million tickets domestically, was seen by something less than a third of all Americans, even before you take multiple viewings into account. To convince the average adult to go to the movies five times in a single calendar year, you need a wide range of product, only a fraction of which is likely to entice any given individual to buy a ticket. Inevitably, however, the people who write professionally about the movies from both the artistic and business angles are inclined to try to make sense of the slate as a whole. Film critics may review two or three movies every week and go to even more—and they have to see everything, not just what appeals to their own tastes. As I learned during my own stint as a working critic, it’s a situation that has a way of altering your expectations: you realize how many movies are simply mediocre and forgettable, and you start to relish anything out of the ordinary, however misguided it might be. Needless to say, this isn’t how your typical moviegoer sees it. Someone who watches a hundred and fifty movies every year for free might as well belong to a different species as someone who pays to see fewer than five, but they have no choice but to try to understand each other, at least if we’re going to take criticism seriously from either side.

Tilda Swinton and Benedict Cumberbatch in Doctor Strange

So what does this have to do with Doctor Strange? Quite a lot, I think. I had originally hoped to write about it here last week, before the election made it hard to think about anything else, and there was a time when I wasn’t even sure whether I’d devote a post to it at all. Yet I’ve become intrigued precisely by the way it has faded in my imagination. In the moment, I liked it a lot. It stars five actors whom I’m happy to see in anything, and it actually gives two or three of them something interesting to do. When I broke it down in my head, its scenes fell into three categories. About of a third were watchable in the usual Marvel way, which takes pride in being pretty good, but not great; another third achieved something like high camp; and the last third were genuinely visionary, with some of the most striking visual effects I’ve ever seen. There are scenes in Doctor Strange that get as close as a movie possibly can to the look and feel of a dream, with elaborate geometric patterns and cityscapes that break down and reform themselves before our eyes. It left me wondering how they did it. But it didn’t stick in my head in the way that Inception, its obvious inspiration, still does. In part, it’s because it uses digital rather than practical effects: an homage to Joseph Gordon-Levitt’s famous hallway fight scene only reminds us of how much more effective—and respectful of gravity—it was to stage it right in the camera. And even the most amazing sequences are chases or showdowns that amount to interchangeable components. The story halts for them, and they could be inserted at any point into any version of the script.

As a result, it left me with a highlight reel of memories that is basically identical to the trailer. But a movie that was wholly as weird and as distinctive as the best scenes in Doctor Strange would never have made it into theaters. It would be fundamentally out of keeping with the basic premise of the Marvel Universe, which is that no one movie can stick out from the rest, and nothing can occur that is so meaningful that it interferes with the smooth production of films being shot simultaneously by other directors. The story, ideally, should be about as little as possible, while still creating the illusion that the stakes are infinite—which leads inexorably to diminishing returns. (When you read the early space opera stories of writers like John W. Campbell, you realize that once the heroes can casually span entire galaxies, it means that nothing matters whatsoever. And the same thing happens in the Marvel films.) Doctor Strange works because it keeps its weirdness hermetically sealed off from the rest: as long as we’re watching those scenes, we’re transported into a freakier, more exhilarating film, only to be returned to the safe beats of the formula as quickly and antiseptically as possible. There’s nothing wrong with the screenplay, except to the extent that there’s something wrong with every script written according to the usual specifications. The result has flashes of something extraordinary, but it’s scaled back for the audience members who see only five movies a year. It’s big and distinctive enough to assure you that you’ve gotten your money’s worth, but not so unusual that it makes you question what you bought with it. It’s Benedict Cumberbatch with an American accent. And it’s exactly as good as that sounds.

Written by nevalalee

November 15, 2016 at 8:27 am

Quote of the Day

leave a comment »

Written by nevalalee

June 1, 2016 at 7:30 am

Choose life

with 5 comments

Inside Out

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What show did you stop watching after a character was killed off?”

Inside Out is an extraordinary film on many levels, but what I appreciated about it the most was the reminder it provides of how to tell compelling stories on the smallest possible scale. The entire movie turns on nothing more—or less—than a twelve-year-old girl’s happiness. Riley is never in real physical danger; it’s all about how she feels. These stakes might seem relatively low, but as I watched it, I felt that the stakes were infinite, and not just because Riley reminded me so much of my own daughter. By the last scene, I was wrung out with emotion. And I think it stands as the strongest possible rebuke to the idea, so prevalent at the major studios, that mainstream audiences will only be moved or excited by stories in which the fate of the entire world hangs in the balance. As I’ve noted here before, “Raise the stakes” is probably the note that writers in Hollywood get the most frequently, right up there with “Make the hero more likable,” and its overuse has destroyed their ability to make such stories meaningful. When every superhero movie revolves around the fate of the entire planet, the death of six billion people can start to seem trivial. (The Star Trek reboot went there first, but even The Force Awakens falls into that trap: it kills off everyone on the Hosnian System for the sake of a throwaway plot point, and it moves on so quickly that it casts a pall over everything that follows.)

The more I think about this mindless emphasis on raising the stakes, the more it strikes me as a version of a phenomenon I’ve discussed a lot on this blog recently, in which big corporations tasked with making creative choices end up focusing on quantifiable but irrelevant metrics, at the expense of qualitative thinking about what users or audiences really need. For Apple, those proxy metrics are thinness and weight; for longform journalism, it’s length. And while “raising the stakes” isn’t quite as quantitative, it sort of feels that way, and it has the advantage of being the kind of rule that any midlevel studio employee can apply with minimal fear of being wrong. (It’s only when you aggregate all those decisions across the entire industry that you end up with movies that raise the stakes so high that they turn into weightless abstractions.) Saying that a script needs higher stakes is the equivalent of saying that a phone needs to be thinner: it’s a way to involve the maximum number of executives in the creative process who have no business being there in the first place. But that’s how corporations work. And the fact that Pixar has managed to avoid that trap, if not always, then at least consistently enough for the result to be more than accidental, is the most impressive thing about its legacy.

Kiefer Sutherland in 24

A television series, unlike a studio franchise, can’t blow up the world on a regular basis, but it can do much the same thing to its primary actors, who are the core building blocks of the show’s universe. As a result, the unmotivated killing of a main character has become television’s favorite way of raising the stakes—although by now, it feels just as lazy. As far as I can recall, I’ve never stopped watching a show solely because it killed off a character I liked, but I’ve often given up on a series, as I did with 24 and Game of Thrones and even The Vampire Diaries, when it became increasingly clear that it was incapable of doing anything else. Multiple shock killings emerge from a mindset that is no longer able to think itself into the lives of its characters: if you aren’t feeling your own story, you have no choice but to fall back on strategies for goosing the audience that seem to work on paper. But almost without exception, the seasons that followed would have been more interesting if those characters had been allowed to survive and develop in honest ways. Every removal of a productive cast member means a reduction of the stories that can be told, and the temporary increase in interest it generates doesn’t come close to compensating for that loss. A show that kills characters with abandon is squandering narrative capital and mortgaging its own future, so it’s no surprise if it eventually goes bankrupt.

A while back, Bryan Fuller told Entertainment Weekly that he had made an informal pledge to shun sexual violence on Hannibal, and when you replace “rape” with “murder,” you get a compelling case for avoiding gratuitous character deaths as well:   

There are frequent examples of exploiting rape as low-hanging fruit to have a canvas of upset for the audience…“A character gets raped” is a very easy story to pitch for a drama. And it comes with a stable of tropes that are infrequently elevated dramatically, or emotionally. I find that it’s not necessarily thought through in the more common crime procedurals. You’re reduced to using shorthand, and I don’t think there can be a shorthand for that violation…And it’s frequently so thinly explored because you don’t have the real estate in forty-two minutes to dig deep into what it is to be a victim of rape…All of the structural elements of how we tell stories on crime procedurals narrow the bandwidth for the efficacy of exploring what it is to go through that experience.

And I’d love to see more shows make a similar commitment to preserving their primary cast members. I’m not talking about character shields, but about finding ways of increasing the tension without taking the easy way out, as Breaking Bad did so well for so long. Death closes the door on storytelling, and the best shows are the ones that seem eager to keep that door open for as long as possible.

The time factor

with 7 comments

Concept art for Toy Story 3

Earlier this week, my daughter saw Toy Story for the first time. Not surprisingly, she loved it—she’s asked to watch it three more times in two days—and we’ve already moved on to Toy Story 2. Seeing the two movies back to back, I was struck most of all by the contrast between them. The first installment, as lovely as it is, comes off as a sketch of things to come: the supporting cast of toys gets maybe ten minutes total of screen time, and the script still has vestiges of the villainous version of Woody who appeared in the earlier drafts. It’s a relatively limited film, compared to the sequels. Yet if you were to watch it today without any knowledge of the glories that followed, you’d come away with a sense that Pixar had done everything imaginable with the idea of toys who come to life. The original Toy Story feels like an exhaustive list of scenes and situations that emerge organically from its premise, as smartly developed by Joss Whedon and his fellow screenwriters, and in classic Pixar fashion, it exploits that core gimmick for all it’s worth. Like Finding Nemo, it amounts to an anthology of all the jokes and set pieces that its setting implies: you can practically hear the writers pitching out ideas. And taken on its own, it seems like it does everything it possibly can with that fantastic concept.

Except, of course, it doesn’t, as two incredible sequels and a series of shorts would demonstrate. Toy Story 2 may be the best example I know of a movie that takes what made its predecessor special and elevates it to a level of storytelling that you never imagined could exist. And it does this, crucially, by introducing a new element: time. If Toy Story is about toys and children, Toy Story 2 and its successor are about what happens when those kids become adults. It’s a complication that was inherent to its premise from the beginning, but the first movie wasn’t equipped to explore it—we had to get to know and care about these characters before we could worry about what would happen after Andy grew up. It’s a part of the story that had to be told, if its assumptions were to be treated honestly, and it shows that the original movie, which seemed so complete in itself, only gave us a fraction of the full picture. Toy Story 3 is an astonishing achievement on its own terms, but there’s a sense in which it only extends and trades on the previous film’s moment of insight, which turned it into a franchise of almost painful emotional resonance. If comedy is tragedy plus time, the Toy Story series knows that when you add time to comedy, you end up with something startlingly close to tragedy again.

Robert De Niro in The Godfather Part II

And thinking about the passage of time is an indispensable trick for creators of series fiction, or for those looking to expand a story’s premise beyond the obvious. Writers of all kinds tend to think in terms of unity of time and place, which means that time itself isn’t a factor in most stories: the action is confined within a safe, manageable scope. Adding more time to the story in either direction has a way of exploding the story’s assumptions, or of exposing fissures that lead to promising conflicts. If The Godfather Part II is more powerful and complex than its predecessor, it’s largely because of its double timeline, which naturally introduces elements of irony and regret that weren’t present in the first movie: the outside world seems to break into the hermetically sealed existence of the Corleones just as the movie itself breaks out of its linear chronology. And the abrupt time jump, which television series from Fargo to Parks and Recreation have cleverly employed, is such a useful way of advancing a story and upending the status quo that it’s become a cliché in itself. Even if you don’t plan on writing more than one story or incorporating the passage of time explicitly into the plot, asking yourself how the characters would change after five or ten years allows you to see whether the story depends on a static, unchanging timeframe. And those insights can only be good for the work.

This also applies to series in which time itself has become a factor for reasons outside anyone’s control. The Force Awakens gains much of its emotional impact from our recognition, even if it’s unconscious, that Mark Hamill is older now than Alec Guinness was in the original, and the fact that decades have gone by both within the story’s universe and in our own world only increases its power. The Star Trek series became nothing less than a meditation on the aging of its own cast. And this goes a long way toward explaining why Toy Story 3 was able to close the narrative circle so beautifully: eleven years had passed since the last movie, and both Andy and his voice actor had grown to adulthood, as had so many of the original film’s fans. (It’s also worth noting that the time element seems to have all but disappeared from the current incarnation of the Toy Story franchise: Bonnie, who owns the toys now, is in no danger of growing up soon, and even if she does, it would feel as if the films were repeating themselves. I’m still optimistic about Toy Story 4, but it seems unlikely to have the same resonance as its predecessors—the time factor has already been fully exploited. Of course, I’d also be glad to be proven wrong.) For a meaningful story, time isn’t a liability, but an asset. And it can lead to discoveries that you didn’t know were possible, but only if you’re willing to play with it.

You are here

with one comment

Adam Driver in Star Wars: The Force Awakens

Remember when you were watching Star Wars: The Force Awakens and Adam Driver took off his mask, and you thought you were looking at some kind of advanced alien? You don’t? That’s strange, because it says you did, right here in Anthony Lane’s review in The New Yorker:

So well is Driver cast against type here that evil may turn out to be his type, and so extraordinary are his features, long and quiveringly gaunt, that even when he removes his headpiece you still believe that you’re gazing at some form of advanced alien.

I’m picking on Lane a little here, because the use of the second person is so common in movie reviews and other types of criticism—including this blog—that we hardly notice it, any more than we notice the “we” in this very sentence. Film criticism, like any form of writing, evolves its own language, and using that insinuating “you,” as if your impressions had melded seamlessly with the critic’s, is one of its favorite conventions. (For instance, in Manohla Dargis’s New York Times review of the same film, she says: “It also has appealingly imperfect men and women whose blunders and victories, decency and goofiness remind you that a pop mythology like Star Wars needs more than old gods to sustain it.”) But who is this “you,” exactly? And why has it started to irk me so much?

The second person has been used by critics for a long time, but in its current form, it almost certainly goes back to Pauline Kael, who employed it in the service of images or insights that could have occurred to no other brain on the planet, as when she wrote of Madeline Kahn in Young Frankenstein: “When you look at her, you see a water bed at just the right temperature.” This tic of Kael’s has been noted and derided for almost four decades, going back to Renata Adler’s memorable takedown in the early eighties, in which she called it “the intrusive ‘you'” and noted shrewdly: “But ‘you’ is most often Ms. Kael’s ‘I,’ or a member or prospective member of her ‘we.'” Adam Gopnik later said: “It wasn’t her making all those judgments. It was the Pop Audience there beside her.” And “the second-person address” clearly bugged Louis Menand, too, although his dislike of it was somewhat undermined by the fact that he internalized it so completely:

James Agee, in his brief service as movie critic of The Nation, reviewed many nondescript and now long-forgotten pictures; but as soon as you finish reading one of his pieces, you want to read it again, just to see how he did it…You know what you think about Bonnie and Clyde by now, though, and so [Kael’s] insights have lost their freshness. On the other hand, she is a large part of the reason you think as you do.

Pauline Kael

Kael’s style was so influential—I hear echoes of it in almost everything I write—that it’s no surprise that her intrusive “you” has been unconsciously absorbed by the generations of film critics that followed. If it bothers you as it does me, you can quietly replace it throughout with “I” without losing much in the way of meaning. But that’s part of the problem. The “you” of film criticism conceals a neurotic distrust of the first person that prevents critics from honoring their opinions as their own. Kael said that she used “you” because she didn’t like “one,” which is fair enough, but there’s also nothing wrong with “I,” which she wasn’t shy about using elsewhere. To a large extent, Kael was forging her own language, and I’m willing to forgive that “you,” along with so much else, because of the oceanic force of the sensibilities to which it was attached. But separating the second person from Kael’s unique voice and turning it into a crutch to be indiscriminately employed by critics everywhere yields a more troubling result. It becomes a tactic that distances the writer slightly from his or her own judgments, creating an impression of objectivity and paradoxical intimacy that has no business in a serious review. Frame these observations in “I,” and the critic would feel more of an obligation to own them and make sense of them; stick them in a convenient “you,” and they’re just one more insight to be tossed off, as if the critic happened to observe it unfolding in your brain and can record it here without comment.

Obviously, there’s nothing wrong with wanting to avoid the first person in certain kinds of writing. It rarely has a place in serious reportage, for instance, despite the efforts of countless aspiring gonzo journalists who try to do what Norman Mailer, Hunter S. Thompson, and only a handful of others have ever done well. (It can even plague otherwise gifted writers: I was looking forward to Ben Lerner’s recent New Yorker piece about art conservation, but I couldn’t get past his insistent use of the first person.) But that “I” absolutely belongs in criticism, which is fundamentally a record of a specific viewer, listener, or reader’s impressions of his or her encounter with a piece of art. All great critics, whether they use that “you” or not, are aware of this, and it can be painful to read a review by an inexperienced writer that labors hard to seem “objective.” But if our best critics so often fall into the “you” trap, it’s a sign that even they aren’t entirely comfortable with giving us all of themselves, and I’ve started to see it as a tiny betrayal—meaningful or not—of what ought to be the critic’s intensely personal engagement with the work. And if it’s only a tic or a trick, then we sacrifice nothing by losing it. Replace that “you” with “I” throughout, making whatever other adjustments seem necessary, and the result is heightened and clarified, with a much better sense of who was really sitting there in the dark, feeling emotions that no other human being would ever feel in quite the same way.

The Jedi mind trick

leave a comment »

BB-8 in Star Wars: The Force Awakens

Difficult to see. Always in motion is the future.

—Yoda, The Empire Strikes Back

At some point over the next few hours, perhaps as you’re reading this post, The Force Awakens is projected to surge past Avatar to become the highest-grossing movie in the history of the North American box office. We usually don’t adjust such figures for inflation, of course, probably because there wouldn’t be as many records broken each year if we did, and it’s all but certain that the original Star Wars will remain tops in the franchise in terms of tickets sold. Yet it’s impossible to discount this achievement. If the latest installment continues on its present trajectory, it has a good chance of cracking the adjusted top ten of all time—it would need to gross somewhere north of $948 million domestic to exceed Snow White and the Seven Dwarfs and earn a spot on that rarefied list, and this is starting to feel like a genuine possibility. Given the changes in the entertainment landscape over the last century, this is beyond flabbergasting. But even this doesn’t get at the real, singular nature of what we’re witnessing today. The most unexpected thing about the success of The Force Awakens is how expected it was. And at a time when Hollywood is moving increasingly toward a tentpole model in which a handful of blockbusters finance all the rest, it represents both a historic high point for the industry and an accomplishment that we’re unlikely to ever see again.

When you look at the lineal timeline of the most successful films at the domestic box office, you have to go back seventy-five years to find a title that even the shrewdest industry insider could have reasonably foreseen. This list, unadjusted for inflation, consists of Gone With the Wind, The Sound of Music, The Godfather, Jaws, Star Wars, E.T., Titanic, and Avatar. Gone With the Wind, which claimed the title that The Birth of a Nation had won a quarter of a century earlier, is the one exception: there’s no doubt that David O. Selznick hoped that it could be the biggest film of its era, even before the first match had been struck for the burning of Atlanta. Every other movie here is a headscratcher. No studio insider at the time would have been willing to bet that The Sound of Music—which Pauline Kael later called The Sound of Money—would outgross not just Doctor Zhivago and Thunderball that year, but every other movie ever made. The Godfather and Jaws were both based on bestselling novels, but that’s hardly a guarantee of success, and both were troubled productions with untested directors at the helm. Star Wars itself hardly needs to be discussed here. Columbia famously passed on E.T., and Titanic was widely regarded before its release as a looming disaster. And even Avatar, which everyone thought would be huge, exceeded all expectations: when you take regression to the mean into account, the idea that James Cameron could break his own record is so implausible that I have a hard time believing it even now.

Avatar

Which is just another way of saying that these movies were all outliers: unique, idiosyncratic projects, not part of any existing franchise, that audiences discovered gradually, often to the bewilderment of the studios themselves. The Force Awakens was different. It had barely been announced before pundits were speculating that it could set the domestic record, and although Disney spent much of buildup to its opening weekend downplaying such forecasts—with the implication that rival studios were inflating projections to make its final performance seem disappointing—it’s hard to believe that the possibility hadn’t crossed everybody’s mind. Most movie fans will remember that William Goldman said “Nobody knows anything” in Adventures in the Screen Trade, but it’s worth quoting the relevant paragraph in full. After noting that everyone in town except for Paramount turned down Raiders of the Lost Ark, he continues:

Why did Paramount say yes? Because nobody knows anything. And why did all the other studios say no? Because nobody knows anything. And why did Universal, the mightiest studio of all, pass on Star Wars, a decision that may just cost them, when all the sequels and spinoffs and toy money and book money and video-game money are totaled, over a billion dollars? Because nobody, nobody—not now, not ever—knows the least goddam thing about what is or isn’t going to work at the box office.

If Hollywood has learned anything since, it’s that you don’t pass on Star Wars. Whatever you might think of its merits as a movie, The Force Awakens marks the one and only time that somebody knew something. And it’s probably the last time, too. It may turn into the reassuring bedtime story that studio executives use to lull themselves to sleep, and Disney may plan on releasing a new installment on an annual basis forever, but the triumphant rebirth of the franchise after ten years of dormancy—or three decades, depending on how you feel about the prequels—is the kind of epochal moment that the industry is doing its best to see never happens again. We aren’t going to have another chance to miss Star Wars because it isn’t going to go away, and the excitement that arose around its return can’t be repeated. The Force Awakens is both the ultimate vindication of the blockbuster model and a high-water mark that will make everything that follows seem like diminishing returns. (More insidiously, it may be the Jedi mind trick that convinces the studios that they know more than they do, which can only lead to heartbreak.) Records are made to be broken, and at some point in my lifetime, another movie will take the crown, if only because inflation will proceed to a point where the mathematics become inevitable. But it won’t be a Star Wars sequel. And it won’t be a movie that anyone, not even a Jedi, can see coming.

Written by nevalalee

January 4, 2016 at 8:13 am

%d bloggers like this: