Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Scott Rudin

Go set a playwright

with 2 comments

If you follow theatrical gossip as avidly as I do, you’re probably aware of the unexpected drama that briefly surrounded the new Broadway adaptation of Harper Lee’s To Kill a Mockingbird, which was written for the stage by Aaron Sorkin. In March, Lee’s estate sued producer Scott Rudin, claiming that the production was in breach of contract for straying drastically from the book. According to the original agreement, the new version wasn’t supposed to “depart in any manner from the spirit of the novel nor alter its characters,” which Sorkin’s interpretation unquestionably did. (Rudin says just as much on the record: “I can’t and won’t present a play that feels like it was written in the year the book was written in terms of its racial politics. It wouldn’t be of interest. The world has changed since then.”) But the question isn’t quite as straightforward as it seems. As a lawyer consulted by the New York Times explains:

Does “spirit” have a definite and precise meaning, or could there be a difference of opinion as to what is “the spirit” of the novel? I do not think that a dictionary definition of “spirit” will resolve that question. Similarly, the contract states that the characters should not be altered. In its pre-action letter, Harper Lee’s estate repeatedly states that the characters “would never have” and “would not have” done numerous things; unless as a matter of historical fact the characters would not have done something…who is to say what a creature of fiction “would never have” or “would not have” done?

Now that the suit has been settled and the play is finally on Broadway, this might all seem beside the point, but there’s one aspect of the story that I think deserves further exploration. Earlier this week, Sorkin spoke to Greg Evans of Deadline about his writing process, noting that he took the initial call from Rudin for good reasons: “The last three times Scott called me and said ‘I have something very exciting to talk to you about,’ I ended up writing Social Network, Moneyball, and Steve Jobs, so I was paying attention.” His first pass was a faithful version of the original story, which took him about six months to write: “I had just taken the greatest hits of the book, the most important themes, the most necessary themes. I stood them up and dramatized them. I turned them into dialogue.” When he was finished, he had a fateful meeting with Rudin:

He had two notes. The first was, “We’ve got to get to the trial sooner.” That’s a structural note. The second was the note that changed everything. He said, “Atticus can’t be Atticus for the whole play. He’s got to become Atticus,” and of course, he was right. A protagonist has to change. A protagonist has to be put through something and change as a result, and a protagonist has to have a flaw. And I wondered how Harper Lee had gotten away with having Atticus be Atticus for the whole book, and it’s because Atticus isn’t the protagonist in the book. Scout is. But in the play, Atticus was going to be the protagonist, and I threw out that first draft. I started all over again, but this time the goal wasn’t to be as much like the book as possible. The goal wasn’t to swaddle the book in bubble wrap and then gently transfer it to a stage. I was going to write a new play.

This is fascinating stuff, but it’s worth emphasizing that while Rudin’s first piece of feedback was “a structural note,” the second one was as well. The notions that “a protagonist has to change” and “a protagonist has to have a flaw” are narrative conventions that have evolved over time, and for good reason. Like the idea of building the action around a clear sequence of objectives, they’re basically artificial constructs that have little to do with the accurate representation of life. Some people never change for years, and while we’re all flawed in one way or another, our faults aren’t always reflected in dramatic terms in the situations in which we find ourselves. These rules are useful primarily for structuring the audience’s experience, which comes down to the ability to process and remember information delivered over time. (As Kurt Vonnegut, who otherwise might not seem to have much in common with Harper Lee, once said to The Paris Review: “I don’t praise plots as accurate representations of life, but as ways to keep readers reading.”) Yet they aren’t essential, either, as the written and filmed versions of To Kill a Mockingbird make clear. The original novel, in particular, has a rock-solid plot and supporting characters who can change and surprise us in ways that Atticus can’t. Unfortunately, it’s hard for plot alone to carry a play, which is largely a form about character, and Atticus is obviously the star part. Sorkin doesn’t shy away from using the backbone that Lee provides—the play does indeed get to the jury trial, which is still the most reliable dramatic convention ever devised, more quickly than the book does—but he also grasped the need to turn the main character into someone who could give shape to the audience’s experience of watching the play. It was this consideration, and not the politics, that turned out to be crucial.

There are two morals to this story. One is how someone like Sorkin, who can fall into traps of his own as a writer, benefits from feedback from even stronger personalities. The other is how a note on structure, which Sorkin takes seriously, forced him to engage more deeply with the play’s real material. As all writers know, it’s harder than it looks to sequence a story as a series of objectives or to depict a change in the protagonist, but simply by thinking about such fundamental units of narrative, a writer will come up with new insights, not just about the hero, but about everyone else. As Sorkin says of his lead character in an interview with Vulture:

He becomes Atticus Finch by the end of the play, and while he’s going along, he has a kind of running argument with Calpurnia, the housekeeper, which is a much bigger role in the play I just wrote. He is in denial about his neighbors and his friends and the world around him, that it is as racist as it is, that a Maycomb County jury could possibly put Tom Robinson in jail when it’s so obvious what happened here. He becomes an apologist for these people.

In other words, Sorkin’s new perspective on Atticus also required him to rethink the roles of Calpurnia and Tom Robinson, which may turn out to be the most beneficial change of all. (This didn’t sit well with the Harper Lee estate, which protested in its complaint that black characters who “knew their place” wouldn’t behave this way at the time.) As Sorkin says of their lack of agency in the original novel: “It’s noticeable, it’s wrong, and it’s also a wasted opportunity.” That’s exactly right—and I like the last reason the best. In theater, as in any other form of narrative, the technical considerations of storytelling are more important than doing the right thing. But to any experienced writer, it’s also clear that they’re usually one and the same.

Written by nevalalee

December 14, 2018 at 8:39 am

Famous monsters of filmland

leave a comment »

For his new book The Big Picture: The Fight for the Future of the Movies, the journalist Ben Fritz reviewed every email from the hack of Sony Pictures, which are still available online. Whatever you might think about the ethics of using such material, it’s a gold mine of information about how Hollywood has done business over the last decade, and Fritz has come up with some fascinating nuggets. One of the most memorable finds is an exchange between studio head Amy Pascal and the producer Scott Rudin, who was trying to convince her to take a chance on Danny Boyle’s adaptation of Steve Jobs. Pascal had expressed doubts about the project, particularly over the casting of Michael Fassbender in the lead, and after arguing it was less risky than The Social Network, Rudin delivered a remarkable pep talk:

You ought to be doing this movie—period—and you and I both know that the cold feet you are feeling is costing you this movie that you want. Once you have cold feet, you’re done. You’re making this decision in the anticipation of how you will be looked at in failure. That’s how you fail. So you’re feeling wobbly in the job right now. Here’s the fact: nothing conventional you could do is going to change that, and there is no life-changing hit that is going to fall into your lap that is not a nervous decision, because the big obvious movies are going to go elsewhere and you don’t have the IP right now to create them from standard material. You have this. Face it…Force yourself to muster some confidence about it and do the exact thing right now for which your career will be known in movie history: be the person who makes the tough decisions and sticks with them and makes the unlikely things succeed. Fall on your sword—when you’ve lost that, it’s finished. You’re the person who does these movies. That’s—for better or worse—who you are and who you will remain. To lose that is to lose yourself.

Steve Jobs turned out to be a financial disappointment, and its failure—despite the prestige of its subject, director, and cast—feels emblematic of the move away from films driven by stars to those that depend on “intellectual property” of the kind that Sony lacked. In particular, the movie industry seems to have shifted to a model perfected by Marvel Studios, which builds a cinematic universe that can drum up excitement for future installments and generate huge grosses overseas. Yet this isn’t exactly new. In the groundbreaking book The Genius of the System, which was published three decades ago, Thomas Schatz notes that Universal did much the same in the thirties, when it pioneered the genre of cinematic horror under founder Carl Laemmle and his son:

The horror picture scarcely emerged full-blown from the Universal machinery, however. In fact, the studio had been cultivating the genre for years, precisely because it played to Universal’s strengths and maximized its resources…Over the years Carl Laemmle built a strong international distribution system, particularly in Europe…[European filmmakers] brought a fascination for the cinema’s distinctly unrealistic qualities, its capacity to depict a surreal landscape of darkness, nightmare logic, and death. This style sold well in Europe.

After noting that the aesthetics of horror lent itself to movies built out of little more than shadows and fog, which were the visual effects of its time, Schatz continues: “This rather odd form of narrative economy was vitally important to a studio with limited financial resources and no top stars to carry its pictures. And in casting, too, the studio turned a limitation into an asset, since the horror film did not require romantic leads or name stars.”

The turning point was Tod Browning’s Dracula, a movie “based on a presold property” that could serve as an entry point for other films along the same lines. It didn’t require a star, but “an offbeat character actor,” and Universal’s expectations for it eerily foreshadow the way in which studio executives still talk today. Schatz writes:

Laemmle was sure it would [succeed]—so sure, in fact, that he closed the Frankenstein deal several weeks before Dracula’s February 1931 release. The Lugosi picture promptly took off at the box office, and Laemmle was more convinced than ever that the horror film was an ideal formula for Universal, given its resources and the prevailing market conditions. He was convinced, too, that he had made the right decision with Frankenstein, which had little presold appeal but now had the success of Dracula to generate audience anticipation.

Frankenstein, in short, was sort of like the Ant-Man of the thirties, a niche property that leveraged the success of its predecessors into something like real excitement. It worked, and Universal’s approach to its monsters anticipates what Marvel would later do on a vaster scale, with “ambitious crossover events” like House of Frankenstein and House of Dracula that combined the studio’s big franchises with lesser names that seemed unable to carry a film on their own. (If Universal’s more recent attempt to do the same with The Mummy fell flat, it was partially because it was unable to distinguish between the horror genre, the star picture, and the comic book movie, resulting in a film that turned out to be none of the above. The real equivalent today would be Blumhouse Productions, which has done a much better job of building its brand—and which distributes its movies through Universal.)

And the inability of such movies to provide narrative closure isn’t a new development, either. After seeing James Whale’s Frankenstein, Carl Laemmle, Jr. reacted in much the same way that executives presumably do now:

Junior Laemmle was equally pleased with Whale’s work, but after seeing the rough cut he was certain that the end of the picture needed to be changed. His concerns were twofold. The finale, in which both Frankenstein and his monster are killed, seemed vaguely dissatisfying; Laemmle suspected that audiences might want a glimmer of hope or redemption. He also had a more pragmatic concern about killing off the characters—and thus any possibility of sequels. Laemmle now regretted letting Professor Van Helsing drive that stake through Count Dracula’s heart, since it consigned the original character to the grave…Laemmle was not about to make the same mistake by letting that angry mob do away with the mad doctor and his monster.

Whale disagreed, but he was persuaded to change the ending after a preview screening, leaving open the possibility that the monster might have survived. Over eight decades later, Joss Whedon offered a similar explanation in an interview with Mental Floss: “It’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant…My feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.” For now, we’re living in a world made by the Universal monsters—and with only a handful of viable properties, half of which are owned by Disney. Without them, it might seem impossible, as Rudin said, “to create them from standard material.” But we’re also still waiting to be blindsided by the next great franchise. As another famous monster once put it: “A lot of times, people don’t know what they want until you show it to them.” And when it came to the movies, at least, Steve Jobs was right.

The hero paradox

leave a comment »

Jennifer Lawrence in The Hunger Games: Catching Fire

Every year, the Academy Awards telecast makes us sit through a bunch of pointless montages, and every year, we get to complain about it. As I mentioned last week, I’ve long since gotten over most of the weird choices made by the Oscars—I like to remind myself that the ceremony isn’t designed for the television audience, but for the movers and shakers sitting in the auditorium itself—and I’ve resigned myself to the prospect of a few pointless production numbers. But the montages always seem particularly strange. They don’t add much in the way of entertainment value, and the opportunity cost for what is already an overlong show is unforgivably high: one fewer montage, and perhaps we might have had room for Dennis Farina in the In Memoriam reel, not to mention the canceled appearance by Batkid. This year’s ceremony, with its “salute to heroes” theme, resulted in an even more random assortment of clips than usual: here’s Gandhi, and Lawrence of Arabia, and just as we start to think there’s a pattern emerging, here’s Sidney Poitier as Mr. Tibbs. (I actually had to look up In the Heat of the Night to reassure myself that it hadn’t been based on a true story.)

The result was inexplicable enough that it inspired Todd VanDerWerff of The A.V. Club to tweet: “Next year: A tribute to protagonists!” But it also raises the larger question of what a hero really is, at least in terms of what we look for in storytelling. From a producer’s point of view, the answer is simple: a hero is the actor with the greatest amount of screen time, or whose face takes up the most room on the poster. (Or as the producer Scott Rudin once said when asked what a movie was about: “It’s about two movie stars.”) A writer might put it somewhat differently. The protagonist of a movie is the character whose actions and decisions drive the plot, and if he or she happens to embody qualities that we associate with heroism—courage, integrity, selflessness, resourcefulness—it’s because these attributes lend themselves both to wishful identification from the audience and to interesting choices and behavior within the confines of the story. All things being equal, a brave, committed individual will end up doing things on camera that we’ll hopefully want to watch. It has nothing to do with morality; it’s a logistical choice that results in more entertaining narratives. Or at least it should be.

Jennifer Lawrence in The Hunger Games

The trouble, of course, is that when you’re not sure about your own story, you tend to fixate more on what the hero is than the more crucial matter of what he does. Screenwriters are always told to make their leading characters more heroic and likable, as if this were something that could be separated from the narrative itself. At worst, the movie simply serves up a chosen one, either explicitly or implicitly, which is often an excuse to give us a protagonist who is interesting and important just because we’re told he is. Sometimes, this problem can be a subtle one. Watching The Hunger Games: Catching Fire for the first time over the weekend, I felt that even though Jennifer Lawrence sells the hell out of the part, Katniss Everdeen herself is something of a wet blanket. This isn’t anyone’s fault: Katniss as written is almost unplayable, since she needs to be admirable enough to inspire a revolution and carry a franchise, vulnerable enough to serve as one corner of a love triangle, and a resourceful warrior who also hates the idea of killing. That’s a lot for any one character to shoulder, and it means that poor Katniss herself is often the least interesting person on the screen.

In general, though, it’s hard for a hero to come to life in the way a more incidental character can, simply because he’s under so much pressure to advance the plot. The great character actor Stephen Tobolowsky hinted at this last week on Reddit:

The difference between character actors and the leading men is that everything the leading men do is on film. Character actors have to invent that life off screen and bring that reality on screen. It’s much more imaginative work and the hours are better.

That’s why we often find ourselves wishing that we could spend more time with the supporting cast of a television show: they’re so much more full of life and vitality than the lead, whose every action is designed to carry forward a huge, creaking machine. Being a hero is a thankless role, both in fiction and in real life, and it inevitably leads to a loss of freedom, when in theory the hero should be more free than anyone else. As Harold Bloom observes of Hamlet, he could be anything in the world, but he’s doomed to play out the role in which he has been cast. Finding a way to balance a hero’s narrative burden with the freedom he needs to come alive in the imagination is one of a writer’s greatest challenges. And if the movies succeeded at this more often, those montages at the Oscars would have made a lot more sense.

%d bloggers like this: