Posts Tagged ‘Vertigo’
Out of the past
You shouldn’t have been that sentimental.
About halfway through the beautiful, devastating finale of Twin Peaks—which I’ll be discussing here in detail—I began to reflect on what the figure of Dale Cooper really means. When we encounter him for the first time in the pilot, with his black suit, fastidious habits, and clipped diction, he’s the embodiment of what we’ve been taught to expect of a special agent of the Federal Bureau of Investigation. The FBI occupies a role in movies and television far out of proportion to its actual powers and jurisdiction, in part because it seems to exist on a level intriguingly beyond that of ordinary law enforcement, and it’s often been used to symbolize the sinister, the remote, or the impersonal. Yet when Cooper reveals himself to be a man of real empathy, quirkiness, and faith in the extraordinary, it comes almost as a relief. We want to believe that a person like this exists. Cooper carries a badge, he wears a tie, and he’s comfortable with a gun, but he’s here to enforce human reason in the face of a bewildering universe. The Black Lodge might be out there, but the Blue Rose task force is on it, and there’s something oddly consoling about the notion that it’s a part of the federal government. A few years later, Chris Carter took this premise and refined it into The X-Files, which, despite its paranoia, reassured us that somebody in a position of authority had noticed the weirdness in the world and was trying to make sense of it. They might rarely succeed, but it was comforting to think that their efforts had been institutionalized, complete with a basement office, a place in the org chart, and a budget. And for a lot of viewers, Mulder and Scully, like Cooper, came to symbolize law and order in stories that laugh at our attempts to impose it.
Even if you don’t believe in the paranormal, the image of the lone FBI agent—or two of them—arriving in a small town to solve a supernatural mystery is enormously seductive. It appeals to our hopes that someone in power cares enough about us to investigate problems that can’t be rationally addressed, which all stand, in one way or another, for the mystery of death. This may be why both Twin Peaks and The X-Files, despite their flaws, have sustained so much enthusiasm among fans. (No other television dramas have ever meant more to me.) But it’s also a myth. This isn’t really how the world works, and the second half of the Twin Peaks finale is devoted to tearing down, with remarkable cruelty and control, the very idea of such solutions. It can only do this by initially giving us what we think we want, and the first of last night’s two episodes misleads us with a satisfying dose of wish fulfillment. Not only is Cooper back, but he’s in complete command of the situation, and he seems to know exactly what to do at every given moment. He somehow knows all about Freddie and his magical green glove, which he utilizes to finally send Bob into oblivion. After rescuing Diane, he uses his room key from the Great Northern, like a magical item in a video game, to unlock the door that leads him to Mike and the disembodied Phillip Jeffries. He goes back in time, enters the events of Fire Walk With Me, and saves Laura on the night of her murder. The next day, Pete Martell simply goes fishing. Viewers at home even get the appearance by Julee Cruise that I’ve been awaiting since the premiere. After the credits ran, I told my wife that if it had ended there, I would have been totally satisfied.
But that was exactly what I was supposed to think, and even during the first half, there are signs of trouble. When Cooper first sees the eyeless Naido, who is later revealed to be the real Diane, his face freezes in a huge closeup that is superimposed for several minutes over the ensuing action. It’s a striking device that has the effect of putting us, for the first time, in Cooper’s head, rather than watching him with bemusement from the outside. We identify with him, and at the very end, when his efforts seemingly come to nothing, despite the fact that he did everything right, it’s more than heartbreaking—it’s like an existential crisis. It’s the side of the show that was embodied by Sheryl Lee’s performance as Laura Palmer, whose tragic life and horrifying death, when seen in its full dimension, put the lie to all the cozy, comforting stories that the series told us about the town of Twin Peaks. Nothing good could ever come out of a world in which Laura died in the way that she did, which was the message that Fire Walk With Me delivered so insistently. And seeing Laura share the screen at length with Cooper presents us with both halves of the show’s identity within a single frame. (It also gives us a second entry, after Blue Velvet, in the short list of great scenes in which Kyle MacLachlan enters a room to find a man sitting down with his brains blown out.) For a while, as Cooper drives Laura to the appointment with her mother, it seems almost possible that the series could pull off one last, unfathomable trick. Even if it means erasing the show’s entire timeline, it would be worth it to save Laura. Or so we think. In the end, they return to a Twin Peaks that neither of them recognize, in which the events of the series presumably never took place, and Cooper’s only reward is Laura’s scream of agony.
As I tossed and turned last night, thinking about Cooper’s final, shattering moment of comprehension, a line of dialogue from another movie drifted into my head: “It’s too late. There’s no bringing her back.” It’s from Vertigo, of course, which is a movie that David Lynch and Mark Frost have been quietly urging us to revisit all along. (Madeline Ferguson, Laura’s identical cousin, who was played by Lee, is named after the film’s two main characters, and both works of art pivot on a necklace and a dream sequence.) Along with so much else, Vertigo is about the futility of trying to recapture or change the past, and its ending, which might be the most unforgettable of any film I’ve ever seen, destroys Scotty’s delusions, which embody the assumptions of so many American movies: “One final thing I have to do, and then I’ll be rid of the past forever.” I think that Lynch and Frost are consciously harking back to Vertigo here—in the framing of the doomed couple on their long drive, as well as in Cooper’s insistence that Laura revisit the scene of the crime—and it doesn’t end well in either case. The difference is that Vertigo prepares us for it over the course of two hours, while Twin Peaks had more than a quarter of a century. Both works offer a conclusion that feels simultaneously like a profound statement of our helplessness in the face of an unfair universe and like the punchline to a shaggy dog story, and perhaps that’s the only way to express it. I’ve quoted Frost’s statement on this revival more than once: “It’s an exercise in engaging with one of the most powerful themes in all of art, which is the ruthless passage of time…We’re all trapped in time and we’re all going to die. We’re all traveling along this conveyor belt that is relentlessly moving us toward this very certain outcome.” Thirty seconds before the end, I didn’t know what he meant. But I sure do now. And I know at last why this show’s theme is called “Falling.”
The last tango
When I look back at many of my favorite movies, I’m troubled by a common thread that they share. It’s the theme of the control of a vulnerable woman by a man in a position of power. The Red Shoes, my favorite film of all time, is about artistic control, while Blue Velvet, my second favorite, is about sexual domination. Even Citizen Kane has that curious subplot about Kane’s attempt to turn Susan into an opera star, which may have originated as an unkind reference to William Randolph Hearst and Marion Davies, but which survives in the final version as an emblem of Kane’s need to collect human beings like playthings. It’s also hard to avoid the feeling that some of these stories secretly mirror the relationship between the director and his actresses on the set. Vertigo, of course, can be read as an allegory for Hitchcock’s own obsession with his leading ladies, whom he groomed and remade as meticulously as Scotty attempts to do with Madeline. In The Shining, Jack’s abuse of Wendy feels only slightly more extreme than what we know Kubrick—who even resembles Jack a bit in the archival footage that survives—imposed on Shelley Duvall. (Duvall’s mental health issues have cast a new pall on those accounts, and the involvement of Kubrick’s daughter Vivian has done nothing to clarify the situation.) And Roger Ebert famously hated Blue Velvet because he felt that David Lynch’s treatment of Isabella Rossellini had crossed an invisible moral line.
The movie that has been subjected to this kind of scrutiny most recently is Last Tango in Paris, after interview footage resurfaced of Bernardo Bertolucci discussing its already infamous rape scene. (Bertolucci originally made these comments three years ago, and the fact that they’ve drawn attention only now is revealing in itself—it was hiding in plain sight, but it had to wait until we were collectively prepared to talk about it.) Since the story first broke, there has been some disagreement over what Maria Schneider knew on the day of the shoot. You can read all about it here. But it seems undeniable that Bertolucci and Brando deliberately withheld crucial information about the scene from Schneider until the cameras were rolling. Even the least offensive version makes me sick to my stomach, all the more so because Last Tango in Paris has been an important movie to me for most of my life. In online discussions of the controversy, I’ve seen commenters dismissing the film as an overrated relic, a vanity project for Brando, or one of Pauline Kael’s misguided causes célèbres. If anything, though, this attitude lets us off the hook too easily. It’s much harder to admit that a film that genuinely moved audiences and changed lives might have been made under conditions that taint the result beyond retrieval. It’s a movie that has meant a lot to me, as it did to many other viewers, including some I knew personally. And I don’t think I can ever watch it again.
But let’s not pretend that it ends there. It reflects a dynamic that has existed between directors and actresses since the beginning, and all too often, we’ve forgiven it, as long as it results in great movies. We write critical treatments of how Vertigo and Psycho masterfully explore Hitchcock’s ambivalence toward women, and we overlook the fact that he sexually assaulted Tippi Hedren. When we think of the chummy partnerships that existed between men like Cary Grant and Howard Hawks, or John Wayne and John Ford, and then compare them with how directors have regarded their female collaborators, the contrast couldn’t be more stark. (The great example here is Gone With the Wind: George Cukor, the original director, was fired because he made Clark Gable uncomfortable, and he was replaced by Gable’s buddy Victor Fleming. Vivien Leigh and Olivia de Havilland were forced to consult with Cukor in secret.) And there’s an unsettling assumption on the part of male directors that this is the only way to get a good performance from a woman. Bertolucci says that he and Brando were hoping to get Schneider’s raw reaction “as a girl, instead of as an actress.” You can see much the same impulse in Kubrick’s treatment of Duvall. Even Michael Powell, one of my idols, writes of how he and the other actors frightened Moira Shearer to the point of tears for the climactic scene of The Red Shoes—“This was no longer acting”—and says elsewhere: “I never let love interfere with business, or I would have made love to her. It would have improved her performance.”
So what’s a film buff to do? We can start by acknowledging that the problem exists, and that it continues to affect women in the movies, whether in the process of filmmaking itself or in the realities of survival in an industry that is still dominated by men. Sometimes it leads to abuse or worse. We can also honor the work of those directors, from Ozu to Almodóvar to Wong Kar-Wai, who have treated their actresses as partners in craft. Above all else, we can come to terms with the fact that sometimes even a masterpiece fails to make up for the choices that went into it. Thinking of Last Tango in Paris, I was reminded of Norman Mailer, who wrote one famous review of the movie and was linked to it in another. (Kael wrote: “On the screen, Brando is our genius as Mailer is our genius in literature.”) Years later, Mailer supported the release from prison of a man named Jack Henry Abbott, a gifted writer with whom he had corresponded at length. Six weeks later, Abbott stabbed a stranger to death. Afterward, Mailer infamously remarked:
I’m willing to gamble with a portion of society to save this man’s talent. I am saying that culture is worth a little risk.
But it isn’t—at least not like this. Last Tango in Paris is a masterpiece. It contains the single greatest male performance I’ve ever seen. But it wasn’t worth it.
The strange loop of Westworld
In last week’s issue of The New Yorker, the critic Emily Nussbaum delivers one of the most useful takes I’ve seen so far on Westworld. She opens with many of the same points that I made after the premiere—that this is really a series about storytelling, and, in particular, about the challenges of mounting an expensive prestige drama on a premium network during the golden age of television. Nussbaum describes her own ambivalence toward the show’s treatment of women and minorities, and she concludes:
This is not to say that the show is feminist in any clear or uncontradictory way—like many series of this school, it often treats male fantasy as a default setting, something that everyone can enjoy. It’s baffling why certain demographics would ever pay to visit Westworld…The American Old West is a logical fantasy only if you’re the cowboy—or if your fantasy is to be exploited or enslaved, a desire left unexplored…So female customers get scattered like raisins into the oatmeal of male action; and, while the cast is visually polyglot, the dialogue is color-blind. The result is a layer of insoluble instability, a puzzle that the viewer has to work out for herself: Is Westworld the blinkered macho fantasy, or is that Westworld? It’s a meta-cliffhanger with its own allure, leaving us only one way to find out: stay tuned for next week’s episode.
I agree with many of her reservations, especially when it comes to race, but I think that she overlooks or omits one important point: conscious or otherwise, it’s a brilliant narrative strategy to make a work of art partially about the process of its own creation, which can add a layer of depth even to its compromises and mistakes. I’ve drawn a comparison already to Mad Men, which was a show about advertising that ended up subliminally criticizing its own tactics—how it drew viewers into complex, often bleak stories using the surface allure of its sets, costumes, and attractive cast. If you want to stick with the Nolan family, half of Chris’s movies can be read as commentaries on themselves, whether it’s his stricken identification with the Joker as the master of ceremonies in The Dark Knight or his analysis of his own tricks in The Prestige. Inception is less about the construction of dreams than it is about making movies, with characters who stand in for the director, the producer, the set designer, and the audience. And perhaps the greatest cinematic example of them all is Vertigo, in which Scotty’s treatment of Madeline is inseparable from the use that Hitchcock makes of Kim Novak, as he did with so many other blonde leading ladies. In each case, we can enjoy the story on its own merits, but it gains added resonance when we think of it as a dramatization of what happened behind the scenes. It’s an approach that is uniquely forgiving of flawed masterpieces, which comment on themselves better than any critic can, until we wonder about the extent to which they’re aware of their own limitations.
And this kind of thing works best when it isn’t too literal. Movies about filmmaking are often disappointing, either because they’re too close to their subject for the allegory to resonate or because the movie within the movie seems clumsy compared to the subtlety of the larger film. It’s why Being John Malkovich is so much more beguiling a statement than the more obvious Adaptation. In television, the most unfortunate recent example is UnREAL. You’d expect that a show that was so smart about the making of a reality series would begin to refer intriguingly to itself, and it did, but not in a good way. Its second season was a disappointment, evidently because of the same factors that beset its fictional show Everlasting: interference from the network, conceptual confusion, tensions between producers on the set. It seemed strange that UnREAL, of all shows, could display such a lack of insight into its own problems, but maybe it isn’t so surprising. A good analogy needs to hold us at arm’s length, both to grant some perspective and to allow for surprising discoveries in the gaps. The ballet company in The Red Shoes and the New York Inquirer in Citizen Kane are surrogates for the movie studio, and both films become even more interesting when you realize how much the lead character is a portrait of the director. Sometimes it’s unclear how much of this is intentional, but this doesn’t hurt. So much of any work of art is out of your control that you need to find an approach that automatically converts your liabilities into assets, and you can start by conceiving a premise that encourages the viewer or reader to play along at home.
Which brings us back to Westworld. In her critique, Nussbaum writes: “Westworld [is] a come-hither drama that introduces itself as a science-fiction thriller about cyborgs who become self-aware, then reveals its true identity as what happens when an HBO drama struggles to do the same.” She implies that this is a bug, but it’s really a feature. Westworld wouldn’t be nearly as interesting if it weren’t being produced with this cast, on this network, and on this scale. We’re supposed to be impressed by the time and money that have gone into the park—they’ve spared no expense, as John Hammond might say—but it isn’t all that different from the resources that go into a big-budget drama like this. In the most recent episode, “Dissonance Theory,” the show invokes the image of the maze, as we might expect from a series by a Nolan brother: get to the center to the labyrinth, it says, and you’ve won. But it’s more like what Douglas R. Hofstadter describes in I Am a Strange Loop:
What I mean by “strange loop” is—here goes a first stab, anyway—not a physical circuit but an abstract loop in which, in the series of stages that constitute the cycling-around, there is a shift from one level of abstraction (or structure) to another, which feels like an upwards movement in a hierarchy, and yet somehow the successive “upward” shifts turn out to give rise to a closed cycle. That is, despite one’s sense of departing ever further from one’s origin, one winds up, to one’s shock, exactly where one had started out.
This neatly describes both the park and the series. And it’s only through such strange loops, as Hofstadter has long argued, that any complex system—whether it’s the human brain, a robot, or a television show—can hope to achieve full consciousness.
The prop master
When we break down the stories we love into their constituent parts, we’re likely to remember the characters first. Yet the inanimate objects—or what a theater professional would call the props—are what feather that imaginary nest, providing a backdrop for the narrative and necessary focal points for the action. A prop can be so striking that it practically deserves costar status, like the rifle in The Day of the Jackal, or a modest but unforgettable grace note, like the cake of soap that Leopold Bloom carries in his pocket for much of Ulysses. It can be the MacGuffin that drives the entire plot or the lever that enables a single crucial moment, like the necklace that tips off Scotty at the end of Vertigo. Thrillers and other genre novels often use props to help us tell flat characters apart, so that an eyepatch or a pocket square is all that distinguishes a minor player, but this kind of cheap shorthand can also shade into the highest level of all, in which accessories like Sherlock Holmes’s pipe or summon up an entire world of romance and emotion. And even if the props merely serve utilitarian ends, they’re still an aspect of fiction that writers could do well to study, since they can provide a path into a story or a solution to a problem that resists all other approaches.
They can also be useful at multiple stages. I’ve known for a long time that a list of props, like lists of any kind, can be an invaluable starting point for planning a story. The most eloquent expression of this I’ve ever found appears, unexpectedly, in Shamus Culhane’s nifty book Animation: From Script to Screen:
One good method of developing a story is to make a list of details. For example [for a cartoon about elves as clock cleaners in a cathedral], what architectural features come to mind—steeples, bells, windows, gargoyles? What props would the elves use—brushes, pails, mops, sponges…what else? Keep on compiling lists without stopping to think about them. Let your mind flow effortlessly, and don’t try to be neat or orderly. Scribble as fast as you can until you run out of ideas.
A list of props can be particularly useful when a story takes place within a closed universe with a finite number of possible combinations. Any good bottle episode invests much of its energy into figuring out surprising ways to utilize the set of props at hand, and I used an existing catalog of props—in the form of the items available for purchase from the commissary at Belmarsh Prison—to figure out a tricky plot point in Eternal Empire.
What I’ve discovered more recently is that a list of props also has its uses toward the end of the creative process, when a short story or novel is nearly complete. If I have a decent draft that somehow lacks overall cohesiveness, I’ll go through and systematically make a list of all the props or objects that appear over the course of the story. Whenever I find a place where a prop that appears in one chapter can be reused down the line, it binds events together that much more tightly. When we’re writing a first draft, we have so much else on our minds that we tend to forget about object permanence: a prop is introduced when necessary and discarded at once. Giving some thought to how those objects can persist makes the physical space of the narrative more credible, and there’s often something almost musically satisfying when a prop unexpectedly reappears. (One of my favorite examples occurs in Wong Kar-Wai’s Chungking Express. During the sequence in which Faye Wong breaks into Tony Leung’s apartment to surreptitiously rearrange and replace some of his possessions, she gives him a new pair of sandals, throwing the old pair behind the couch. Much later, after she floods his living room by mistake, one of the old sandals comes floating out from its hiding place. It only appears onscreen for a moment, and nobody even mentions it, but it’s an image I’ve always treasured.)
And in many cases, the props themselves aren’t even the point. I’ve said before that one of the hardest things in writing isn’t inventing new material but fully utilizing what you already have. Nine times out of ten, when you’re stuck on a story problem, you’ll find that the solution is already there, buried between the lines on a page you wrote months before. The hard part is seeing past your memories of it. A list of props, assembled as drily as if you were a claims adjuster examining a property, can provide a lens through which the overfamiliar can become new. (This may be why histories of the world in a hundred objects, or whatever, are so popular: they give us a fresh angle on old events by presenting them through props, not personalities.) When you look at it more closely, a list of props is really a list of actions, or moments in which a character expresses himself by performing a specific physical activity. Unless you’re just giving us an inventory of a room’s contents, as Donna Tartt loves to do, a prop usually appears only when it’s being used for something. Props thus represent the point in space where intention becomes action, expressed in visual or tactile terms—which is exactly what a writer should always be striving to accomplish. And a list of props is nothing less than a list of the times which the story is working more or less as it should.
The list of a lifetime
I miss Roger Ebert for a lot of reasons, but I always loved how fully he occupied the role of the celebrity critic while expanding it into something more. “Two thumbs up” has become a way of dismissing an entire category of film criticism, and Ebert was as responsible for its rise as anyone else, although he can hardly be blamed for his imitators. Yet he wouldn’t have been nearly as good at it—and he was damned good, especially when paired with Gene Siskel—if it hadn’t been built on a foundation of shrewdness, taste, and common sense that came through in every print review he wrote. He knew that a rating system was necessary, if only to give shape to his discussions with Gene, but he was also aware of its limitations. (For proof, you need only turn to his classic review of the Adam Sandler remake of The Longest Yard, which transforms, unexpectedly, into an extended essay on the absurdity of reconciling a thoughtful approach to criticism with “that vertical thumb.”) Read any critic for any length of time, whether it’s Pauline Kael or David Thomson or James Wood, and you start to see the whole business of ranking works of art, whether with thumbs or with words, as both utterly important and inherently ridiculous. Ebert understood this profoundly.
The same was true of the other major tool of the mainstream critic: the list. Making lists of the best or worst movies, like handing out awards, turns an art form into a horse race, but it’s also a necessary evil. A critic wants to be a valued guide, but more often, he ends up serving as a signpost, pointing up the road toward an interesting vista while hoping that we’ll take in other sights along the way. Lists are the most useful pointers we have, especially for viewers who are encountering the full variety of movies for the first time, and they’ve played an enormous role in my own life. And when you read Ebert’s essay on preparing his final list for the Sight & Sound poll, you sense both the melancholy nature of the task and his awareness of the power it holds. Ebert knows that adding a movie to his list naturally draws attention to it, and he pointedly includes a single “propaganda” title—here it’s Malick’s Tree of Life—to encourage viewers to seek it out. Since every addition requires a removal, he clarifies his feelings on this as well:
Once any film has ever appeared on my [Sight & Sound] list, I consider it canonized. Notorious or Gates of Heaven, for example, are still two of the ten best films of all time, no matter what a subsequent list says.
In short, he approaches the list as a game, but a serious one, and he knows that pointing one viewer toward Aguirre or The General makes all of it worthwhile.
I thought of his example repeatedly when I revised my list of my ten favorite movies. Four years had gone by since my last series of posts on the subject, and the passage of time had brought a bit of reshuffling and a pair of replacements: L.A. Confidential and Star Trek II: The Wrath of Khan had given way to Vertigo and Inception. And while it’s probably a mistake to view it as a zero-sum game, it’s hard not to see these films as commenting on one another. L.A. Confidential remains, as I said long ago, my favorite of all recent Hollywood movies, but it’s a film that invests its genre with greater fluency and complexity without challenging the rules on a deeper level, while Vertigo takes the basic outline of a sleek romantic thriller and blows it to smithereens. As much as I love them both, there’s no question in my mind as to which one achieves more. The contest between Inception and Wrath of Khan is harder to judge, and I’m not sure that the latter isn’t ultimately richer and more rewarding. But I wanted to write about Inception ever so slightly more, and after this weekend’s handwringing over the future of original ideas in movies, I have a hunch that its example is going to look even more precious with time. Inception hardly needs my help to draw attention to it, but to the extent that I had a propaganda choice this time around, it was this one.
Otherwise, my method in ranking these films was a simple one. I asked myself which movie I’d save first—solely for my own pleasure—if the last movie warehouse in the world were on fire. The answer was The Red Shoes. Next would be Blue Velvet, then Chungking Express, and so on down the line. Looking at the final roster, I don’t think I’d make any changes. Like Ebert, who kept La Dolce Vita on his list because of how it reflected the arc of his own life, I’m aware that much of the result is a veiled autobiography: Blue Velvet, in particular, galvanized me as a teenager as few other movies have, and part of the reason I rank it so highly is to acknowledge that specific debt. Other films are here largely because of the personal associations they evoke. Yet any movie that encapsulates an entire period in my life, out of all the films I was watching then, has to be extraordinary by definition: it isn’t just a matter of timing, at least not if it lasts. (You could even say that a great movie, like Vertigo, is one that convinces many different viewers that it’s secretly about them.) Ebert knew that there was no contradiction in embracing The Tree of Life as both the largest cosmic statement since 2001 and an agonizingly specific evocation of his own childhood. Any list, like any critic, lives in two worlds, and each half gains meaning from the other. And when I think of my own list and the choices it made, I can only quote Ebert one last time: “To add a title, I must remove one. Which film can I do without? Not a single one.”
My ten great movies #6: Vertigo
Like many great works of art, Vertigo lingers in the imagination—perhaps more than any other movie I’ve ever seen—because it oscillates so nervously between its surface pleasures and its darkest depths. It’s both the ultimate Hitchcock entertainment, with its flawless cinematography, iconic Edith Head costumes, and lush Bernard Herrmann score, and the most psychologically complex film ever produced in America. In many respects, it’s the most mysterious movie ever made, but whenever I watch it again, I’m struck by how much of it is grounded in specifics: the mundane details of Scotty’s life, the beautiful but realistic San Francisco settings, and the way his obsession for Madeline manifests itself in trips to salons and department stores. Early on, it can come off as routine, even banal, which leaves us even less prepared for its climax, a sick joke or sucker punch that also breaks the heart. There’s no greater ending in all of movies, and it works because it’s so cruel, arbitrary, and unfair.
Vertigo takes so many insane, unjustifiable risks that it inevitably feels flawed in places, despite long stretches of icy perfection: the plot sometimes creaks, especially in the first half, and the dialogue scenes often feel like part of a lesser film. But all these concerns are swept away by the extraordinary third act, which may be my favorite in any work of art. I’ve noted before how the original novel keeps the big revelation for the very end, while the film puts it almost forty minutes earlier, shifting points of view and dividing the viewer’s loyalties in the process. It’s a brilliant change—arguably no other creative decision in any movie adaptation has had a greater impact—and it turns the movie from an elegant curiosity into something indescribably beautiful and painful. When Judy turns to the camera and the image is flooded with red, we’re as close to the heart of movies as we’ll ever get. As David Thomson writes: “It’s a test case. If you are moved by this film, you are a creature of cinema. But if you are alarmed by its implausibility, its hysteria, its cruelty—well, there are novels.”
Next week: The most enduring of all Hollywood films, and a bittersweet reminder of what might have been.
How to be ambiguous
Note: I’m traveling for the next few days, so I’ll be republishing a few of my favorite pieces from earlier in this blog’s run. This post originally appeared on May 22, 2013.
Writers are generally advised to avoid ambiguity. Clarity, as E.B. White observes, may not be a substitute for merit in writing, but it’s as close as we can get, so it’s good form for authors to state things as clearly as they can. It’s certainly the best rule to follow if there’s any doubt. Yet this does nothing to explain the fact that many of the works of art that affect us so deeply—from Hamlet to Vertigo to, yes, Mad Men—are founded on ambiguity. As in the case of most masterpieces, these can be dangerous examples for a writer to follow, but they’re also very tempting. Great fiction survives in the imagination because of the constellation of questions it raises in the reader’s mind, and the problem of balancing such uncertainties with a narrative that remains clear from moment to moment is one of the most difficult issues for a writer to face. And it soon becomes obvious, after writing or reading a few examples, that ambiguous language is not the best way to create a larger superimposition of interpretations.
As usual, we can get some useful insights by looking at poetry, the leading edge of language, whose lessons and innovations tend to filter down centuries later into prose. Poetry is often seen as ambiguous or obscure, but when you examine the greatest poems line by line, you find that this is an effect generated by the resonance of highly specific images—nouns, verbs, and concrete adjectives, all intelligible in themselves but mysterious as a whole. Take, for instance, the poem that I.A. Richards has called “the most mysterious poem in English,” Shakespeare’s “The Phoenix and the Turtle.” Each stanza stands with crystal clarity, and often something more, but the result has been interpreted as everything from a Catholic allegory to a veiled reference to the relationship between Sir John Salusbury and Queen Elizabeth, and as it stands, it’s a puzzle without an answer. A prefatory note spelling it out would have avoided much of this confusion, but in the process, it would have destroyed the magic.
Which leads us to a very important point, which is that ambiguity is best created out of a network of specifics with one crucial piece removed. It’s often been observed, for instance, that much of the mystery of Shakespeare’s plays emerges from the fact that he omits part of his original source material while leaving other elements intact. In the original Amleth story, there’s no confusion about the reasons for the lead character’s madness: he believes that his uncle is plotting against his life, so in order to protect himself and mislead his enemies, he pretends to be an idiot. Hamlet takes away this detail—Claudius doesn’t seem particularly interested in killing Hamlet at all until after he starts to act like a lunatic—and creates a tantalizing ambiguity in the process. The same is true of King Lear, in which the original source more clearly explains the king’s reasons for putting his three daughters to the test. The resulting plays are filled with concrete language and action, but the mystery remains.
And this is true of many works of art. We never know the origins of Montresor’s murderous vendetta in “The Cask of Amontillado,” but the story itself is so detailed that it practically serves as a manual on how to wall a man up alive, even as Poe denies us the one piece of information that most writers would have included first. (If Poe were alive today, I suspect that his editor would have begged him to flesh out the backstory.) Vertigo is the most mysterious movie ever made, but on watching it again, I’m struck by how much of it is grounded in specifics—the mundane details of Scotty’s life, the beautiful but realistic San Francisco settings, the way his obsession for Madeline manifests itself in trips to salons and department stores. Ambiguity, in other words, is only effective when the story itself is concrete enough to convincingly support multiple interpretations, which, in practice, usually means an even greater attention to clarity and convincing detail than if the line of the narrative were perfectly clear. A map that contains a single path can afford to leave the rest of the territory blank, but if we’re going to find our way down more than one road, we’ll need a better sense of the landscape, even, or especially, if the landmarks lead us astray.
“You have a better chance than I do…”
Note: This post is the fifteenth installment in my author’s commentary for Eternal Empire, covering Chapter 16. You can read the previous installments here.
Writers are often told that it’s a mistake to build their stories around luck, particularly if it works to the hero’s advantage. As Pixar storyboard artist Emma Coats famously said: “Coincidences to get characters into trouble are great; coincidences to get them out of it are cheating.” And it seems intuitively true that a story, whenever possible, should arise out of decisions made by the protagonist and antagonist. Yet this is a genre convention in itself, and it isn’t there, in spite of appearances, because it’s more “realistic.” Luck plays an enormous role in real life, and if exclude it from our plotting, it isn’t for the sake of realism, but plausibility, which are two entirely different things. In Adventures in the Screen Trade, William Goldman describes a hypothetical scene in which the hero, tasked with entering a heavily secured castle, simply blunders in without a plan. He climbs the wall within sight of the guards, who don’t react; wanders around for a while in plain view; trips a few alarms without drawing any attention; and finally ends up, by accident, in the room he’s trying to enter. If this were a movie, we’d throw tomatoes. But it’s exactly how a man named Michael Fagan once broke into Buckingham Palace and ended up in the bedroom of the Queen.
If we rule out such moments of luck in fiction, it isn’t because they can’t happen, but because we feel that they take the writer and characters off the hook. It seems lazy, and worse, it pulls us out of the fictional dream by breaking an implied contract between author and reader, which states that events should emerge from the logical consequences of the characters’ actions. But there’s one interesting exception. Sometimes the master plan is so farfetched that only an absurdly omniscient protagonist could pull it off, anticipating every last detail with pinpoint precision. (Think, for instance, of the Saw movies, or even of the Joker’s stratagems in The Dark Knight.) This can be even less believable than a plan that hinges on luck, so constructing the plot turns into a choice between implausibilities—or, better, as a balance between the two. You could see it as a problem of narrative engineering: a solution that depends solely on either luck or unerring foresight collapses under its own unlikelihood, but a combination of the two stands firm. The challenge lies in mixing these elements in the right proportions, with a little luck and a little cleverness, so that the reader or viewer doesn’t regard the result as anything less than a natural development.
And whenever luck is involved, it’s best to push it as far from the center of the story as possible, or to make it a fait accompli, so that it seems less like a stroke of fortune than a condition of the plot itself. Most movies about an impossible heist, for instance, hinge on elements of luck: there’s always a convenient air duct, or a hallway without any security cameras, or a moment when the guards change shifts. A well-constructed story will introduce these elements as soon as it can. If Danny Ocean stumbles across an unsecured ventilation shaft during the heist, we cry foul; if he mentions it beforehand, we more or less accept it, although the element of luck is exactly the same. On a higher level, the villain’s complicated plan in Vertigo depends on a single huge assumption, as Hitchcock himself admitted to François Truffaut:
The husband was planning to throw his wife down from the top of the tower. But how could he know that James Stewart wouldn’t make it up those stairs? Because he became dizzy? How could he be sure of that!
Truffaut’s response is revealingly pragmatic: “That’s true, but I saw it as one of those assumptions you felt people would accept.” Which we do—but only because it’s there in the title of the movie itself, as a kind of anthropic principle on which the whole story depends. It’s still luck, but in a form that can’t be separated from the fabric of the overall movie.
I made good use of this principle in Eternal Empire, which includes more than its fair share of wild notions. Arguably the largest involves a plot point early in the novel: Maya Asthana, my unlikely mole, has to kill a man held in solitary confinement while avoiding all suspicion. At the very least, it was necessary that she be left alone with him without any security cameras—and here, already, were two big implausibilities. I “solved” the problem by putting it entirely out of her hands. Earlier in the novel, she and Wolfe visit Rogozin in detention, and it’s Wolfe who asks that the cameras be turned off, supposedly to put the suspect at ease, but really to make it less glaring when Asthana makes the same request later on. Similarly, in Chapter 16, it’s Wolfe who tells her to visit Rogozin, saying that she’s under too much scrutiny to go herself, while unwittingly setting the stage for Asthana’s plan. Clearly, from Asthana’s point of view, these are two enormous strokes of luck. I was reasonably fine with this, though, because the alternative, in which Asthana arranges for an unobserved visit entirely on her own initiative, would be even less plausible. Like most good villains, Asthana knows how to play the hand she’s been dealt. And if the deck has been stacked in her favor, hopefully the reader won’t see this until after the trick is over…
The three kinds of surprise
In real life, most of us would be happy to deal with fewer surprises, but in fiction, they’re a delight. Or at least movies and television would like to believe. In practice, twist endings and plot developments that arrive out of left field can be exhausting and a little annoying, if they emerge less out of the logic of the story than from a mechanical decision to jerk us around. I’ve noted before that our obsession with big twists can easily backfire: if we’re conditioned to expect a major surprise, it prevents us from engaging with the narrative as it unfolds, since we’re constantly questioning every detail. (In many cases, the mere knowledge that there is a twist counts as a spoiler in itself.) And Hitchcock was smart enough to know that suspense is often preferable to surprise, which is why he restructured the plot of Vertigo to place its big reveal much earlier than it occurs in the original novel. Writers are anxious to prevent the audience from getting ahead of the story for even a second, but you can also generate a lot of tension if viewers can guess what might be coming just slightly before the characters do. Striking that balance requires intelligence and sensitivity, and it’s easier, in general, to just keep throwing curveballs, as shows like 24 did until it became a cliché.
Still, a good surprise can be enormously satisfying. If we start from first principles, building on the concept of the unexpected, we end up with three different categories:
1. When something happens that we don’t expect.
2. When we expect something to happen, but something else happens instead.
3. When we expect something to happen, but nothing happens.
And it’s easy to come up with canonical examples of all three. For the first, you can’t do much better than the shower scene in Psycho; for the second, you can point to something like the famous fake-out in The Silence of the Lambs, in which the intercutting of two scenes misleads us into thinking that an assault team is closing in on Buffalo Bill, when Clarice is really wandering into danger on her own; and for the third, you have the scene in The Cabin in the Woods when one of the characters is dared to make out with the wolf’s head on the wall, causing us to brace ourselves for a shock that never comes. And these examples work so elegantly because they use our knowledge of the medium against us. We “know” that the protagonist won’t be killed halfway through; we “know” that intercutting implies convergence; and we “know” when to be wary of a jump scare. And none of these surprises would be nearly as effective for a viewer—if one even exists—who could approach the story in complete naiveté.
But not every surprise is equally rewarding. A totally unexpected plot development can come dangerously close—like the rain of frogs in Magnolia—to feeling like a gimmick. The example I’ve cited from The Silence of the Lambs works beautifully on first viewing, but over time, it starts to seem more like a cheat. And there’s a fine line between deliberately setting up a plot thread without paying it off and simply abandoning it. I got to thinking about this after finishing the miniseries Fargo, which I loved, but which also has a way of picking up and dropping story points almost absentmindedly. In a long interview with The A.V. Club, showrunner Noah Hawley tries to explain his thought process, with a few small spoilers:
Okay, Gus is going to arrest Malvo in episode four, and he’s going to call Molly to tell her to come, but of course, she doesn’t get to go because her boss goes. What you want is the scene of Molly and Malvo, but you’re not getting it…
In episode ten when Gus tells her to stay put, and she just can’t, and she gets her keys and goes to the car and drives toward Lester, we are now expecting a certain event to happen. Therefore, when that doesn’t happen, there’s the unpredictable nature of what’s going to happen, and you’re coming into it with an assumption…
By giving Russell that handcuff key, people were going to expect him to be out there for the last two episodes and play some kind of role in the end game, which is never a bad thing, to set some expectations [that don’t pay off].
Fargo is an interesting test case because it positions itself, like the original movie, as based on true events, when in fact it’s totally fictional. In theory, this frees it up to indulge in loose ends, coincidences, and lack of conventional climaxes, since that’s what real life is like. But as much as I enjoyed Fargo, I’m not sure I really buy it. In many respects, the show is obsessively stylized and designed; it never really feels like a story that could take place anywhere but in the Coenverse. And there are times when Hawley seems to protest too much, pointing to the lack of a payoff as a subversion when it’s really more a matter of not following through. The test, as always, is a practical one. If the scene that the audience is denied is potentially more interesting than what actually happens, it’s worth asking if the writers are being honest with themselves: after all, it’s relatively easy to set up a situation and stop, while avoiding the hard work that comes with its resolution. A surprise can’t just be there to frustrate our expectations; it needs to top them, or to give us a development that we never knew we wanted. It’s hard to do this even once, and even harder to do it consistently. But if the element of surprise is here to stay—and it doesn’t seem to be going anywhere—then it should surprise us, above all else, with how good it is.
“Is Rogozin an intelligence agent?”
Note: This post is the fifth installment in my author’s commentary for Eternal Empire, covering Chapter 4. You can read the previous installments here.
Earlier today, I quoted David Pye, the late professor of furniture design at The Royal College of Art, on the fact that every design is in certain ways a failure, since it represents a compromise between so many contradictory factors. Pye continues: “It is quite impossible for any design to be ‘the logical outcome of the requirements,’ simply because, the requirements being in conflict, their logical outcome is an impossibility.” I like this observation because it flies in the face of how we normally regard the objects around us. Unless the result is an outright fiasco—a chair that breaks, a program that crashes—it can be hard to register its many invisible compromises. For most of us, a chair is just a chair, at least at first glance. It’s only after we’ve sat in it for a long time that we start to understand the ways in which it falls short. And many designers devote as much energy to postponing that moment of realization as to addressing the underlying problems themselves. Ideally, by the time we notice the flaws, we’ll have moved on, in the way a small kitchen gadget is more likely to be lost before it breaks.
The idea of the postponement of failure, or the user’s perception of it, is central to design. As J.E. Gordon points out in Structures, sooner or later all bridges fall down, and he concludes: “It is the purpose of medicine and engineering to postpone these occurrences for a decent interval.” In a work of narrative art, you want to push that failure beyond the bounds of the story itself. Nearly every novel includes elements that would snap or break if extended too long, and that’s as true of an intimate domestic drama as it is of the most outlandish thriller. A big part of knowing when to begin or end a story lies in identifying the chunk of the action—which in theory could be extended infinitely in either direction—that seems in the least danger of falling apart. Even a masterpiece like Vertigo depends on selective omission, elision of inconvenient details, and a focus on those pieces that serve the story’s emotional needs. Later, we might question a few points, but as Hitchcock said, they won’t occur to us until late at night, when we’re going to the refrigerator for a snack. And if we can define the boundaries of the story in a way that defers those objections until the book is closed or the credits roll, we’ve succeeded.
This applies as much to a story’s small structural decisions as to questions of logic. Every extended narrative consists of adjacent pieces that probably occurred to the author at different stages. A crucial detail in the first chapter may have only been added in the last draft, after the rest of the story had already been written—in fact, it’s very likely to have done so—but it all has to seem as if it unfolded naturally from that initial premise. It may be true, as Pye says, that no object can truly be “the logical outcome of the requirements,” but just as a chair has to look like something we can take for granted, a story has to present itself as a series of inevitable developments, even if wasn’t. In a perfect world, we’d all proceed as David Mamet advises, with each incident flowing from a clear series of objectives, but in practice, there’s so much else a viable novel begs to include: characters who thrust themselves on the writer’s attention, color from the world that can’t be suppressed, functional scenes that exist only to make something wonderful happen later, or just things that the author wants to talk about. This assemblage of material is so central to what makes good novels come to life that it’s a mistake to deny it. But we’re still left with the problem of how to make the result seem consistent.
In Eternal Empire, for example, my biggest narrative challenge was the fact that the three opening subplots—Maddy’s recruitment to go undercover, Wolfe’s arrest of the dissident Vitaly Rogozin, and Ilya’s ordeal in prison—had almost nothing to do with one another. Later, they’ll converge in ways that I hope are surprising and satisfying, but for most of the novel’s first half, and especially in the first few chapters, there was a real danger that they’d be perceived as three unrelated stories that I was arbitrarily cutting together. The solution was to make each chapter look like it was about the same thing, which I did by harking back to the Rogozin plot whenever I could. Chapter 3 opens with Powell asking Maddy if she’s heard what happened, while Chapter 5 starts with her reading a newspaper with an article about the Rogozin case, neither of which are strictly necessary. And in Chapter 4, I show Ilya discussing these events with his lawyer, although the two men could have been talking about nearly anything: I just needed to get them into the same room. It’s something of a cheat, and it reminds me a little of Homer’s line in “The Itchy and Scratchy and Poochie Show”: “Whenever Poochie’s not on screen, all the other characters should be asking, ‘Where’s Poochie?'” But it works. And even if the chair wobbles a little, it will hopefully stand up long enough to get you to the next page…
The broken circle of Gone Girl
Note: A few oblique spoilers follow for Gone Girl.
Gone Girl, which finally arrives on home video this week, was predictably shut out at last night’s Golden Globe Awards. But while that particular ceremony may be something of a farce—as this article hilariously reminds us, honorees are determined by the votes of eighty-seven “total randos”—it feels like an indicator of the film’s position as we enter the back end of awards season. The Golden Globes didn’t even give it a Best Picture nomination, and it has largely fallen out of the Oscar conversation: it’s a likely nominee that tops no one’s list of potential winners. And it isn’t hard to see why. Gone Girl may be a commercial hit with universal critical acclaim, but it also falls into a genre of chilly, manipulative puzzle boxes that rarely earn major awards. It took decades for Vertigo to claim its true status as the central American movie of the fifties, in part because it looks so much at first glance like an implausible toy. Gone Girl isn’t as good as Vertigo, which is admittedly the highest possible standard to which a movie like this can be held, but it’s revealing that I even feel like discussing them in the same sentence.
That said, I was halfway expecting Gillian Flynn to walk away with a win for her sharp, canny screenplay, which is a category in which similarly tricky movies, like The Usual Suspects, have sometimes eked out a consolation prize. Flynn’s original novel hinges on a conceit—a diary that only gradually reveals that it has been written by an unreliable narrator—that should be all but impossible to make work on film, and the fact that it gets even ninety percent of the way there is a considerable achievement. (I’m deducting only a few points for one scene, a fictionalized flashback, that really should have occurred later on in the movie, after the diary itself had been discovered and read. Still, the movie as a whole is so tightly constructed that I’m willing to let it pass: it’s the kind of objection that only occurs to the viewer after the fact, and it probably works better in the moment to have the scene come where it does.) David Fincher’s direction pulls off a parallel feat; he’s a filmmaker whose attention to detail and technical obsessiveness have a way of calling attention to themselves, but here, as in The Social Network, he makes it all look easy, when it really represents a solution to almost insurmountable narrative challenges.
Gone Girl jerks us around so expertly, in fact, that I’m still a bit surprised that it falters near the end, when it suddenly stops and asks us to take it all very seriously. Again, the comparison with Hitchcock is an instructive one. Hitchcock would have loved this story, almost as much as he would have loved Rosamund Pike, but he wouldn’t have made the mistake of ruining the fun with an agonized denouement. He might have given us an ironic closing image, a last little shock, or even a gag—which doesn’t mean it wouldn’t have been emotionally satisfying. The final shot of Psycho, with Anthony Perkins’s face dissolving into a few subliminal frames of his mother’s skull as he looks into the camera, is the kind of closing fillip that gets a laugh even as it burrows into our unconscious. Hitchcock films as different as Notorious and Frenzy end on a similar punchline, and The Birds came close to doing the same. At the highest level of all, you have the ending of Vertigo, a sick joke that also breaks the heart. There’s no greater ending in all of movies, and it works because it’s so cruel, arbitrary, and unfair. (The alternate ending, which was apparently shot purely to appease European censors, only reminds us of how perfect it is to leave Stewart alone on that ledge.)
If Vertigo works so well, it’s because it exists within its own sealed world, until every element seems to stand for something else in our waking life. It isn’t an allegory, exactly; it’s more like a literalization, within the conventions of the thriller, of the way in which we impose new faces on ourselves and others, or try in our doomed way to recapture the past. Gone Girl covers much of the same territory, and if it’s interesting on a level beyond that of a clinical game, it’s as a heightened vision of what any marriage threatens to be—not just Affleck and Pike’s, but everybody’s. Oddly, it’s in stepping out of that closed circle that it becomes less convincing: when it returns us to reality, the prior ordeal starts to seem less real, or like a freak outlier, when the movie would have been better off keeping us immersed in the paranoid dream it creates. (The other great comparison here is Otto Preminger’s Laura, which hints explicitly that its second half is taking place within the hero’s head, but denies us a scene when he wakes up again.) The tradition of noir, to which Gone Girl is an honorable extension, works because it presents a mirror universe of our own, with a different set of rules but equally inexorable logic. Gone Girl comes to the point of implying that the same is true of any marriage, but it ends by being about theirs, not ours.
“And they lived happily ever after…”
In old age, I accept unhappy endings in Shakespearean tragedy, Flaubert, and Tolstoy, but back away from them in lesser works. Desdemona, Cordelia, Emma Bovary, and Anna Karenina are slain by their creators, and we are compelled to absorb the greatness of the loss. Perhaps it trains us to withstand better the terrible deaths of friends, family, and lovers, and to contemplate more stoically our own dissolution. But I increasingly avoid most movies with unhappy endings, since few among them aesthetically earn the suffering they attempt to inflict upon us.
I’m starting to feel the same way. For most of my life, I’ve never shied away from works of art with unhappy endings: in movies, the list begins and ends with Vertigo, the greatest of all sucker punches ever inflicted on an audience, and includes films as different as The Red Shoes, The Third Man, and Dancer in the Dark. When I’m given a choice between ambiguous interpretations, as in Inception, I’m often inclined to go with the darker reading. But as time goes on, I’ve found that I prefer happy endings, both from a purely technical standpoint and as a matter of personal taste.
Which isn’t to say that unhappy endings can’t work. Yesterday, I cited Bruno Bettelheim on the subject of fairy tales, which invariably end on an unambiguously happy note to encourage children to absorb their implicit lessons about life. As adults, our artistic needs are more complicated, if not entirely dissimilar. An unhappy ending of the sort that we find in the myth of Oedipus or Madame Bovary is psychological training of a different sort, preparing us, as Bloom notes, for the tragic losses that we all eventually experience. Just as scary movies acquaint us with feelings of terror that we’d rarely feel under ordinary circumstances, great works of art serve as a kind of exercise room for the emotions, expanding our capacity to feel in ways that would never happen if we only drew on the material of our everyday lives. If the happy endings in fairy tales prepare and encourage children to venture outside the safe confines of family into the wider world, unhappy endings in adult fiction do the opposite: they turn our attention inward, forcing us to scrutinize aspects of ourselves that we’ve been trained to avoid as we focus on our respectable adult responsibilities.
In order for this to work, though, that unhappiness has to be authentically earned, and the number of works that pull it off is vanishingly small. Endings, whether happy or unhappy, are very hard, and a lot of writers, including myself, are often unsure if they’ve found the right way to end a story. But given that uncertainty, it’s wisest, when you don’t know the answer, to err on positive side, and to ignore the voice that insists that an unhappy ending is somehow more realistic and uncompromising. In fact, a bleak, unearned ending is just as false to the way the world works as an undeserved happy one, and at greater cost to the reader. A sentimental happy ending may leave us unsatisfied with the author’s work, but that’s nothing compared to our sense of being cheated by a dark conclusion that arises from cynicism or creative exhaustion. Simply as a matter of craft, stories work best when they’re about the restoration of order, and one that ends with the characters dead or destroyed by failure technically meets that requirement. But for most writers, I’d argue that being able to restore a positive order to the tangle of complications they’ve created is a sign of greater artistic maturity.
And while it’s nice to believe that a happy or unhappy ending should flow naturally from the events that came before, a casual look at the history of literature indicates that this isn’t the case. Anna Karenina survived in Tolstoy’s first draft. Until its final act, Romeo and Juliet isn’t so different in tone from many of Shakespeare’s comedies, and if the ending had been changed to happily reunite the two lovers, it’s likely that we’d have trouble imagining it in any other way—although it’s equally likely that we’d file it permanently among his minor plays. On the opposite end of the spectrum, The Winter’s Tale is saved from becoming a tragedy only by the most arbitrary, unconvincing, and deeply moving of authorial contrivances. In practice, the nature of an ending is determined less by the inexorable logic of the plot than by the author’s intuition when the time comes to bring the story to a close, and as we’ve seen, it can often go either way. A writer has no choice but to check his gut to see what feels right, and I don’t think it’s too much to say that the burden lies with the unhappy ending to prove that it belongs there. Any halfway competent writer can herd his characters into the nearest available chasm. But when in doubt, get them out.
The best closing shots in film
Note: Since I’m taking a deserved break for the holidays, I’m reposting a couple of my favorite entries from early in this blog’s run. This post was originally published, in a slightly different form, on January 13, 2011. Visual spoilers follow. Cover your eyes!
As I’ve noted before, the last line of a novel is almost always of interest, but the last line of a movie generally isn’t. It isn’t hard to understand why: movies are primarily a visual medium, and there’s a sense in which even the most brilliant dialogue can often seem beside the point. And as much the writer in me wants to believe otherwise, audiences don’t go to the movies to listen to words: they go to look at pictures.
Perhaps inevitably, then, there are significantly more great closing shots in film than there are great curtain lines. Indeed, the last shot of nearly every great film is memorable, so the list of finalists can easily expand into the dozens. Here, though, in no particular order, are twelve of my favorites. Click for the titles:
Cindy Sherman and the viewer’s vertigo
Ranking artists and their works may be little more than a critical parlor game, but as games go, it’s fun and sometimes instructive, and it’s hard for me to resist a good list. I’ve been fascinated for as long as I can remember by the Sight & Sound poll of great movies, as much for its alterations over time as its current snapshot of our cultural canon: Vertigo‘s rise, Bergman’s fall, the ascent of such directors as Abbas Kiarostami and Wong Kar-Wai. Contemporary visual art doesn’t have a similar cyclical poll, but maybe it should, if only for what it tells us about our shifting tastes. Vanity Fair recently conducted a survey of art world luminaries to determine the greatest living artist, and at first glance, the results are more or less what you’d expect: Gerard Richter at the top, followed closely by Jasper Johns, with such familiar names as Richard Serra, Bruce Nauman, and Ellsworth Kelly appearing lower down. The omissions, too, are fascinating: Damien Hirst got a paltry three votes, Julian Schnabel none at all, which only reminds us how quickly the critical consensus can change, or how little auction prices and celebrity count in the long run. Yet a poll like this says as much about the way we see art as about the artists themselves, and I’d like to focus, in particular, on the most highly ranked woman on the list, Cindy Sherman, whose career reveals so much about the kind of art that grabs and maintains our attention.
I’ve been a Sherman devotee for a long time, ever since discovering her Untitled Film Stills in my sophomore year in college. In some ways, Sherman is the secret muse behind my novels: along with Diane Arbus and Susan Sontag, she’s the artist who first got me thinking about the ambivalent relationship of photography toward its practitioners and its subjects, a thread that runs throughout The Icon Thief, in which one of her photos makes a cameo appearance. And it’s no accident that all of these artists were women. The story of women and photography is a tangled one, tinged with notions of power and powerlessness, agency and objectification, the male gaze and the need to document lives that might otherwise go unseen. Arbus is best known for her photos of others, in the most literal sense, but we’re also fascinated by her self-portraits, and Sherman has always been her only model. The Untitled Film Stills are hugely powerful—I vividly remember my first encounter with one—but they’re also beguiling acts of mimicry and disguise. Sherman, who always enjoyed dressing up as imaginary characters, began photographing herself almost as an afterthought, and the result is a set of images that remain endlessly evocative. Each suggests a single enigmatic moment in an ongoing narrative, to the extent that you could use them as a collection of story prompts, like an eroticized version of The Mysteries of Harris Burdick, and even after thirty years, they’re still compelling and seductive.
Which brings me to an uncomfortable point about Sherman, and one that doesn’t always get enough emphasis in critical discussions of her career, which is how shrewdly her earlier pictures utilize her own sexuality to draw male viewers into her work. Sherman was and remains an attractive woman, and it’s hard to imagine her photos troubling us in the way they do if her own face wasn’t so naturally suited to being transformed into a fantasy object: sex kitten, ingenue, working girl, ice queen. Taken as a whole, these metamorphoses are troubling and mesmerizing, but when you view them in isolation, they slip uneasily into the same category of image they’re meant to satirize. Even as her work plays on clichés of feminine iconography, she seizes our attention by appealing to the same part of the brain. Two her framed prints live in my office, so familiar by now that I barely even notice them, but if I had to explain why I put them there, I’d have to admit that the impulse isn’t that different from when I’d decorate my room as a teenager with album sleeves and magazine cutouts. I like the way they look, and when I was younger, they ended up on the cover of more than one mix tape. Sherman spoofs and subverts the idea of the female art object, but in the process, she turns herself into the very thing she’s trying to undermine—a pin-up for readers of Art in America.
In her later work, Sherman would assume increasingly grotesque and horrifying masks, and her recent run of clown pictures, for instance, isn’t something you’d hang on a dorm room wall. Yet there’s no denying that her early fame and continued appeal rest on the way in which her most famous works simultaneously embody and undermine their sources: it’s no accident that the recent collection of her Untitled Film Stills used a cover image of Sherman at her most glamorous. Which may be her most enduring lesson. So many works of art stick in the imagination because they oscillate between surface appeal and darker depths: Vertigo is both the ultimate Hitchcock thriller, with its Edith Head costumes and lush Bernard Herrmann score, and also the most psychologically complex of all Hollywood movies. Like its opening shot, it starts with the eye, then plunges in deeper, and much of its power comes from how we identify first with Scotty, then with Madeline and her unwilling Shermanesque transformation. Sherman, like Hitchcock, draws us in with the story she seems to be telling, then forces us to rethink why we care about such stories at all. (It’s also true of Kara Walker, the only other woman on the Vanity Fair list, whose paper cutouts and silhouettes evoke classic children’s illustration while confronting us with the horror of slavery.) This requires both enormous technical virtuosity and a curious kind of naiveté, which we see whenever Hitchcock and Sherman discuss their work. We’re left intrigued but unsettled, unsure of how much of our response comes from the art itself or from our own misreading of it. And in the end, we realize that we’re in the picture, too, just beyond the edges of the frame.
How to be ambiguous
Writers are generally advised to avoid ambiguity. Clarity, as E.B. White observes, may not be a substitute for merit in writing, but it’s as close as we can get, so it’s good form for authors to state things as clearly as they can. It’s certainly the best rule to follow if there’s any doubt. Yet this does nothing to explain the fact that many of the works of art that affect us so deeply—from Hamlet to Vertigo to, yes, Mad Men—are founded on ambiguity. As in the case of most masterpieces, these can be dangerous examples for a writer to follow, but they’re also very tempting. Great fiction survives in the imagination because of the constellation of questions it raises in the reader’s mind, and the problem of balancing such uncertainties with a narrative that remains clear from moment to moment is one of the most difficult issues for a writer to face. And it soon becomes obvious, after writing or reading a few examples, that ambiguous language is not the best way to create a larger superimposition of interpretations.
As usual, we can get some useful insights by looking at poetry, the leading edge of language, whose lessons and innovations tend to filter down centuries later into prose. Poetry is often seen as ambiguous or obscure, but when you examine the greatest poems line by line, you find that this is an effect generated by the resonance of highly specific images—nouns, verbs, and concrete adjectives, all intelligible in themselves but mysterious as a whole. Take, for instance, the poem that I.A. Richards has called “the most mysterious poem in English,” Shakespeare’s “The Phoenix and the Turtle.” Each stanza stands with crystal clarity, and often something more, but the result has been interpreted as everything from a Catholic allegory to a veiled reference to the relationship between Sir John Salusbury and Queen Elizabeth, and as it stands, it’s a puzzle without an answer. A prefatory note spelling it out would have avoided much of this confusion, but in the process, it would have destroyed the magic.
Which leads us to a very important point, which is that ambiguity is best created out of a network of specifics with one crucial piece removed. It’s often been observed, for instance, that much of the mystery of Shakespeare’s plays emerges from the fact that he omits part of his original source material while leaving other elements intact. In the original Amleth story, there’s no confusion about the reasons for the lead character’s madness: he believes that his uncle is plotting against his life, so in order to protect himself and mislead his enemies, he pretends to be an idiot. Hamlet takes away this detail—Claudius doesn’t seem particularly interested in killing Hamlet at all until after he starts to act like a lunatic—and creates a tantalizing ambiguity in the process. The same is true of King Lear, in which the original source more clearly explains the king’s reasons for putting his three daughters to the test. The resulting plays are filled with concrete language and action, but the mystery remains.
And this is true of many works of art. We never know the origins of Montresor’s murderous vendetta in “The Cask of Amontillado,” but the story itself is so detailed that it practically serves as a manual on how to wall a man up alive, even as Poe denies us the one piece of information that most writers would have included first. (If Poe were alive today, I suspect that his editor would have begged him to flesh out the backstory.) Vertigo is the most mysterious movie ever made, but on watching it again, I’m struck by how much of it is grounded in specifics—the mundane details of Scotty’s life, the beautiful but realistic San Francisco settings, the way his obsession for Madeline manifests itself in trips to salons and department stores. Ambiguity, in other words, is only effective when the story itself is concrete enough to convincingly support multiple interpretations, which, in practice, usually means an even greater attention to clarity and convincing detail than if the line of the narrative were perfectly clear. A map that contains a single path can afford to leave the rest of the territory blank, but if we’re going to find our way down more than one road, we’ll need a better sense of the landscape, even, or especially, if the landmarks lead us astray.
Making the impossible plausible
Ideally, all stories should consist of a series of events that arise organically from the characters and their decisions, based on a rigorous understanding of how the world really works. In practice, and especially in genre fiction, it doesn’t always happen that way. Sometimes a writer just has a plot development that he really wants to write, and isn’t entirely sure how to get there from here. Frequently he’ll construct a story out of a number of unrelated ideas, and needs to cobble them together in a way that will seem inevitable after the fact. And sometimes he’ll simply paint himself into a corner, or realize that he’s overlooked a monstrous plot hole, and wants to extricate himself in a way that leaves his dignity—and most of the earlier material—intact. A purist might say that the author should throw the story out and start again, proceeding more honestly from first principles, but a working writer doesn’t always have that luxury. Better, I think, to find a way of keeping the parts that work and smoothing over the connective bits that seem implausible or unconvincing, while keeping the reader immersed in the fictional dream. And in the spirt of faking it until you make it, I offer the following suggestions:
1. Make it a fait accompli. As I’ve mentioned before, a reader is much more likely to accept a farfetched narrative development if the characters take it for granted. Usually, this means putting the weakest link in your story offstage. My favorite example is from Some Like It Hot, an incredibly contrived movie that shrewdly refuses to show the most crucial moment in the entire plot: instead of giving us a scene in which the main characters decide to go on the run in drag, it just cuts to the two of them already in skirts, rushing across the platform to catch a train to Florida. The lesson, clearly, is that if something in your story is obviously impossible, it’s better to pretend that it’s already happened. And the best strategy of all is to push the most implausible element of your story outside the boundaries of the plot itself, so it’s already in place before the story begins, which is what I’ve previously called the anthropic principle of fiction. If the viewer doesn’t see something happen, it requires an additional mental effort to rewind the story to object to it, and by then, the plot and characters have already moved on. It’s best to make like Jack Lemmon in heels, and just run with it.
2. Tell, don’t show. Normally, we’re trained to depict a turning point in the action as vividly as possible, and are taught that it’s bad form to describe an important moment indirectly or leave it offstage. When it comes to a weak point in the plot, however, that sort of scrutiny can only raise questions. It’s smarter, instead, to break a cardinal rule of fiction and get it out of the way as unobtrusively as possible. An implausible conversation, for instance, might best be rendered as indirect dialogue, leaving readers to fill in a more convincing version themselves. And if you can’t dramatize something in a credible fashion, it might be best to summarize it, in the way dramatists use the arrival of a messenger to convey developments that would be impossible to stage, although it’s best to keep this sort of thing as short as you can. There’s a particularly gorgeous example in the novel The Silence of the Lambs. Thomas Harris shows us Hannibal Lecter’s escape from federal security in loving detail, but when it comes to the most astonishing element of his getaway—the fact that he peels off another man’s face and wears it like a mask—he lets Jack Crawford describe it after the fact in a few terse sentences. It’s still hard to buy, but it’s more acceptable than if he’d allowed us to question the action as it unfolded.
3. Use misdirection. The secret of sleight of hand is that the audience’s eye is naturally drawn to action, humor, and color, allowing the performer to conduct outrageous manipulations in plain sight. Similarly, when a story is engaging enough from moment to moment, the reader is less likely to object to inconsistencies and plot holes. A film like Inception, for instance—which is probably my favorite movie of the last fifteen years—is riddled with logical problems, and even the basic workings of its premise aren’t entirely consistent, but we’re having too much fun to care. In some ways, this is the most important point of all: when a work of art is entertaining, involving, and emotionally true, we’re more likely to forgive the moments when the plot creaks. Some of our greatest books and movies, like Vertigo, are dazzling precisely because they expend a huge amount of effort to convince us of premises that, if the artist proceeded with uncompromising logic, would never make it past a rough draft. Just remember, as Aristotle pointed out, that a convincing impossibility is preferable to an unconvincing possibility, and take it from there.
“There’s something we need to talk about…”
leave a comment »
Note: This post is the fifty-first installment in my author’s commentary for Eternal Empire, covering Chapter 50. You can read the previous installments here.
Suspense is usually the most linear of genres, but a lot of thrillers include exactly one flashback. You know the one I mean: it comes near the end, just after the big twist, to explain precisely how you were fooled. In a heist movie, it frequently involves the revelation that the plan you thought the protagonists were following was actually something else entirely, and in films that are heavily dependent on fridge logic, it can reveal that much of the movie you believed you were watching was really an elaborate mislead. At its best, as with the unforgettable flashback that occurs two-thirds of the way through Vertigo, it can singlehandedly justify the whole concept of flashbacks in general; at its worst, in a movie like Now You See Me, it can leave you asking why you bothered taking any interest in the plot at all. And these reveals seem to be becoming more common, as the need to find new variations on old surprises has caused such plots to become ever more convoluted and implausible. (We’re at a point now where a single flashback scene isn’t enough: we’re treated to entire flashback montages, replaying what seems like half of the movie from a different point of view. When handled well, as in The Illusionist, this sort of thing can be delightful, but it can also leave a viewer feeling that the film hasn’t played fair with its obligation to mislead us with what it shows, rather than what it omits.)
This sort of flashback is obviously designed to save a surprise for the end of the movie, which is where we’ve been conditioned to expect it—even if some violence has to be done to the fabric of the narrative to put the reveal in the last ten minutes, instead of where it naturally occurred. This isn’t a new strategy. Jack Woodford, the pulp writer whose instructional book Trial and Error was carefully studied by Robert A. Heinlein, thought that all stories should end with a punch ending, and he offered a very useful tip on how to artificially create one:
This is why so many stories contrive to withhold crucial information until the point where it carries the most impact, even if it doesn’t quite play fair. (You frequently see this in the early novels of Frederick Forsyth, like The Odessa File or The Dogs of War, which leave out a key element of the protagonist’s motivation, only to reveal it at the climax or on the very last page. It’s such a good trick that you can almost forgive Forsyth for reusing it three or four times.)
Another advantage to delaying the explanatory scene for as long as possible is that it turns an implausible twist into a fait accompli. I’ve noted before that if there’s a particularly weak point in the story on which the credibility of the plot depends, the best strategy for dealing with it is to act as if has already happened, and to insert any necessary justifications after you’ve presented the situation as blandly as possible. Readers or audiences are more likely to accept a farfetched plot development after it has already been taken for granted. If they had been allowed to watch it unfold from scratch, during the fragile early stages, they would have been more likely to object. (My favorite example is how in the two great American drag comedies, Some Like it Hot and Tootsie, we never see the main characters make the decision to pose as women—we cut to them already in makeup and heels, which mostly prevents us from raising any of the obvious objections.) This explains why the expository flashback, while often ludicrously detailed, rarely shows us the one scene that we really want to see: the conversation in which one character had to explain to the rest what he wanted them to do, and why. Even a classic twist ending like the one in The Sting falls apart when we imagine the characters putting it into words. The act of speaking the plan aloud would only destroy its magic.
I put these principles to good use in Chapter 50 of Eternal Empire, which rewinds the plot slightly to replay a crucial scene in its entirety. Structuring it as a flashback was clearly meant to preserve the surprise, but also to downplay its less plausible angles. For the story to work, Maddy had to reveal herself to Tarkovsky, justify her good intentions, and propose a complicated counterplot, all in the course of a single conversation. I think that the chapter does a decent job of pulling it off, but placing the discussion here, after the effects of the decision have already been revealed, relieves it of some of the weight. The reader is already invested in the premise, simply by reading the events of the preceding chapters, and I hoped that this would carry us past any gaps in the logic. But it’s worth noting that I never actually show the crux of the conversation, in which Maddy spells out the plan she has in mind. Asking a character to fake his death for the sake of some elaborate charade is a scene that can’t possibly play well—which might be why we almost never see it, even though a similar twist seems to lie at the bottom of half of the surprise endings ever written. We don’t hear Maddy telling Tarkovsky what she wants him to do; we just see the results. It’s a form of selective omission that goes a long way toward making it all acceptable. But as the reader will soon discover, the plan hasn’t gone quite as well as they think…
Like this:
Written by nevalalee
April 21, 2016 at 9:09 am
Posted in Books, Writing
Tagged with Eternal Empire commentary, Frederick Forsyth, Jack Woodford, Now You See Me, Some Like It Hot, The Illusionist, The Sting, Tootsie, Trial and Error, Vertigo