Posts Tagged ‘Hannibal’
Yesterday, while writing about the pitfalls of quotation in book reviews, I mentioned the famous smackdown that Martin Amis delivered to the novel Hannibal by Thomas Harris. When I went back to look up the lines I wanted to quote, I found myself reading the whole thing over again, just for the simple pleasure of it. It’s one of the great critical slams of all time, and it checks off most of the boxes that this kind of shellacking requires. Amis begins by listing a few hyperbolic claims made by other reviewers—“A momentous achievement,” “A plausible candidate for the Pulitzer Prize”—and then skewers them systematically. He comes at the novel, significantly, from a position of real respect: Amis calls himself “a Harris fan from way back.” Writing of the earlier books in the series, he says that Harris has achieved what every popular novelist hopes to accomplish: “He has created a parallel world, a terrible antiterra, airless and arcane but internally coherent.” When Amis quotes approvingly from these previous installments, it can only make Hannibal look worse by comparison, although Harris doesn’t do himself any favors:
[Lecter] has no need of “need”: Given the choice, he—and Harris—prefer to say “require”…Out buying weapons—or, rather, out “purchasing” weapons—he tells the knife salesman, “I only require one.” Why, I haven’t felt such a frisson of sheer class since I last heard room service say “How may I assist you?’” And when Lecter is guilty of forgetfulness he says “Bother”—not “Shit” or “Fuck” like the rest of us. It’s all in the details.
Reading the review again, I realized that it falls squarely in the main line of epic takedowns that begins with Mark Twain’s “Fenimore Cooper’s Literary Offenses.” This is a piece that was probably ruined for a lot of readers by being assigned to them in high school, but it deserves a fresh look: it really is one of the funniest and most valuable essays about writing we have, and I revisit it every couple of years. Like Amis, Twain begins by quoting some of his target’s puffier critical encomiums: “The five tales reveal an extraordinary fullness of invention…The craft of the woodsman, the tricks of the trapper, all the delicate art of the forest were familiar to Cooper from his youth up.” (In response, Twain proposes the following rule: “That crass stupidities shall not be played upon the reader as ‘the craft of the woodsman, the delicate art of the forest’ by either the author or the people in the tale.”) Both Twain and Amis are eager to go after their subjects with a broadsword, but they’re also alert to the nuances of language. For Amis, it’s the subtle shading of pretension that creeps in when Harris writes “purchases” instead of “buys”; for Twain, it’s the distinction between “verbal” and “oral,” “precision” and “facility,” “phenomena” and “marvels,” “necessary” and “predetermined.” His eighteen rules of writing, deduced in negative fashion from Cooper’s novels, are still among the best ever assembled. He notes that one of the main requirements of storytelling is “that the personages in a tale shall be alive, except in the case of corpses, and that always the reader shall be able to tell the corpses from the others.” Which, when you think about it, is even more relevant in Harris’s case—although that’s a subject for another post.
I’ve learned a lot from these two essays, and it made me reflect on the bad reviews that have stuck in my head over the years. In general, a literary critic should err on the side of generosity, especially when it comes to his or her contemporaries, and a negative review of a first novel that nobody is likely to read is an expense of spirit in a waste of shame. But occasionally, a bad review can be just as valuable and memorable as any other form of criticism. I may not agree with James Wood’s feelings about John le Carré, but I’ll never forget how he sums up a passage from Smiley’s People as “a clever coffin of dead conventions.” Once a year or so, I’ll find myself remembering John Updike’s review of Tom Wolfe’s A Man in Full, which notes the author’s obsession with muscular male bodies—“the latissimi dorsi,” “the trapezius muscles”—and catalogs his onomatopoetics, which are even harder to take seriously when you have to type them all out:
“Brannnnng! Brannnnng! Brannnnng!,” “Woooo-eeeeeee! Hegh-heggghhhhhh,” “Ahhhhhhhhhhh ahhhhhhhhhhhh ahhhhhhhhhhh,” “Su-puerflyyyyyyyyyyyyyyyy!,” “eye eye eye eye eye eye eye eye eye,” Scrack scrack scrack scraccckkk scraccccck,” “glug glug glug glugglugglug,” “Awriiighhhhhhhht!”
And half of my notions as a writer seem to have been shaped by a single essay by Norman Mailer, “Some Children of the Goddess,” in which he takes careful aim at most of his rivals from the early sixties. William Styron’s Set This House on Fire is “the magnum opus of a fat spoiled rich boy who could write like an angel about landscape and like an adolescent about people”; J.D. Salinger’s four novellas about the Glass family “seem to have been written for high-school girls”; and Updike himself writes “the sort of prose which would be admired in a writing course overseen by a fussy old nance.”
So what makes a certain kind of negative review linger in the memory long after the book in question has been forgotten? It often involves one major writer taking aim at another, which is already more interesting than the sniping of a critic who knows the craft only from the outside. In most cases, it picks on a potential competitor, which is a target worthy of the writer’s efforts. And there’s usually an undercurrent of wounded love: the best negative reviews, like the one David Foster Wallace wrote on Updike’s Toward the End of Time, reflect a real disillusionment with a former idol. (Notice, too, how so many of the same names keep recurring, as if Mailer and Updike and Wolfe formed a closed circle that runs forever, like a perpetual motion machine of mixed feelings.) Even when there’s no love lost between the critic and his quarry, as with Twain and Cooper, there’s a sense of anger at the betrayal of storytelling by someone who should know better. To return to poor Thomas Harris, I’ll never forget the New Yorker review by Anthony Lane that juxtaposed a hard, clean excerpt from The Silence of the Lambs:
“Lieutenant, it looks like he’s got two six-shot .38s. We heard three rounds fired and the dump pouches on the gunbelts are still full, so he may just have nine left. Advise SWAT it’s +Ps jacketed hollowpoints. This guy favors the face.”
With this one from Hannibal Rising:
“I see you and the cricket sings in concert with my heart.”
“My heart hops at the sight of you, who taught my heart to sing.”
Lane reasonably responds: “What the hell is going on here?” And that’s what all these reviews have in common—an attempt by one smart, principled writer to figure out what the hell is going on with another.
I don’t have the numbers to back this up, but I have a hunch that most professional writers rarely go back to reread their own work. In an interview with The Paris Review, the novelist François Mauriac puts his finger on why revisiting a published story can be such an unpleasant experience:
I only reread my books when I have to in correcting proofs. The publication of my complete works condemned me to this; it is as painful as rereading old letters. It is thus that death emerges from abstraction, thus we touch it like a thing: a handful of ashes, of dust.
The more you unpack this statement, the more insightful it becomes. Reading one of your published stories is like reading an old letter in several ways: it confronts you with the image of yourself when you were younger, it makes your mistakes more visible in hindsight, and it shows you how insidiously the present has turned into the dead past. It’s the fossilized remnant of a process that used to be alive, and as soon as a work of art is locked into its final form, you see all kinds of problems with it. This isn’t necessarily because you could do any better now, but because a story on the page always seems less interesting than it did in your head. When you’re experiencing the work of other writers, you rarely dwell on how else it might have been done, but when you’re reading your own stuff, it’s hard to think about anything else.
This kind of estrangement from a work to which you devoted so much time and energy is unbearably sad—or it would be, if the writer didn’t immediately move on to the next thing. And it explains why the rare story that you can enjoy for its own sake becomes so precious. Usually, it’s something that came fairly easily, as if you were simply transcribing a moment of inspiration that descended from somewhere higher up, or rose from the depths of the subconscious. Isaac Asimov called it “writing over my head,” saying: “I occasionally write better than I ordinarily do…When I reread one of these stories or passages, I find it hard to believe that I wrote it, and I wish ardently that I could write like that all the time.” (Asimov said that he cried whenever he reread the ending of his own story “The Ugly Little Boy.”) Alternatively, you can feel safely detached from one of your own works if you were operating as an artist for hire, without much of a personal stake in the result, but did your job at a high level of technical proficiency. Steven Spielberg has said that the only one of his movies that he can watch with his kids as if he hadn’t directed it, rather than remembering what it was like on the set each day, is Raiders of the Lost Ark. You can see why: it was George Lucas’s baby, and what Spielberg brought to the project was a matchless eye and a useful degree of distance from the material. And I’m not surprised that the result delights him as much as it does me.
When it comes to my own work, there’s almost nothing that I can read now for my own pleasure. Occasionally, like Mauriac, I’ll need to correct page proofs, and I always have to gather my courage a bit: you’re strictly limited in the number of changes you can make, and you can’t imperceptibly massage the text in the way you can when you’re fiddling with a draft in Word. Reviewing proofs shortly after you’ve finished a story is even wore than reading an old letter—it’s like encountering an ex-boyfriend or girlfriend soon after a breakup, when you realize that you’ll never be able to take back what happened. (Not every writer feels this way, and some, like James Joyce, notoriously rewrote entire sections of the manuscript in galley form. But I’ve always assumed that making extensive changes at this stage will only introduce unforeseen complications, so I try to restrict myself to altering a word or a punctuation mark here and there.) Even after my feelings have cooled and a story sits on the shelf like a dead thing, it’s hard for me to look at it again: it’s like being confronted with your irrevocable life choices all at once. And if I had to make a list of the bits and pieces of my fiction that I wouldn’t mind reading again, it represents a tiny slice of the whole: maybe “The Boneless One,” most of “Kawataro” and “The Whale God,” the second half of “Ernesto,” the closing summation in The Icon Thief, and the plane crash and tunnel chase in City of Exiles. That’s about it.
In most of these cases, I was writing over my head, either because I was following up on a good idea that seemed to come out of nowhere, or because I was able to subordinate myself to the mechanics of a plot that I’d already set in motion. And of all the pages I’ve published, Chapter 58 of Eternal Empire might be my favorite—which is to say, if you forced me to pick something to read again, it’s the one I’d probably chose. It isn’t the most complex or difficult thing I’ve written: once I knew that Wolfe and Ilya would team up to take down a dacha full of gangsters and save Maddy, it was mostly just a matter of not screwing it up. But I had a great time writing it, and I still have a good time reading the result. The confluence of names I mentioned above is part of the reason why: it’s one of the few occasions when I felt that I was writing fanfic for my own creations, not because I was indulging myself, but because it combined characters for a payoff that I never would have imagined when I wrote the first book in the series. It’s obviously indebted to scenes like the shootout at the Victory Motel in both the novel and the film versions of L.A. Confidential, and if Wolfe at the climax of City of Exiles slipped into the Clarice Starling of The Silence of the Lambs, she’s closer here to the Starling of Hannibal. It’s the finest moment for my favorite character in the trilogy, which is reason enough for me to like it. Throughout this entire author’s commentary, I’d been looking forward to writing about it, but now that I’m here, I find that I don’t have much to say except that I think it’s pretty damned good. And I’m going back to read it again now…
Note: I’m on vacation this week, so I’ll be republishing a few of my favorite posts from earlier in this blog’s run. This post originally appeared, in a slightly different form, on April 22, 2014.
Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.
The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.
As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.
At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the great pleasures of watching Hannibal lay in how it learned to acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. Fargo remains the most fascinating drama on television in large part because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.
Note: Spoilers follow for the book and miniseries And Then There Were None.
Over the weekend, my wife and I caught up with the recent BBC adaptation of And Then There Were None, which aired in two parts last week on Lifetime. It’s a nice, overwrought version of Agatha Christie’s story, faithful to the novel in its outlines but cheerfully willing to depart from it in the details, and I liked it a lot. (I particularly enjoyed Maeve Dermody’s swift descent from an Emily Blunt lookalike to something like a crazy cat lady, complete with dark circles under both eyes.) And it also gives me an excuse to revisit the weirdest novel ever to sell one hundred million copies. The book reads like Christie’s attempt to see how far she could push her classic formula—a series of baffling murders in a closed setting—without alienating her audience, and as clinical as the result often feels, readers have never ceased to respond to it: by any reckoning, it’s the bestselling mystery novel of all time. With every single character serving in turn as bystander, suspect, and victim, it takes this sort of novel to its limit, and it incidentally discovers how few of the standard elements are necessary. There isn’t a sympathetic protagonist in sight, or even a detective. As Sarah Phelps, who wrote the miniseries, observes in a perceptive interview:
Within the Marple and Poirot stories somebody is there to unravel the mystery, and that gives you a sense of safety and security, of predicting what is going to happen next…In this book that doesn’t happen—no one is going to come to save you, absolutely nobody is coming to help or rescue or interpret.
In other words, the puzzle itself is the star, just as the plot is the hero in most science fiction—a genre that often overlaps with this sort of mystery. (And Then There Were None was published just a year after “Who Goes There?” by John W. Campbell, which tells much the same story, except with a shapeshifting alien as the villain.) Watching Noah Taylor in the role of the sinister servant who places the ten figurines on the table, I joked that he was playing the Tim Curry part, but’s a hint of truth there: Christie emphasized the gamelike aspects of the genre long before there was anything like Clue, and she plants the seeds of her own future parodies so consciously that there’s hardly any point in mocking those conventions. And Then There Were None is structured like the five-minute mysteries that contemporary readers probably know best through the likes of Encyclopedia Brown: after the last victim dies, there’s a convenient summary of the relevant facts by two bewildered cops at Scotland Yard, followed by what amounts to a sealed bonus chapter with the killer’s confession, complete with a list of the clues that the reader might have missed. As the murderer writes: “It was my ambition to invent a murder mystery that no one could solve.” And if we had any doubt about the identification of the killer with Christie herself, this should put it to rest. Christie is the murderer, even if she appears in the story under a different face and name.
This, I think, is why the original novel has always been such a spectacular success: it gets closer than any other to the uneasy way in which the author and the killer, rather than the detective, turn out to be one and the same. Christie’s guilty party is one of the earliest exemplars of a character type that we recognize from John Doe in Seven, Jigsaw in the Saw movies, and even Christopher Nolan’s version of the Joker: the killer whose control of the story is so complete that he can’t be separated from the screenwriter. In my discussion of the television series Hannibal, I noted that it sometimes seemed as if Lecter himself was in the writers room, or dictating material to Thomas Harris: he was so adept at manipulating the men and women around him that he practically became the showrunner. If the detective in a mystery novel is a surrogate for the reader, who approaches the text as a series of clues, the killer can only be the writer, and by removing the detective from the story entirely, Christie makes this identity even more explicit. We’re cast in the part of an invisible sleuth, moving unseen on the island as the victims are eliminated one by one, with Christie as our ice-cold antagonist, seated at the other end of the board. (The writer selects her victims as carefully as the killer does: note that all the characters are childless and—except for the servant couple—unmarried, which allows them to be dispatched with a minimum of regret.)
And those ten figures on the dining table aren’t there by accident. They’re tokens in the game that Edward Fitzgerald describes in The Rubaiyat of Omar Khayyam:
But helpless pieces of the game he plays
Upon this chequerboard of nights and days;
Hither and thither moves, and checks, and slays,
And one by one back in the closet lays.
Christie certainly knew that verse: it appears only a few lines before the stanza that she used a few years later for the title of her novel The Moving Finger. And Then There Were None confirmed her as the genre’s ultimate chess master, and one of the pleasures in reading it again comes from our knowledge of how cunningly she uses the elements of the novel itself—like the third person omniscient point of view—to mislead and ensnare us. (That’s one way in which the miniseries, for all its cleverness, can’t match the novel: Christie moves in and out of the heads of her characters, including the killer, without cheating. A televised version of the same story only has to concern itself with the surfaces, which makes its job relatively easy.) Christie tricked us here in ways that can’t be reproduced, regardless of how many other works have copied its central twist. Mysteries come and go, but And Then There Were None is where the genre begins and ends. And there can only be one.
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What show did you stop watching after a character was killed off?”
Inside Out is an extraordinary film on many levels, but what I appreciated about it the most was the reminder it provides of how to tell compelling stories on the smallest possible scale. The entire movie turns on nothing more—or less—than a twelve-year-old girl’s happiness. Riley is never in real physical danger; it’s all about how she feels. These stakes might seem relatively low, but as I watched it, I felt that the stakes were infinite, and not just because Riley reminded me so much of my own daughter. By the last scene, I was wrung out with emotion. And I think it stands as the strongest possible rebuke to the idea, so prevalent at the major studios, that mainstream audiences will only be moved or excited by stories in which the fate of the entire world hangs in the balance. As I’ve noted here before, “Raise the stakes” is probably the note that writers in Hollywood get the most frequently, right up there with “Make the hero more likable,” and its overuse has destroyed their ability to make such stories meaningful. When every superhero movie revolves around the fate of the entire planet, the death of six billion people can start to seem trivial. (The Star Trek reboot went there first, but even The Force Awakens falls into that trap: it kills off everyone on the Hosnian System for the sake of a throwaway plot point, and it moves on so quickly that it casts a pall over everything that follows.)
The more I think about this mindless emphasis on raising the stakes, the more it strikes me as a version of a phenomenon I’ve discussed a lot on this blog recently, in which big corporations tasked with making creative choices end up focusing on quantifiable but irrelevant metrics, at the expense of qualitative thinking about what users or audiences really need. For Apple, those proxy metrics are thinness and weight; for longform journalism, it’s length. And while “raising the stakes” isn’t quite as quantitative, it sort of feels that way, and it has the advantage of being the kind of rule that any midlevel studio employee can apply with minimal fear of being wrong. (It’s only when you aggregate all those decisions across the entire industry that you end up with movies that raise the stakes so high that they turn into weightless abstractions.) Saying that a script needs higher stakes is the equivalent of saying that a phone needs to be thinner: it’s a way to involve the maximum number of executives in the creative process who have no business being there in the first place. But that’s how corporations work. And the fact that Pixar has managed to avoid that trap, if not always, then at least consistently enough for the result to be more than accidental, is the most impressive thing about its legacy.
A television series, unlike a studio franchise, can’t blow up the world on a regular basis, but it can do much the same thing to its primary actors, who are the core building blocks of the show’s universe. As a result, the unmotivated killing of a main character has become television’s favorite way of raising the stakes—although by now, it feels just as lazy. As far as I can recall, I’ve never stopped watching a show solely because it killed off a character I liked, but I’ve often given up on a series, as I did with 24 and Game of Thrones and even The Vampire Diaries, when it became increasingly clear that it was incapable of doing anything else. Multiple shock killings emerge from a mindset that is no longer able to think itself into the lives of its characters: if you aren’t feeling your own story, you have no choice but to fall back on strategies for goosing the audience that seem to work on paper. But almost without exception, the seasons that followed would have been more interesting if those characters had been allowed to survive and develop in honest ways. Every removal of a productive cast member means a reduction of the stories that can be told, and the temporary increase in interest it generates doesn’t come close to compensating for that loss. A show that kills characters with abandon is squandering narrative capital and mortgaging its own future, so it’s no surprise if it eventually goes bankrupt.
A while back, Bryan Fuller told Entertainment Weekly that he had made an informal pledge to shun sexual violence on Hannibal, and when you replace “rape” with “murder,” you get a compelling case for avoiding gratuitous character deaths as well:
There are frequent examples of exploiting rape as low-hanging fruit to have a canvas of upset for the audience…“A character gets raped” is a very easy story to pitch for a drama. And it comes with a stable of tropes that are infrequently elevated dramatically, or emotionally. I find that it’s not necessarily thought through in the more common crime procedurals. You’re reduced to using shorthand, and I don’t think there can be a shorthand for that violation…And it’s frequently so thinly explored because you don’t have the real estate in forty-two minutes to dig deep into what it is to be a victim of rape…All of the structural elements of how we tell stories on crime procedurals narrow the bandwidth for the efficacy of exploring what it is to go through that experience.
And I’d love to see more shows make a similar commitment to preserving their primary cast members. I’m not talking about character shields, but about finding ways of increasing the tension without taking the easy way out, as Breaking Bad did so well for so long. Death closes the door on storytelling, and the best shows are the ones that seem eager to keep that door open for as long as possible.
At this point, it might seem like there’s nothing new to say—at least by me—about The Silence of the Lambs. I’ve discussed both the book and the movie here at length, and I’ve devoted countless posts to unpacking Hannibal Lecter’s most recent televised incarnation. Yet like all lasting works of art, and I’d argue that both the novel and the film qualify, The Silence of the Lambs continues to reveal new aspects when seen from different angles, especially now that exactly a quarter of a century has gone by since the movie’s release. Watching it again today, for instance, it’s hard not to be struck by how young Clarice Starling really is: Jodie Foster was just twenty-eight when the film was shot, and when I look at Starling from the perspective of my middle thirties, she comes off as simultaneously more vulnerable and more extraordinary. (I have an uneasy feeling that it’s close to the way Jack Crawford, not to mention Lecter, might have seen her at the time.) And it only highlights her affinities to Buffalo Bill’s chosen prey. This isn’t exactly a revelation: that sense of a dark sisterhood is a pivotal plot point in the original story. But it’s one thing to grasp this intellectually and quite another to go back and see how cannily the movie casts actresses as Bill’s victims who subtly suggest Foster’s own facial features, just a little wider. And it’s more clear than ever how Foster’s early fame, her passage into movies like Taxi Driver, her strange historical linkage to a stalker and failed assassin, and her closely guarded personal life gave her the tools and aura to evoke Starling’s odd mixture of toughness and fragility.
What’s also obvious now, unfortunately, is the extent to which Starling was—and remains—an anomaly in the genre. Starling, as embodied by Foster, has inspired countless female leads in thrillers in the decades since. (When I found myself obliged to create a similar character for my own novels, my thoughts began and ended with her.) Yet aside from Dana Scully, the results have been less than memorable. Starling has always been eclipsed by the shadow of the monster in the cell beside her, but in many ways, she was a harder character to crack, and the fact that she works so well in her written and cinematic incarnations is the result of an invisible, all but miraculous balancing act. None of the later efforts in the same direction have done as well. Christopher McQuarrie, while discussing the characters played by Emily Blunt in Edge of Tomorrow and Rebecca Ferguson in Mission: Impossible—Rogue Nation, gets close to the heart of the challenge:
They’re not men. They’re women that are not trying to be men…To me, more than anything, Rebecca is mature, elegant, confident, and at peace. Her only vulnerability in the movie is she’s just as fucked as everybody else…Usually when you want to create vulnerability for a woman, it’s about giving her a neurosis—a fear or some emotional arc that, ultimately, gets the better of her, whether it’s a need for revenge or need for redemption. You know, “Her father was killed by a twister, so she has to defeat twisters no matter what,” and I wouldn’t have any of that either. It simply was: you’re here on your own terms and you’re in a shitty situation created by people in power above you. How do you escape this situation and maintain your dignity?
Which isn’t to say that Starling didn’t suffer from her share of father issues. But those last two sentences capture her appeal as well as any I’ve ever read.
Time also offers some surprising perspectives on Lecter himself, or at least the version of him we see here. The Silence of the Lambs, like Rocky, is one of those classic movies that has been diminished in certain respects by our knowledge of the sequels that followed it. Conventional wisdom holds that Anthony Hopkins’s take on Lecter became broader and more self-indulgent with every installment, and it’s fashionable to say that the best version of the character was really Brian Cox in Manhunter, or, more plausibly, Mads Mikkelsen on Hannibal. It’s a seductively contrarian argument, but it’s also inherently ridiculous. As great as the novel is, we probably wouldn’t be talking about Lecter or Thomas Harris or The Silence of the Lambs at all if it weren’t for Hopkins’s performance. And in many ways, it’s his facile, even superficial interpretation of the character that made the result so potent. Hopkins was discovered and mentored by Laurence Olivier, whom he understudied in August Strindberg’s Dance of Death, and it helps to view his approach to Lecter through the lens of the quote from Olivier that I cited here the other week: “I’m afraid I do work mostly from the outside in. I usually collect a lot of details, a lot of characteristics, and find a creature swimming about somewhere in the middle of them.” Hopkins’s creature is the finest example I know of a classically trained stage lion slumming it in a juicy genre part, and even if it wasn’t a particularly difficult performance once Hopkins figured out the voice, still—he figured out that voice.
And as soon as we acknowledge, or even embrace, the degree to which Lecter is a fantasy that barely survives twelve minutes onscreen, the more this approach seems like a perfectly valid solution to this dance of death. If Lecter seemed increasingly hammy and unconvincing in the movie versions of Hannibal and Red Dragon, that isn’t a failure on Hopkins’s part: making him the main attraction only brought out the artificiality and implausibility that had been there all along, and Hopkins just did what any smart actor would have done under the circumstances—take the money and try to salvage his own sense of fun. (As it happens, Ted Tally’s script for Red Dragon is surprisingly good, a thoughtful, inventive approach to tough material that was let down by the execution. If I had to choose, I’d say he did a better job on the page than Bryan Fuller ultimately did with the same story.) With the passage of time, it’s increasingly clear that Lecter falls apart even as you look at him, and that he’s a monster like the shark in Jaws or the dinosaurs that would follow two years later in Jurassic Park: they’re only convincing when glimpsed in flashes or in darkness, and half of the director’s art lies in knowing when to cut away. Put him front and center, as the sequels did, and the magic vanishes. Asking why Hopkins is so much more effective in The Silence of the Lambs than in the films that followed is like asking why the computer effects in Jurassic Park look better than their equivalents today: it isn’t about technology or technique, but about how the film deploys it to solve particular problems. Twelve minutes over twenty-five years is about as much scrutiny as Hopkins’s wonderful Lecter could sustain. And the rest, as they say, should have been silence.
Note: Spoilers follow for the season finale of Hannibal.
When it comes to making predictions about television shows, my track record is decidedly mixed. I was long convinced, for instance, that Game of Thrones would figure out a way to keep Oberyn Martell around, just because he was such fun to watch, and to say I was wrong about this is something of an understatement. Let the record show, however, that I said here months ago that the third season of Hannibal would end with Will Graham getting a knife through his face:
In The Silence of the Lambs, Crawford says that Graham’s face “looks like damned Picasso drew it.” None of the prior cinematic versions of this story have dared to follow through on this climax, but I have a feeling, given the evidence, that Fuller would embrace it. Taking Hugh Dancy’s face away, or making it hard for it look at, would be the ultimate rupture between the series and its viewers. Given the show’s cancellation, it may well end up being the very last thing we see. It would be a grim note on which to end. But it’s nothing that this series hasn’t taught us to expect.
This wasn’t the hardest prediction in the world to make. One of the most distinctive aspects of Bryan Fuller’s take on the Lecter saga is his willingness to pursue elements of the original novels that other adaptations have avoided, and the denouement of Red Dragon—with Will lying alone, disfigured, and mute in the hospital—is a downer ending that no other version of this story has been willing to touch.
Of course, that wasn’t what we got here, either. Instead of Will in his hospital bed, brooding silently on the indifference of the natural world to murder, we got a hysterical ballet of death, with Will and Hannibal teaming up to dispatch Dolarhyde like the water buffalo at the end of Apocalypse Now, followed by an operatic plunge over the edge of a cliff, with our two star-crossed lovers locked literally in each other’s arms. And it was a worthy finale for a series that has seemed increasingly indifferent to anything but that unholy love story. The details of Lecter’s escape from prison are wildly implausible, and whatever plan they reflect is hilariously undercooked, even for someone like Jack Crawford, who increasingly seems like the world’s worst FBI agent in charge. Hannibal has never been particularly interested its procedural elements, and its final season took that contempt to its final, ludicrous extreme. In the novel Red Dragon, Will, despite his demons, is a competent, inspired investigator, and he’s on the verge of apprehending Dolaryhyde through his own smarts when his quarry turns the tables. In Fuller’s version, unless I missed something along the way, Will doesn’t make a single useful deduction or take any meaningful action that isn’t the result of being manipulated by Hannibal or Jack. He’s a puppet, and dangerously close to what TV Tropes has called a Woobie: a character whom we enjoy seeing tortured so we can wish the pain away.
None of this should be taken as a criticism of the show itself, in which any narrative shortcomings can hardly be separated from Fuller’s conscious decisions. But as enjoyable as the series has always been—and I’ve enjoyed it more than any network drama I’ve seen in at least a decade—it’s something less than an honest reckoning with its material. As a rule of thumb, the stories about Lecter, including Harris’s own novels, have been the most successful when they stick most closely to their roots as police procedurals. Harris started his career as a crime reporter, and his first three books, including Black Sunday, are masterpieces of the slow accumulation of convincing detail, spiced and enriched by a layer of gothic violence. When you remove that foundation of realistic suspense, you end up with a character who is dangerously uncontrollable: it’s Lecter, not Harris, who becomes the author of his own novel. In The Annotated Dracula, Leslie S. Klinger proposes a joke theory that the real author of that book is Dracula himself, who tracked down Bram Stoker and forced him to make certain changes to conceal the fact that he was alive and well and living in Transylvania. It’s an “explanation” that rings equally true of the novels Hannibal and Hannibal Rising, which read suspiciously as if Lecter were dictating elements of his own idealized autobiography to Harris. (As far as I know, nobody has seen or heard from Harris since Hannibal Rising came out almost a decade ago. Are we sure he’s all right?)
And there are times when Hannibal, the show, plays as if Lecter had gotten an executive producer credit sometime between the second and third seasons. If anything, this is a testament to his vividness: when properly acted and written, he dominates his stories to a greater extent than any fictional character since Sherlock Holmes. (In fact, the literary agent hypothesis—in which the credited writer of a series is alleged to be simply serving as a front—originated among fans of Conan Doyle, who often seemed bewildered by the secondary lives his characters assumed.) But there’s something unsettling about how Lecter inevitably takes on the role of a hero. My favorite stretch of Hannibal was the back half of the second season, which looked unflinchingly at Lecter’s true nature as a villain, cannibal, and destroyer of lives. When he left the entire supporting cast to bleed slowly to death at the end of “Mizumono,” it seemed impossible to regard him as an appealing figure ever again. And yet here we are, with an ending that came across as the ultimate act of fan service in a show that has never been shy about appealing to its dwindling circle of devotees. I can’t exactly blame it for this, especially because the slow dance of seduction between Will and Hannibal has always been a source of sick, irresistible fascination. But we’re as far ever from an adaptation that would force us to honestly confront why we’re so attached to a man who eats other people, or why we root for him to triumph over lesser monsters who make the mistake of not being so rich, cultured, or amusing. Lecter came into this season like a lion, but he went out, as always, like a lamb.