Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Arthur Conan Doyle

Mycroft Holmes and the mark of genius

with one comment

Sidney Paget illustration of Mycroft Holmes

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on November 2, 2016.

“Original discoveries cannot be made casually, not by anyone at any time or anywhere,” the great biologist Edward O. Wilson writes in Letters to a Young Scientist. “The frontier of scientific knowledge, often referred to as the cutting edge, is reached with maps drawn by earlier scientists…Somewhere in these vast unexplored regions you should settle.” This seems like pretty good career advice for scientists and artists alike. But then Wilson makes a striking observation:

But, you may well ask, isn’t the cutting edge a place only for geniuses? No, fortunately. Work accomplished on the frontier defines genius, not just getting there. In fact, both accomplishments along the frontier and the final eureka moment are achieved more by entrepreneurship and hard work than by native intelligence. This is so much the case that in most fields most of the time, extreme brightness may be a detriment. It has occurred to me, after meeting so many successful researchers in so many disciplines, that the ideal scientist is smart only to an intermediate degree: bright enough to see what can be done but not so bright as to become bored doing it.

At first glance, this may not seem all that different from Martin A. Schwartz’s thoughts on the importance of stupidity: “Productive stupidity means being ignorant by choice.” In fact, they’re two separate observations—although they turn out to be related in one important respect. Schwartz is talking about “absolute stupidity,” or our collective ignorance in the face of the unknown, and he takes pains to distinguish it from the “relative stupidity” that differentiates students in the same college classes. And while Wilson isn’t talking about relative stupidity here, exactly, he’s certainly discussing relative intelligence, or the idea that the best scientists might be just a little bit less bright than their smartest peers in school. As he goes on to observe:    

What, then, of certified geniuses whose IQs exceed 140, and are as high as 180 or more? Aren’t they the ones who produce the new groundbreaking ideas? I’m sure some do very well in science, but let me suggest that perhaps, instead, many of the IQ-brightest join societies like Mensa and work as auditors and tax consultants. Why should the rule of optimum medium brightness hold? (And I admit this perception of mine is only speculative.) One reason could be that IQ geniuses have it too easy in their early training. They don’t have to sweat the science courses they take in college. They find little reward in the necessarily tedious chores of data-gathering and analysis. They choose not to take the hard roads to the frontier, over which the rest of us, the lesser intellectual toilers, must travel.

Marilyn vos Savant

In other words, the real geniuses are reluctant to take on the voluntary stupidity that science demands, and they’re more likely to find sources of satisfaction that don’t require them to constantly confront their own ignorance. This is a vast generalization, of course, but it seems to square with experience. I’ve met a number of geniuses, and what many of them have in common is a highly pragmatic determination to make life as pleasant for themselves as possible. Any other decision, in fact, would call their genius into doubt. If you can rely unthinkingly on your natural intelligence to succeed in a socially acceptable profession, or to minimize the amount of work you have to do at all, you don’t have to be a genius to see that this is a pretty good deal. The fact that Marilyn vos Savant—who allegedly had the highest tested intelligence ever recorded—became a columnist for Parade might be taken as a knock against her genius, but really, it’s the most convincing proof of it that I can imagine. The world’s smartest person should be more than happy to take a cushy gig at a Sunday supplement magazine. Most of the very bright endure their share of miseries during childhood, and their reward, rather than more misery, might as well be an adult life that provides intellectual stimulation in emotional safety. This is why I’ve always felt that Mycroft Holmes, Sherlock’s smarter older brother, knew exactly how his genius ought to be used. As Sherlock notes drily in “The Adventure of the Bruce-Partington Plans”: “Mycroft draws four hundred and fifty pounds a year, remains a subordinate, has no ambitions of any kind, will receive neither honor nor title, but remains the most indispensable man in the country.” 

Yet it’s Sherlock, who was forced to leave the house to find answers to his problems, whom we love more. (He’s also been held up as an exemplar of the perfect scientist.) Mycroft is hampered by both his physical laziness and his mental quickness: when a minister comes to him with a complicated problem involving “the Navy, India, Canada, and the bimetallic question,” Mycroft can provide the answer “offhand,” which doesn’t give him much of an incentive to ever leave his office or the Diogenes Club. As Holmes puts it in “The Greek Interpreter”:

You wonder…why it is that Mycroft does not use his powers for detective work. He is incapable of it…I said that he was my superior in observation and deduction. If the art of the detective began and ended in reasoning from an armchair, my brother would be the greatest criminal agent that ever lived. But he has no ambition and no energy. He will not even go out of his way to verify his own solution, and would rather be considered wrong than take the trouble to prove himself right.

Mycroft wasn’t wrong, either. He seems to have lived a very comfortable life. But it’s revealing that Conan Doyle gave the real adventures to the brother with the slightly less scintillating intelligence. In art, just as in science, technical facility can prevent certain artists from making real discoveries. The ones who have to work at it are more likely to find something real. But we can also raise a glass to Mycroft, Marilyn, and the geniuses who are smart enough not to make it too hard on themselves.

The manufacturers of worlds

with 2 comments

For the last few days, as part of a deliberate break from writing, I’ve been browsing contentedly through my favorite book, The Annotated Sherlock Holmes by William S. Baring-Gould. It was meant to be a comforting read that was as far removed from work as possible, but science fiction, unsurprisingly, can’t seem to let me go. Yesterday, I was looking over The Sign of the Four when I noticed a line that I’ve read countless times without really taking note of it. As Holmes leaves Baker Street to pursue a line of the investigation, he says to Watson, who has remained behind: “Let me recommend this book—one of the most remarkable ever penned. It is Winwood Reade’s Martyrdom of Man. I shall be back in an hour.” Toward the end of the novel, speaking of the difficulty in predicting what any given human being will do, Holmes elaborates:

Winwood Reade is good upon the subject…He remarks that, while the individual man is an insoluble puzzle, in the aggregate he becomes a mathematical certainty. You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

This is remarkably like what Isaac Asimov writes of psychohistory, a sociological version of the ideal gas law that can predict the future based on the existence of a huge number—perhaps in the trillions—of individual lives. And it seemed worth checking to see if this passage could cast any light on the origins of the imaginary science that I’ve spent so much time exploring.

It pains me to say that Holmes himself probably wasn’t a direct influence on the Foundation series. There was a considerable overlap between Sherlockians and science fiction writers—prominent members of both camps included Anthony Boucher, Poul Anderson, Fletcher Pratt, and Manly Wade Wellman—but John W. Campbell wasn’t among them, and Asimov was drafted only reluctantly into the Baker Street Irregulars. (He writes in I. Asimov: “Conan Doyle was a slapdash and sloppy writer…I am not really a Holmes enthusiast.”) For insight, we have to go back to Winwood Reade himself, a British historian, explorer, and correspondent of Charles Darwin whose discussion of the statistical predictability of the human race appears, interestingly, in an argument against the efficacy of prayer. Here’s the full passage from The Martyrdom of Man, which was published in 1872:

All phenomena, physical and moral, are subject to laws as invariable as those which regulate the rising and setting of the sun. It is in reality as foolish to pray for rain or a fair wind as it would be to pray that the sun should set in the middle of the day. It is as foolish to pray for the healing of a disease or for daily bread as it is to pray for rain or a fair wind. It is as foolish to pray for a pure heart or for mental repose as it is to pray for help in sickness or misfortune. All the events which occur upon the earth result from Law: even those actions which are entirely dependent on the caprices of the memory, or the impulse of the passions, are shown by statistics to be, when taken in the gross, entirely independent of the human will. As a single atom, man is an enigma; as a whole, he is a mathematical problem. As an individual, he is a free agent; as a species, the offspring of necessity.

At the end of the book, Reade takes his own principles to their logical conclusion, becoming, in effect, an early writer of science fiction. Its closing section, “Intellect,” sketches out a universal history that anticipates Toynbee, but Reade goes further: “When we understand the laws which regulate the complex phenomena of life, we shall be able to predict the future as we are already able to predict comets and eclipses and planetary movements.” He describes three inventions that he believes will lead to an era of global prosperity:

The first is the discovery of a motive force which will take the place of steam, with its cumbrous fuel of oil or coal; secondly, the invention of aerial locomotion which will transport labour at a trifling cost of money and of time to any part of the planet, and which, by annihilating distance, will speedily extinguish national distinctions; and thirdly, the manufacture of flesh and flour from the elements by a chemical process in the laboratory, similar to that which is now performed within the bodies of the animals and plants.

And after rhapsodizing over the utopian civilization that will result—in which “poetry and the fine arts will take that place in the heart which religion now holds”—he turns his thoughts to the stars:

And then, the earth being small, mankind will migrate into space, and will cross the airless Saharas which separate planet from planet, and sun from sun. The earth will become a Holy Land which will be visited by pilgrims from all the quarters of the universe. Finally, men will master the forces of nature; they will become themselves architects of systems, manufacturers of worlds. Man then will be perfect; he will then be a creator; he will therefore be what the vulgar worship as a god.

Reade was inevitably seen as an atheist, and although he didn’t like the label, he inclined many readers in that direction, as he did in one of the most interesting episodes in this book’s afterlife. The scene is World War II, which tested the idea of psychohistory to its limit, and the speaker is the author of the memoir The Enchanted Places:

The war was on. I was in Italy. From time to time [my father] used to send me parcels of books to read. In one of them were two in the Thinker’s Library series: Renan’s The Life of Jesus and Winwood Reade’s The Martyrdom of Man. I started with The Life of Jesus and found it quite interesting; I turned to The Martyrdom and found it enthralling…There was no God. God had not created Man in His own image. It was the other way round: Man had created God. And Man was all there was. But it was enough. It was the answer, and it was both totally convincing and totally satisfying. It convinced and satisfied me as I lay in my tent somewhere on the narrow strip of sand that divides Lake Comacchio from the Adriatic; and it has convinced and satisfied me ever since.

I wrote at once to my father to tell him so and he at once wrote back. And it was then that I learned for the first time that these were his beliefs, too, and that he had always hoped that one day I would come to share them…So he had sent me The Martyrdom. But even then he had wanted to play absolutely fair, and so he had added The Life of Jesus. And then he had been content to leave the verdict to me. Well, he said, the church had done its best. It had had twenty-four years’ start—and it had failed.

The author adds: “If I had to compile a list of books that have influenced my life, high on the list would undoubtedly be Winwood Reade’s The Martyrdom of Man. And it would probably be equally high on my father’s list too.” The father in question was A.A. Milne. And the son was named Christopher Robin.

Quote of the Day

leave a comment »

It is one of those cases where the art of the reasoner should be used rather for the sifting of details than for the acquiring of fresh evidence. The tragedy has been so uncommon, so complete and of such personal importance to so many people, that we are suffering from a plethora of surmise, conjecture, and hypothesis. The difficulty is to detach the framework of fact—of absolute undeniable fact—from the embellishments of theorists and reporters. Then, having established ourselves upon this sound basis, it is our duty to see what inferences may be drawn and what are the special points upon which the whole mystery turns.

Arthur Conan Doyle, “Silver Blaze”

Written by nevalalee

December 6, 2017 at 7:30 am

My ten great books #1: The Annotated Sherlock Holmes

leave a comment »

The Annotated Sherlock Holmes

Note: Four years ago, I published a series of posts here about my ten favorite works of fiction. Since then, the list has evolved, as all such rankings do, and this seems like a good time to revisit it. (I’m not including any science fiction, which I hope to cover in a separate feature later this year.) I’ll be treating them in the order of their original publication, but as it happens, we’ll be starting today with the book I love the most.

I first encountered the best book in the world in the library of St. John’s College in Annapolis, Maryland. At the time, I was seventeen, and of course I was already in love with Sherlock Holmes—I’d even been exposed to the subculture of obsessive Holmes fans through the wonderful anthology A Baker Street Dozen, which I still think is the most inviting introduction to the subject for the general reader. What I found in The Annotated Sherlock Holmes by William S. Baring-Gould was something much more: an entire universe of speculation, whimsy, and longing grown from the rich soil of Arthur Conan Doyle’s original stories. As the narrator relates in Borges’s “Tlön, Uqbar, Orbis Tertius”:

Two years before I had discovered…a superficial description of a nonexistent country; now chance afforded me something more precious and arduous. Now I held in my hands a vast methodical fragment of an unknown planet’s entire history, with its architecture and its playing cards, with the dread of its mythologies and the murmur of its languages, with its emperors and its seas, with its minerals and its birds and its fish, with its algebra and its fire, with its theological and metaphysical controversy. And all of it articulated, coherent, with no visible doctrinal intent or tone of parody.

The rules of the game were simple. Holmes, Watson, Mycroft, and the other vivid figures who populated their slice of London had been real men and women; Conan Doyle had been Watson’s literary agent; and the stories were glimpses into a larger narrative that could be reconstructed with enough patience and ingenuity. Given the scraps of information that they provided, you could figure out which building had been the model for 221B Baker Street; piece together the details of Watson’s military record, the location of his war wound, and the identities of his three, or perhaps four, wives; determine the species of the speckled band and whether “The Adventure of the Three Students” took place at Oxford or Cambridge; and pin down, with considerable accuracy, when and where each of the other adventures took place, even as Watson, or Conan Doyle, tried to divert you with “mistakes” that were deliberate misleads or red herrings.

The result of Baring-Gould’s work, which collects nearly a century’s worth of speculation into one enormous, handsomely illustrated volume, is the first book I’d save if I could own only one, and for years, it’s been living on my desk, both as a source of inspiration and as a convenient laptop stand. (Leslie Klinger’s more recent edition is lovely as well, but Baring-Gould will always be closest to my heart.) And it’s taken me a long time to realize why I care about this book so much, aside from the obvious pleasure it affords. It represents a vision of the world, and of reading, that I find immensely seductive. Each story, and often each sentence, opens onto countless others, and if Conan Doyle didn’t mean for his work to be subjected to this level of scrutiny, that’s even better: it allows us to imagine that we aren’t following a trail of clues that the author meant for us to find, but discovering something that was invisibly there all along. “Never has so much been written by so many for so few,” as the great Sherlockian Christopher Morley once said, and it’s true. All these studies are spectacularly useless, and they’re divorced from any real academic or practical value—aside, of course, from the immense benefit of allowing us to spend more time in this world and in the company of two of the most appealing characters in fiction. It’s a way for the story, and the act of reading, to go on forever, and in the end, it transforms us. In the role of a literary detective, or a tireless reader, you become Holmes, or at least a Watson to more capable investigators, thanks to the beauty of the stories themselves. What more can we ask from reading?

“Knowledge of Politics—Feeble”

with 2 comments

Illustration by Sidney Paget for "The Five Orange Pips"

In A Study in Scarlet, the first Sherlock Holmes adventure, there’s a celebrated passage in which Watson tries to figure out his mystifying roommate. At this point in their relationship, he doesn’t even know what Holmes does for a living, and he’s bewildered by the gaps in his new friend’s knowledge, such as his ignorance of the Copernican model of the solar system. When Watson informs him that the earth goes around the sun, Holmes says: “Now that I do know it, I shall do my best to forget it.” He tells Watson that the human brain is like “a little empty attic,” and that it’s a mistake to assume that the room has elastic walls, concluding: “If we went round the moon it would not make a pennyworth of difference to me or to my work.” In fact, it’s clear that he’s gently pulling Watson’s leg: Holmes certainly shows plenty of practical astronomical knowledge in stories like “The Musgrave Ritual,” and he later refers casually to making “allowance for the personal equation, as the astronomers put it.” At the time, Watson wasn’t in on the joke, and he took it all at face value when he made his famous list of Holmes’s limitations. Knowledge of literature, philosophy, and astronomy was estimated as “nil,” while botany was “variable,” geology was “practical, but limited,” chemistry was “profound,” and anatomy—in an expression that I’ve always loved—was “accurate, but unsystematic.”

But the evaluation that has probably inspired the most commentary is “Knowledge of Politics—Feeble.” Ever since, commentators have striven mightily to reconcile this with their conception of Holmes, which usually means forcing him into the image of their own politics. In Sherlock Holmes: Fact or Fiction?, T.S. Blakeney observes that Holmes takes no interest, in “The Bruce-Partington Plans,” in “the news of a revolution, of a possible war, and of an impending change of government,” and he concludes:

It is hard to believe that Holmes, who had so close a grip on realities, could ever have taken much interest in the pettiness of party politics, nor could so strong an individualist have anything but contempt for the equalitarian ideals of much modern sociological theory.

S.C. Roberts, in “The Personality of Sherlock Holmes,” objected to the latter point, arguing that Holmes’s speech in “The Naval Treaty” on English boarding schools—“Capsules with hundreds of bright little seeds in each, out of which will spring the wiser, better England of the future”—is an expression of Victorian liberalism at its finest. Roberts writes:

It is perfectly true that the clash of political opinions and of political parties does not seem to have aroused great interest in Holmes’s mind. But, fundamentally, there can be no doubt that Holmes believed in democracy and progress.

Sidney Paget illustration of Mycroft Holmes

In reality, Holmes’s politics are far from a mystery. As the descendant of “country squires,” he rarely displayed anything less than a High Tory respect for the rights of landed gentry, and he remained loyal to the end to Queen Victoria, the “certain gracious lady in whose interests he had once been fortunate enough to carry out a small commission.” He was obviously an individualist in his personal habits, in the venerable tradition of British eccentrics, which doesn’t mean that his political views—as some have contended—were essentially libertarian. Holmes had a very low regard for the freedom of action of the average human being, and with good reason. The entire series was predicated on the notion that men and women are totally predictable, moving within their established courses so reliably that a trained detective can see into the past and forecast the future. As someone once noted, Holmes’s deductions are based on a chain of perfectly logical inferences that would have been spoiled by a single mistake on the part of the murderer. Holmes didn’t particularly want the world to change, because it was the familiar canvas on which he practiced his art. (His brother Mycroft, after all, was the British government.) The only individuals who break out of the pattern are criminals, and even then, it’s a temporary disruption. You could say that the entire mystery genre is inherently conservative: it’s all about the restoration of order, and in the case of Holmes, it means the order of a world, in Vincent Starrett’s words, “where it is always 1895.”

I love Sherlock Holmes, and in a large part, it’s the nostalgia for that era—especially by those who never had to live with it or its consequences—that makes the stories so appealing. But it’s worth remembering what life was really like at the end of the nineteenth century for those who weren’t as fortunate. (Arthur Conan Doyle identified, incidentally, as a Liberal Unionist, a forgotten political party that was so muddled in its views that it inspired a joke in The Importance of Being Earnest: “What are your politics?” “Well, I am afraid I really have none. I am a Liberal Unionist.” And there’s no question that Conan Doyle believed wholeheartedly in the British Empire and all it represented.) Over the last few months, there have been times when I’ve thought approvingly of what Whitfield J. Bell says in “Holmes and History”:

Holmes’s knowledge of politics was anything but weak or partial. Of the hurly-burly of the machines, the petty trade for office and advantage, it is perhaps true that Holmes knew little. But of politics on the highest level, in the grand manner, particularly international politics, no one was better informed.

I can barely stand to look at a newspaper these days, so it’s tempting to take a page from Holmes and ignore “the petty trade for office and advantage.” And I often do. But deep down, it implies an acceptance of the way things are now. And it seems a little feeble.

Mycroft Holmes and the mark of genius

with 3 comments

Sidney Paget illustration of Mycroft Holmes

“Original discoveries cannot be made casually, not by anyone at any time or anywhere,” the great biologist Edward O. Wilson writes in Letters to a Young Scientist. “The frontier of scientific knowledge, often referred to as the cutting edge, is reached with maps drawn by earlier scientists…Somewhere in these vast unexplored regions you should settle.” This seems like pretty good career advice for scientists and artists alike. But then Wilson makes a striking observation:

But, you may well ask, isn’t the cutting edge a place only for geniuses? No, fortunately. Work accomplished on the frontier defines genius, not just getting there. In fact, both accomplishments along the frontier and the final eureka moment are achieved more by entrepreneurship and hard work than by native intelligence. This is so much the case that in most fields most of the time, extreme brightness may be a detriment. It has occurred to me, after meeting so many successful researchers in so many disciplines, that the ideal scientist is smart only to an intermediate degree: bright enough to see what can be done but not so bright as to become bored doing it.

At first glance, this may not seem all that different from Martin A. Schwartz’s thoughts on the importance of stupidity, which I quoted here last week. In fact, they’re two separate observations—although they turn out to be related in one important respect. Schwartz is talking about “absolute stupidity,” or our collective ignorance in the face of the unknown, and he takes pains to distinguish it from the “relative stupidity” that differentiates students in the same college classes. And while Wilson isn’t talking about relative stupidity here, exactly, he’s certainly discussing relative intelligence, or the idea that the best scientists might be just a little bit less bright than their smartest peers in school. As he goes on to observe:    

What, then, of certified geniuses whose IQs exceed 140, and are as high as 180 or more? Aren’t they the ones who produce the new groundbreaking ideas? I’m sure some do very well in science, but let me suggest that perhaps, instead, many of the IQ-brightest join societies like Mensa and work as auditors and tax consultants. Why should the rule of optimum medium brightness hold? (And I admit this perception of mine is only speculative.) One reason could be that IQ geniuses have it too easy in their early training. They don’t have to sweat the science courses they take in college. They find little reward in the necessarily tedious chores of data-gathering and analysis. They choose not to take the hard roads to the frontier, over which the rest of us, the lesser intellectual toilers, must travel.

Marilyn vos Savant

In other words, the real geniuses are reluctant to take on the voluntary stupidity that science demands, and they’re more likely to find sources of satisfaction that don’t require them to constantly confront their own ignorance. This is a vast generalization, of course, but it seems to square with experience. I’ve met a number of geniuses, and what many of them have in common is a highly pragmatic determination to make life as pleasant for themselves as possible. Any other decision, in fact, would call their genius into doubt. If you can rely unthinkingly on your natural intelligence to succeed in a socially acceptable profession, or to minimize the amount of work you have to do at all, you don’t have to be a genius to see that this is a pretty good deal. The fact that Marilyn vos Savant—who allegedly had the highest tested intelligence ever recorded—became a columnist for Parade might be taken as a knock against her genius, but really, it’s the most convincing proof of it that I can imagine. The world’s smartest person should be more than happy to take a cushy gig at a Sunday supplement magazine. Most of the very bright endure their share of miseries during childhood, and their reward, rather than more misery, might as well be an adult life that provides intellectual stimulation in emotional safety. This is why I’ve always felt that Mycroft Holmes, Sherlock’s smarter older brother, knew exactly how his genius ought to be used. As Sherlock notes drily in “The Adventure of the Bruce-Partington Plans”: “Mycroft draws four hundred and fifty pounds a year, remains a subordinate, has no ambitions of any kind, will receive neither honor nor title, but remains the most indispensable man in the country.” 

Yet it’s Sherlock, who was forced to leave the house to find answers to his problems, whom we love more. (He’s also been held up as an exemplar of the perfect scientist.) Mycroft is hampered by both his physical laziness and his mental quickness: when a minister comes to him with a complicated problem involving “the Navy, India, Canada, and the bimetallic question,” Mycroft can provide the answer “offhand,” which doesn’t give him much of an incentive to ever leave his office or the Diogenes Club. As Holmes puts it in “The Greek Interpreter”:

You wonder…why it is that Mycroft does not use his powers for detective work. He is incapable of it…I said that he was my superior in observation and deduction. If the art of the detective began and ended in reasoning from an armchair, my brother would be the greatest criminal agent that ever lived. But he has no ambition and no energy. He will not even go out of his way to verify his own solution, and would rather be considered wrong than take the trouble to prove himself right.

Mycroft wasn’t wrong, either. He seems to have lived a very comfortable life. But it’s revealing that Conan Doyle gave the real adventures to the brother with the slightly less scintillating intelligence. In art, just as in science, technical facility can prevent certain artists from making real discoveries. The ones who have to work at it are more likely to find something real. But we can also raise a glass to Mycroft, Marilyn, and the geniuses who are smart enough not to make it too hard on themselves.

The monster in the writers room

leave a comment »

Mads Mikkelsen on Hannibal

Note: Spoilers follow for the season finale of Hannibal.

When it comes to making predictions about television shows, my track record is decidedly mixed. I was long convinced, for instance, that Game of Thrones would figure out a way to keep Oberyn Martell around, just because he was such fun to watch, and to say I was wrong about this is something of an understatement. Let the record show, however, that I said here months ago that the third season of Hannibal would end with Will Graham getting a knife through his face:

In The Silence of the Lambs, Crawford says that Graham’s face “looks like damned Picasso drew it.” None of the prior cinematic versions of this story have dared to follow through on this climax, but I have a feeling, given the evidence, that Fuller would embrace it. Taking Hugh Dancy’s face away, or making it hard for it look at, would be the ultimate rupture between the series and its viewers. Given the show’s cancellation, it may well end up being the very last thing we see. It would be a grim note on which to end. But it’s nothing that this series hasn’t taught us to expect.

This wasn’t the hardest prediction in the world to make. One of the most distinctive aspects of Bryan Fuller’s take on the Lecter saga is his willingness to pursue elements of the original novels that other adaptations have avoided, and the denouement of Red Dragon—with Will lying alone, disfigured, and mute in the hospital—is a downer ending that no other version of this story has been willing to touch.

Of course, that wasn’t what we got here, either. Instead of Will in his hospital bed, brooding silently on the indifference of the natural world to murder, we got a hysterical ballet of death, with Will and Hannibal teaming up to dispatch Dolarhyde like the water buffalo at the end of Apocalypse Now, followed by an operatic plunge over the edge of a cliff, with our two star-crossed lovers locked literally in each other’s arms. And it was a worthy finale for a series that has seemed increasingly indifferent to anything but that unholy love story. The details of Lecter’s escape from prison are wildly implausible, and whatever plan they reflect is hilariously undercooked, even for someone like Jack Crawford, who increasingly seems like the world’s worst FBI agent in charge. Hannibal has never been particularly interested its procedural elements, and its final season took that contempt to its final, ludicrous extreme. In the novel Red Dragon, Will, despite his demons, is a competent, inspired investigator, and he’s on the verge of apprehending Dolaryhyde through his own smarts when his quarry turns the tables. In Fuller’s version, unless I missed something along the way, Will doesn’t make a single useful deduction or take any meaningful action that isn’t the result of being manipulated by Hannibal or Jack. He’s a puppet, and dangerously close to what TV Tropes has called a Woobie: a character whom we enjoy seeing tortured so we can wish the pain away.

Hugh Dancy on Hannibal

None of this should be taken as a criticism of the show itself, in which any narrative shortcomings can hardly be separated from Fuller’s conscious decisions. But as enjoyable as the series has always been—and I’ve enjoyed it more than any network drama I’ve seen in at least a decade—it’s something less than an honest reckoning with its material. As a rule of thumb, the stories about Lecter, including Harris’s own novels, have been the most successful when they stick most closely to their roots as police procedurals. Harris started his career as a crime reporter, and his first three books, including Black Sunday, are masterpieces of the slow accumulation of convincing detail, spiced and enriched by a layer of gothic violence. When you remove that foundation of realistic suspense, you end up with a character who is dangerously uncontrollable: it’s Lecter, not Harris, who becomes the author of his own novel. In The Annotated Dracula, Leslie S. Klinger proposes a joke theory that the real author of that book is Dracula himself, who tracked down Bram Stoker and forced him to make certain changes to conceal the fact that he was alive and well and living in Transylvania. It’s an “explanation” that rings equally true of the novels Hannibal and Hannibal Rising, which read suspiciously as if Lecter were dictating elements of his own idealized autobiography to Harris. (As far as I know, nobody has seen or heard from Harris since Hannibal Rising came out almost a decade ago. Are we sure he’s all right?)

And there are times when Hannibal, the show, plays as if Lecter had gotten an executive producer credit sometime between the second and third seasons. If anything, this is a testament to his vividness: when properly acted and written, he dominates his stories to a greater extent than any fictional character since Sherlock Holmes. (In fact, the literary agent hypothesis—in which the credited writer of a series is alleged to be simply serving as a front—originated among fans of Conan Doyle, who often seemed bewildered by the secondary lives his characters assumed.) But there’s something unsettling about how Lecter inevitably takes on the role of a hero. My favorite stretch of Hannibal was the back half of the second season, which looked unflinchingly at Lecter’s true nature as a villain, cannibal, and destroyer of lives. When he left the entire supporting cast to bleed slowly to death at the end of “Mizumono,” it seemed impossible to regard him as an appealing figure ever again. And yet here we are, with an ending that came across as the ultimate act of fan service in a show that has never been shy about appealing to its dwindling circle of devotees. I can’t exactly blame it for this, especially because the slow dance of seduction between Will and Hannibal has always been a source of sick, irresistible fascination. But we’re as far ever from an adaptation that would force us to honestly confront why we’re so attached to a man who eats other people, or why we root for him to triumph over lesser monsters who make the mistake of not being so rich, cultured, or amusing. Lecter came into this season like a lion, but he went out, as always, like a lamb.

%d bloggers like this: