Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Sherlock Holmes

Mycroft Holmes and the mark of genius

with one comment

Sidney Paget illustration of Mycroft Holmes

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on November 2, 2016.

“Original discoveries cannot be made casually, not by anyone at any time or anywhere,” the great biologist Edward O. Wilson writes in Letters to a Young Scientist. “The frontier of scientific knowledge, often referred to as the cutting edge, is reached with maps drawn by earlier scientists…Somewhere in these vast unexplored regions you should settle.” This seems like pretty good career advice for scientists and artists alike. But then Wilson makes a striking observation:

But, you may well ask, isn’t the cutting edge a place only for geniuses? No, fortunately. Work accomplished on the frontier defines genius, not just getting there. In fact, both accomplishments along the frontier and the final eureka moment are achieved more by entrepreneurship and hard work than by native intelligence. This is so much the case that in most fields most of the time, extreme brightness may be a detriment. It has occurred to me, after meeting so many successful researchers in so many disciplines, that the ideal scientist is smart only to an intermediate degree: bright enough to see what can be done but not so bright as to become bored doing it.

At first glance, this may not seem all that different from Martin A. Schwartz’s thoughts on the importance of stupidity: “Productive stupidity means being ignorant by choice.” In fact, they’re two separate observations—although they turn out to be related in one important respect. Schwartz is talking about “absolute stupidity,” or our collective ignorance in the face of the unknown, and he takes pains to distinguish it from the “relative stupidity” that differentiates students in the same college classes. And while Wilson isn’t talking about relative stupidity here, exactly, he’s certainly discussing relative intelligence, or the idea that the best scientists might be just a little bit less bright than their smartest peers in school. As he goes on to observe:    

What, then, of certified geniuses whose IQs exceed 140, and are as high as 180 or more? Aren’t they the ones who produce the new groundbreaking ideas? I’m sure some do very well in science, but let me suggest that perhaps, instead, many of the IQ-brightest join societies like Mensa and work as auditors and tax consultants. Why should the rule of optimum medium brightness hold? (And I admit this perception of mine is only speculative.) One reason could be that IQ geniuses have it too easy in their early training. They don’t have to sweat the science courses they take in college. They find little reward in the necessarily tedious chores of data-gathering and analysis. They choose not to take the hard roads to the frontier, over which the rest of us, the lesser intellectual toilers, must travel.

Marilyn vos Savant

In other words, the real geniuses are reluctant to take on the voluntary stupidity that science demands, and they’re more likely to find sources of satisfaction that don’t require them to constantly confront their own ignorance. This is a vast generalization, of course, but it seems to square with experience. I’ve met a number of geniuses, and what many of them have in common is a highly pragmatic determination to make life as pleasant for themselves as possible. Any other decision, in fact, would call their genius into doubt. If you can rely unthinkingly on your natural intelligence to succeed in a socially acceptable profession, or to minimize the amount of work you have to do at all, you don’t have to be a genius to see that this is a pretty good deal. The fact that Marilyn vos Savant—who allegedly had the highest tested intelligence ever recorded—became a columnist for Parade might be taken as a knock against her genius, but really, it’s the most convincing proof of it that I can imagine. The world’s smartest person should be more than happy to take a cushy gig at a Sunday supplement magazine. Most of the very bright endure their share of miseries during childhood, and their reward, rather than more misery, might as well be an adult life that provides intellectual stimulation in emotional safety. This is why I’ve always felt that Mycroft Holmes, Sherlock’s smarter older brother, knew exactly how his genius ought to be used. As Sherlock notes drily in “The Adventure of the Bruce-Partington Plans”: “Mycroft draws four hundred and fifty pounds a year, remains a subordinate, has no ambitions of any kind, will receive neither honor nor title, but remains the most indispensable man in the country.” 

Yet it’s Sherlock, who was forced to leave the house to find answers to his problems, whom we love more. (He’s also been held up as an exemplar of the perfect scientist.) Mycroft is hampered by both his physical laziness and his mental quickness: when a minister comes to him with a complicated problem involving “the Navy, India, Canada, and the bimetallic question,” Mycroft can provide the answer “offhand,” which doesn’t give him much of an incentive to ever leave his office or the Diogenes Club. As Holmes puts it in “The Greek Interpreter”:

You wonder…why it is that Mycroft does not use his powers for detective work. He is incapable of it…I said that he was my superior in observation and deduction. If the art of the detective began and ended in reasoning from an armchair, my brother would be the greatest criminal agent that ever lived. But he has no ambition and no energy. He will not even go out of his way to verify his own solution, and would rather be considered wrong than take the trouble to prove himself right.

Mycroft wasn’t wrong, either. He seems to have lived a very comfortable life. But it’s revealing that Conan Doyle gave the real adventures to the brother with the slightly less scintillating intelligence. In art, just as in science, technical facility can prevent certain artists from making real discoveries. The ones who have to work at it are more likely to find something real. But we can also raise a glass to Mycroft, Marilyn, and the geniuses who are smart enough not to make it too hard on themselves.

The minor key

with one comment

“What keeps science fiction a minor genre, for all the brilliance of its authors and apparent pertinence of its concerns?” The critic who asked this question was none other than John Updike, in his New Yorker review of David G. Hartwell’s anthology The World Treasury of Science Fiction, which was published at the end of the eighties. Updike immediately responded to his own question with his usual assurance:

The short answer is that each science-fiction story is so busy inventing its environment that little energy is left to be invested in the human subtleties. Ordinarily, “mainstream” fiction snatches what it needs from the contemporary environment and concentrates upon surprising us with details of behavior; science fiction tends to reverse the priorities…It rarely penetrates and involves us the way the quest realistic fiction can…”The writer,” Edmund Wilson wrote, “must always find expressions for something which has never yet been exposed, must master a new set of phenomena which has never yet been mastered.” Those rhapsodies, for instance, which Proust delivered upon the then-fresh inventions of the telephone, the automobile, and the airplane point up the larger relativities and magical connections of his great novel, as well as show the new century breaking upon a fin-de-siècle sensibility. The modest increments of fictional “news,” of phenomena whose presentation is unprecedented, have the cumulative weight of true science—a nudging, inching fidelity to human change ultimately far more impressive and momentous than the great glittering leaps of science fiction.

I’ll concede that Updike’s underlying point here is basically correct, and that a lot of science fiction has to spend so much time establishing the premise and the background that it has to shortchange or underplay other important qualities along the way. (At its highest level, this is less a reflection of the author’s limitations than a courtesy to the reader. It’s hard to innovate along every parameter at once, so complex works of speculative fiction as different as Gravity’s Rainbow and Inception need to strategically simplify wherever they can.) But there’s also a hidden fallacy in Updike’s description of science fiction as “a minor genre.” What, exactly, would a “major” genre look like? It’s hard to come up with a definitive list, but if we’re going to limit ourselves to a conception of genre that encompasses science fiction and not, say, modernist realism, we’d probably include fantasy, horror, western, romance, erotica, adventure, mystery, suspense, and historical fiction. When we ask ourselves whether Updike would be likely to consider any of these genres “major,” it’s pretty clear that the answer is no. Every genre, by definition, is minor, at least to many literary critics, which not only renders the distinction meaningless, but raises a host of other questions. If we honestly ask what keeps all genres—although not individual authors—in the minor category, there seem to be three possibilities. Either genre fiction fails to attract or keep major talent; it suffers from various systemic problems of the kind that Updike identified for science fiction; or there’s some other quirk in the way we think about fiction that relegates these genres to a secondary status, regardless of the quality of specific works or writers.

And while all three of these factors may play a role, it’s the third one that seems most plausible. (After all, when you average out the quality of all “literary fiction,” from Updike, Bellow, and Roth down to the work put out by the small presses and magazines, it seems fairly clear that Sturgeon’s Law applies here as much as anywhere else, and ninety percent of everything is crud. And modernist realism, like every category coherent enough to earn its own label, has plenty of clichés of its own.) In particular, if a genre writer is deemed good enough, his or her reward is to be elevated out of it entirely. You clearly see this with such authors as Jorge Luis Borges, perhaps the greatest writer of speculative fiction of the twentieth century, who was plucked out of that category to complete more effectively with Proust, Joyce, and Kafka—the last of whom was arguably also a genre writer who was forcibly promoted to the next level. It means that the genre as a whole can never win. Its best writers are promptly confiscated, freeing up critics to speculate about why it remains “minor.” As Daniel Handler noted in an interview several years ago:

I believe that children’s literature is a genre. I resisted the idea that children’s literature is just anything that children are reading. And I certainly resisted the idea that certain books should get promoted out of children’s literature just because adults are reading them. That idea is enraging too. That’s what happens to any genre, right? First you say, “Margaret Atwood isn’t really a science fiction writer.” Then you say, “There really aren’t any good science fiction writers.” That’s because you promoted them all!

And this pattern isn’t a new one. It’s revealing that Updike quoted Edmund Wilson, who in his essays “Why Do People Read Detective Stories” and “Who Cares Who Killed Roger Ackroyd?” dismissed the entire mystery genre as minor or worse. Yet when it came to defending his fondness for one author in particular, he fell back on a familiar trick:

I will now confess, in my turn, that, since my first looking into this subject last fall, I have myself become addicted, in spells, to reading myself to sleep with Sherlock Holmes, which I had gone back to, not having looked at it since childhood, in order to see how it compared with Conan Doyle’s latest imitators. I propose, however, to justify my pleasure in rereading Sherlock Holmes on grounds entirely different from those on which the consumers of the current product ordinarily defend their taste. My contention is that Sherlock Holmes is literature on a humble but not ignoble level, whereas the mystery writers most in vogue now are not. The old stories are literature, not because of the conjuring tricks and the puzzles, not because of the lively melodrama, which they have in common with many other detective stories, but by virtue of imagination and style. These are fairy-tales, as Conan Doyle intimated in his preface to his last collection, and they are among the most amusing of fairy-tales and not among the least distinguished.

Strip away the specifics, and the outlines of the argument are clear. Sherlock Holmes is good, and mysteries are bad, so Sherlock Holmes must be something other than mystery fiction. It’s maddening, but from the point of view of a working critic, it makes perfect sense. You get to hold onto the works that you like, while keeping the rest of the genre safely minor—and then you can read yourself happily to sleep.

Brexit pursued by a bear

with one comment

Over the weekend, my wife and I took our daughter to see Paddington 2, which can accurately be described as the best live-action children’s movie since Paddington. These are charming films, and the worst that can be said of them is that they’re clearly trying hard to be better than they have any right to be. Unlike an artist like Hayao Miyazaki, who constructs stories according to his own secret logic and ends up seizing the imagination of adults and children across the world, director Paul King and his collaborators are more in the tradition of Pixar, which does amazing work and never lets you forget it for a second. (If you want to reach back even further, you could say that these movies split the difference between Babe, a technically phenomenal film that somehow managed to seem effortless, and Babe: Pig in the City, an unquestioned masterpiece that often felt on the verge of flying apart under the pressure of George Miller’s ambitions.) Paddington 2, in particular, is so indebted to the work of Wes Anderson, especially The Grand Budapest Hotel, that it seems less like a pastiche than an unauthorized knockoff. Is it really an act of homage to painstakingly recreate the look of a movie that came out less than four years ago? But it also doesn’t matter. It’s as if King and his collaborators realized that Anderson’s work amounted to an industrial process that was being wasted if it wasn’t being used to make a children’s movie, so they decided to copy it before the patent expired. The result isn’t quite on the level of The Life Aquatic with Steve Zissou, a major work of art that also seems to have been made by and for twelve-year-old kids. But it’s more than enough until Anderson finally makes the Encyclopedia Brown adaptation of my dreams.

Paddington 2 also doubles as the best advertisement for Britain in film since the heyday of the Ministry of Information, with a roster of such ringers as Sally Hawkins, Hugh Bonneville, Brendan Gleeson, Julie Walters, Jim Broadbent, Peter Capaldi, and Joanna Lumley, as well as a wonderfully diverse supporting cast. (It also gives Hugh Grant—the quintessential British export of the last quarter of a century—his best role in a long time.) It’s the most loving portrait of London that any movie has provided in years, with a plot driven by an implausible treasure hunt that serves as an excuse to tour such landmarks as Tower Bridge and St. Paul’s Cathedral. Watching it is almost enough to make you forget the fact that just a few months before production began, the United Kingdom narrowly voted to effectively withdraw from its role as a global power. It might seem like a stretch to see a children’s movie through the lens of Brexit, but nearly every British film of the postwar period can be read as a commentary on the nation’s sometimes painful efforts to redefine itself in a changing world order. Nostalgia is often a strategy for dealing with harsher realities, and escapism can be more revealing than it knows, with even the James Bond series serving as a form of wishful thinking. And America should be paying close attention. A nation on the decline no longer has the luxury of having its movies stand for nothing but themselves, and Britain provides a striking case study for what happens to a culture after its period of ascendancy is over. The United States, like its nearest relation, threw away much of its credibility a year and a half ago in a fit of absentmindedness.

This partially accounts for our sudden fascination with Britain and its royal family, which seems to have risen to levels unseen since the death of Princess Diana. Part of it amounts to an accident of timing—the flurry of celebrations for Queen Elizabeth’s ninetieth birthday and sapphire jubilee generated a flood of content that was more available to American viewers than ever before, and we were unusually primed to receive it. Over the last year or so, my wife and I have watched something like three different documentaries about the Windsors, along with The Crown and The Great British Baking Show, the soothing rhythms of which make Top Chef seem frantic by comparison. Above all else, we’ve followed the saga of Prince Harry and Meghan Markle, which has often been mined for clues as to its possible social and political significance. As Rebecca Mead writes in The New Yorker:

This may be because [the engagement is] legit the only bit of non-terrible news that’s happened in the last year. But there’s more to it than that. This is a royal wedding for non-royalists, even for anti-royalists…There is another important way in which Markle’s arrival reconfigures what Prince Philip reportedly calls “the Firm.” Not only is she American, she is also of mixed race: Markle’s mother is African-American, and her father is white…Whatever else Markle brings to the gilded royal table in terms of glamour, intelligence, and charm, her experience of racial prejudice is unprecedented among members of the royal family. At a time when racial bigotry and nativism is on the rise on both sides of the Atlantic, the coming to prominence at the heart of Britain’s First Family of an American woman whose ancestors were enslaved could not be more welcome, or more salutary.

The unstated point is that even as the United Kingdom goes through convulsions of its own, at least it gets to have this. And we can’t be blamed for wanting to clutch some of it to ourselves. After quoting Princess Diana’s wish that she become “a queen of people’s hearts,” Mead adds:

For those of us horrified by the President’s imperial, autocratic instincts—by his apparent wish to reinstate a feudal system with himself at its apex, attended by a small court of plutocrats who, like him, have been even further enriched by Republican tax reform—might we not claim Harry and Meghan as the monarchs of our hearts? Might they not serve as paradoxical avatars of our own hopes for a more open, more international, more unified, and fairer world?

It’s hard to quarrel with this basically harmless desire to comfort ourselves with the images of the monarchy, and I’ve been guilty of it myself. The building blocks of so much of my inner life—from the Sherlock Holmes stories to the movies of Powell and Pressburger—reflect a nostalgia for an England, as Vincent Starrett put it, “where it is always 1895.” It’s an impulse as old as Walt Disney, a Chicago child whose studio turned into a propaganda mill in the early sixties for the values of the Edwardian era. (As much as I love Mary Poppins, it’s hard to overlook the fact that it premiered just a few weeks after the Gulf of Tonkin resolution and against a backdrop of race riots in Philadelphia.) America has nostalgic myths of its own, but it tends to fall back on its British forebears when it feels particularly insecure about its own legacy. When it becomes too difficult to look at ourselves, we close our eyes and think of England.

The manufacturers of worlds

with 2 comments

For the last few days, as part of a deliberate break from writing, I’ve been browsing contentedly through my favorite book, The Annotated Sherlock Holmes by William S. Baring-Gould. It was meant to be a comforting read that was as far removed from work as possible, but science fiction, unsurprisingly, can’t seem to let me go. Yesterday, I was looking over The Sign of the Four when I noticed a line that I’ve read countless times without really taking note of it. As Holmes leaves Baker Street to pursue a line of the investigation, he says to Watson, who has remained behind: “Let me recommend this book—one of the most remarkable ever penned. It is Winwood Reade’s Martyrdom of Man. I shall be back in an hour.” Toward the end of the novel, speaking of the difficulty in predicting what any given human being will do, Holmes elaborates:

Winwood Reade is good upon the subject…He remarks that, while the individual man is an insoluble puzzle, in the aggregate he becomes a mathematical certainty. You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

This is remarkably like what Isaac Asimov writes of psychohistory, a sociological version of the ideal gas law that can predict the future based on the existence of a huge number—perhaps in the trillions—of individual lives. And it seemed worth checking to see if this passage could cast any light on the origins of the imaginary science that I’ve spent so much time exploring.

It pains me to say that Holmes himself probably wasn’t a direct influence on the Foundation series. There was a considerable overlap between Sherlockians and science fiction writers—prominent members of both camps included Anthony Boucher, Poul Anderson, Fletcher Pratt, and Manly Wade Wellman—but John W. Campbell wasn’t among them, and Asimov was drafted only reluctantly into the Baker Street Irregulars. (He writes in I. Asimov: “Conan Doyle was a slapdash and sloppy writer…I am not really a Holmes enthusiast.”) For insight, we have to go back to Winwood Reade himself, a British historian, explorer, and correspondent of Charles Darwin whose discussion of the statistical predictability of the human race appears, interestingly, in an argument against the efficacy of prayer. Here’s the full passage from The Martyrdom of Man, which was published in 1872:

All phenomena, physical and moral, are subject to laws as invariable as those which regulate the rising and setting of the sun. It is in reality as foolish to pray for rain or a fair wind as it would be to pray that the sun should set in the middle of the day. It is as foolish to pray for the healing of a disease or for daily bread as it is to pray for rain or a fair wind. It is as foolish to pray for a pure heart or for mental repose as it is to pray for help in sickness or misfortune. All the events which occur upon the earth result from Law: even those actions which are entirely dependent on the caprices of the memory, or the impulse of the passions, are shown by statistics to be, when taken in the gross, entirely independent of the human will. As a single atom, man is an enigma; as a whole, he is a mathematical problem. As an individual, he is a free agent; as a species, the offspring of necessity.

At the end of the book, Reade takes his own principles to their logical conclusion, becoming, in effect, an early writer of science fiction. Its closing section, “Intellect,” sketches out a universal history that anticipates Toynbee, but Reade goes further: “When we understand the laws which regulate the complex phenomena of life, we shall be able to predict the future as we are already able to predict comets and eclipses and planetary movements.” He describes three inventions that he believes will lead to an era of global prosperity:

The first is the discovery of a motive force which will take the place of steam, with its cumbrous fuel of oil or coal; secondly, the invention of aerial locomotion which will transport labour at a trifling cost of money and of time to any part of the planet, and which, by annihilating distance, will speedily extinguish national distinctions; and thirdly, the manufacture of flesh and flour from the elements by a chemical process in the laboratory, similar to that which is now performed within the bodies of the animals and plants.

And after rhapsodizing over the utopian civilization that will result—in which “poetry and the fine arts will take that place in the heart which religion now holds”—he turns his thoughts to the stars:

And then, the earth being small, mankind will migrate into space, and will cross the airless Saharas which separate planet from planet, and sun from sun. The earth will become a Holy Land which will be visited by pilgrims from all the quarters of the universe. Finally, men will master the forces of nature; they will become themselves architects of systems, manufacturers of worlds. Man then will be perfect; he will then be a creator; he will therefore be what the vulgar worship as a god.

Reade was inevitably seen as an atheist, and although he didn’t like the label, he inclined many readers in that direction, as he did in one of the most interesting episodes in this book’s afterlife. The scene is World War II, which tested the idea of psychohistory to its limit, and the speaker is the author of the memoir The Enchanted Places:

The war was on. I was in Italy. From time to time [my father] used to send me parcels of books to read. In one of them were two in the Thinker’s Library series: Renan’s The Life of Jesus and Winwood Reade’s The Martyrdom of Man. I started with The Life of Jesus and found it quite interesting; I turned to The Martyrdom and found it enthralling…There was no God. God had not created Man in His own image. It was the other way round: Man had created God. And Man was all there was. But it was enough. It was the answer, and it was both totally convincing and totally satisfying. It convinced and satisfied me as I lay in my tent somewhere on the narrow strip of sand that divides Lake Comacchio from the Adriatic; and it has convinced and satisfied me ever since.

I wrote at once to my father to tell him so and he at once wrote back. And it was then that I learned for the first time that these were his beliefs, too, and that he had always hoped that one day I would come to share them…So he had sent me The Martyrdom. But even then he had wanted to play absolutely fair, and so he had added The Life of Jesus. And then he had been content to leave the verdict to me. Well, he said, the church had done its best. It had had twenty-four years’ start—and it had failed.

The author adds: “If I had to compile a list of books that have influenced my life, high on the list would undoubtedly be Winwood Reade’s The Martyrdom of Man. And it would probably be equally high on my father’s list too.” The father in question was A.A. Milne. And the son was named Christopher Robin.

Quote of the Day

leave a comment »

It is one of those cases where the art of the reasoner should be used rather for the sifting of details than for the acquiring of fresh evidence. The tragedy has been so uncommon, so complete and of such personal importance to so many people, that we are suffering from a plethora of surmise, conjecture, and hypothesis. The difficulty is to detach the framework of fact—of absolute undeniable fact—from the embellishments of theorists and reporters. Then, having established ourselves upon this sound basis, it is our duty to see what inferences may be drawn and what are the special points upon which the whole mystery turns.

Arthur Conan Doyle, “Silver Blaze”

Written by nevalalee

December 6, 2017 at 7:30 am

Solzhenitsyn’s rosary

leave a comment »

Aleksandr Solzhenitsyn

Note: I’m taking a few days off for Thanksgiving, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on July 11, 2016.

When Aleksandr Solzhenitsyn was imprisoned in the Soviet gulag, along with so many other sufferings, he was forced to deal with a challenge that modern writers rarely have to confront—the problem of memorization. He wanted to keep writing poetry, but he was unable to put anything on paper, which would be confiscated and read by the guards. Here’s the solution that he found, as he recounts in The Gulag Archipelago:

I started breaking matches into little pieces and arranging them on my cigarette case in two rows (of ten each, one representing units and the others tens). As I recited the verses to myself, I displaced one bit of broken match from the units row for every line. When I shifted ten units I displaced one of the “tens”…Every fiftieth and every hundredth line I memorized with special care, to help me keep count. Once a month I recited all that I had written. If the wrong line came out in place of one of the hundreds and fifties, I went over it all again and again until I caught the slippery fugitives.

In the Kuibyshev Transit Prison I saw Catholics (Lithuanians) busy making themselves rosaries for prison use…I joined them and said that I, too, wanted to say my prayers with a rosary but that in my particular religion I needed hundred beads in a ring…that every tenth bead must be cubic, not spherical, and that the fiftieth and the hundredth beads must be distinguishable at a touch.

The Lithuanians were impressed, Solzhenitsyn says, by his “religious zeal,” and they agreed to make a rosary to his specifications, fashioning the beads out of pellets of bread and coloring them with burnt rubber, tooth powder, and disinfectant. (Later, when Solzhenitsyn realized that twenty beads were enough, he made them himself out of cork.) He concludes:

I never afterward parted with the marvelous present of theirs; I fingered and counted my beads inside my wide mittens—at work line-up, on the march to and fro from work, at all waiting times; I could do it standing up, and freezing cold was no hindrance. I carried it safely through the search points, in the padding of my mittens, where it could not be felt. The warders found it on various occasions, but supposed that it was for praying and let me keep it. Until the end of my sentence (by which time I had accumulated 12,000 lines) and after that in my places of banishment, this necklace helped me write and remember.

Ever since I first read this story, I’ve been fascinated by it, and I’ve occasionally found myself browsing the rosaries or prayer beads for sale online, wondering if I should get one for myself, just in case—although in case of what, exactly, I don’t know.

Joan Didion

But you don’t need to be in prison to understand the importance of memorization. One of the side effects of our written and interconnected culture is that we’ve lost the ability to hold information in our heads, and this trend has only accelerated as we’ve outsourced more of our inner lives to the Internet. This isn’t necessarily a bad thing: there are good reasons for keeping a lot of this material where it can be easily referenced, without feeling the need to remember it all. (As Sherlock Holmes said in A Study in Scarlet: “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose…It is a mistake to think that that little room has elastic walls and can distend to any extent.” Although given the amount of obscure information that Holmes was able to produce in subsequent stories, it’s possible that he was just kidding.) But there’s also a real loss involved. Oral cultures are marked by a highly developed verbal memory, especially for those whose livelihoods depend on it: a working poet could be expected to know hundreds of songs by heart, and the conventions of poetry itself emerged, in part, as a set of mnemonic devices. Meter, rhyme, and conventional formulas allowed many lines of verse to be recited for a paying audience—or improvised on the spot. An oral poem is a vehicle for the preservation of information, and it takes advantage of the human brain’s ability to retain material in a pattern that hints at what comes next. When we neglect this, we lose touch with some of the reasons that poetry evolved in the first place.

And what makes memorization particularly valuable as a creative tool is the fact that it isn’t quite perfect. When you write something down, it tends to become fixed, both physically and psychologically. (Joan Didion gets at this when she says: “By the time you’ve laid down the first two sentences, your options are all gone.”) An idea in the brain, by contrast, remains fluid, malleable, and vital. Each time you go back to revisit it, whether using a rosary or some other indexical system, you aren’t just remembering it, but to some extent recreating it, and you’ll never get it exactly right. But just as natural selection exists because of the variations that arise from errors of transcription, a creative method that relies on memory is less accurate but more receptive to happy accidents than one that exists on the page. A line of poetry might change slightly each time we call it up, but the core idea remains, and the words that survive from one iteration to the next have persisted, by definition, because they’re memorable. We find ourselves revising and reworking the result because we have no choice, and in the process, we keep it alive. The danger, of course, is that if we don’t keep notes, any ideas we have are likely to float away without ever being realized—a phenomenon that every writer regards with dread. What we need is a structure that allows us to assign an order to the ideas in our head while preserving their ripe state of unwrittenness. Solzhenitsyn’s rosary, which was forced on him by necessity, was one possible answer, but there are others. Even if we’re diligent about keeping a pencil and paper—or a smartphone—nearby, there will be times when an idea catches us at a moment at which we can’t write it down. And when that happens, we need to be ready.

Written by nevalalee

November 23, 2017 at 9:00 am

The weight of lumber

with 4 comments

In my discussion yesterday of huge scholarly projects that expanded to take up the lives of their authors, I deliberately left out one name. Arnold J. Toynbee was a British historian and author of the twelve volumes of A Study of History, the most ambitious attempt to date at a universal theory of the rise and fall of civilizations. Toynbee has intrigued me for as long as I can remember, but he’s a little different from such superficially similar figures as Joseph Needham and Donald Knuth. For one thing, he actually finished his magnum opus, and even though it took decades, he more or less stuck to the plan of the work that he published in the first installment, which was an achievement in itself. He also differed from the other two in reaching a wide popular audience. Thousands of sets of his book were sold, and it became a bestseller in its two-volume abridgment by D.C. Somervell. It inspired countless essays and thick tomes of commentary, argument, and response—and then, strangely, it simply went away. Toynbee’s name has all but disappeared from mainstream and academic consideration, maybe because his ideas were too abstruse for one and too grandiose for the other, and if he’s recognized today at all, it’s probably because of the mysterious Toynbee tiles. (One possible successor is the psychohistory of the Foundation series, which has obvious affinities to his work, although Isaac Asimov played down the connection. He read the first half of A Study of History in 1944, borrowing the volumes one at a time from L. Sprague de Camp, and recalled: “There are some people who, on reading my Foundation series, are sure that it was influenced basically by Toynbee. They are only partly right. The first four stories were written before I had read Toynbee. ‘Dead Hand,’ however, was indeed influenced by it.”)

At the Newberry Library Book Fair last week, I hesitated over buying a complete set of Toynbee, and by the time I made up my mind and went back to get it, it was gone—which is the kind of mistake that can haunt me for the rest of my life. As a practical matter, though, I have all the Toynbee I’ll ever need: I already own the introductory volume of A Study of History and the Somervell abridgment, and it’s frankly hard to imagine reading anything else. But I did pick up the twelfth and last volume, Reconsiderations, published seven years after the rest, which might be the most interesting of them all. It’s basically Toynbee’s reply to his critics in over seven hundred pages of small type, in the hardcover equivalent of a writer responding to all the online comments on his work one by one. Toynbee seems to have read every review of his book, and he sets out to engage them all, including a miscellaneous section of over eighty pages simply called Ad Hominem. It’s a prickly, fascinating work that is probably more interesting than the books that inspired it, and one passage in particular caught my eye:

One of my critics has compared earlier volumes of this book to a “palace” in which “the rooms…are over-furnished to the point of resembling a dealer’s warehouse.” This reviewer must also be a thought-reader; for I have often thought of myself as a man moving old furniture about. For centuries these lovely things had been lying neglected in the lumber-rooms and attics. They had been piled in there higgledy-piggledy, in utter disorder, and had been crammed so tight that nobody could even squeeze his way in to look at them and find out whether they were of any value. In the course of ages they had been accumulating there—unwanted rejects from a score of country houses. This unworthy treatment of these precious pieces came to trouble me more and more; for I knew that they were not really junk; I knew that they were heirlooms, and these so rare and fine that they were not just provincial curiosities; they were the common heritage of anyone who had any capacity for appreciating beauty in Man’s handiwork.

In speaking of “lumber-rooms and attics,” Toynbee is harking back to a long literary tradition of comparing the mind itself to a lumber-room, which originally meant a spare room in a house full of unused furniture and other junk. I owe this knowledge to Nicholson Baker’s famous essay “Lumber,” reprinted in his collection The Size of Thoughts, in which he traces the phrase’s rise and fall, in a miniature version of what Toynbee tries to do for entire civilizations. Baker claims to have chosen the word “lumber” essentially at random, writing in his introduction: “Now feels like a good time to pick a word or a phrase, something short, and go after it, using the available equipment of intellectual retrieval, to see where we get…It should be representatively out of the way; it should have seen better days. Once or twice in the past it briefly enjoyed the status of a minor cliché, but now, for one reason or another, it is ignored or forgotten.” This might be a description of A Study of History itself—and yet, remarkably, Baker doesn’t mention the passage that I’ve quoted here. I assume that this is because he wasn’t aware of it, because it fits in beautifully with the rest of his argument. The dread of the mind becoming a lumber-room, crammed with useless odds and ends, is primarily a fear of intellectuals, as expressed by their patron saint Sherlock Holmes:

I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things, so that he has a difficulty in laying his hands upon it. Now the skillful workman is very careful indeed as to what he takes into his brain-attic…It is a mistake to think that this little room has elastic walls and can distend to any extent.

Baker explains: “This is a form of the great scholarly worry—a worry which hydroptically book-thirsty poets like Donne, Johnson, Gray, Southey, and Coleridge all felt at times—the fear that too much learning will eventually turn even an original mind into a large, putty-colored regional storage facility of mislabeled and leaking chemical drums.”

Toynbee’s solution to the problem of mental lumber, like that of Needham and Knuth, was simply to pull it out of his brain and put it down on paper, even if it took three decades and twelve volumes. It’s hard not to be stirred by his description of his efforts:

At last I found that I could not bear this shocking situation any longer, so I set my own hand to a back-breaking job. I began to drag out the pieces, one by one, and to arrange them in the hall. I could not pretend to form a final judgement on the order in which they should be placed. Indeed, there never could be a final judgement on this, for a number of attractive different orders could be imagined, each of them the right order from some particular point of view. The first thing to be done was to get as many of the pieces as possible out into the open and to assemble them in some order or other. If once I had them parked down in the hall, I could see how they looked and could shift them and re-shift them at my leisure. Perhaps I should not have the leisure; perhaps the preliminary job of extracting these treasures from the lumber-rooms and attics would turn out to be as much as I could manage with my single pair of hands. If so, this would not matter; for there would be plenty of time afterwards for other people to rearrange the pieces, and, no doubt, they would be doing this again and again as they studied them more closely and came to know more about them than would ever be known by me.

It’s through arrangement and publication that lumber becomes precious again, and from personal experience, I know how hard it can be to relinquish information that has been laboriously brought to light. But part of the process is knowing when to stop. As Baker, a less systematic but equally provocative thinker, concludes:

I have poked through verbal burial mounds, I have overemphasized minor borrowings, I have placed myself deep in the debt of every accessible work of reference, and I have overquoted and overquibbled—of course I have: that is what always happens when you pay a visit to the longbeards’ dusty chamber…All the pages I have flipped and copied and underlined will turn gray again and pull back into the shadows, and have no bearing on one another. Lumber becomes treasure only temporarily, through study, and then it lapses into lumber again. Books open, and then they close.

My ten great books #1: The Annotated Sherlock Holmes

leave a comment »

The Annotated Sherlock Holmes

Note: Four years ago, I published a series of posts here about my ten favorite works of fiction. Since then, the list has evolved, as all such rankings do, and this seems like a good time to revisit it. (I’m not including any science fiction, which I hope to cover in a separate feature later this year.) I’ll be treating them in the order of their original publication, but as it happens, we’ll be starting today with the book I love the most.

I first encountered the best book in the world in the library of St. John’s College in Annapolis, Maryland. At the time, I was seventeen, and of course I was already in love with Sherlock Holmes—I’d even been exposed to the subculture of obsessive Holmes fans through the wonderful anthology A Baker Street Dozen, which I still think is the most inviting introduction to the subject for the general reader. What I found in The Annotated Sherlock Holmes by William S. Baring-Gould was something much more: an entire universe of speculation, whimsy, and longing grown from the rich soil of Arthur Conan Doyle’s original stories. As the narrator relates in Borges’s “Tlön, Uqbar, Orbis Tertius”:

Two years before I had discovered…a superficial description of a nonexistent country; now chance afforded me something more precious and arduous. Now I held in my hands a vast methodical fragment of an unknown planet’s entire history, with its architecture and its playing cards, with the dread of its mythologies and the murmur of its languages, with its emperors and its seas, with its minerals and its birds and its fish, with its algebra and its fire, with its theological and metaphysical controversy. And all of it articulated, coherent, with no visible doctrinal intent or tone of parody.

The rules of the game were simple. Holmes, Watson, Mycroft, and the other vivid figures who populated their slice of London had been real men and women; Conan Doyle had been Watson’s literary agent; and the stories were glimpses into a larger narrative that could be reconstructed with enough patience and ingenuity. Given the scraps of information that they provided, you could figure out which building had been the model for 221B Baker Street; piece together the details of Watson’s military record, the location of his war wound, and the identities of his three, or perhaps four, wives; determine the species of the speckled band and whether “The Adventure of the Three Students” took place at Oxford or Cambridge; and pin down, with considerable accuracy, when and where each of the other adventures took place, even as Watson, or Conan Doyle, tried to divert you with “mistakes” that were deliberate misleads or red herrings.

The result of Baring-Gould’s work, which collects nearly a century’s worth of speculation into one enormous, handsomely illustrated volume, is the first book I’d save if I could own only one, and for years, it’s been living on my desk, both as a source of inspiration and as a convenient laptop stand. (Leslie Klinger’s more recent edition is lovely as well, but Baring-Gould will always be closest to my heart.) And it’s taken me a long time to realize why I care about this book so much, aside from the obvious pleasure it affords. It represents a vision of the world, and of reading, that I find immensely seductive. Each story, and often each sentence, opens onto countless others, and if Conan Doyle didn’t mean for his work to be subjected to this level of scrutiny, that’s even better: it allows us to imagine that we aren’t following a trail of clues that the author meant for us to find, but discovering something that was invisibly there all along. “Never has so much been written by so many for so few,” as the great Sherlockian Christopher Morley once said, and it’s true. All these studies are spectacularly useless, and they’re divorced from any real academic or practical value—aside, of course, from the immense benefit of allowing us to spend more time in this world and in the company of two of the most appealing characters in fiction. It’s a way for the story, and the act of reading, to go on forever, and in the end, it transforms us. In the role of a literary detective, or a tireless reader, you become Holmes, or at least a Watson to more capable investigators, thanks to the beauty of the stories themselves. What more can we ask from reading?

The cliché factory

with one comment

A few days ago, Bob Mankoff, the cartoon editor of The New Yorker, devoted his weekly email newsletter to the subject of “The Great Clichés.” A cliché, as Mankoff defines it, is a restricted comic situation “that would be incomprehensible if the other versions had not first appeared,” and he provides a list of examples that should ring bells for all readers of the magazine, from the ubiquitous “desert island” to “The-End-Is-Nigh Guy.” Here are a few of my favorites:

Atlas holding up the world; big fish eating little fish; burglars in masks; cave paintings; chalk outline at crime scene; crawling through desert; galley slaves; guru on mountain; mobsters and victim with cement shoes; man in stocks; police lineup; two guys in horse costume.

Inevitably, Mankoff’s list includes a few questionable choices, while also omitting what seem like obvious contenders. (Why “metal detector,” but not “Adam and Eve?”) But it’s still something that writers of all kinds will want to clip and save. Mankoff doesn’t make the point explicitly, but most gag artists probably keep a similar list of clichés as a starting point for ideas, as we read in Mort Gerberg’s excellent book Cartooning:

List familiar situations—clichés. You might break them down into categories, like domestic (couple at breakfast, couple watching television); business (boss berating employee, secretary taking dictation); historic (Paul Revere’s ride, Washington crossing the Delaware); even famous cartoon clichés (the desert island, the Indian snake charmer)…Then change something a little bit.

As it happened, when I saw Mankoff’s newsletter, I had already been thinking about a far more harmful kind of comedy cliché. Last week, Kal Penn went on Twitter to post some of the scripts from his years auditioning as a struggling actor, and they amount to an alternative list of clichés kept by bad comedy writers, consciously or otherwise: “Gandhi lookalike,” “snake charmer,” “foreign student.” One character has a “slight Hindi accent,” another is a “Pakistani computer geek who dresses like Beck and is in a perpetual state of perspiration,” while a third delivers dialogue that is “peppered with Indian cultural references…[His] idiomatic conversation is hit and miss.” A typical one-liner: “We are propagating like flies on elephant dung.” One script describes a South Asian character’s “spastic techno pop moves,” with Penn adding that “the big joke was an accent and too much cologne.” (It recalls the Morrissey song “Bengali in Platforms,” which included the notorious line: “Life is hard enough when you belong here.” You could amend it to read: “Being a comedy writer is hard enough when you belong here.”) Penn closes by praising shows with writers “who didn’t have to use external things to mask subpar writing,” which cuts to the real issue here. The real person in “a perpetual state of perspiration” isn’t the character, but the scriptwriter. Reading the teleplay for an awful sitcom is a deadening experience in itself, but it’s even more depressing to realize that in most cases, the writer is falling back on a stereotype to cover up the desperate unfunniness of the writing. When Penn once asked if he could play a role without an accent, in order to “make it funny on the merits,” he was told that he couldn’t, probably because everybody else knew that the merits were nonexistent.

So why is one list harmless and the other one toxic? In part, it’s because we’ve caught them at different stages of evolution. The list of comedy conventions that we find acceptable is constantly being culled and refined, and certain art forms are slightly in advance of the others. Because of its cultural position, The New Yorker is particularly subject to outside pressures, as it learned a decade ago with its Obama terrorist cover—which demonstrated that there are jokes and images that aren’t acceptable even if the magazine’s attitude is clear. Turn back the clock, and Mankoff’s list would include conventions that probably wouldn’t fly today. Gerberg’s list, like Penn’s, includes “snake charmer,” which Mankoff omits, and he leaves out “Cowboys and Indians,” a cartoon perennial that seems to be disappearing. And it can be hard to reconstruct this history, because the offenders tend to be consigned to the memory hole. When you read a lot of old magazine fiction, as I do, you inevitably find racist stereotypes that would be utterly unthinkable today, but most of the stories in which they appear have long since been forgotten. (One exception, unfortunately, is the Sherlock Holmes short story “The Adventure of the Three Gables,” which opens with a horrifying racial caricature that most Holmes fans must wish didn’t exist.) If we don’t see such figures as often today, it isn’t necessarily because we’ve become more enlightened, but because we’ve collectively agreed to remove certain figures from the catalog of stock comedy characters, while papering over their use in the past. A list of clichés is a snapshot of a culture’s inner life, and we don’t always like what it says. The demeaning parts still offered to Penn and actors of similar backgrounds have survived for longer than they should have, but sitcoms that trade in such stereotypes will be unwatchable in a decade or two, if they haven’t already been consigned to oblivion.

Of course, most comedy writers aren’t thinking in terms of decades, but about getting through the next five minutes. And these stereotypes endure precisely because they’re seen as useful, in a shallow, short-term kind of way. There’s a reason why such caricatures are more visible in comedy than in drama: comedy is simply harder to write, but we always want more of it, so it’s inevitable that writers on a deadline will fall back on lazy conventions. The really insidious thing about these clichés is that they sort of work, at least to the extent of being approved by a producer without raising any red flags. Any laughter that they inspire is the equivalent of empty calories, but they persist because they fill a cynical need. As Penn points out, most writers wouldn’t bother with them at all if they could come up with something better. Stereotypes, like all clichés, are a kind of fallback option, a cheap trick that you deploy if you need a laugh and can’t think of another way to get one. Clichés can be a precious commodity, and all writers resort to them occasionally. They’re particularly valuable for gag cartoonists, who can’t rely on a good idea from last week to fill the blank space on the page—they’ve got to produce, and sometimes that means yet another variation on an old theme. But there’s a big difference between “Two guys in a horse costume” and “Gandhi lookalike.” Being able to make that distinction isn’t a matter of political correctness, but of craft. The real solution is to teach people to be better writers, so that they won’t even be tempted to resort to such tired solutions. This might seem like a daunting task, but in fact, it happens all the time. A cliché factory operates on the principle of supply and demand. And it shuts down as soon as people no longer find it funny.

Written by nevalalee

March 20, 2017 at 11:18 am

“Knowledge of Politics—Feeble”

with 2 comments

Illustration by Sidney Paget for "The Five Orange Pips"

In A Study in Scarlet, the first Sherlock Holmes adventure, there’s a celebrated passage in which Watson tries to figure out his mystifying roommate. At this point in their relationship, he doesn’t even know what Holmes does for a living, and he’s bewildered by the gaps in his new friend’s knowledge, such as his ignorance of the Copernican model of the solar system. When Watson informs him that the earth goes around the sun, Holmes says: “Now that I do know it, I shall do my best to forget it.” He tells Watson that the human brain is like “a little empty attic,” and that it’s a mistake to assume that the room has elastic walls, concluding: “If we went round the moon it would not make a pennyworth of difference to me or to my work.” In fact, it’s clear that he’s gently pulling Watson’s leg: Holmes certainly shows plenty of practical astronomical knowledge in stories like “The Musgrave Ritual,” and he later refers casually to making “allowance for the personal equation, as the astronomers put it.” At the time, Watson wasn’t in on the joke, and he took it all at face value when he made his famous list of Holmes’s limitations. Knowledge of literature, philosophy, and astronomy was estimated as “nil,” while botany was “variable,” geology was “practical, but limited,” chemistry was “profound,” and anatomy—in an expression that I’ve always loved—was “accurate, but unsystematic.”

But the evaluation that has probably inspired the most commentary is “Knowledge of Politics—Feeble.” Ever since, commentators have striven mightily to reconcile this with their conception of Holmes, which usually means forcing him into the image of their own politics. In Sherlock Holmes: Fact or Fiction?, T.S. Blakeney observes that Holmes takes no interest, in “The Bruce-Partington Plans,” in “the news of a revolution, of a possible war, and of an impending change of government,” and he concludes:

It is hard to believe that Holmes, who had so close a grip on realities, could ever have taken much interest in the pettiness of party politics, nor could so strong an individualist have anything but contempt for the equalitarian ideals of much modern sociological theory.

S.C. Roberts, in “The Personality of Sherlock Holmes,” objected to the latter point, arguing that Holmes’s speech in “The Naval Treaty” on English boarding schools—“Capsules with hundreds of bright little seeds in each, out of which will spring the wiser, better England of the future”—is an expression of Victorian liberalism at its finest. Roberts writes:

It is perfectly true that the clash of political opinions and of political parties does not seem to have aroused great interest in Holmes’s mind. But, fundamentally, there can be no doubt that Holmes believed in democracy and progress.

Sidney Paget illustration of Mycroft Holmes

In reality, Holmes’s politics are far from a mystery. As the descendant of “country squires,” he rarely displayed anything less than a High Tory respect for the rights of landed gentry, and he remained loyal to the end to Queen Victoria, the “certain gracious lady in whose interests he had once been fortunate enough to carry out a small commission.” He was obviously an individualist in his personal habits, in the venerable tradition of British eccentrics, which doesn’t mean that his political views—as some have contended—were essentially libertarian. Holmes had a very low regard for the freedom of action of the average human being, and with good reason. The entire series was predicated on the notion that men and women are totally predictable, moving within their established courses so reliably that a trained detective can see into the past and forecast the future. As someone once noted, Holmes’s deductions are based on a chain of perfectly logical inferences that would have been spoiled by a single mistake on the part of the murderer. Holmes didn’t particularly want the world to change, because it was the familiar canvas on which he practiced his art. (His brother Mycroft, after all, was the British government.) The only individuals who break out of the pattern are criminals, and even then, it’s a temporary disruption. You could say that the entire mystery genre is inherently conservative: it’s all about the restoration of order, and in the case of Holmes, it means the order of a world, in Vincent Starrett’s words, “where it is always 1895.”

I love Sherlock Holmes, and in a large part, it’s the nostalgia for that era—especially by those who never had to live with it or its consequences—that makes the stories so appealing. But it’s worth remembering what life was really like at the end of the nineteenth century for those who weren’t as fortunate. (Arthur Conan Doyle identified, incidentally, as a Liberal Unionist, a forgotten political party that was so muddled in its views that it inspired a joke in The Importance of Being Earnest: “What are your politics?” “Well, I am afraid I really have none. I am a Liberal Unionist.” And there’s no question that Conan Doyle believed wholeheartedly in the British Empire and all it represented.) Over the last few months, there have been times when I’ve thought approvingly of what Whitfield J. Bell says in “Holmes and History”:

Holmes’s knowledge of politics was anything but weak or partial. Of the hurly-burly of the machines, the petty trade for office and advantage, it is perhaps true that Holmes knew little. But of politics on the highest level, in the grand manner, particularly international politics, no one was better informed.

I can barely stand to look at a newspaper these days, so it’s tempting to take a page from Holmes and ignore “the petty trade for office and advantage.” And I often do. But deep down, it implies an acceptance of the way things are now. And it seems a little feeble.

Mycroft Holmes and the mark of genius

with 3 comments

Sidney Paget illustration of Mycroft Holmes

“Original discoveries cannot be made casually, not by anyone at any time or anywhere,” the great biologist Edward O. Wilson writes in Letters to a Young Scientist. “The frontier of scientific knowledge, often referred to as the cutting edge, is reached with maps drawn by earlier scientists…Somewhere in these vast unexplored regions you should settle.” This seems like pretty good career advice for scientists and artists alike. But then Wilson makes a striking observation:

But, you may well ask, isn’t the cutting edge a place only for geniuses? No, fortunately. Work accomplished on the frontier defines genius, not just getting there. In fact, both accomplishments along the frontier and the final eureka moment are achieved more by entrepreneurship and hard work than by native intelligence. This is so much the case that in most fields most of the time, extreme brightness may be a detriment. It has occurred to me, after meeting so many successful researchers in so many disciplines, that the ideal scientist is smart only to an intermediate degree: bright enough to see what can be done but not so bright as to become bored doing it.

At first glance, this may not seem all that different from Martin A. Schwartz’s thoughts on the importance of stupidity, which I quoted here last week. In fact, they’re two separate observations—although they turn out to be related in one important respect. Schwartz is talking about “absolute stupidity,” or our collective ignorance in the face of the unknown, and he takes pains to distinguish it from the “relative stupidity” that differentiates students in the same college classes. And while Wilson isn’t talking about relative stupidity here, exactly, he’s certainly discussing relative intelligence, or the idea that the best scientists might be just a little bit less bright than their smartest peers in school. As he goes on to observe:    

What, then, of certified geniuses whose IQs exceed 140, and are as high as 180 or more? Aren’t they the ones who produce the new groundbreaking ideas? I’m sure some do very well in science, but let me suggest that perhaps, instead, many of the IQ-brightest join societies like Mensa and work as auditors and tax consultants. Why should the rule of optimum medium brightness hold? (And I admit this perception of mine is only speculative.) One reason could be that IQ geniuses have it too easy in their early training. They don’t have to sweat the science courses they take in college. They find little reward in the necessarily tedious chores of data-gathering and analysis. They choose not to take the hard roads to the frontier, over which the rest of us, the lesser intellectual toilers, must travel.

Marilyn vos Savant

In other words, the real geniuses are reluctant to take on the voluntary stupidity that science demands, and they’re more likely to find sources of satisfaction that don’t require them to constantly confront their own ignorance. This is a vast generalization, of course, but it seems to square with experience. I’ve met a number of geniuses, and what many of them have in common is a highly pragmatic determination to make life as pleasant for themselves as possible. Any other decision, in fact, would call their genius into doubt. If you can rely unthinkingly on your natural intelligence to succeed in a socially acceptable profession, or to minimize the amount of work you have to do at all, you don’t have to be a genius to see that this is a pretty good deal. The fact that Marilyn vos Savant—who allegedly had the highest tested intelligence ever recorded—became a columnist for Parade might be taken as a knock against her genius, but really, it’s the most convincing proof of it that I can imagine. The world’s smartest person should be more than happy to take a cushy gig at a Sunday supplement magazine. Most of the very bright endure their share of miseries during childhood, and their reward, rather than more misery, might as well be an adult life that provides intellectual stimulation in emotional safety. This is why I’ve always felt that Mycroft Holmes, Sherlock’s smarter older brother, knew exactly how his genius ought to be used. As Sherlock notes drily in “The Adventure of the Bruce-Partington Plans”: “Mycroft draws four hundred and fifty pounds a year, remains a subordinate, has no ambitions of any kind, will receive neither honor nor title, but remains the most indispensable man in the country.” 

Yet it’s Sherlock, who was forced to leave the house to find answers to his problems, whom we love more. (He’s also been held up as an exemplar of the perfect scientist.) Mycroft is hampered by both his physical laziness and his mental quickness: when a minister comes to him with a complicated problem involving “the Navy, India, Canada, and the bimetallic question,” Mycroft can provide the answer “offhand,” which doesn’t give him much of an incentive to ever leave his office or the Diogenes Club. As Holmes puts it in “The Greek Interpreter”:

You wonder…why it is that Mycroft does not use his powers for detective work. He is incapable of it…I said that he was my superior in observation and deduction. If the art of the detective began and ended in reasoning from an armchair, my brother would be the greatest criminal agent that ever lived. But he has no ambition and no energy. He will not even go out of his way to verify his own solution, and would rather be considered wrong than take the trouble to prove himself right.

Mycroft wasn’t wrong, either. He seems to have lived a very comfortable life. But it’s revealing that Conan Doyle gave the real adventures to the brother with the slightly less scintillating intelligence. In art, just as in science, technical facility can prevent certain artists from making real discoveries. The ones who have to work at it are more likely to find something real. But we can also raise a glass to Mycroft, Marilyn, and the geniuses who are smart enough not to make it too hard on themselves.

Solzhenitsyn’s rosary

with 2 comments

Aleksandr Solzhenitsyn

When Aleksandr Solzhenitsyn was imprisoned in the Soviet gulag, he was forced to deal with a challenge that modern writers rarely have to confront—the problem of memorization. He wanted to keep writing, but was unable to put anything on paper, which would be confiscated and read by the guards. Here’s the solution that he found, as described in The Gulag Archipelago:

I started breaking matches into little pieces and arranging them on my cigarette case in two rows (of ten each, one representing units and the others tens). As I recited the verses to myself, I displaced one bit of broken match from the units row for every line. When I shifted ten units I displaced one of the “tens”…Every fiftieth and every hundredth line I memorized with special care, to help me keep count. Once a month I recited all that I had written. If the wrong line came out in place of one of the hundreds and fifties, I went over it all again and again until I caught the slippery fugitives.

In the Kuibyshev Transit Prison I saw Catholics (Lithuanians) busy making themselves rosaries for prison use…I joined them and said that I, too, wanted to say my prayers with a rosary but that in my particular religion I needed hundred beads in a ring…that every tenth bead must be cubic, not spherical, and that the fiftieth and the hundredth beads must be distinguishable at a touch.

The Lithuanians were impressed, Solzhenitsyn says, by his “religious zeal,” and they agreed to make a rosary to his specifications, fashioning the beads out of pellets of bread and coloring them with burnt rubber, tooth powder, and disinfectant. (Later, when Solzhenitsyn realized that twenty beads were enough, he made them himself out of cork.) He concludes:

I never afterward parted with the marvelous present of theirs; I fingered and counted my beads inside my wide mittens—at work line-up, on the march to and fro from work, at all waiting times; I could do it standing up, and freezing cold was no hindrance. I carried it safely through the search points, in the padding of my mittens, where it could not be felt. The warders found it on various occasions, but supposed that it was for praying and let me keep it. Until the end of my sentence (by which time I had accumulated 12,000 lines) and after that in my places of banishment, this necklace helped me write and remember.

Ever since I first read this story, I’ve been fascinated by it, and I’ve occasionally found myself browsing the rosaries or prayer beads for sale online, wondering if I should get one for myself, just in case—although in case of what, exactly, I don’t know.

Joan Didion

But you don’t need to be in prison to understand the importance of memorization. One of the side effects of our written and interconnected culture is that we’ve lost the ability to hold information in our heads, a trend that has only accelerated as we’ve outsourced more of our inner lives to the Internet. This isn’t necessarily a bad thing: there are good reasons for keeping a lot of this material where it can be easily referenced, without feeling the need to remember it all. (As Sherlock Holmes said in A Study in Scarlet: “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose…It is a mistake to think that that little room has elastic walls and can distend to any extent.” Although given the amount of obscure information that Holmes was able to produce in subsequent stories, it’s possible that he was just kidding.) But there’s also a real loss involved. Oral cultures are marked by a highly developed verbal memory, especially for those whose livelihoods depend on it: a working poet could be expected to know hundreds of songs by heart, and the conventions of poetry itself emerged, in part, as a set of mnemonic devices. Meter, rhyme, and conventional formulas allowed many lines of verse to be recited for a paying audience—or improvised on the spot. Like the songlines of the Aboriginal Australians, an oral poem is a vehicle for the preservation of information, and it takes advantage of the human brain’s ability to retain material in a pattern that hints at what comes next. When we neglect this, we lose touch with some of the reasons that poetry evolved in the first place.

And what makes memorization particularly valuable as a creative tool is the fact that it isn’t quite perfect. When you write something down, it tends to become fixed, both physically and psychologically. (Joan Didion must have had something like this in mind when she said: “By the time you’ve laid down the first two sentences, your options are all gone.”) An idea in the brain, by contrast, remains fluid, malleable, and vital. Each time you go back to revisit it, whether using a rosary or some other indexical system, you aren’t just remembering it, but to some extent recreating it, and you’ll never get it exactly right. But just as natural selection exists because of the variations that arise from errors of transcription, a creative method that relies on memory is less accurate but more receptive to happy accidents than one that exists on the page. A line of poetry might change slightly each time we call it up, but the core idea remains, and the words that survive from one iteration to the next have persisted, by definition, because they’re memorable. We find ourselves revising and reworking the result because we have no choice, and in the process, we keep it alive. The danger, of course, is that if we don’t keep notes, any ideas we have are likely to float away without ever being realized—a phenomenon that every writer regards with dread. What we need is a structure that allows us to assign an order to the ideas in our head while preserving their ripe state of unwrittenness. Solzhenitsyn’s rosary, which was forced on him by necessity, was one possible answer, but there are others. Tomorrow, I’ll discuss another method that I’ve been using with excellent results, and which relies on a priceless mnemonic tool that we all take for granted: the alphabet.

Written by nevalalee

July 11, 2016 at 9:06 am

The monster in the writers room

leave a comment »

Mads Mikkelsen on Hannibal

Note: Spoilers follow for the season finale of Hannibal.

When it comes to making predictions about television shows, my track record is decidedly mixed. I was long convinced, for instance, that Game of Thrones would figure out a way to keep Oberyn Martell around, just because he was such fun to watch, and to say I was wrong about this is something of an understatement. Let the record show, however, that I said here months ago that the third season of Hannibal would end with Will Graham getting a knife through his face:

In The Silence of the Lambs, Crawford says that Graham’s face “looks like damned Picasso drew it.” None of the prior cinematic versions of this story have dared to follow through on this climax, but I have a feeling, given the evidence, that Fuller would embrace it. Taking Hugh Dancy’s face away, or making it hard for it look at, would be the ultimate rupture between the series and its viewers. Given the show’s cancellation, it may well end up being the very last thing we see. It would be a grim note on which to end. But it’s nothing that this series hasn’t taught us to expect.

This wasn’t the hardest prediction in the world to make. One of the most distinctive aspects of Bryan Fuller’s take on the Lecter saga is his willingness to pursue elements of the original novels that other adaptations have avoided, and the denouement of Red Dragon—with Will lying alone, disfigured, and mute in the hospital—is a downer ending that no other version of this story has been willing to touch.

Of course, that wasn’t what we got here, either. Instead of Will in his hospital bed, brooding silently on the indifference of the natural world to murder, we got a hysterical ballet of death, with Will and Hannibal teaming up to dispatch Dolarhyde like the water buffalo at the end of Apocalypse Now, followed by an operatic plunge over the edge of a cliff, with our two star-crossed lovers locked literally in each other’s arms. And it was a worthy finale for a series that has seemed increasingly indifferent to anything but that unholy love story. The details of Lecter’s escape from prison are wildly implausible, and whatever plan they reflect is hilariously undercooked, even for someone like Jack Crawford, who increasingly seems like the world’s worst FBI agent in charge. Hannibal has never been particularly interested its procedural elements, and its final season took that contempt to its final, ludicrous extreme. In the novel Red Dragon, Will, despite his demons, is a competent, inspired investigator, and he’s on the verge of apprehending Dolaryhyde through his own smarts when his quarry turns the tables. In Fuller’s version, unless I missed something along the way, Will doesn’t make a single useful deduction or take any meaningful action that isn’t the result of being manipulated by Hannibal or Jack. He’s a puppet, and dangerously close to what TV Tropes has called a Woobie: a character whom we enjoy seeing tortured so we can wish the pain away.

Hugh Dancy on Hannibal

None of this should be taken as a criticism of the show itself, in which any narrative shortcomings can hardly be separated from Fuller’s conscious decisions. But as enjoyable as the series has always been—and I’ve enjoyed it more than any network drama I’ve seen in at least a decade—it’s something less than an honest reckoning with its material. As a rule of thumb, the stories about Lecter, including Harris’s own novels, have been the most successful when they stick most closely to their roots as police procedurals. Harris started his career as a crime reporter, and his first three books, including Black Sunday, are masterpieces of the slow accumulation of convincing detail, spiced and enriched by a layer of gothic violence. When you remove that foundation of realistic suspense, you end up with a character who is dangerously uncontrollable: it’s Lecter, not Harris, who becomes the author of his own novel. In The Annotated Dracula, Leslie S. Klinger proposes a joke theory that the real author of that book is Dracula himself, who tracked down Bram Stoker and forced him to make certain changes to conceal the fact that he was alive and well and living in Transylvania. It’s an “explanation” that rings equally true of the novels Hannibal and Hannibal Rising, which read suspiciously as if Lecter were dictating elements of his own idealized autobiography to Harris. (As far as I know, nobody has seen or heard from Harris since Hannibal Rising came out almost a decade ago. Are we sure he’s all right?)

And there are times when Hannibal, the show, plays as if Lecter had gotten an executive producer credit sometime between the second and third seasons. If anything, this is a testament to his vividness: when properly acted and written, he dominates his stories to a greater extent than any fictional character since Sherlock Holmes. (In fact, the literary agent hypothesis—in which the credited writer of a series is alleged to be simply serving as a front—originated among fans of Conan Doyle, who often seemed bewildered by the secondary lives his characters assumed.) But there’s something unsettling about how Lecter inevitably takes on the role of a hero. My favorite stretch of Hannibal was the back half of the second season, which looked unflinchingly at Lecter’s true nature as a villain, cannibal, and destroyer of lives. When he left the entire supporting cast to bleed slowly to death at the end of “Mizumono,” it seemed impossible to regard him as an appealing figure ever again. And yet here we are, with an ending that came across as the ultimate act of fan service in a show that has never been shy about appealing to its dwindling circle of devotees. I can’t exactly blame it for this, especially because the slow dance of seduction between Will and Hannibal has always been a source of sick, irresistible fascination. But we’re as far ever from an adaptation that would force us to honestly confront why we’re so attached to a man who eats other people, or why we root for him to triumph over lesser monsters who make the mistake of not being so rich, cultured, or amusing. Lecter came into this season like a lion, but he went out, as always, like a lamb.

“Their journey so far had been uneventful…”

leave a comment »

"The overnight train from Paris to Munich..."

Note: This post is the twenty-ninth installment in my author’s commentary for Eternal Empire, covering Chapter 28. You can read the previous installments here.

In evolutionary theory, there’s a concept known as exaptation, in which a trait that evolved because it met a particular need turns out to be just as useful for something else. Feathers, for instance, originally provided a means of regulating body temperature, but they ended up being crucial in the development of flight, and in other cases, a trait that played a secondary or supporting role to another adaptation becomes important enough to serve an unrelated purpose of its own. We see much the same process at work in genre fiction, which is subject to selective pressures from authors, editors, and especially readers. The genres we see today, like suspense or romance, might seem inevitable, but their conventions are really just a set of the recipes or tricks that worked. Such innovations are rarely introduced as a conscious attempt to define a new category of fiction, but as solutions to the problems that a specific narrative presents. The elements we see in Jane Eyre—the isolated house, the orphaned heroine, the employer with a mysterious past—arose from Charlotte Brontë’s confrontation with that particular story, but they worked so well that they were appropriated by a cohort of other writers, working in the now defunct genre of the gothic romance. And I suspect that Brontë would be as surprised as anyone by the uses to which her ideas have been put.

It’s rare for a genre to emerge, as gothic romance did, from a single book; more often, it’s the result of small shifts in a wide range of titles, with each book accidentally providing a useful tool that is picked up and used by others. Repeat the process for a generation or two, and you end up with a set of conventions to which later writers will repeatedly return. And as with other forms of natural selection, a secondary adaptation, introduced to enable something else, can evolve to take over the whole genre. The figure of the detective or private eye is a good example. When you look at the earliest works of mystery fiction we have, from Bleak House to The Moonstone, you often find that the detective plays a minor role: he pops up toward the middle of the story, he nudges the plot along when necessary, and he defers whenever possible to the other characters. Even in A Study in Scarlet, Holmes is only one character among many, and the book drops him entirely in favor of a long flashback about the Mormons. Ultimately, though, the detective—whose initial role was purely functional—evolved to become the central attraction, with the romantic leads who were the focus of attention in Dickens or Collins reduced to the interchangeable supporting players of an Agatha Christie novel. The detective was originally just a way of feathering the story; in the end, he was what allowed the genre to take flight.

"Their journey so far had been uneventful..."

You see something similar in suspense’s obsession with modes of transportation. One of the first great attractions of escapist spy fiction lay in the range of locations it presented: it allowed readers to vicariously travel to various exotic locales. (This hasn’t changed, either: the latest Mission: Impossible movie takes us to Belarus, Cuba, Virginia, Paris, Vienna, Casablanca, and London.) The planes, trains, and automobiles that fill such novels were meant simply to get the characters from place to place. Over time, though, they became set pieces in their own right. I’ve noted elsewhere that what we call an airport novel was literally a story set largely in airports, as characters flew from one exciting setting to another, and you could compile an entire anthology of thriller scenes set on trains or planes. At first, they were little more than connective tissue—you had to show the characters going from point A to point B, and the story couldn’t always cut straight from Lisbon to Marrakesh—but these interstitial scenes ultimately evolved into a point of interest in themselves. They also play a useful structural role. Every narrative requires a few pauses or transitions to gather itself between plot points, and staging such scenes on an interesting form of transport makes it seem as if the story is advancing, even if it’s taking a breather.

In Eternal Empire, for instance, there’s an entire chapter focusing on Ilya and his minder Bogdan as they take the Cassiopeia railway from Paris to Munich. There’s no particular reason it needs to exist at all, and although it contains some meaningful tidbits of backstory, I could have introduced this material in any number of other ways. But I wanted to write a train scene, in part as an homage to the genre, in part because it seemed unrealistic to leave Ilya’s fugitive journey undescribed, and in part because it gave me the setting I needed. There’s a hint of subterfuge, with my two travelers moving from one train station to another under false passports, and a complication in the fact that neither can bring a gun onboard, leaving them both unarmed. Really, though, it’s a scene about two men sizing each other up, and thrillers have long since learned that a train is the best place for such conversations, which is why characters always seem to be coming and going at railway stations. (In the show Hannibal, Will and Chiyo spend most of an episode on an overnight train to Florence, although they easily could have flown. It ends with Chiyo shoving Will onto the tracks, but I suspect that it’s really there to give them a chance to talk, which wouldn’t play as well on a plane.) Ilya and Bogdan have a lot to talk about. And when they get to their destination, they’ll have even more to say…

Bert’s nose and the limits of memory

with one comment

Bert and Ernie on Sesame Street?

A few days ago, I was leafing through a Sesame Street coloring book with my daughter when I was hit by a startling realization: I couldn’t remember the color of Bert’s nose. I’ve watched Bert and Ernie for what has to be hundreds of hours—much of it in the last six months—and I know more about them than I do about most characters in novels. But for the life of me, I couldn’t remember what color Bert’s nose was, and I was on the point of looking up a picture in The Sesame Street Dictionary when it finally came to me. As I continued to page through the coloring book, though, I found that I had trouble recalling a lot of little details. Big Bird’s legs, for instance, are orange cylinders segmented by thin contour lines, but what color are those lines? What about Elmo’s nose? Or the stripes on Bert and Ernie’s shirts? In the end, I repeatedly found myself going online to check. And while the last thing I want is to set down rules for what crayons my daughter can and can’t use when coloring her favorite characters, as a writer, and particularly one for whom observation and accuracy of description have always been important, I was secretly chagrined.

They aren’t isolated cases, either. My memory, like everyone else’s, has areas of greater and lesser precision: I have an encyclopedic recall of movie release dates, but have trouble putting a name to a face until I’ve met a person a couple of times. Like most of us, I remember images as chunks of information, and when I try to drill down to recall particular details, I feel like Watson in his exchange with Holmes in “A Scandal in Bohemia”:

For example, you have frequently seen the steps which lead up from the hall to this room.”
“Frequently.”
“How often?”
“Well, some hundreds of times.”
“Then how many are there?”
“How many? I don’t know.”
“Quite so! You have not observed. And yet you have seen. That is just my point. Now, I know that there are seventeen steps, because I have both seen and observed.”

And I find it somewhat peculiar—and I’m not alone here—that I was able to remember and locate this quote without any effort, while I still couldn’t tell you the number of steps that lead to the front porch of my own house.

Illustration by Sidney Paget for "A Scandal in Bohemia"

Of course, none of this is particularly surprising, if we’ve thought at all about how our own memories work. A mental image is really more of an impression that disappears like a mirage as soon as we try to get any closer, and it’s particularly true of the objects we take most for granted. When we think of our own pasts, it’s the exceptional moments that we remember, while the details of everyday routine seem to evaporate without a trace: I recall all kinds of things about my trip to Peru, but I can barely remember what my average day was like before my daughter was born. This kind of selective amnesia is so common that it doesn’t even seem worth mentioning. But it raises a legitimate question of whether this represents a handicap for a writer, or even disqualifies us from doing interesting work. In a letter to the novelist James Jones, the editor Maxwell Perkins once wrote:

I remember reading somewhere what I thought was a very true statement to the effect that anybody could find out if he was a writer. If he were a writer, when he tried to write out of some particular day, he found that he could recall exactly how the light fell and how the temperature felt, and all the quality of it. Most people cannot do it. If they can do it, they may never be successful in a pecuniary sense, but that ability is at the bottom of writing, I am sure.

For those of us who probably wouldn’t notice if someone quietly switched our toothbrushes, as Faye Wong does to Tony Leung in Chungking Express, this may seem disheartening. But I’d like to believe that memory and observation can be cultivated, like any writing skill, or that we can at least learn how to compensate for our own weaknesses. Some writers, like Nabokov or Updike, were born as monsters of noticing, but for the rest of us, some combination of good notes, close attention to the techniques of the writers we admire, and the directed observation required to solve particular narrative problems can go a long way toward making up the difference. (I emphasize specific problems because it’s more useful, in the long run, to figure out how to describe something within the context of a story than to work on self-contained writing exercises.) Revision, too, can work wonders: a full page of description distilled to a short paragraph, leaving only its essentials, can feel wonderfully packed and evocative. Our memories are selective for a reason: if we remembered everything, we’d have trouble knowing what was important. It’s better, perhaps, to muddle through as best as we can, turning on that novelistic degree of perception only when it counts—or, more accurately, when our intuition tells us that it counts. And when it really matters, we can always go back and verify that Bert’s nose, in fact, is orange.

Written by nevalalee

July 21, 2015 at 9:26 am

The prop master

with 2 comments

Edward Fox in The Day of the Jackal

When we break down the stories we love into their constituent parts, we’re likely to remember the characters first. Yet the inanimate objects—or what a theater professional would call the props—are what feather that imaginary nest, providing a backdrop for the narrative and necessary focal points for the action. A prop can be so striking that it practically deserves costar status, like the rifle in The Day of the Jackal, or a modest but unforgettable grace note, like the cake of soap that Leopold Bloom carries in his pocket for much of Ulysses. It can be the MacGuffin that drives the entire plot or the lever that enables a single crucial moment, like the necklace that tips off Scotty at the end of Vertigo. Thrillers and other genre novels often use props to help us tell flat characters apart, so that an eyepatch or a pocket square is all that distinguishes a minor player, but this kind of cheap shorthand can also shade into the highest level of all, in which accessories like Sherlock Holmes’s pipe or summon up an entire world of romance and emotion. And even if the props merely serve utilitarian ends, they’re still an aspect of fiction that writers could do well to study, since they can provide a path into a story or a solution to a problem that resists all other approaches.

They can also be useful at multiple stages. I’ve known for a long time that a list of props, like lists of any kind, can be an invaluable starting point for planning a story. The most eloquent expression of this I’ve ever found appears, unexpectedly, in Shamus Culhane’s nifty book Animation: From Script to Screen:

One good method of developing a story is to make a list of details. For example [for a cartoon about elves as clock cleaners in a cathedral], what architectural features come to mind—steeples, bells, windows, gargoyles? What props would the elves use—brushes, pails, mops, sponges…what else? Keep on compiling lists without stopping to think about them. Let your mind flow effortlessly, and don’t try to be neat or orderly. Scribble as fast as you can until you run out of ideas.

A list of props can be particularly useful when a story takes place within a closed universe with a finite number of possible combinations. Any good bottle episode invests much of its energy into figuring out surprising ways to utilize the set of props at hand, and I used an existing catalog of props—in the form of the items available for purchase from the commissary at Belmarsh Prison—to figure out a tricky plot point in Eternal Empire.

Kim Novak in Vertigo

What I’ve discovered more recently is that a list of props also has its uses toward the end of the creative process, when a short story or novel is nearly complete. If I have a decent draft that somehow lacks overall cohesiveness, I’ll go through and systematically make a list of all the props or objects that appear over the course of the story. Whenever I find a place where a prop that appears in one chapter can be reused down the line, it binds events together that much more tightly. When we’re writing a first draft, we have so much else on our minds that we tend to forget about object permanence: a prop is introduced when necessary and discarded at once. Giving some thought to how those objects can persist makes the physical space of the narrative more credible, and there’s often something almost musically satisfying when a prop unexpectedly reappears. (One of my favorite examples occurs in Wong Kar-Wai’s Chungking Express. During the sequence in which Faye Wong breaks into Tony Leung’s apartment to surreptitiously rearrange and replace some of his possessions, she gives him a new pair of sandals, throwing the old pair behind the couch. Much later, after she floods his living room by mistake, one of the old sandals comes floating out from its hiding place. It only appears onscreen for a moment, and nobody even mentions it, but it’s an image I’ve always treasured.)

And in many cases, the props themselves aren’t even the point. I’ve said before that one of the hardest things in writing isn’t inventing new material but fully utilizing what you already have. Nine times out of ten, when you’re stuck on a story problem, you’ll find that the solution is already there, buried between the lines on a page you wrote months before. The hard part is seeing past your memories of it. A list of props, assembled as drily as if you were a claims adjuster examining a property, can provide a lens through which the overfamiliar can become new. (This may be why histories of the world in a hundred objects, or whatever, are so popular: they give us a fresh angle on old events by presenting them through props, not personalities.) When you look at it more closely, a list of props is really a list of actions, or moments in which a character expresses himself by performing a specific physical activity. Unless you’re just giving us an inventory of a room’s contents, as Donna Tartt loves to do, a prop usually appears only when it’s being used for something. Props thus represent the point in space where intention becomes action, expressed in visual or tactile terms—which is exactly what a writer should always be striving to accomplish. And a list of props is nothing less than a list of the times which the story is working more or less as it should.

“Her face was that of a woman with secrets…”

leave a comment »

"She had never considered herself particularly Indian..."

Note: This post is the thirteenth installment in my author’s commentary for Eternal Empire, covering Chapter 14. You can read the previous installments here.

Of all the misconceptions that frustrate aspiring writers, one of the most insidious involves the distinction between flat and round characters. As formulated by E.M. Forster in Aspects of the Novel, a flat character is one that expresses a single, unchanging idea or quality, while a round character has the ability to change or surprise us. One certainly sounds better than the other, and as a result, you’ll often find writers fretting over the fact that one character or another in their stories is flat, or wondering how to construct a suitably round character from scratch, as if it were a matter of submitting the proper design specifications. What all this misses is the fact that Forster’s original categories were descriptive, not prescriptive, and a round character isn’t inherently more desirable than a flat one: as with everything else in writing, it depends on execution and the role a particular character plays in the narrative as a whole. It’s true that Forster concludes by saying: “We must admit that flat people are not in themselves as big achievements as round ones.” But he also prefaces this with three full pages of reasons why flat characters can be useful—or essential—in even the greatest of novels.

So why should we ever prefer a flat character over a round? Forster notes that flat characters often linger in the memory more vividly after the novel is over; they can be brought onstage in full force, rather than being slowly developed; and they’re easily recognizable, which can serve as an organizing principle in a complicated story. (He even says that Russian novels could use more of them.) In the work of writers like Dickens, who gives us pretty much nothing but flat characters, or Proust, who uses almost as many, their interest arises from their interactions with one another and the events of the plot: “He is the idea, and such life as he possesses radiates from its edges and from the scintillations it strikes when other elements in the novel impinge.” If Forster had lived a little later, he might have also mentioned Thomas Pynchon, whose works are populated by caricatures and cartoons whose flatness becomes a kind of strategy for managing the novel’s complexity. Flat characters have their limitations; they’re more appealing when comic than tragic, and they work best when they set off a round character at the center. But most good novels, as Forster observes, contain a mixture of the two: “A novel that is at all complex often requires flat people as well as round, and the outcome of their collisions parallels life more accurately.”

"Her face was that of a woman with secrets..."

And a memorable flat character requires as much work and imagination as one seen in the round. A bad, unconvincing character is sometimes described as “flat,” but the problem isn’t flatness in itself—it’s the lack of energy or ingenuity devoted to rendering that one vivid quality, or the author’s failure to recognize when one or another category of character is required. A bad flat character can be unbearable, but a bad round character tends to dissolve into a big pile of nothing, an empty collection of notions without anything to hold it together, as we see in so much literary fiction. The great ideal is a round, compelling character, but in order to surprise the reader, he or she has to surprise the writer first. And in practice, what this usually means is that a character who was introduced to fill a particular role gradually begins to take on other qualities, not through some kind of magic, but simply as the part is extended through multiple incidents and situations. Sherlock Holmes is fairly flat as first introduced in A Study in Scarlet: he’s extraordinarily memorable, but also the expression of a single idea. It’s only when the element of time is introduced, in the form of a series of stories, that he acquires an inner life. Not every flat character evolves into roundness, but when one does, the result is often more interesting than if it were conceived that way from the ground up.

My own novels contain plenty of flat characters, mostly to fill a necessary function or story point, but the one who turned into something more is Maya Asthana. She began, as most flat characters do, purely as a matter of convenience. Wolfe needed to talk to somebody, so I gave her a friend, and most of her qualities were chosen to make her marginally more vivid in what I thought would be her limited time onstage: I made her South Asian, which was an idea left over from an early conception of Wolfe herself, and I decided that she’d be planning her wedding, since this would provide her with a few easy bits of business that could be introduced without much trouble. But as I’ve mentioned elsewhere, Asthana got caught up in a radical shift in the logic of the novel itself: I needed a mole and a traitor within the agency, and after my original plan turned out to be unworkable, I cast around for someone else to fill that role. Asthana happened to be handy. And by turning her into a villain without changing a word of her initial presentation in City of Exiles, I got something far more intriguing than if I’d had this in mind from the beginning. Chapter 14 of Eternal Empire represents our first extended look at Asthana from the inside, and I like how the characteristics she acquired before I knew her true nature—her vanity, her intelligence, her perfect life with her fiancé—vibrate against what she became. Not every character turns out this way; these novels are filled with minor players content to occupy their roles. But Asthana, lucky for me and unlucky for everyone else, wanted to be more…

How the Vulcan got his ears

leave a comment »

Leonard Nimoy in Star Trek II: The Wrath of Khan

When the writer and director Nicholas Meyer was first approached about the possibility of working on the sequel to Star Trek: The Motion Picture, his initial response was: “Star Trek? Is that the one with the guy with the pointy ears?” Meyer, who tells this story in his engaging memoir The View from the Bridge, went on to cleverly stage the opening scene of Wrath of Khan—which is probably the one movie, aside from Vertigo, that I’ve discussed more often on this blog than any other—so that those ears are literally the first thing we see, in a shot of a viewscreen taken from over Spock’s shoulder. Elsewhere, I’ve spoken at length about how Meyer’s detachment from the source material resulted in by far the best movie in the franchise, and one of the most entertaining movies I’ve ever seen: because he wasn’t beholden to the original series, he was free to stock it with things he liked, from the Horatio Hornblower novels to A Tale of Two Cities. But it’s revealing that he latched onto those ears first. As the reaction to Leonard Nimoy’s death last week amply proved, Spock was the keystone and entry point to that entire universe, and our love for him and what he represented had as much to do with his ears as with what was going on in the brain between them.

These days, Spock’s ears are so iconic that it can be hard to recognize how odd they once seemed. Spock was one of the few elements to survive from the original series pilot “The Cage,” and even at the time, the network was a little perturbed: it raised concerns over his allegedly satanic appearance, which executives feared “would scare the shit out of every kid in America.” (They would have cared even less for Gene Roddenberry’s earliest conceptions, in which Spock was described as having “a slightly reddish complexion.”) Accordingly, the first round of publicity photos for the show were airbrushed to give him normal ears and eyebrows. In any event, of course, Spock didn’t scare kids, or ordinary viewers—he fascinated them. And those ears were a large part of his appeal. As Meyer intuitively understood, they were a fantastic piece of narrative shorthand, a signal to anyone flipping through channels that something interesting was happening onscreen. Spock’s ears said as much about the show’s world and intentions as Kirk’s opening voiceover, and they did so without a word of dialogue.

Star Trek II: The Wrath of Khan

Yet they wouldn’t have been nearly as effective if they hadn’t served as the visual introduction to a character who revealed greater depths the moment he began to speak. Spock was ostensibly a creature of pure logic, but he was much more, as Roger Ebert noted in his original review of Wrath of Khan:

The peculiar thing about Spock is that, being half human and half Vulcan and therefore possessing about half the usual quota of human emotions, he consistently, if dispassionately, behaves as if he possessed very heroic human emotions indeed. He makes a choice in Star Trek II that would be made only by a hero, a fool, or a Vulcan.

And while Robert Anton Wilson once claimed, with a straight face, that Spock was an archetypal reincarnation of the Aztec god Mescalito, whose pointed ears also appear on Peter Pan and the Irish leprechaun, his truest predecessor is as close as Victorian London. Meyer—whose breakthrough as a novelist was The Seven Per-Cent Solution—was the first to explicitly make the connection between Spock and Sherlock Holmes, whom Spock obliquely calls “an ancestor of mine” in The Undiscovered Country. Both were perfect reasoning machines, but they used logic to amplify, rather than undercut, their underlying qualities of humanity. “A great heart,” as Watson says, “as well as…a great brain.”

There’s a lesson here for storytellers of all kinds, and like most such examples, it’s easy to explain and all but impossible to replicate. Spock began as a visual conceit that could be grasped at once, deepened over time into a character whose basic qualities were immediately comprehensible and intriguing, and then became much more, aided in no small part by a magnificent performance by Nimoy. The autism advocate Temple Grandin has spoken of how much of herself she saw in Spock, a logical being trying to make his way in a world of more emotional creatures, and there’s no question that many Star Trek fans felt the same way. Spock, at least, carried his difference openly, and those who wear Starfleet pins on their lapels or don pointed ears at conventions are quietly allying themselves with that sense of otherness—which turns, paradoxically, into a sense of identity. “Of all the souls I have encountered in my travels, his was the most human,” Kirk says at the end of Wrath of Khan, and what feels like a contradiction gets at something more profound. Humanity, whether in reality or in fiction, is something we have to earn with every choice we make. Spock’s journey as a character was so compelling that it arguably saved Star Trek three times over, and neither the franchise or science fiction as we know it would be the same if we hadn’t heard the story through his ears.

“A freezing horror took hold of him…”

leave a comment »

"The copilot shook his head..."

Note: This post is the forty-fifth installment in my author’s commentary for City of Exiles, covering Chapter 44. You can read the earlier installments here

I’ve always been fascinated by horror fiction, but I’ve rarely drawn on its conventions for my own work. A few of my short stories—notably “The Boneless One” and “Cryptids”—employ horror tropes, and “Kawataro” is essentially an extended homage to the genre. In my novels, though, there’s little if any trace of it. Part of this is due to the fact that I’ve ended up working in a category that doesn’t accommodate itself easily to that style: suspense fiction, at least of the international kind that I write, operates within a narrow tonal range, with heightened events and purposeful violence described with clinical precision. This air of constraint is both the genre’s limitation and its greatest strength, but it also means that horror sits within it uncomfortably. At its best, horror fiction comes down to variations of tone, with everyday mundanity disrupted by unknown terrors, and a writer like Stephen King is so good at conveying the ordinary that the horror itself can seem less interesting by comparison. (Writers in whom the tone is steeped in dread from the beginning have trouble playing these changes: I love H.P. Lovecraft, for instance, but I can’t say that he scares me.)

The big exception is Chapter 44 of City of Exiles, in which horror comes to the forefront of the narrative to a degree that doesn’t have a parallel in the rest of the series. City of Exiles isn’t a perfect novel, and I’ve been hard on it elsewhere in this commentary, but I still think that the last ten chapters or so represent some of the strongest writing I’ve published, and the sequence kicks off here, as a neurotoxin is released inside a private plane with horrifying results. If the scene works, and I believe it does, it’s largely because of the kind of tonal shift that I describe above. It opens with Powell and Chigorin discovering that there may be a lethal device on board the plane, and for several pages, the action unfolds like something out of a Tom Clancy novel, complete with detailed specs on the ventilation system. (The couple of paragraphs spent discussing the ram system and the mix manifold were the product of a lot of tedious hours paging through aircraft manuals online.) But once the poison is released, the tone shifts abruptly into nightmare, and the result is a page or two like nothing else in these novels.

"A freezing horror took hold of him..."

In describing what Powell sees, I consciously turned back to the likes of King and Lovecraft, and there’s also a sentence or two of deliberate homage to “The Adventure of the Devil’s Foot,” a Sherlock Holmes short story that turns on a similar device. (“The Devil’s Foot” also provides the epigraph to Part III, and there are subtle allusions to it throughout the novel. Justice Roundhay, who sends Ilya to Belmarsh Prison, is named after one of Conan Doyle’s characters, and the two aliases that Karvonen uses—Dale Stern and Trevor Guinness—are nods to the names Sterndale and Tregennis.) The notion that Powell would see a monstrous version of one of the cherubim from Ezekiel’s vision of the merkabah is one of those ideas that seem obvious in retrospect, although it didn’t occur to me until fairly late in the process. It also involves a small cheat, since Powell is never directly privy to Wolfe’s conversations on the subject with Ilya, so I had to insert a short line in a previous chapter to explain why he’d have Ezekiel on his mind.

And although the result works well, at least to my eyes, I’m glad that it’s restricted to this chapter and nowhere else. Horror, as we all know well, is more effective the less it’s described, and as it stands, the description of Powell’s hallucination goes on just as long as necessary. It doesn’t feel like anything else in these books, which is part of the point: it’s a momentary disruption of the evenhanded tone I try to maintain even in scenes of great violence or intensity, and it casts a shadow over the more conventionally suspenseful scenes that follow. I’d love to write a real horror novel someday, mostly for the challenge of sustaining that kind of mood over a longer stretch of narrative: the number of novels that really pull it off would fill maybe a single shelf, and it’s no accident that King’s short stories are often so much scarier than his books. Still, I suspect that this scene works as well as it does because it’s embedded within a novel that otherwise seems so removed from the emotions that true horror evokes. And as with the poison that triggers these visions, a small dose is usually more than enough…

“I’d like to buy a ticket for today’s tournament…”

with one comment

"Ilya looked around the room..."

Note: This post is the eighteenth installment in my author’s commentary for City of Exiles, covering Chapter 17. You can read the earlier installments here.)

When you write a novel, you’re hoping  to control as much of the process as possible, but some of the most important factors will always be out of your hands, which is exactly how it should be. In the case of The Icon Thief, for instance, I never would have guessed that the massive stock market crash and ensuing downtown that occurred halfway through my first draft would end up deeply influencing almost every word I wrote over the next four years. I’d conceived this novel from the beginning as a thriller set in the New York art world, and for months, I’d conducted diligent research into the art market, auctions, galleries, and the underground trade in stolen paintings. The financial crisis changed everything. Art is often touted as an alternative form of wealth that will retain its value when other investments are falling, but when the entire system implodes, art suffers as much as anything else. Within a few weeks, articles were already warning of  a prolonged slowdown in the art market, which meant that one of the core premises of my novel—a hedge fund that would treat art like any other asset class—no longer seemed even remotely plausible.

As it turned out, I needn’t have worried: the art market turned out to be considerably more resilient than the rest of the economy, and by the time The Icon Thief was finally published, auction prices were back at record highs. Still, I didn’t know this at the time, so I was faced with several uncomfortable choices. I could ignore the problem; I could scrap the art fund angle entirely; or I could make the story a period piece that was set at a specific historical moment before the crisis. This last option was by far the most appealing, since it would both address my immediate narrative problem and provide a useful measure of historical irony, as the reader realized that the world in which these characters lived would change drastically in a matter of months. In the end, that’s what I ended up doing, and the approach worked just fine. Yet it also opened up another set of challenges that I couldn’t have anticipated. I’d conceived this novel as a standalone work, but it unexpectedly turned into the first book in a series. And since the first installment would serve as a template for the rest, it meant that every other novel in the trilogy would also have to take place during a specific stretch of time in the recent past.

"I'd like to buy a ticket for today's tournament..."

I’ve spoken elsewhere about the difficulties that this posed, especially for Eternal Empire, which was set against the backdrop of the London riots. For City of Exiles, it meant that I had to square the events of my novel as much as possible with the facts of the timeline that I’d chosen. Early in the process, for example, I’d conceived of an extended suspense sequence taking place at a chess tournament, following the Alfred Hitchcock rule that a story that takes place in Switzerland has to include the Alps, lakes, and chocolates: I was writing about Russia, and it seemed inevitable that I’d build part of the story around chess. (I’d already covered ballet in the previous book, and if the series had continued into a fourth novel, it’s possible that I would have ended up with a major subplot about pairs skating.) Since the story was set in London, the obvious choice was to set the scene at the London Chess Classic, which imposed certain constraints on the time of year in which the story could take place. And because the example set by the first book dictated that the events would occur at a particular point in the past, this meant that I had to stage everything at the actual tournament that was held in 2010—I was writing this, remember, in early 2011, so this was the most recent date that made sense.

I hadn’t been to the London Chess Classic myself, of course, but I was fortunate enough to find a trove of photographs, videos, and floor plans from the previous year’s tournament on its official site. Naturally, I spent hours poring over this material, taking notes on background detail, schedules, and the details of the games themselves. Later, on my research trip to London, I paid a visit to the Olympia Exhibition Center, the results of which are displayed in Chapter 17, in which Ilya walks the same route. (On the day I was there, the conference center was hosting the LondonEdge exhibition of punk, burlesque, and gothic fashion, which provided me with some of the trip’s most interesting souvenirs.) Obviously, few readers will bother verifying that the timeline of the novel matches the real tournament so closely, but as a writer, sticking to the facts as much as I could provided a useful framework for structuring the action of the story itself. I also may have been inspired by the example of Sherlock Holmes fans, who will notice if Conan Doyle happens to set a story on a balmy weekend in London in 1886 when the weather was really rainy. I probably got a lot of things wrong, but I hope that I got at least a few right. And in any case, the tournament is about to take a rather different turn from what really happened that day…

Written by nevalalee

February 13, 2014 at 9:40 am

%d bloggers like this: