Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Sherlock Holmes

Mycroft Holmes and the mark of genius

with one comment

Sidney Paget illustration of Mycroft Holmes

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on November 2, 2016.

“Original discoveries cannot be made casually, not by anyone at any time or anywhere,” the great biologist Edward O. Wilson writes in Letters to a Young Scientist. “The frontier of scientific knowledge, often referred to as the cutting edge, is reached with maps drawn by earlier scientists…Somewhere in these vast unexplored regions you should settle.” This seems like pretty good career advice for scientists and artists alike. But then Wilson makes a striking observation:

But, you may well ask, isn’t the cutting edge a place only for geniuses? No, fortunately. Work accomplished on the frontier defines genius, not just getting there. In fact, both accomplishments along the frontier and the final eureka moment are achieved more by entrepreneurship and hard work than by native intelligence. This is so much the case that in most fields most of the time, extreme brightness may be a detriment. It has occurred to me, after meeting so many successful researchers in so many disciplines, that the ideal scientist is smart only to an intermediate degree: bright enough to see what can be done but not so bright as to become bored doing it.

At first glance, this may not seem all that different from Martin A. Schwartz’s thoughts on the importance of stupidity: “Productive stupidity means being ignorant by choice.” In fact, they’re two separate observations—although they turn out to be related in one important respect. Schwartz is talking about “absolute stupidity,” or our collective ignorance in the face of the unknown, and he takes pains to distinguish it from the “relative stupidity” that differentiates students in the same college classes. And while Wilson isn’t talking about relative stupidity here, exactly, he’s certainly discussing relative intelligence, or the idea that the best scientists might be just a little bit less bright than their smartest peers in school. As he goes on to observe:    

What, then, of certified geniuses whose IQs exceed 140, and are as high as 180 or more? Aren’t they the ones who produce the new groundbreaking ideas? I’m sure some do very well in science, but let me suggest that perhaps, instead, many of the IQ-brightest join societies like Mensa and work as auditors and tax consultants. Why should the rule of optimum medium brightness hold? (And I admit this perception of mine is only speculative.) One reason could be that IQ geniuses have it too easy in their early training. They don’t have to sweat the science courses they take in college. They find little reward in the necessarily tedious chores of data-gathering and analysis. They choose not to take the hard roads to the frontier, over which the rest of us, the lesser intellectual toilers, must travel.

Marilyn vos Savant

In other words, the real geniuses are reluctant to take on the voluntary stupidity that science demands, and they’re more likely to find sources of satisfaction that don’t require them to constantly confront their own ignorance. This is a vast generalization, of course, but it seems to square with experience. I’ve met a number of geniuses, and what many of them have in common is a highly pragmatic determination to make life as pleasant for themselves as possible. Any other decision, in fact, would call their genius into doubt. If you can rely unthinkingly on your natural intelligence to succeed in a socially acceptable profession, or to minimize the amount of work you have to do at all, you don’t have to be a genius to see that this is a pretty good deal. The fact that Marilyn vos Savant—who allegedly had the highest tested intelligence ever recorded—became a columnist for Parade might be taken as a knock against her genius, but really, it’s the most convincing proof of it that I can imagine. The world’s smartest person should be more than happy to take a cushy gig at a Sunday supplement magazine. Most of the very bright endure their share of miseries during childhood, and their reward, rather than more misery, might as well be an adult life that provides intellectual stimulation in emotional safety. This is why I’ve always felt that Mycroft Holmes, Sherlock’s smarter older brother, knew exactly how his genius ought to be used. As Sherlock notes drily in “The Adventure of the Bruce-Partington Plans”: “Mycroft draws four hundred and fifty pounds a year, remains a subordinate, has no ambitions of any kind, will receive neither honor nor title, but remains the most indispensable man in the country.” 

Yet it’s Sherlock, who was forced to leave the house to find answers to his problems, whom we love more. (He’s also been held up as an exemplar of the perfect scientist.) Mycroft is hampered by both his physical laziness and his mental quickness: when a minister comes to him with a complicated problem involving “the Navy, India, Canada, and the bimetallic question,” Mycroft can provide the answer “offhand,” which doesn’t give him much of an incentive to ever leave his office or the Diogenes Club. As Holmes puts it in “The Greek Interpreter”:

You wonder…why it is that Mycroft does not use his powers for detective work. He is incapable of it…I said that he was my superior in observation and deduction. If the art of the detective began and ended in reasoning from an armchair, my brother would be the greatest criminal agent that ever lived. But he has no ambition and no energy. He will not even go out of his way to verify his own solution, and would rather be considered wrong than take the trouble to prove himself right.

Mycroft wasn’t wrong, either. He seems to have lived a very comfortable life. But it’s revealing that Conan Doyle gave the real adventures to the brother with the slightly less scintillating intelligence. In art, just as in science, technical facility can prevent certain artists from making real discoveries. The ones who have to work at it are more likely to find something real. But we can also raise a glass to Mycroft, Marilyn, and the geniuses who are smart enough not to make it too hard on themselves.

The minor key

with one comment

“What keeps science fiction a minor genre, for all the brilliance of its authors and apparent pertinence of its concerns?” The critic who asked this question was none other than John Updike, in his New Yorker review of David G. Hartwell’s anthology The World Treasury of Science Fiction, which was published at the end of the eighties. Updike immediately responded to his own question with his usual assurance:

The short answer is that each science-fiction story is so busy inventing its environment that little energy is left to be invested in the human subtleties. Ordinarily, “mainstream” fiction snatches what it needs from the contemporary environment and concentrates upon surprising us with details of behavior; science fiction tends to reverse the priorities…It rarely penetrates and involves us the way the quest realistic fiction can…”The writer,” Edmund Wilson wrote, “must always find expressions for something which has never yet been exposed, must master a new set of phenomena which has never yet been mastered.” Those rhapsodies, for instance, which Proust delivered upon the then-fresh inventions of the telephone, the automobile, and the airplane point up the larger relativities and magical connections of his great novel, as well as show the new century breaking upon a fin-de-siècle sensibility. The modest increments of fictional “news,” of phenomena whose presentation is unprecedented, have the cumulative weight of true science—a nudging, inching fidelity to human change ultimately far more impressive and momentous than the great glittering leaps of science fiction.

I’ll concede that Updike’s underlying point here is basically correct, and that a lot of science fiction has to spend so much time establishing the premise and the background that it has to shortchange or underplay other important qualities along the way. (At its highest level, this is less a reflection of the author’s limitations than a courtesy to the reader. It’s hard to innovate along every parameter at once, so complex works of speculative fiction as different as Gravity’s Rainbow and Inception need to strategically simplify wherever they can.) But there’s also a hidden fallacy in Updike’s description of science fiction as “a minor genre.” What, exactly, would a “major” genre look like? It’s hard to come up with a definitive list, but if we’re going to limit ourselves to a conception of genre that encompasses science fiction and not, say, modernist realism, we’d probably include fantasy, horror, western, romance, erotica, adventure, mystery, suspense, and historical fiction. When we ask ourselves whether Updike would be likely to consider any of these genres “major,” it’s pretty clear that the answer is no. Every genre, by definition, is minor, at least to many literary critics, which not only renders the distinction meaningless, but raises a host of other questions. If we honestly ask what keeps all genres—although not individual authors—in the minor category, there seem to be three possibilities. Either genre fiction fails to attract or keep major talent; it suffers from various systemic problems of the kind that Updike identified for science fiction; or there’s some other quirk in the way we think about fiction that relegates these genres to a secondary status, regardless of the quality of specific works or writers.

And while all three of these factors may play a role, it’s the third one that seems most plausible. (After all, when you average out the quality of all “literary fiction,” from Updike, Bellow, and Roth down to the work put out by the small presses and magazines, it seems fairly clear that Sturgeon’s Law applies here as much as anywhere else, and ninety percent of everything is crud. And modernist realism, like every category coherent enough to earn its own label, has plenty of clichés of its own.) In particular, if a genre writer is deemed good enough, his or her reward is to be elevated out of it entirely. You clearly see this with such authors as Jorge Luis Borges, perhaps the greatest writer of speculative fiction of the twentieth century, who was plucked out of that category to complete more effectively with Proust, Joyce, and Kafka—the last of whom was arguably also a genre writer who was forcibly promoted to the next level. It means that the genre as a whole can never win. Its best writers are promptly confiscated, freeing up critics to speculate about why it remains “minor.” As Daniel Handler noted in an interview several years ago:

I believe that children’s literature is a genre. I resisted the idea that children’s literature is just anything that children are reading. And I certainly resisted the idea that certain books should get promoted out of children’s literature just because adults are reading them. That idea is enraging too. That’s what happens to any genre, right? First you say, “Margaret Atwood isn’t really a science fiction writer.” Then you say, “There really aren’t any good science fiction writers.” That’s because you promoted them all!

And this pattern isn’t a new one. It’s revealing that Updike quoted Edmund Wilson, who in his essays “Why Do People Read Detective Stories” and “Who Cares Who Killed Roger Ackroyd?” dismissed the entire mystery genre as minor or worse. Yet when it came to defending his fondness for one author in particular, he fell back on a familiar trick:

I will now confess, in my turn, that, since my first looking into this subject last fall, I have myself become addicted, in spells, to reading myself to sleep with Sherlock Holmes, which I had gone back to, not having looked at it since childhood, in order to see how it compared with Conan Doyle’s latest imitators. I propose, however, to justify my pleasure in rereading Sherlock Holmes on grounds entirely different from those on which the consumers of the current product ordinarily defend their taste. My contention is that Sherlock Holmes is literature on a humble but not ignoble level, whereas the mystery writers most in vogue now are not. The old stories are literature, not because of the conjuring tricks and the puzzles, not because of the lively melodrama, which they have in common with many other detective stories, but by virtue of imagination and style. These are fairy-tales, as Conan Doyle intimated in his preface to his last collection, and they are among the most amusing of fairy-tales and not among the least distinguished.

Strip away the specifics, and the outlines of the argument are clear. Sherlock Holmes is good, and mysteries are bad, so Sherlock Holmes must be something other than mystery fiction. It’s maddening, but from the point of view of a working critic, it makes perfect sense. You get to hold onto the works that you like, while keeping the rest of the genre safely minor—and then you can read yourself happily to sleep.

Brexit pursued by a bear

with one comment

Over the weekend, my wife and I took our daughter to see Paddington 2, which can accurately be described as the best live-action children’s movie since Paddington. These are charming films, and the worst that can be said of them is that they’re clearly trying hard to be better than they have any right to be. Unlike an artist like Hayao Miyazaki, who constructs stories according to his own secret logic and ends up seizing the imagination of adults and children across the world, director Paul King and his collaborators are more in the tradition of Pixar, which does amazing work and never lets you forget it for a second. (If you want to reach back even further, you could say that these movies split the difference between Babe, a technically phenomenal film that somehow managed to seem effortless, and Babe: Pig in the City, an unquestioned masterpiece that often felt on the verge of flying apart under the pressure of George Miller’s ambitions.) Paddington 2, in particular, is so indebted to the work of Wes Anderson, especially The Grand Budapest Hotel, that it seems less like a pastiche than an unauthorized knockoff. Is it really an act of homage to painstakingly recreate the look of a movie that came out less than four years ago? But it also doesn’t matter. It’s as if King and his collaborators realized that Anderson’s work amounted to an industrial process that was being wasted if it wasn’t being used to make a children’s movie, so they decided to copy it before the patent expired. The result isn’t quite on the level of The Life Aquatic with Steve Zissou, a major work of art that also seems to have been made by and for twelve-year-old kids. But it’s more than enough until Anderson finally makes the Encyclopedia Brown adaptation of my dreams.

Paddington 2 also doubles as the best advertisement for Britain in film since the heyday of the Ministry of Information, with a roster of such ringers as Sally Hawkins, Hugh Bonneville, Brendan Gleeson, Julie Walters, Jim Broadbent, Peter Capaldi, and Joanna Lumley, as well as a wonderfully diverse supporting cast. (It also gives Hugh Grant—the quintessential British export of the last quarter of a century—his best role in a long time.) It’s the most loving portrait of London that any movie has provided in years, with a plot driven by an implausible treasure hunt that serves as an excuse to tour such landmarks as Tower Bridge and St. Paul’s Cathedral. Watching it is almost enough to make you forget the fact that just a few months before production began, the United Kingdom narrowly voted to effectively withdraw from its role as a global power. It might seem like a stretch to see a children’s movie through the lens of Brexit, but nearly every British film of the postwar period can be read as a commentary on the nation’s sometimes painful efforts to redefine itself in a changing world order. Nostalgia is often a strategy for dealing with harsher realities, and escapism can be more revealing than it knows, with even the James Bond series serving as a form of wishful thinking. And America should be paying close attention. A nation on the decline no longer has the luxury of having its movies stand for nothing but themselves, and Britain provides a striking case study for what happens to a culture after its period of ascendancy is over. The United States, like its nearest relation, threw away much of its credibility a year and a half ago in a fit of absentmindedness.

This partially accounts for our sudden fascination with Britain and its royal family, which seems to have risen to levels unseen since the death of Princess Diana. Part of it amounts to an accident of timing—the flurry of celebrations for Queen Elizabeth’s ninetieth birthday and sapphire jubilee generated a flood of content that was more available to American viewers than ever before, and we were unusually primed to receive it. Over the last year or so, my wife and I have watched something like three different documentaries about the Windsors, along with The Crown and The Great British Baking Show, the soothing rhythms of which make Top Chef seem frantic by comparison. Above all else, we’ve followed the saga of Prince Harry and Meghan Markle, which has often been mined for clues as to its possible social and political significance. As Rebecca Mead writes in The New Yorker:

This may be because [the engagement is] legit the only bit of non-terrible news that’s happened in the last year. But there’s more to it than that. This is a royal wedding for non-royalists, even for anti-royalists…There is another important way in which Markle’s arrival reconfigures what Prince Philip reportedly calls “the Firm.” Not only is she American, she is also of mixed race: Markle’s mother is African-American, and her father is white…Whatever else Markle brings to the gilded royal table in terms of glamour, intelligence, and charm, her experience of racial prejudice is unprecedented among members of the royal family. At a time when racial bigotry and nativism is on the rise on both sides of the Atlantic, the coming to prominence at the heart of Britain’s First Family of an American woman whose ancestors were enslaved could not be more welcome, or more salutary.

The unstated point is that even as the United Kingdom goes through convulsions of its own, at least it gets to have this. And we can’t be blamed for wanting to clutch some of it to ourselves. After quoting Princess Diana’s wish that she become “a queen of people’s hearts,” Mead adds:

For those of us horrified by the President’s imperial, autocratic instincts—by his apparent wish to reinstate a feudal system with himself at its apex, attended by a small court of plutocrats who, like him, have been even further enriched by Republican tax reform—might we not claim Harry and Meghan as the monarchs of our hearts? Might they not serve as paradoxical avatars of our own hopes for a more open, more international, more unified, and fairer world?

It’s hard to quarrel with this basically harmless desire to comfort ourselves with the images of the monarchy, and I’ve been guilty of it myself. The building blocks of so much of my inner life—from the Sherlock Holmes stories to the movies of Powell and Pressburger—reflect a nostalgia for an England, as Vincent Starrett put it, “where it is always 1895.” It’s an impulse as old as Walt Disney, a Chicago child whose studio turned into a propaganda mill in the early sixties for the values of the Edwardian era. (As much as I love Mary Poppins, it’s hard to overlook the fact that it premiered just a few weeks after the Gulf of Tonkin resolution and against a backdrop of race riots in Philadelphia.) America has nostalgic myths of its own, but it tends to fall back on its British forebears when it feels particularly insecure about its own legacy. When it becomes too difficult to look at ourselves, we close our eyes and think of England.

The manufacturers of worlds

with 2 comments

For the last few days, as part of a deliberate break from writing, I’ve been browsing contentedly through my favorite book, The Annotated Sherlock Holmes by William S. Baring-Gould. It was meant to be a comforting read that was as far removed from work as possible, but science fiction, unsurprisingly, can’t seem to let me go. Yesterday, I was looking over The Sign of the Four when I noticed a line that I’ve read countless times without really taking note of it. As Holmes leaves Baker Street to pursue a line of the investigation, he says to Watson, who has remained behind: “Let me recommend this book—one of the most remarkable ever penned. It is Winwood Reade’s Martyrdom of Man. I shall be back in an hour.” Toward the end of the novel, speaking of the difficulty in predicting what any given human being will do, Holmes elaborates:

Winwood Reade is good upon the subject…He remarks that, while the individual man is an insoluble puzzle, in the aggregate he becomes a mathematical certainty. You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

This is remarkably like what Isaac Asimov writes of psychohistory, a sociological version of the ideal gas law that can predict the future based on the existence of a huge number—perhaps in the trillions—of individual lives. And it seemed worth checking to see if this passage could cast any light on the origins of the imaginary science that I’ve spent so much time exploring.

It pains me to say that Holmes himself probably wasn’t a direct influence on the Foundation series. There was a considerable overlap between Sherlockians and science fiction writers—prominent members of both camps included Anthony Boucher, Poul Anderson, Fletcher Pratt, and Manly Wade Wellman—but John W. Campbell wasn’t among them, and Asimov was drafted only reluctantly into the Baker Street Irregulars. (He writes in I. Asimov: “Conan Doyle was a slapdash and sloppy writer…I am not really a Holmes enthusiast.”) For insight, we have to go back to Winwood Reade himself, a British historian, explorer, and correspondent of Charles Darwin whose discussion of the statistical predictability of the human race appears, interestingly, in an argument against the efficacy of prayer. Here’s the full passage from The Martyrdom of Man, which was published in 1872:

All phenomena, physical and moral, are subject to laws as invariable as those which regulate the rising and setting of the sun. It is in reality as foolish to pray for rain or a fair wind as it would be to pray that the sun should set in the middle of the day. It is as foolish to pray for the healing of a disease or for daily bread as it is to pray for rain or a fair wind. It is as foolish to pray for a pure heart or for mental repose as it is to pray for help in sickness or misfortune. All the events which occur upon the earth result from Law: even those actions which are entirely dependent on the caprices of the memory, or the impulse of the passions, are shown by statistics to be, when taken in the gross, entirely independent of the human will. As a single atom, man is an enigma; as a whole, he is a mathematical problem. As an individual, he is a free agent; as a species, the offspring of necessity.

At the end of the book, Reade takes his own principles to their logical conclusion, becoming, in effect, an early writer of science fiction. Its closing section, “Intellect,” sketches out a universal history that anticipates Toynbee, but Reade goes further: “When we understand the laws which regulate the complex phenomena of life, we shall be able to predict the future as we are already able to predict comets and eclipses and planetary movements.” He describes three inventions that he believes will lead to an era of global prosperity:

The first is the discovery of a motive force which will take the place of steam, with its cumbrous fuel of oil or coal; secondly, the invention of aerial locomotion which will transport labour at a trifling cost of money and of time to any part of the planet, and which, by annihilating distance, will speedily extinguish national distinctions; and thirdly, the manufacture of flesh and flour from the elements by a chemical process in the laboratory, similar to that which is now performed within the bodies of the animals and plants.

And after rhapsodizing over the utopian civilization that will result—in which “poetry and the fine arts will take that place in the heart which religion now holds”—he turns his thoughts to the stars:

And then, the earth being small, mankind will migrate into space, and will cross the airless Saharas which separate planet from planet, and sun from sun. The earth will become a Holy Land which will be visited by pilgrims from all the quarters of the universe. Finally, men will master the forces of nature; they will become themselves architects of systems, manufacturers of worlds. Man then will be perfect; he will then be a creator; he will therefore be what the vulgar worship as a god.

Reade was inevitably seen as an atheist, and although he didn’t like the label, he inclined many readers in that direction, as he did in one of the most interesting episodes in this book’s afterlife. The scene is World War II, which tested the idea of psychohistory to its limit, and the speaker is the author of the memoir The Enchanted Places:

The war was on. I was in Italy. From time to time [my father] used to send me parcels of books to read. In one of them were two in the Thinker’s Library series: Renan’s The Life of Jesus and Winwood Reade’s The Martyrdom of Man. I started with The Life of Jesus and found it quite interesting; I turned to The Martyrdom and found it enthralling…There was no God. God had not created Man in His own image. It was the other way round: Man had created God. And Man was all there was. But it was enough. It was the answer, and it was both totally convincing and totally satisfying. It convinced and satisfied me as I lay in my tent somewhere on the narrow strip of sand that divides Lake Comacchio from the Adriatic; and it has convinced and satisfied me ever since.

I wrote at once to my father to tell him so and he at once wrote back. And it was then that I learned for the first time that these were his beliefs, too, and that he had always hoped that one day I would come to share them…So he had sent me The Martyrdom. But even then he had wanted to play absolutely fair, and so he had added The Life of Jesus. And then he had been content to leave the verdict to me. Well, he said, the church had done its best. It had had twenty-four years’ start—and it had failed.

The author adds: “If I had to compile a list of books that have influenced my life, high on the list would undoubtedly be Winwood Reade’s The Martyrdom of Man. And it would probably be equally high on my father’s list too.” The father in question was A.A. Milne. And the son was named Christopher Robin.

Quote of the Day

leave a comment »

It is one of those cases where the art of the reasoner should be used rather for the sifting of details than for the acquiring of fresh evidence. The tragedy has been so uncommon, so complete and of such personal importance to so many people, that we are suffering from a plethora of surmise, conjecture, and hypothesis. The difficulty is to detach the framework of fact—of absolute undeniable fact—from the embellishments of theorists and reporters. Then, having established ourselves upon this sound basis, it is our duty to see what inferences may be drawn and what are the special points upon which the whole mystery turns.

Arthur Conan Doyle, “Silver Blaze”

Written by nevalalee

December 6, 2017 at 7:30 am

Solzhenitsyn’s rosary

leave a comment »

Aleksandr Solzhenitsyn

Note: I’m taking a few days off for Thanksgiving, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on July 11, 2016.

When Aleksandr Solzhenitsyn was imprisoned in the Soviet gulag, along with so many other sufferings, he was forced to deal with a challenge that modern writers rarely have to confront—the problem of memorization. He wanted to keep writing poetry, but he was unable to put anything on paper, which would be confiscated and read by the guards. Here’s the solution that he found, as he recounts in The Gulag Archipelago:

I started breaking matches into little pieces and arranging them on my cigarette case in two rows (of ten each, one representing units and the others tens). As I recited the verses to myself, I displaced one bit of broken match from the units row for every line. When I shifted ten units I displaced one of the “tens”…Every fiftieth and every hundredth line I memorized with special care, to help me keep count. Once a month I recited all that I had written. If the wrong line came out in place of one of the hundreds and fifties, I went over it all again and again until I caught the slippery fugitives.

In the Kuibyshev Transit Prison I saw Catholics (Lithuanians) busy making themselves rosaries for prison use…I joined them and said that I, too, wanted to say my prayers with a rosary but that in my particular religion I needed hundred beads in a ring…that every tenth bead must be cubic, not spherical, and that the fiftieth and the hundredth beads must be distinguishable at a touch.

The Lithuanians were impressed, Solzhenitsyn says, by his “religious zeal,” and they agreed to make a rosary to his specifications, fashioning the beads out of pellets of bread and coloring them with burnt rubber, tooth powder, and disinfectant. (Later, when Solzhenitsyn realized that twenty beads were enough, he made them himself out of cork.) He concludes:

I never afterward parted with the marvelous present of theirs; I fingered and counted my beads inside my wide mittens—at work line-up, on the march to and fro from work, at all waiting times; I could do it standing up, and freezing cold was no hindrance. I carried it safely through the search points, in the padding of my mittens, where it could not be felt. The warders found it on various occasions, but supposed that it was for praying and let me keep it. Until the end of my sentence (by which time I had accumulated 12,000 lines) and after that in my places of banishment, this necklace helped me write and remember.

Ever since I first read this story, I’ve been fascinated by it, and I’ve occasionally found myself browsing the rosaries or prayer beads for sale online, wondering if I should get one for myself, just in case—although in case of what, exactly, I don’t know.

Joan Didion

But you don’t need to be in prison to understand the importance of memorization. One of the side effects of our written and interconnected culture is that we’ve lost the ability to hold information in our heads, and this trend has only accelerated as we’ve outsourced more of our inner lives to the Internet. This isn’t necessarily a bad thing: there are good reasons for keeping a lot of this material where it can be easily referenced, without feeling the need to remember it all. (As Sherlock Holmes said in A Study in Scarlet: “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose…It is a mistake to think that that little room has elastic walls and can distend to any extent.” Although given the amount of obscure information that Holmes was able to produce in subsequent stories, it’s possible that he was just kidding.) But there’s also a real loss involved. Oral cultures are marked by a highly developed verbal memory, especially for those whose livelihoods depend on it: a working poet could be expected to know hundreds of songs by heart, and the conventions of poetry itself emerged, in part, as a set of mnemonic devices. Meter, rhyme, and conventional formulas allowed many lines of verse to be recited for a paying audience—or improvised on the spot. An oral poem is a vehicle for the preservation of information, and it takes advantage of the human brain’s ability to retain material in a pattern that hints at what comes next. When we neglect this, we lose touch with some of the reasons that poetry evolved in the first place.

And what makes memorization particularly valuable as a creative tool is the fact that it isn’t quite perfect. When you write something down, it tends to become fixed, both physically and psychologically. (Joan Didion gets at this when she says: “By the time you’ve laid down the first two sentences, your options are all gone.”) An idea in the brain, by contrast, remains fluid, malleable, and vital. Each time you go back to revisit it, whether using a rosary or some other indexical system, you aren’t just remembering it, but to some extent recreating it, and you’ll never get it exactly right. But just as natural selection exists because of the variations that arise from errors of transcription, a creative method that relies on memory is less accurate but more receptive to happy accidents than one that exists on the page. A line of poetry might change slightly each time we call it up, but the core idea remains, and the words that survive from one iteration to the next have persisted, by definition, because they’re memorable. We find ourselves revising and reworking the result because we have no choice, and in the process, we keep it alive. The danger, of course, is that if we don’t keep notes, any ideas we have are likely to float away without ever being realized—a phenomenon that every writer regards with dread. What we need is a structure that allows us to assign an order to the ideas in our head while preserving their ripe state of unwrittenness. Solzhenitsyn’s rosary, which was forced on him by necessity, was one possible answer, but there are others. Even if we’re diligent about keeping a pencil and paper—or a smartphone—nearby, there will be times when an idea catches us at a moment at which we can’t write it down. And when that happens, we need to be ready.

Written by nevalalee

November 23, 2017 at 9:00 am

The weight of lumber

with 4 comments

In my discussion yesterday of huge scholarly projects that expanded to take up the lives of their authors, I deliberately left out one name. Arnold J. Toynbee was a British historian and author of the twelve volumes of A Study of History, the most ambitious attempt to date at a universal theory of the rise and fall of civilizations. Toynbee has intrigued me for as long as I can remember, but he’s a little different from such superficially similar figures as Joseph Needham and Donald Knuth. For one thing, he actually finished his magnum opus, and even though it took decades, he more or less stuck to the plan of the work that he published in the first installment, which was an achievement in itself. He also differed from the other two in reaching a wide popular audience. Thousands of sets of his book were sold, and it became a bestseller in its two-volume abridgment by D.C. Somervell. It inspired countless essays and thick tomes of commentary, argument, and response—and then, strangely, it simply went away. Toynbee’s name has all but disappeared from mainstream and academic consideration, maybe because his ideas were too abstruse for one and too grandiose for the other, and if he’s recognized today at all, it’s probably because of the mysterious Toynbee tiles. (One possible successor is the psychohistory of the Foundation series, which has obvious affinities to his work, although Isaac Asimov played down the connection. He read the first half of A Study of History in 1944, borrowing the volumes one at a time from L. Sprague de Camp, and recalled: “There are some people who, on reading my Foundation series, are sure that it was influenced basically by Toynbee. They are only partly right. The first four stories were written before I had read Toynbee. ‘Dead Hand,’ however, was indeed influenced by it.”)

At the Newberry Library Book Fair last week, I hesitated over buying a complete set of Toynbee, and by the time I made up my mind and went back to get it, it was gone—which is the kind of mistake that can haunt me for the rest of my life. As a practical matter, though, I have all the Toynbee I’ll ever need: I already own the introductory volume of A Study of History and the Somervell abridgment, and it’s frankly hard to imagine reading anything else. But I did pick up the twelfth and last volume, Reconsiderations, published seven years after the rest, which might be the most interesting of them all. It’s basically Toynbee’s reply to his critics in over seven hundred pages of small type, in the hardcover equivalent of a writer responding to all the online comments on his work one by one. Toynbee seems to have read every review of his book, and he sets out to engage them all, including a miscellaneous section of over eighty pages simply called Ad Hominem. It’s a prickly, fascinating work that is probably more interesting than the books that inspired it, and one passage in particular caught my eye:

One of my critics has compared earlier volumes of this book to a “palace” in which “the rooms…are over-furnished to the point of resembling a dealer’s warehouse.” This reviewer must also be a thought-reader; for I have often thought of myself as a man moving old furniture about. For centuries these lovely things had been lying neglected in the lumber-rooms and attics. They had been piled in there higgledy-piggledy, in utter disorder, and had been crammed so tight that nobody could even squeeze his way in to look at them and find out whether they were of any value. In the course of ages they had been accumulating there—unwanted rejects from a score of country houses. This unworthy treatment of these precious pieces came to trouble me more and more; for I knew that they were not really junk; I knew that they were heirlooms, and these so rare and fine that they were not just provincial curiosities; they were the common heritage of anyone who had any capacity for appreciating beauty in Man’s handiwork.

In speaking of “lumber-rooms and attics,” Toynbee is harking back to a long literary tradition of comparing the mind itself to a lumber-room, which originally meant a spare room in a house full of unused furniture and other junk. I owe this knowledge to Nicholson Baker’s famous essay “Lumber,” reprinted in his collection The Size of Thoughts, in which he traces the phrase’s rise and fall, in a miniature version of what Toynbee tries to do for entire civilizations. Baker claims to have chosen the word “lumber” essentially at random, writing in his introduction: “Now feels like a good time to pick a word or a phrase, something short, and go after it, using the available equipment of intellectual retrieval, to see where we get…It should be representatively out of the way; it should have seen better days. Once or twice in the past it briefly enjoyed the status of a minor cliché, but now, for one reason or another, it is ignored or forgotten.” This might be a description of A Study of History itself—and yet, remarkably, Baker doesn’t mention the passage that I’ve quoted here. I assume that this is because he wasn’t aware of it, because it fits in beautifully with the rest of his argument. The dread of the mind becoming a lumber-room, crammed with useless odds and ends, is primarily a fear of intellectuals, as expressed by their patron saint Sherlock Holmes:

I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. A fool takes in all the lumber of every sort that he comes across, so that the knowledge which might be useful to him gets crowded out, or at best is jumbled up with a lot of other things, so that he has a difficulty in laying his hands upon it. Now the skillful workman is very careful indeed as to what he takes into his brain-attic…It is a mistake to think that this little room has elastic walls and can distend to any extent.

Baker explains: “This is a form of the great scholarly worry—a worry which hydroptically book-thirsty poets like Donne, Johnson, Gray, Southey, and Coleridge all felt at times—the fear that too much learning will eventually turn even an original mind into a large, putty-colored regional storage facility of mislabeled and leaking chemical drums.”

Toynbee’s solution to the problem of mental lumber, like that of Needham and Knuth, was simply to pull it out of his brain and put it down on paper, even if it took three decades and twelve volumes. It’s hard not to be stirred by his description of his efforts:

At last I found that I could not bear this shocking situation any longer, so I set my own hand to a back-breaking job. I began to drag out the pieces, one by one, and to arrange them in the hall. I could not pretend to form a final judgement on the order in which they should be placed. Indeed, there never could be a final judgement on this, for a number of attractive different orders could be imagined, each of them the right order from some particular point of view. The first thing to be done was to get as many of the pieces as possible out into the open and to assemble them in some order or other. If once I had them parked down in the hall, I could see how they looked and could shift them and re-shift them at my leisure. Perhaps I should not have the leisure; perhaps the preliminary job of extracting these treasures from the lumber-rooms and attics would turn out to be as much as I could manage with my single pair of hands. If so, this would not matter; for there would be plenty of time afterwards for other people to rearrange the pieces, and, no doubt, they would be doing this again and again as they studied them more closely and came to know more about them than would ever be known by me.

It’s through arrangement and publication that lumber becomes precious again, and from personal experience, I know how hard it can be to relinquish information that has been laboriously brought to light. But part of the process is knowing when to stop. As Baker, a less systematic but equally provocative thinker, concludes:

I have poked through verbal burial mounds, I have overemphasized minor borrowings, I have placed myself deep in the debt of every accessible work of reference, and I have overquoted and overquibbled—of course I have: that is what always happens when you pay a visit to the longbeards’ dusty chamber…All the pages I have flipped and copied and underlined will turn gray again and pull back into the shadows, and have no bearing on one another. Lumber becomes treasure only temporarily, through study, and then it lapses into lumber again. Books open, and then they close.

%d bloggers like this: