Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Isaac Asimov

The Road to Foundation

with 5 comments

As I’ve recounted here before, on August 1, 1941, Isaac Asimov was riding the subway to John W. Campbell’s office in New York when the history of science fiction changed forever. In his memoir In Memory Yet Green, Asimov, who was twenty-one at the time, recalls the moment at which he first conceived of what became the Foundation series:

On the way down I racked my brain for a story idea. Failing, I tried a device I sometimes used. I opened a book at random and then tried free association, beginning with whatever I saw. The book I had with me was a collection of the Gilbert and Sullivan plays. I opened it to Iolanthe—to the picture of the Fairy Queen throwing herself at the feet of Private Willis, the sentry. Thinking of sentries, I thought of soldiers, of military empires, of the Roman Empire—of the Galactic Empire—aha!

For reasons that I’ll discuss below, I’m reasonably sure that the illustration that Asimov describes is the one reproduced above, which was drawn by the lyricist W.S. Gilbert himself. And what strikes me the most about this anecdote now is the fact that Asimov looked at this particular picture, ignored the Fairy Queen entirely, and turned it into a series in which no women of any consequence would appear for years. To make a slightly facetious comparison, if I were a therapist giving Asimov the Thematic Apperception Test, in which the subject is asked to look at a picture and make up a story about it, this is the point at which I would sit up slightly in my chair.

Recently, it occurred to me to try to figure out which book Asimov was carrying on the train that day, if only because it’s interesting to dig into what a writer might have been reading at a given moment. The great model here is John Livingston Lowes’s The Road to Xanadu, which obsessively connects the imagery of “Kubla Khan” and “The Rime of the Ancient Mariner” to the travel narratives that Samuel Coleridge was studying at the time. Asimov, it’s worth noting, was skeptical of Lowes’s approach:

I tried reading the book in my youth, but gave up. It could only interest another Coleridge scholar. Besides, I saw no point to it. Granted that the phrases already existed scattered through a dozen books, they existed for everybody. It was only Coleridge who thought of putting them together, with the necessary modifications, to form one of the great poems of the English language. Coleridge might not have been a hundred percent original but he was original enough to make the poem a work of genius.

But this kind of search can be diverting in itself, and it didn’t take me long to conclude that Asimov’s book was likely to have been Plays and Poems of W.S. Gilbert, which was published by Random House in 1932. As far as I can tell, it’s one of only two books available at the time that included both the lyrics to Iolanthe and the illustrations by Gilbert, and it would have been easy to find. (The other is a book titled Authentic Libretti of the Gilbert and Sullivan Operas, which was published a few years later to coincide with a tour by the D’Oyly Carte Opera Company, and it doesn’t look like something that Asimov would have brought on the subway.)

The edition, as it happens, is available online for free, and it can be amusing to left through it while keeping the young Asimov in mind. This isn’t literary criticism, exactly, but a kind of scholarly reverie, and it’s valuable primarily for the chain of associations that it evokes. The book opens with a lengthy introduction by Deems Taylor, a music critic and occasional member of the Algonquin Round Table, and I’d like to think that Asimov would have seen aspects of himself in it. For example, here’s Taylor on Gilbert’s early years as a writer:

For a time, his writings, although voluminous, attracted no attention whatsoever. He tried everything—reporting, dramatic criticism, editorials, weekly news letters to provincial papers, political polemics, essays—all the forms of quotidian literature that flow from the pen of any young person who vaguely “wants to write” (a sentence that, appropriately, has no object). The results were financially negligible. Nor did he have the meagre satisfaction of knowing that there were those who were watching him, believing in him. Nobody was watching a young journalistic hack who was no different from scores of his fellows except that he combined a gift for saying cutting things with a complete inability to refrain from saying them.

This sounds a lot like Asimov in the days when he was trying to break into Astounding, and as I thought more about Gilbert and Sullivan themselves, who brought out the best in each other, I saw them for the first time as shadows of Asimov and Campbell in the thirties, of whose partnership the former once wrote: “Campbell and I, in those first three years of my writing career—the crucial and formative ones—were a symbiotic organism.”

But the section that intrigues me the most comes near the end of the introduction. Speaking fondly of the characters of HMS Pinafore, The Mikado and all the rest, Taylor writes:

As this gay, silly, endearing crew skip upon the stage, the sum of all that they say is always the same thing; and it is a romantic thing: That the light of pure reason casts grotesque shadows; that a world in which there is nothing but the letter of the law, and the logical conclusion, and the inevitable deduction, and the axiomatic fact, and the rational course of conduct, is, in the last account, a ridiculous one. Looking at their world, in which there is everything but the truth that lies beyond logic, we perceive that it is, in more ways than one, an impossible world.

It’s hard for me to read this now without reflecting that Asimov was just moments away, as he rode the train to Campbell’s office, from conceiving nothing less than “a world in which there is nothing but the letter of the law, and the logical conclusion, and the inevitable deduction, and the axiomatic fact, and the rational course of conduct,” which would end up dominating much of the rest of his life. And while I’m no expert on Gilbert and Sullivan, viewing the Foundation series through that lens seems like a promising approach. Asimov, as I’ve noted elsewhere, never seems to have been particularly interested in psychohistory, which was mostly Campbell’s invention, and he was more conscious of its limitations than many of its fans are. (In The End of Eternity, Asimov describes a similar group of scientists as a collection of “psychopaths.”) And what Taylor writes of these operettas applies just as well to many of the stories that they inspired: “The sky has cleared, the problems solve themselves, and everything has suddenly turned out all right. Every fundamental axiom of human motive and conduct has been outraged, and we are delighted.”

The unique continent

with one comment

Early in 1939, the science fiction editor John W. Campbell wrote to Lester del Rey to propose a story idea. As Del Rey recalled years later: “The idea was that maybe [Neanderthals weren’t] killed off fighting Cro-Magnon, but rather died of frustration from meeting a race with a superior culture. I didn’t exactly accept it as good anthropology, but the story took shape easily.” The result, “The Day is Done,” appeared in the May 1939 issue of Astounding Science Fiction, and it moved Isaac Asimov so much that he wept as he read it on the subway. To a modern reader, the most striking thing about it is probably the unsigned editorial note—clearly written by Campbell—that followed the story on its original publication. The magazine didn’t usually provide this kind of extended commentary on specific works of fiction, so many readers must have read it closely, including the following passage:

Anthropologists believe today that, as Lester del Rey has here portrayed, the Neanderthal man died out due to heartbreak…Incredible? Senseless to attribute such feelings to them? We have on earth today an exact and frightening duplication of that cosmic tragedy. The Bushmen of Tasmania are gone; the aboriginal race of Australia are going, become useless beggars without self-respect hanging on the fringes of the white man’s civilization, unable to reach understanding of man’s higher intelligence, and paralyzed to hopelessness thereby. Those who have not contacted white men continue in their own ways, but any missionary, any government protector sent to them—brings death by hopelessness! There is no help for them, for help is death.

I was reminded of these lines after Susan Goldberg, the editor of National Geographic, published a remarkable essay headlined “For Decades, Our Coverage Was Racist.” The magazine has taken a commendable first step—although not the last—to coming to grips with its legacy, and Goldberg outlines its history of reinforcing or creating racial stereotypes in devastating detail. As an example of the ideas that were quietly passed along to its readers, Goldberg cites an article about Australia from 1916, in which pictures of two Aboriginal people carry the stark caption: “South Australian Blackfellows: These savages rank lowest in intelligence of all human beings.” And when you examine the article itself, which is available elsewhere online, you find language that is so reminiscent of Campbell that I wonder if he might not have read it:

The blackfellow is not a “degraded savage,” but rather a primitive man placed in an unfavorable environment. When food and water are abundant the aboriginal is kind to the infirm, and even shows traits of generosity and gratitude. When the struggle for existence is severe he becomes an animal searching for its pretty. Mentally he is a weak child, with uncontrolled feelings, without initiative or sense of responsibility. In many respects he is intelligent and profits by education, but abstract ideas are apparently beyond his reach. His ignorance, superstition, and fear, rather than viciousness and evil intentions, make him dangerous to strangers.

And as an excellent article by Gavin Evans of The Guardian recently pointed out, this kind of “race science” has never disappeared—it just evolved with the times.

Goldberg doesn’t identify the author of “Lonely Australia: The Unique Continent,” but his background might be the most notable point of all. His name was Herbert E. Gregory, and he was nothing less than the director of the geology department at Yale University. He was an expert on the geography of “the Navajo country” of Arizona, New Mexico, and Utah; he extensively documented his studies of “the Indians and geology of Peru”; and shortly after the article was published, he became the director of the Bishop Museum in Honolulu, the home of the largest collection of Polynesian artifacts in the world. Gregory, in other words, was “interested in Indians,” to use Paul Chaat Smith’s devastating phrase, but he was also a distinguished scholar whose career amounted to a guided tour of the areas where contact between native and colonizing peoples took place. In a government publication titled The Navajo Country, which appeared the same year as his piece for National Geographic, Gregory wrote:

To my mind the period of direct contact with nature is the true “heroic age” of human history, an age in which heroic accomplishment and heroic endurance are parts of the daily routine. The activities of people on this stage of progress deserve a place among the cherished traditions of the human race. I believe also that the sanest missionary effort includes an endeavor to assist the uncivilized man in his adjustment to natural laws…This country is also the home of the vigorous and promising Navajos—a tribe in remarkably close adjustment to their physical surroundings. To improve the condition of this long-neglected but capable race, to render their life more intelligently wholesome by applying scientific knowledge, gives pleasures in no degree less than that obtained by the study of the interesting geologic problems which this country affords.

There’s a lot to unpack here, and I know only as much about Gregory as I’ve been able to find in a morning of research. But I know something about Campbell, and I feel justified in pointing out a common pattern. Both Campbell and Gregory were intelligent, educated men in positions of authority who were trusted by their readers to provide information about how the world worked. Astounding was the news of the future, while National Geographic, in the minds of many subscribers, represented the past, and you could probably perform a similar analysis of the magazines on which people relied for their understanding of the present. For all their accomplishments, both men had unexamined ideas about race that quietly undermined the stated goals of the publications in which their work appeared. Campbell undeniably did a great deal for science fiction, but by failing to see that his views were excluding voices that could have elevated the entire genre, he arguably did just as much to hold it back. Try to imagine an editor in the thirties who believed that he had the ability and the obligation to develop a diverse range of writers, and you end up with a revolution in science fiction that would have dwarfed everything that Campbell actually accomplished. And this isn’t a matter of projecting our own values onto an earlier time—Campbell actively conceived of himself as an innovator, and he deserves to be judged by his own high standards. The same holds true for the National Geographic Society, which was founded “to increase and diffuse geographic knowledge,” but often settled for received notions, even if it expanded the horizons of its audience in other ways. Goldberg quotes John Edwin Mason, a professor at the University of Virginia: “It’s possible to say that a magazine can open people’s eyes at the same time it closes them.” This was equally true of Astounding, which defined itself and its readers in ways that we have yet to overcome. And these definitions still matter. As the tagline once read on all of the ads in National Geographic: “Mention the Geographic—it identifies you.”

The stories of our lives

with one comment

Last week, I mentioned the evocative challenge that the writer and literary agent John Brockman recently posed to a group of scientists and intellectuals: “Ask the question for which you will be remembered.” I jokingly said that my own question would probably resemble the one submitted by the scholar Jonathan Gottschall: “Are stories bad for us?” As often happens with such snap decisions, however, this one turned out to be more revealing than I had anticipated. When I look back at my work as a writer, it’s hard to single out any overarching theme, but I do seem to come back repeatedly to the problem of reading the world as a text. My first novel, The Icon Thief, was openly inspired by Foucault’s Pendulum by Umberto Eco, which inverts the conventions of the conspiracy thriller to explore how we tell ourselves stories about history and reality. I didn’t go quite as far as Eco did, but it was a subject that I enjoyed, and it persisted to a lesser extent in my next two books. My science fiction stories tend to follow a formula that I’ve described as The X-Files in reverse, in which a paranormal interpretation of a strange event is supplanted by another that fits the same facts into a more rational pattern. And I’m currently finishing up a book that is secretly about how the stories that we read influence our behavior in the real world. As Isaac Asimov pointed out in his essay “The Sword of Achilles,” most readers are drawn to science fiction at a young age, and its values and assumptions subtly affect how they think and feel. If there’s a single thread that runs through just about everything I’ve written, then, it’s the question of how our tendency to see the world as a story—or a text—can come back to haunt us in unexpected ways.

As it happens, we’re all living right now through a vast social experiment that might have been designed to test this very principle. I got to thinking about this soon after reading an excellent essay, “The Weight of the Words,” by the political scientist Jacob T. Levy. He begins with a discussion of Trump’s “shithole countries” remark, which led a surprising number of commentators—on both the right and the left—to argue that the president’s words were less important than his actions. Levy summarizes this view: “Ignore the tweets. Ignore Trump’s inflammatory language. Ignore the words. What counts is the policy outcomes.” He continues:

I have a hard time believing that anyone really thinks like this as a general proposition…The longstanding view among conservatives was that Churchill’s “Iron Curtain” speech and Reagan’s call to “tear down this wall” were important events, words that helped to mobilize western resistance to Communism and to provide moral clarity about the stakes of that resistance.

On a more basic level, since it’s impossible for the government to accomplish everything by force, much of politics lies in emotional coercion, which suggest that words have power in themselves. Levy refers to Hannah Arendt’s argument in The Human Condition, in which a familiar figure appears:

The stature of the Homeric Achilles can be understood only if one sees him as “the doer of great deeds and the speakers of great words”…Thought was secondary to speech, but speech and action were considered to be coeval and coequal, of the same rank and the same kind; and this originally meant not only that most political action, in so far as it remains outside the sphere of violence, is indeed transacted in words, but more fundamentally that finding the right words at the right moment, quite apart from the information or communication they may convey, is action.

Levy then lists many of the obvious ways in which Trump’s words have had tangible effects—the erosion of America’s stature abroad, the undermining of trust in law enforcement and the civil service, the growth of tribalism and xenophobia, and the redefinition of what it means to be a Republican. (As Levy notes of Trump’s relationship to his supporters: “He doesn’t speak for them; how many of them had a view about ‘the deep state’ two years ago? He speaks to them, and it matters.”) Trump routinely undercuts the very notion of truth, in what seems like the ultimate example of the power of speech over the world of fact. And Levy’s conclusion deserves to be read whenever we need to be reminded of how this presidency differs from all the others that have come before:

The alleged realism of those who want to ignore words will often point to some past president whose lofty rhetoric obscured ugly policies. Whether those presidents are named “Reagan and George W. Bush” or “JFK and Barack Obama” varies in the obvious way, but the deflationary accounts are similar; there are blunders, crimes, abuses, and atrocities enough to find in the record of every American president. But all those presidents put forward a public rhetorical face that was better than their worst acts. This inevitably drives political opponents crazy: they despise the hypocrisy and the halo that good speeches put on undeserving heads. I’ve had that reaction to, well, every previous president in my living memory, at one time or another. But there’s something important and valuable in the fact that they felt the need to talk about loftier ideals than they actually governed by. They kept the public aspirations of American political culture pointed toward Reagan’s “shining city on a hill.”

He concludes of all of our previous presidents: “In words, even if not in deeds, they championed a free and fair liberal democratic order, the protection of civil liberties, openness toward the world, rejection of racism at home, and defiance against tyranny abroad. And their words were part of the process of persuading each generation of Americans that those were constitutively American ideals.” America, in short, is a story that Americans tell one another—and the world—about themselves, and when we change the assumptions behind this narrative, it has profound implications in practice. We treat others according to the roles that we’ve imagined for ourselves, or, more insidiously, that our society has imagined for us. Those roles are often restrictive, but they can also be liberating, both for good and for bad. (Levy perceptively notes that the only federal employees who don’t feel devalued these days are immigration and border agents.) And Levy sounds a warning that we would all do well to remember:

“Ignore the tweets, ignore the language, ignore the words” is advice that affects a kind of sophistication: don’t get distracted by the circus, keep your eye on what’s going on behind the curtain. This is faux pragmatism, ignoring what is being communicated to other countries, to actors within the state, and to tens of millions of fellow citizens. It ignores how all those actors will respond to the speech, and how norms, institutions, and the environment for policy and coercion will be changed by those responses. Policy is a lagging indicator; ideas and the speech that expresses them pave the way.

“Trump has spent a year on the campaign trail and a year in office telling us where he intends to take us,” Levy concludes. And we’re all part of this story now. But we should be even more worried if the words ever stop. As Arendt wrote more than half a century ago: “Only sheer violence is mute.”

The lantern battery and the golem

leave a comment »

Science means simply the aggregate of all the recipes that are always successful. All the rest is literature.

—Paul Valéry

Yesterday morning, my wife asked me: “Have you seen the illustration for Michael Chabon’s new essay?” She thrust the latest issue of The New Yorker in my direction, and when I looked down, I saw a drawing by Greg Clarke of a little boy reading what was unmistakably a copy of Astounding Science Fiction. The kid is evidently meant to be Chabon himself, and his article, “The Recipe for Life,” is about nothing less than how his inner life was shaped by his father’s memories of an earlier era. Chabon writes:

He talked about comic books, radio dramas, Astounding magazine, and the stories they’d all told: of rocket-powered heroes, bug-eyed monsters, mad scientists bent on ruling the world. He described to me how he had saved box tops from cold cereals like Post Toasties, and redeemed them by mail for Junior G-Man badges or cardboard Flying Fortresses that carried payloads of black marbles. He told me about playing games like potsy, stickball, handball, and ringolevio, and, for the first time but by no means the last, about an enchanted pastry called a charlotte russe, a rosette of whipped cream on a disk of sponge cake served in a scalloped paper cup, topped with a Maraschino cherry. He described having spent weeks in the cellar of his Flatbush apartment building as a young teen-ager, with some mail-order chemicals, five pounds of kosher salt, and a lantern battery, trying to re-create “the original recipe for life on earth,” as detailed in the pages of Astounding.

The younger Chabon listened to his father intently, and all the while, he was “riding the solitary rails of my imagination into our mutual story, into the future we envisioned and the history we actually accumulated; into the vanished world that he once inhabited.”

Chabon’s father seems to have been born around 1938, or right around the time that John W. Campbell took over Astounding, positioning him to barely catch the tail end of the golden age. He would have been about twelve when the article “Dianetics: The Evolution of a Science” appeared in the May 1950 issue, which means that he snuck in right under the wire. (As the fan Peter Graham once said: “The golden age of science fiction is twelve.”) In fact, when you account for a gap in age of about eighteen years, the fragments of his childhood that we glimpse here are intriguingly reminiscent of Isaac Asimov. Both were bright Jewish boys growing up in Brooklyn—otherwise known as the center of the universe—and they shared the same vocabulary of nostalgia. Robert Chabon reminisced about stickball and the charlotte russe; Asimov lamented the disappearance of the egg cream and wrote in his memoirs:

We used to play “punchball,” for instance. This was a variant of baseball, played without a lot and without a bat. All you needed was a street (we called it a “gutter”) and a rubber ball. You hit the ball with your clenched fist and from then on it was pretty much like baseball.

I don’t know if kids these days still play punchball, but it survived for long enough to be fondly remembered by Stephen Jay Gould, who was born in 1941 in Queens. For Gould, punchball was nothing less than “the canonical ‘recess’ game…It was the game we would play unless kids specifically called for another form.”

Like many polymaths who thrived at the intersection between science and the arts, Gould and Asimov were raised in secular Jewish households, and Chabon’s essay unfolds against a similar, largely unstated cultural background. He notes that his father knew “the birth names of all five Marx Brothers,” as well as the rather startling fact that Underdog’s archenemy was named Simon Bar Sinister. Recalling his father’s “expression of calm intensity,” Chabon links it to another Jewish icon: “A few years later, I will watch Leonard Nimoy, as Mr. Spock, look up from his scanner on the bridge of the USS Enterprise, and catch an echo of my father’s face.” As he silently watches Fritz Lang’s science fiction epic Metropolis in his ailing father’s bedroom, he imagines the conversation that might have unfolded between them under happier circumstances: “Lang’s mother was Jewish. His wife was a member of the Nazi Party.” “Hey, that would make a great sitcom.” Chabon doesn’t emphasize these connections, perhaps because he’s explored them endlessly elsewhere. In his earlier essay “Imaginary Homelands,” he writes:

For a long time now I’ve been busy, in my life and in my work, with a pair of ongoing, overarching investigations: into my heritage—rights and privileges, duties and burdens—as a Jew and as a teller of Jewish stories; and into my heritage as a lover of genre fiction…Years spent writing novels and stories about golems and the Jewish roots of American superhero comic books, Sherlock Holmes and the Holocaust, medieval Jewish freebooters, Passover Seders attended by protégés of forgotten Lovecraftian horror writers, years of writing essays, memoirs, and nervous manifestos about genre fiction of Jewishness.

This is one of the richest veins imaginable for cultural exploration, and Chabon has conducted it so expertly for so long that he can trust us to make many of the associations for ourselves. Revealingly, this is actually the second essay that he has written under the title “The Recipe for Life.” The first, published almost two decades ago, was a meditation on the myth of the golem, a prototypical science fiction story with anticipatory shades of Frankenstein. In his earlier piece, Chabon quotes the philosopher Gershom Scholem: “Golem-making is dangerous; like all major creation it endangers the life of the creator—the source of danger, however, is not the golem…but the man himself.” Chabon continues:

When I read these words, I saw at once a connection to my own work. Anything good that I have written has, at some point during its composition, left me feeling uneasy and afraid. It has seemed, for a moment at least, to put me at risk…I have come to see this fear, this sense of my own imperilment by my creations, as not only an inevitable, necessary part of writing fiction but as virtual guarantor, insofar as such a thing is possible, of the power of my work: as a sign that I am on the right track, that I am following the recipe correctly, speaking the proper spells.

The recipe, Chabon implies, can come from either “The Idea of the Golem” or Astounding, and we owe much of his remarkable career to that insight, which he implicitly credits, in turn, to his father: “The past and the future became alloyed in my imagination: magic and science, heroes and villains, brick-and-steel Brooklyn and the chromium world of tomorrow.”

Going with the flow

leave a comment »

On July 13, 1963, New York University welcomed a hundred attendees to an event called the Conference on Education for Creativity in the Sciences. The gathering, which lasted for three days, was inspired by the work of Dr. Myron A. Coler, the director of the school’s Creative Science Program. There isn’t a lot of information available online about Coler, who was trained as an electrical engineer, and the best source I’ve found is an unsigned Talk of the Town piece that ran earlier that week in The New Yorker. It presents Coler as a scholar who was interested in the problem of scientific creativity long before it became fashionable: “What is it, how does it happen, how is it fostered—can it be isolated, measured, nurtured, predicted, directed, and so on…By enhancing it, you produce more from what you have of other resources. The ability to exploit a resource is in itself a resource.” He conducted monthly meetings for years with a select group of scientists, writing down everything that they had to say on the subject, including a lot of wild guesses about how to identify creative or productive people. Here’s my favorite:

One analyst claims that one of the best ways that he knows to test an individual is to take him out to dinner where lobster or crab is served. If the person uses his hands freely and seems to enjoy himself at the meal, he is probably well adjusted. If, on the other hand, he has trouble in eating the crab, he probably will have trouble in his relations with people also.

The conference was overseen by Jerome B. Wiesner, another former electrical engineer, who was appointed by John F. Kennedy to chair the President’s Science Advisory Committee. Wiesner’s interest lay in education, and particularly in identifying and training children who showed an early aptitude for science. In an article that was published a few years later in the journal Daedalus, Wiesner listed some of the attributes that were often seen in such individuals, based on the work of the pioneering clinical psychologist Anne Roe:

A childhood environment in which knowledge and intellectual effort were so highly valued for themselves than an addiction to reading and study was firmly established at an early age; an unusual degree of independence which, among other things, led them to discover early that they could satisfy their curiosity by personal efforts; an early dependence on personal resources, and on the necessity to think for oneself; an intense drive that generated concentrated, persistent, time-ignoring efforts in their studies and work; a secondary-school training that tended to emphasize science rather than the humanities; and high, but not necessarily remarkably high, intelligence.

But Wiesner also closed on a note of caution: “We do not now have useful techniques for predicting with comfortable reliability which individuals will turn out to be creative in the sciences or in any other field, no matter how great an investment we make in their education. Nor does it appear likely that such techniques will be developed in the immediate future.”

As it happened, one of the attendees at the conference was Isaac Asimov, who took the bus down to New York from Boston. Years afterward, he said that he couldn’t remember much about the experience—he was more concerned by the fact that he lost the wad of two hundred dollars that he had brought as emergency cash—and that his contributions to the discussion weren’t taken seriously. When the question came up of how to identify potentially creative individuals at a young age, he said without hesitation: “Keep an eye peeled for science-fiction readers.” No one else paid much attention, but Asimov didn’t forget the idea, and he wrote it up later that year in his essay “The Sword of Achilles,” which was published by The Bulletin of the Atomic Scientists. His views on the subject were undoubtedly shaped by his personal preferences, but he was also probably right. (He certainly met most of the criteria listed by Weisner, aside from “an unusual degree of independence,” since he was tied down for most of his adolescence to his father’s candy store.) And science fiction had more in common with Coler and Wiesner’s efforts than they might have appreciated. The editor John W. Campbell had always seen the genre as a kind of training program that taught its readers how to survive in the future, and Weisner described “tomorrow’s world” in terms that might have been pulled straight from Astounding: “That world will be more complex than it is today, will be changing more rapidly than now, and it will have jobs only for the well trained.” Weisner closed with a quotation from the philosopher Alfred North Whitehead:

In the conditions of modern life, the rule is absolute, the race which does not value trained intelligence is doomed…Today we maintain ourselves. Tomorrow science will have moved forward one more step, and there will be no appeal from the judgment which will then be pronounced on the uneducated.

These issues tend to come to the forefront during times of national anxiety, and it’s no surprise that we’re seeing a resurgence in them today. In last week’s issue of The New Yorker, Adam Gopnik rounded up a few recent titles on education and child prodigies, which reflect “the sense that American parents have gone radically wrong, making themselves and their kids miserable in the process, by hovering over them like helicopters instead of observing them from a watchtower, at a safe distance.” The catch is that while the current wisdom says that we should maximize our children’s independence, most child prodigies were the result of intensive parental involvement, which implies that the real secret to creative achievement lies somewhere else. And the answer may be right in front of us. As Gopnik writes of the author Ann Hulbert’s account of of the piano prodigy Lang Lang:

Lang Lang admits to the brutal pressures placed on him by his father…He was saved because he had, as Hulbert writes, “carved out space for a version of the ‘autotelic experience’—absorption in an activity purely for its own sake, a specialty of childhood.” Following the psychologist Mihaly Csikszentmihalyi, Hulbert maintains that it was being caught in “the flow,” the feeling of the sudden loss of oneself in an activity, that preserved Lang Lang’s sanity: “The prize always beckoned, but Lang was finding ways to get lost in the process.”

This is very close to the “concentrated, persistent, time-ignoring efforts” that Weisner described fifty years ago, as well as his characterization of learning as “an addiction.” Gopnik concludes: “Accomplishment, the feeling of absorption in the flow, of mastery for its own sake, of knowing how to do this thing, is what keeps all of us doing what we do, if we like what we do at all.” And it seems to have been this sense of flow, above all else, that led Asimov to write more than four hundred books. He was addicted to it. As he once wrote to Robert A. Heinlein: “I like it in the attic room with the wallpaper. I’ve been all over the galaxy. What’s left to see?”

The manufacturers of worlds

with 2 comments

For the last few days, as part of a deliberate break from writing, I’ve been browsing contentedly through my favorite book, The Annotated Sherlock Holmes by William S. Baring-Gould. It was meant to be a comforting read that was as far removed from work as possible, but science fiction, unsurprisingly, can’t seem to let me go. Yesterday, I was looking over The Sign of the Four when I noticed a line that I’ve read countless times without really taking note of it. As Holmes leaves Baker Street to pursue a line of the investigation, he says to Watson, who has remained behind: “Let me recommend this book—one of the most remarkable ever penned. It is Winwood Reade’s Martyrdom of Man. I shall be back in an hour.” Toward the end of the novel, speaking of the difficulty in predicting what any given human being will do, Holmes elaborates:

Winwood Reade is good upon the subject…He remarks that, while the individual man is an insoluble puzzle, in the aggregate he becomes a mathematical certainty. You can, for example, never foretell what any one man will do, but you can say with precision what an average number will be up to. Individuals vary, but percentages remain constant. So says the statistician.

This is remarkably like what Isaac Asimov writes of psychohistory, a sociological version of the ideal gas law that can predict the future based on the existence of a huge number—perhaps in the trillions—of individual lives. And it seemed worth checking to see if this passage could cast any light on the origins of the imaginary science that I’ve spent so much time exploring.

It pains me to say that Holmes himself probably wasn’t a direct influence on the Foundation series. There was a considerable overlap between Sherlockians and science fiction writers—prominent members of both camps included Anthony Boucher, Poul Anderson, Fletcher Pratt, and Manly Wade Wellman—but John W. Campbell wasn’t among them, and Asimov was drafted only reluctantly into the Baker Street Irregulars. (He writes in I. Asimov: “Conan Doyle was a slapdash and sloppy writer…I am not really a Holmes enthusiast.”) For insight, we have to go back to Winwood Reade himself, a British historian, explorer, and correspondent of Charles Darwin whose discussion of the statistical predictability of the human race appears, interestingly, in an argument against the efficacy of prayer. Here’s the full passage from The Martyrdom of Man, which was published in 1872:

All phenomena, physical and moral, are subject to laws as invariable as those which regulate the rising and setting of the sun. It is in reality as foolish to pray for rain or a fair wind as it would be to pray that the sun should set in the middle of the day. It is as foolish to pray for the healing of a disease or for daily bread as it is to pray for rain or a fair wind. It is as foolish to pray for a pure heart or for mental repose as it is to pray for help in sickness or misfortune. All the events which occur upon the earth result from Law: even those actions which are entirely dependent on the caprices of the memory, or the impulse of the passions, are shown by statistics to be, when taken in the gross, entirely independent of the human will. As a single atom, man is an enigma; as a whole, he is a mathematical problem. As an individual, he is a free agent; as a species, the offspring of necessity.

At the end of the book, Reade takes his own principles to their logical conclusion, becoming, in effect, an early writer of science fiction. Its closing section, “Intellect,” sketches out a universal history that anticipates Toynbee, but Reade goes further: “When we understand the laws which regulate the complex phenomena of life, we shall be able to predict the future as we are already able to predict comets and eclipses and planetary movements.” He describes three inventions that he believes will lead to an era of global prosperity:

The first is the discovery of a motive force which will take the place of steam, with its cumbrous fuel of oil or coal; secondly, the invention of aerial locomotion which will transport labour at a trifling cost of money and of time to any part of the planet, and which, by annihilating distance, will speedily extinguish national distinctions; and thirdly, the manufacture of flesh and flour from the elements by a chemical process in the laboratory, similar to that which is now performed within the bodies of the animals and plants.

And after rhapsodizing over the utopian civilization that will result—in which “poetry and the fine arts will take that place in the heart which religion now holds”—he turns his thoughts to the stars:

And then, the earth being small, mankind will migrate into space, and will cross the airless Saharas which separate planet from planet, and sun from sun. The earth will become a Holy Land which will be visited by pilgrims from all the quarters of the universe. Finally, men will master the forces of nature; they will become themselves architects of systems, manufacturers of worlds. Man then will be perfect; he will then be a creator; he will therefore be what the vulgar worship as a god.

Reade was inevitably seen as an atheist, and although he didn’t like the label, he inclined many readers in that direction, as he did in one of the most interesting episodes in this book’s afterlife. The scene is World War II, which tested the idea of psychohistory to its limit, and the speaker is the author of the memoir The Enchanted Places:

The war was on. I was in Italy. From time to time [my father] used to send me parcels of books to read. In one of them were two in the Thinker’s Library series: Renan’s The Life of Jesus and Winwood Reade’s The Martyrdom of Man. I started with The Life of Jesus and found it quite interesting; I turned to The Martyrdom and found it enthralling…There was no God. God had not created Man in His own image. It was the other way round: Man had created God. And Man was all there was. But it was enough. It was the answer, and it was both totally convincing and totally satisfying. It convinced and satisfied me as I lay in my tent somewhere on the narrow strip of sand that divides Lake Comacchio from the Adriatic; and it has convinced and satisfied me ever since.

I wrote at once to my father to tell him so and he at once wrote back. And it was then that I learned for the first time that these were his beliefs, too, and that he had always hoped that one day I would come to share them…So he had sent me The Martyrdom. But even then he had wanted to play absolutely fair, and so he had added The Life of Jesus. And then he had been content to leave the verdict to me. Well, he said, the church had done its best. It had had twenty-four years’ start—and it had failed.

The author adds: “If I had to compile a list of books that have influenced my life, high on the list would undoubtedly be Winwood Reade’s The Martyrdom of Man. And it would probably be equally high on my father’s list too.” The father in question was A.A. Milne. And the son was named Christopher Robin.

To the stars

with 3 comments

In a few hours, if all goes according to plan, I’ll be delivering the contracted draft of Astounding: John W. Campbell, Isaac Asimov, Robert A. Heinlein, L. Ron Hubbard, and the Golden Age of Science Fiction to my publisher. Last night, I had trouble sleeping, and I found myself remembering a passage from an essay by Algis Budrys that I read at the beginning of this project:

It’s becoming increasingly obvious that we need a long, objective look at John W. Campbell, Jr. But we’re not likely to get one…Obviously, no one who knew him well enough to work for him at any length could have retained an objective view of him; the most we can hope for from that quarter would be a series of memoirs which, taken all together and read by some ideally situated observer, might distill down into some single resultant—which all its parents would disown…But, obviously, no one who failed to feel his effect, or who rebelled against his effect, or lost interest in his effect, is apt to understand matters well enough to tell us exactly what he did and how he did it. At best, we’ll hear he had feet of clay. How those feet are described by each expositor may eventually produce some sort of resultant.

Budrys wrote these words more than forty years ago, and while I can’t say that I’ve always managed to be an “ideally situated observer,” I’d like to think that I’ve occasionally come close, thanks largely to the help that I’ve received from the friends of this book, who collectively—and often individually—know far more about the subject than I ever will.

Along the way, there have also been moments when the central figures seemed to reach out and speak to me directly. In a footnote in In Memory Yet Green, the first volume of his gargantuan memoir, which I still manage to enjoy even after immersing myself in it for most of the last two years, Isaac Asimov writes:

You wouldn’t think that with this autobiography out there’d be any need for a biography, but undoubtedly there’ll be someone who will consider this record of mine so biased, so self-serving, so ridiculous that there will be need for a scholarly, objective biography to set the record straight. Well, I wish him luck.

And in a letter to Syracuse University, Campbell wrote: “Sorry, but any scholarly would-be biographers are going to have a tough time finding any useful documentation on me! I just didn’t keep the records!” (Luckily for me, he was wrong.) Heinlein probably wouldn’t have cared for this project, either. As he said of a proposed study of his career by Alexei Panshin: “I preferred not to have my total corpus of work evaluated in print until after I was dead…but in any case, I did not want a book published about me written by a kid less than half my age and one who had never written a novel himself—and especially one who had tried to pick a fight with me in the past.” And we’re not even going to talk about Hubbard yet. For now, I’m going to treat myself to a short break, wait for notes, and take a few tentative steps toward figuring out what comes next. In the meantime, I can only echo what Martin Amis wrote over three decades ago: “I knew more about Isaac Asimov than I knew about anyone else alive. What could there be left to add?”

Written by nevalalee

December 4, 2017 at 9:06 am

%d bloggers like this: