Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Arthur C. Clarke

A Hawk From a Handsaw, Part 1

leave a comment »

Note: My article “The Campbell Machine,” which describes one of the strangest episodes in the history of Astounding Science Fiction, is now available online and in the July/August issue of Analog. To celebrate its publication, I’m republishing a series about an equally curious point of intersection between science fiction and the paranormal. This post originally appeared, in a slightly different form, on February 15, 2017. 

I am but mad north-north-west. When the wind is southerly, I know a hawk from a handsaw.

Hamlet

In the summer of 1974, the Israeli magician and purported psychic Uri Geller arrived at Birkbeck College in Bloomsbury, London, where the physicist David Bohm planned to subject him to a series of tests. Two of the appointed observers were the authors Arthur Koestler and Arthur C. Clarke, of whom Geller writes in his autobiography:

Arthur Clarke…would be particularly important because he was highly skeptical of anything paranormal. His position was that his books, like 2001 and Childhood’s End, were pure science fiction, and it would be highly unlikely that any of their fantasies would come true, at least in his own lifetime.

He met the group in a conference room, where Koestler was outwardly polite, even as Geller sensed that he “really wasn’t getting through to Arthur C. Clarke.” A demonstration seemed to be in order, so Geller asked Clarke to hold one of his own house keys in one hand, watching closely to make sure that it wasn’t switched, handled, or subjected to any trickery. Soon enough, the key began to bend. Clarke cried out, in what I like to think was an inadvertent echo of his most famous story: “My God, my eyes are seeing it! It’s bending!”

Geller went on to display his talents in a number of other ways, including forcing a Geiger counter to click at an accelerated rate merely by concentrating on it. (The skeptic James Randi has suggested that Geller had a magnet taped to his leg.) “By that time,” Geller writes, “Arthur Clarke seemed to have lost all his skepticism. He said something like, ‘My God! It’s all coming true! This is what I wrote about in Childhood’s End. I can’t believe it.'” Geller continues:

Clarke was not there just to scoff. He had wanted things to happen. He just wanted to be completely convinced that everything was legitimate. When he saw that it was, he told the others: “Look, the magicians and the journalists who are knocking this better put up or shut up now. Unless they can repeat the same things Geller is doing under the same rigidly controlled conditions, they have nothing further to say.”

Clarke also described the plot of Childhood’s End, which Geller evidently hadn’t read: “It involves a UFO that is hovering over the earth and controlling it. He had written the book about twenty years ago. He said that, after being a total skeptic about these things, his mind had really been changed by observing these experiments.”

The Horus Errand

It’s tempting to think that Geller is exaggerating the extent of the author’s astonishment, but here’s what Clarke himself said of it much later:

Although it’s hard to focus on that hectic and confusing day at Birkbeck College in 1974…I suspect that Uri Geller’s account in My Story is all too accurate…In view of the chaos at the hastily arranged Birkbeck encounter, the phrase “rigidly controlled conditions” is hilarious. But that last sentence is right on target, for [the reproduction of Geller’s effects by stage magicians] is precisely what happened…Nevertheless, I must confess a sneaking fondness for Uri; though he left a trail of bent cutlery and fractured reputations round the world, he provided much-needed entertainment at a troubled and unhappy time.

Geller has largely faded from the public consciousness, but Clarke—who continued to believe long afterward that paranormal phenomena “can’t all be nonsense”—wasn’t the only prominent science fiction writer to find him intriguing. Robert Anton Wilson, one of my intellectual heroes, discusses him at length in the book Cosmic Trigger, in which he recounts a strange incident that was experienced by his friend Saul-Paul Sirag. The year before the Birkbeck tests, Sirag allegedly saw Geller’s head turn into that of a “bird of prey,” like a hawk: “His nose became a beak, and his entire head sprouted feathers, down to his neck and shoulders.” (Wilson neglects to mention that Sirag was also taking LSD at the time.) The hawk, Sirag thought, was the form assumed by an alien intelligence that was supposedly in contact with Geller, and he didn’t know that it had appeared in the same shape to two other witnesses, including a psychic named Ray Stanford and another man who nicknamed it “Horus,” after the Egyptian god with a hawk’s head.

And it gets even weirder. A few months later, Sirag saw the January 1974 issue of Analog, which featured the story “The Horus Errand” by William E. Cochrane. The cover illustration depicted a man wearing a hawklike helmet, with the name “Stanford” written over his breast pocket. According to one of Sirag’s friends, the occultist Alan Vaughan, the character in the painting even looked a little like Ray Stanford, and you can judge the resemblance for yourself. Vaughan was interested enough to write to the artist, the legendary Frank Kelly Freas, for more information. (Freas, incidentally, was close friends with John W. Campbell, to the point where Campbell even asked him to serve as the guardian for his daughters if anything ever happened to him or his wife.) Freas replied that he had never met Stanford in person or knew how he looked, but that he had once received a psychic consultation from him by mail, in which Stanford told Freas that he had been “some sort of illustrator in a past life in ancient Egypt.” As a result, Freas began to consciously employ Egyptian imagery in his work, and the design of the helmet on the cover was entirely his own, without any reference to the story. At that point, the whole thing kind of peters out, aside from serving as an example of the kind of absurd coincidence that was so close to Wilson’s heart. But the intersection of Arthur C. Clarke, Uri Geller, and Robert Anton Wilson at that particular moment in time is a striking one, and it points toward an important thread in the history of science fiction that tends to be overlooked or ignored—perhaps because it’s often guarded by ominous hawks. I’ll be digging into this more deeply tomorrow.

The Big One

leave a comment »

In a heartfelt appreciation of the novelist Philip Roth, who died earlier this week, the New York Times critic Dwight Garner describes him as “the last front-rank survivor of a generation of fecund and authoritative and, yes, white and male novelists…[that] included John Updike, Norman Mailer and Saul Bellow.” These four names seem fated to be linked together for as long as any of them is still read and remembered, and they’ve played varying roles in my own life. I was drawn first to Mailer, who for much of my adolescence was my ideal of what a writer should be, less because of his actual fiction than thanks to my repeated readings of the juiciest parts of Peter Manso’s oral biography. (If you squint hard and think generously, you can even see Mailer’s influence in the way I’ve tried to move between fiction and nonfiction, although in both cases it was more a question of survival.) Updike, my favorite, was a writer I discovered after college. I agree with Garner that he probably had the most “sheer talent” of them all, and he represents my current model, much more than Mailer, of an author who could apparently do anything. Bellow has circled in and out of my awareness over the years, and it’s only recently that I’ve started to figure out what he means to me, in part because of his ambiguous status as a subject of biography. And Roth was the one I knew least. I’d read Portnoy’s Complaint and one or two of the Zuckerman novels, but I always felt guilty over having never gotten around to such late masterpieces as American Pastoral—although the one that I should probably check out first these days is The Plot Against America.

Yet I’ve been thinking about Roth for about as long as I’ve wanted to be a writer, largely because he came as close as anyone ever could to having the perfect career, apart from the lack of the Nobel Prize. He won the National Book Award for his debut at the age of twenty-six; he had a huge bestseller at an age when he was properly equipped to enjoy it; and he closed out his oeuvre with a run of major novels that critics seemed to agree were among the best that he, or anyone, had ever written. (As Garner nicely puts it: “He turned on the afterburners.”) But he never seemed satisfied by his achievement, which you can take as an artist’s proper stance toward his work, a reflection of the fleeting nature of such rewards, a commentary on the inherent bitterness of the writer’s life, or all of the above. Toward the end of his career, Roth actively advised young writers not to become novelists, and in his retirement announcement, which he delivered almost casually to a French magazine, he quoted Joe Louis: “I did the best I could with what I had.” A month later, in an interview with Charles McGrath of the New York Times, he expanded on his reasoning:

I know I’m not going to write as well as I used to. I no longer have the stamina to endure the frustration. Writing is frustration—it’s daily frustration, not to mention humiliation. It’s just like baseball: you fail two-thirds of the time…I can’t face any more days when I write five pages and throw them away. I can’t do that anymore…I knew I wasn’t going to get another good idea, or if I did, I’d have to slave over it.

And on his computer, he posted a note that gave him strength when he looked at it each day: “The struggle with writing is over.”

Roth’s readers, of course, rarely expressed the same disillusionment, and he lives most vividly in my mind as a reference point against which other authors could measure themselves. In an interview with The Telegraph, John Updike made one of the most quietly revealing statements that I’ve ever heard from a writer, when asked if he felt that he and Roth were in competition:

Yes, I can’t help but feel it somewhat. Especially since Philip really has the upper hand in the rivalry as far as I can tell. I think in a list of admirable novelists there was a time when I might have been near the top, just tucked under Bellow. But since Bellow died I think Philip has…he’s certainly written more novels than I have, and seems more dedicated in a way to the act of writing as a means of really reshaping the world to your liking. But he’s been very good to have around as far as goading me to become a better writer.

I think about that “list of admirable novelists” all the time, and it wasn’t just a joke. In an excellent profile in The New Yorker, Claudia Roth Pierpoint memorably sketched in all the ways in which other writers warily circled Roth. When asked if the two of them were friends, Updike said, “Guardedly,” and Bellow seems to have initially held Roth at arm’s length, until his wife convinced him to give the younger writer a chance. Pierpont concludes of the relationship between Roth and Updike: “They were mutual admirers, wary competitors who were thrilled to have each other in the world to up their game: Picasso and Matisse.”

And they also remind me of another circle of writers whom I know somewhat better. If Bellow, Mailer, Updike, and Roth were the Big Four of the literary world, they naturally call to mind the Big Three of science fiction—Heinlein, Asimov, and Clarke. In each case, the group’s members were perfectly aware of how exceptional they were, and they carefully guarded their position. (Once, in a conference call with the other two authors, Asimov jokingly suggested that one of them should die to make room for their successors. Heinlein responded: “Fuck the other writers!”) Clarke and Asimov seem to have been genuinely “thrilled to have each other in the world,” but their relationship with the third point of the triangle was more fraught. Toward the end, Asimov started to “avoid” the combative Heinlein, who had a confrontation with Clarke over the Strategic Defense Initiative that effectively ended their friendship. In public, they remained cordial, but you can get a hint of their true feelings in a remarkable passage from the memoir I. Asimov:

[Clarke] and I are now widely known as the Big Two of science fiction. Until early 1988, as I’ve said, people spoke of the Big Three, but then Arthur fashioned a little human figurine of wax and with a long pin— At least, he has told me this. Perhaps he’s trying to warn me. I have made it quite plain to him, however, that if he were to find himself the Big One, he would be very lonely. At the thought of that, he was affected to the point of tears, so I think I’m safe.

As it turned out, Clarke, like Roth, outlived all the rest, and perhaps they felt lonely in the end. Longevity can amount to a kind of victory in itself. But it must be hard to be the Big One.

The dawn of man

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

Almost from the moment that critics began to write about 2001, it became fashionable to observe that the best performance in the movie was by an actor playing a computer. In his review in Analog, for example, P. Schuyler Miller wrote:

The actors, except for the gentle voice of HAL, are thoroughly wooden and uninteresting, and I can’t help wondering whether this isn’t Kubrick’s subtle way of suggesting that the computer is really more “human” than they and fully justified in trying to get rid of them before they louse up an important mission. Someday we may know whether the theme of this part is a Clarke or a Kubrick contribution. I suspect it was the latter…perhaps just because Stanley Kubrick is said to like gadgets.

This criticism is often used to denigrate the other performances or the film’s supposed lack of humanity, but I prefer to take it as a tribute to the work of actor Douglas Rain, Kubrick and Clarke’s script, and the brilliant design of HAL himself. The fact that a computer is the character we remember best isn’t a flaw in the movie, but a testament to its skill and imagination. And as I’ve noted elsewhere, the acting is excellent—it’s just so understated and naturalistic that it seems vaguely incongruous in such spectacular settings. (Compare it to the performances in Destination Moon, for instance, and you see how good Keir Dullea and William Sylvester really are here.)

But I also think that the best performance in 2001 isn’t by Douglas Rain at all, but by Vivian Kubrick, in her short appearance on the phone as Heywood Floyd’s daughter. It’s a curious scene that breaks many of the rules of good storytelling—it doesn’t lead anywhere, it’s evidently designed to do nothing but show off a piece of hardware, and it peters out even as we watch it. The funniest line in the movie may be Floyd’s important message:

Listen, sweetheart, I want you to tell mommy something for me. Will you remember? Well, tell mommy that I telephoned. Okay? And that I’ll try to telephone tomorrow. Now will you tell her that?

But that’s oddly true to life as well. And when I watch the scene today, with a five-year-old daughter of my own, it seems to me that there’s no more realistic little girl in all of movies. (Kubrick shot the scene himself, asking the questions from offscreen, and there’s a revealing moment when the camera rises to stay with Vivian as she stands. This is sometimes singled out as a goof, although there’s no reason why a sufficiently sophisticated video phone wouldn’t be able to track her automatically.) It’s a scene that few other films would have even thought to include, and now that video chat is something that we all take for granted, we can see through the screen to the touchingly sweet girl on the other side. On some level, Kubrick simply wanted his daughter to be in the movie, and you can’t blame him.

At the time, 2001 was criticized as a soulless hunk of technology, but now it seems deeply human, at least compared to many of its imitators. Yesterday in the New York Times, Bruce Handy shared a story from Keir Dullea, who explained why he breaks the glass in the hotel room at the end, just before he comes face to face with himself as an old man:

Originally, Stanley’s concept for the scene was that I’d just be eating and hear something and get up. But I said, “Stanley, let me find some slightly different way that’s kind of an action where I’m reaching—let me knock the glass off, and then in mid-gesture, when I’m bending over to pick it up, let me hear the breathing from that bent-over position.” That’s all. And he says, “Oh, fine. That sounds good.” I just wanted to find a different way to play the scene than blankly hearing something. I just thought it was more interesting.

I love this anecdote, not just because it’s an example of an evocative moment that arose from an actor’s pragmatic considerations, but because it feels like an emblem of the production of the movie as a whole. 2001 remains the most technically ambitious movie of all time, but it was also a project in which countless issues were being figured out on the fly. Every solution was a response to a specific problem, and it covered a dizzying range of challenges—from the makeup for the apes to the air hostess walking upside down—that might have come from different movies entirely.

2001, in short, was made by hand—and it’s revealing that many viewers assume that computers had to be involved, when they didn’t figure in the process at all. (All of the “digital” readouts on the spacecraft, for instance, were individually animated, shot on separate reels of film, and projected onto those tiny screens on set, which staggers me even to think about it. And even after all these years, I still can’t get my head around the techniques behind the Star Gate sequence.) It reminds me, in fact, of another movie that happens to be celebrating an anniversary this year. As a recent video essay pointed out, if the visual effects in Jurassic Park have held up so well, it’s because most of them aren’t digital at all. The majority consist of a combination of practical effects, stop motion, animatronics, raptor costumes, and a healthy amount of misdirection, with computers used only when absolutely necessary. Each solution is targeted at the specific problems presented by a snippet of film that might last just for a few seconds, and it moves so freely from one trick to another that we rarely have a chance to see through it. It’s here, not in A.I., that Spielberg got closest to Kubrick, and it hints at something important about the movies that push the technical aspects of the medium. They’re often criticized for an absence of humanity, but in retrospect, they seem achingly human, if only because of the total engagement and attention that was required for every frame. Most of their successors lack the same imaginative intensity, which is a greater culprit than the use of digital tools themselves. Today, computers are used to create effects that are perfect, but immediately forgettable. And one of the wonderful ironies of 2001 is that it used nothing but practical effects to create a computer that no viewer can ever forget.

The psychedelic nightmare

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

On June 24, 1968, Ron Hopkins, an officer of the Church of Scientology, issued a secret policy statement to all members under his jurisdiction in the United Kingdom. It read in full: “No staff or current students are to see the film 2001: A Space Odyssey. The film produces heavy and unnecessary restimulation.” A few months later, the author William S. Burroughs wrote to his friend Brion Gysin: “Incidentally I thoroughly enjoyed 2001. More fun than a roller coaster. I knew I wanted to see it when all Scientologists were told it was off limits.” To the best of my knowledge, we don’t know precisely why the movie troubled the church, although it isn’t hard to guess. In dianetics, “restimulation” refers to the awakening of traumatic memories, often from past lives, and even the experience of seeing this film in a theater might have seemed like an unnecessary risk. In a lecture on the story of Xenu, L. Ron Hubbard explained the sad fate of the thetans, the disembodied souls who have clung for millions of years to unsuspecting humans:

[The thetans] were brought down, packed up, and put in front of projection machines, which were sound and color pictures. First [it] gave them the implant which you know as “clearing course.” And then a whole track implanted which you know as OT II. After this however, about the remainder of the thirty-six days, which is the bulk of them, is taken up with a 3D super colossal motion picture, which has to do with God, the Devil, space opera, etc.

And the uneasiness that Scientologists felt toward 2001 was only an extreme version of the ambivalence of many fans toward a movie that represented the most ambitious incursion that the genre had ever made into the wider culture.

As far as I can determine, we don’t know what Robert A. Heinlein thought of the film, although he presumably saw it—it was screened one night on the S.S. Statendam, the ocean liner on which he sailed on the ill-fated Voyage Beyond Apollo cruise in 1972. And Isaac Asimov had a few surprising brushes with the production itself. Arthur C. Clarke called him to discuss a plot point about the evolution of vegetarians into omnivores, and a year and a half later, Asimov came close to actually being in the movie:

Arthur Clarke was working with Stanley Kubrick to put out a motion picture called 2001: A Space Odyssey, and Kubrick, who was investing millions in what might have seemed a very dubious venture…was searching for ways to promote it properly. One way was to get a group of high-prestige individuals to make the movie respectable by having them submit to movie-camera interviews in which they would speak on such subjects as the possibility of extraterrestrial life. I was one of those approached, and I spent hours on May 18, 1966 doing the interview in one of the rooms in the Anatomy Department [at Boston University]…Afterward I heard that Carl Sagan had been approached and had refused to cooperate since no money was involved. It made me uneasily aware that I had given myself away for nothing and had exposed myself as valueless by the only measure Hollywood valued—money. But it was for Arthur Clarke, I told myself, and you can’t let a pal down.

Ultimately, the idea of the talking heads was dropped, and none of his footage made it into the finished film. Asimov later wrote approvingly of the movie’s “realistic portrayal of space travel” and called it a “classic,” but although he praised its special effects, he never seems to have said much about its merits as entertainment.

As far as John W. Campbell is concerned, I haven’t been able to find any opinions that he expressed on it in public—an unusual omission for an editor who was seldom reluctant to speak his mind about anything. In 1968, however, Analog took the unusual step of running what amounted to two reviews of the film, one by G. Harry Stine, the other by book critic P. Schuyler Miller. Stine, an author and rocket scientist who was close to both Campbell and Heinlein, hated the movie:

We thought that here, perhaps, would be a suitable sequel to the fabulous Destination Moon made twenty years ago…When the final title credits were flashed across the Cinerama screen after the New York premiere, I sat there with the feeling that I’d been had. It’s too bad that the film is billed as science fiction, because it isn’t. It is ninety percent “gee whiz” science gadgetry and ten percent fantasy nonsense…[Audiences] will believe that it is a solid look at the technology of the future. They will instead see a film that is the most cleverly made, subtly done attack on science and technology that has ever been made…It disintegrates into an unexplainable, nonscientific, anti-intellectual psychedelic nightmare.

Stein criticized the HAL subplot “because Kubrick and Clarke did not use or recall Asimov’s Three Laws of Robotics,” and he lamented the film’s lack of characterization and conflict, adding without irony that these were qualities “rarely lacking in [the] pages” of Analog. A month later, in a combined review of the book and the movie, which he called “tantalizing,” Miller was slightly more kind to the latter: “Technically, it is certainly the most advanced science fiction film we have ever had…The film will be remembered; the book won’t.”

None of these criticisms are necessarily wrong, although I’d argue that the performances, which Miller called “wooden,” have held up better than anybody could have expected. But much of the response feels like an attempt by lifelong fans to grapple with a major effort by an outsider. Three decades earlier, Campbell had reacted in a similar way to a surprise move into science fiction by Kubrick’s most noteworthy precursor. In 1938, after the airing of the Mercury Theatre’s radio adaptation of The War of the Worlds, Campbell wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.”  In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:

I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement. Undoubtedly, Mr. Orson Welles felt the same way.

Campbell, who was just a few years older than Welles, seems to have quickly tired of being asked about The War of the Worlds, which he evidently saw as an encroachment on his turf. 2001 felt much the same to many fans. Fifty years later, it’s easier to see it as an indispensable part of the main line of hard science fiction—and perhaps even as its culmination. But it didn’t seem that way at the time.

The cosmic order

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

On April 2, 1968, the world premiere of 2001: A Space Odyssey was held at the Uptown Theater, a movie palace in the Cleveland Park neighborhood of Washington, D.C. Two days later, Martin Luther King, Jr. was assassinated in Memphis, sparking riots throughout the country, including the nation’s capital. At first, this might seem like another reminder of how we unconsciously compartmentalize the past, filing events into separate categories, like the moon landing and the Manson killings, that actually unfolded in a confused present tense. Three years ago, the artist Edgar Arceneaux released an experimental film, A Time to Break Silence, that tried to go deeper, explaining in an interview:

Stanley Kubrick, Arthur C. Clarke and Dr. King were formulating their ideas about the duality of technology, which can be used as both a weapon and tool, during the same time period. As the psychic trauma of Dr. King’s death had the nation in a raw state of anger and uncertainty, a film chronicling the genealogy of humanity’s troubled future with technology is released in theaters.

More often, however, we tend to picture the political upheavals of the sixties as moving along a separate track from the decade’s scientific and technological achievements. In his book on the making of 2001, the journalist Piers Bizony writes of its visual effects team: “The optimism of Kubrick’s technologists seemed unquenchable. Perhaps, like their counterparts at Cape Kennedy, they were just too busy in their intense and closed-off little world to notice Vietnam, Martin Luther King, LSD, the counterculture?”

But that isn’t really true. John W. Campbell liked to remind his authors: “The future doesn’t happen one at a time.” And neither does the past or the present. We find, for instance, that King himself—who was a man who thought about everything—spoke and wrote repeatedly about the space program. At first, like many others, he saw it through the lens of national security, saying in a speech on February 2, 1959: “In a day when Sputniks and Explorers dash through outer space and guided ballistic missiles are carving highways of death though the stratosphere, nobody can win a war.” Yet it remained on his mind, and images of space began to appear more often in his public statements over the following year. A few months later, in a sermon titled “Unfulfilled Hopes,” he said:

We look out at the stars; we find ourselves saying that these stars shine from their cold and serene and passionless height, totally indifferent to the joys and sorrows of men. We begin to ask, is man a plaything of a callous nature, sometimes friendly and sometimes inimical? Is man thrown out as a sort of orphan in the terrifying immensities of space, with nobody to guide him on and nobody concerned about him? These are the questions we ask, and we ask them because there is an element of tragedy in life.

And King proclaimed in a commencement speech at Morehouse College in June: “Man through his scientific genius has been able to dwarf distance and place time in chains. He has been able to carve highways through the stratosphere, and is now making preparations for a trip to the moon. These revolutionary changes have brought us into a space age. The world is now geographically one.”

King’s attitude toward space was defined by a familiar tension. On one hand, space travel is a testament to our accomplishments as a species; on the other, it diminishes our achievements by forcing us to confront the smallness of our place in the universe. On December 11, 1960, King emphasized this point in a sermon at the Unitarian Church of Germantown, Pennsylvania:

All of our new developments can banish God neither from the microcosmic compass of the atom nor from the vast unfathomable ranges of interstellar space, living in a universe in which we are forced to measure stellar distance by light years, confronted with the illimitable expanse of the universe in which stars are five hundred million billion miles from the Earth, which heavenly bodies travel at incredible speed and in which the ages of planets are reckoned in terms of billions of years. Modern man is forced to cry out with the solace of old: “When I behold the heavens, the work of thy hands, the moon, the stars, and all that thou hast created, what is man that thou art mindful of him and the son of man that thou remembereth him?”

In 1963, King made the comparison more explicit in his book The Strength to Love: “Let us notice, first, that God is able to sustain the vast scope of the physical universe. Here again, we are tempted to feel that man is the true master of the physical universe. Manmade jet planes compress into minutes distances that formerly required weeks of tortuous effort. Manmade spaceships carry cosmonauts through outer space at fantastic speeds. Is God not being replaced in the mastery of the cosmic order?” But after reminding us of the scale of the distances involved, King concludes: “We are forced to look beyond man and affirm anew that God is able.”

This seems very much in the spirit of 2001, which is both a hymn to technology and a meditation on human insignificance. For King, however, the contrast between the triumphs of engineering and the vulnerability of the individual wasn’t just an abstract notion, but a reflection of urgent practical decisions that had to be made here and now. Toward the end of his life, he framed it as a choice of priorities, as he did in a speech in 1967: “John Kenneth Galbraith said that a guaranteed national income could be done for about twenty billion dollars a year. And I say to you today, that if our nation can spend thirty-five billion dollars to fight an unjust, evil war in Vietnam, and twenty billion dollars to put a man on the moon, it can spend billions of dollars to put God’s children on their own two feet right here on Earth.” The following year, speaking to the Rabbinical Assembly in the Catskills, he was even more emphatic: “It must be made clear now that there are some programs that we can cut back on—the space program and certainly the war in Vietnam—and get on with this program of a war on poverty.” And on March 18, 1968, King said to the striking sanitation workers in Memphis, whom he would visit again on the day before he died:

I will hear America through her historians, years and generations to come, saying, “We built gigantic buildings to kiss the skies. We built gargantuan bridges to span the seas. Through our spaceships we were able to carve highways through the stratosphere. Through our airplanes we are able to dwarf distance and place time in chains. Through our submarines we were able to penetrate oceanic depths.” It seems that I can hear the God of the universe saying, “Even though you have done all of that, I was hungry and you fed me not, I was naked and you clothed me not. The children of my sons and daughters were in need of economic security and you didn’t provide it for them. And so you cannot enter the kingdom of greatness.”

How the solar system was won

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

When Stanley Kubrick hired Arthur C. Clarke to work on the project that became 2001: A Space Odyssey, they didn’t have a title, a plot, or even much in the way of a premise. In Kubrick’s introductory letter to the author, he had written only that his interest lay in “these broad areas, naturally assuming great plot and character”:

1. The reasons for believing in the existence of intelligent extraterrestrial life.
2. The impact (and perhaps even lack of impact in some quarters) such discovery would have on earth in the near future.
3. A space probe with a landing and exploration of the moon and Mars.

If you’ve seen the movie, you know that almost none of what Kubrick describes here ended up in the finished film. The existence of extraterrestrial life is the anthropic assumption on which the entire story rests; there’s no real attempt to sketch in the larger social context; and the discovery of the alien artifact—far from having any impact on society—remains a secret until the end to all but a few scientists. There’s already a thriving colony on the moon when the main action of the story really starts, and Heywood Floyd only turns up after the monolith has been found. All that remains of Kubrick’s original conception, in fact, is a vague feeling that he tried to convey early in their partnership, which Clarke remembered later as the desire to make “a movie about man’s relation to the universe—something which had never been attempted, still less achieved, in the history of motion pictures.”

In this respect, they undoubtedly succeeded, and a lot of it had to do with Kubrick’s choice of collaborator. Yesterday, I suggested that Kubrick settled on Clarke because he was more likely than the other obvious candidates to be available for the extended writing process that the director had in mind. (This was quite an assumption, since it meant that Clarke had to be away from his home in Ceylon for more than a year, but it turned out to be right.) Yet Clarke was also uniquely qualified to write about “man’s relation to the universe,” and in particular about aliens who were far in advance of the human race. As Isaac Asimov has memorably explained, this was a plot point that was rarely seen in Astounding, mostly because of John W. Campbell’s personal prejudices:

[Campbell] was a devout believer in the inequality of man and felt that the inequality could be detected by outer signs such as skin and hair coloring…In science fiction, this translated itself into the Campbellesque theory that earthmen (all of whom, in the ideal Campbell story, resembled, people of northwestern European extraction) were superior to all other intelligent races.

Clarke had broken through in Astounding after the war—his stories “Loophole” and “Rescue Party” appeared in 1946—but geographical distance and foreign rights issues had kept him from being shaped by Campbell to any real extent. As a result, he was free to indulge in such works as Childhood’s End, the ultimate story about superior aliens, which was inspired by Campbell’s novel The Mightiest Machine but ran its first installment in the British magazine New Worlds.

Clarke, in short, was unquestionably part of the main sequence of hard science fiction that Campbell had inaugurated, but he was also open to exploring enormous, borderline mystical questions that emphasized mankind’s insignificance. (At his best, in such stories as “The Star” and “The Nine Billion Names of God,” he managed to combine clever twist endings with a shattering sense of scale in a way that no other writer has ever matched.) It was this unlikely combination of wit, technical rigor, and awareness of the infinite that made him ideally suited to Kubrick, and they promptly embarked on one of the most interesting collaborations in the history of the genre. As an example of a symbiotic organism, the only comparable example is Campbell and the young Asimov, except that Clarke and Kubrick were both mature artists at the peak of their talents. Fortunately for us, Clarke kept a journal, and he provided excerpts in two fascinating essays, “Christmas, Shepperton” and “Monoliths and Manuscripts,” which were published in the collection The Lost Worlds of 2001. The entries offer a glimpse of a process that ranged freely in all directions, with both men pursuing trains of thought as far as they would go before abandoning them for something better. As Clarke writes:

It was [Kubrick’s] suggestion that, before embarking on the drudgery of the script, we let our imaginations soar freely by developing the story in the form of a complete novel…After various false starts and twelve-hour talkathons, by early May 1964 Stanley agreed that [Clarke’s short story] “The Sentinel” would provide good story material. But our first concept—and it is hard now for me to focus on such an idea, though it would have been perfectly viable—involved working up to the discovery of an extraterrestrial artifact as the climax, not the beginning, of the story. Before that, we would have a series of incidents or adventures devoted to the exploration of the moon and planets…[for which] our private title (never of course intended for public use) was How the Solar System Was Won.

And while 2001 arguably made its greatest impact on audiences with its meticulous art direction and special effects, Kubrick’s approach to writing was equally obsessive. He spent a full year developing the story with Clarke before approaching the studio for financing, and although they soon realized that the premise of “The Sentinel” would work better as an inciting incident, rather than as the ending, the notion of “incidents or adventures” persisted in the finished script. The film basically consists of four loosely connected episodes, the most memorable of which—the story of HAL 9000—could be eliminated without fundamentally affecting the others. But if it feels like an organic whole, this is largely thanks to the decision to develop far more material than could ever fit into a novel, much less a movie. (Clarke’s diary entries are filled with ideas that were dropped or transformed in the final version: “The people we meet on the other star system are humans who were collected from earth a hundred thousand years ago, and hence are virtually identical to us.” “What if our E.T.s are stranded on earth and need the ape-men to help them?” And then there’s the startling line, which Clarke, who was discreetly gay, records without comment: “Stanley has invented the wild idea of slightly fag robots who create a Victorian environment to put our heroes at their ease.”) It verged on a private version of development hell, without any studio notes or interference, and it’s hard to imagine any other director who could have done it. 2001 started a revolution in visual effects, but its writing process was just as remarkable, and we still haven’t caught up to it yet. Even Clarke, whose life it changed, found Kubrick’s perfectionism hard to take, and he concluded: “In the long run, everything came out all right—exactly as Stanley had predicted. But I can think of easier ways of earning a living.”

When Clarke Met Kubrick

with 3 comments

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

“I’m reading everything by everybody,” Stanley Kubrick said one day over lunch in New York. It was early 1964, and he was eating at Trader Vic’s with Roger A. Caras, a wildlife photographer and studio publicist who was working at the time for Columbia Pictures. Dr. Strangelove had just been released, and after making small talk about their favorite brand of telescope, Caras asked the director what he had in mind for his next project. Kubrick replied that he was thinking about “something on extraterrestrials,” but he didn’t have a writer yet, and in the meantime, he was consuming as much science fiction as humanly possible. Unfortunately, we don’t know much about what he was reading, which is a frustrating omission in the career of a filmmaker whose archives have been the subject of so many exhaustive studies. In his biography of Kubrick, Vincent Lobrutto writes tantalizingly of this period: “Every day now boxes of science fiction and fact books were being delivered to his apartment. Kubrick was immersing himself in a subject he would soon know better than most experts. His capacity to grasp and disseminate information stunned many who worked with him.” Lobrutto notes that Kubrick took much the same approach a decade later on the project that became The Shining, holing up in his office with “stacks of horror books,” and the man with whom he would eventually collaborate on 2001 recalled of their first meeting: “[Kubrick] had already absorbed an immense amount of science fact and science fiction, and was in some danger of believing in flying saucers.” At their lunch that day at Trader Vic’s, however, Caras seemed to think that all of this work was unnecessary, and he told this to Kubrick in no uncertain terms: “Why waste your time? Why not just start with the best?”

Let’s pause the tape here for a moment to consider what other names Caras might plausibly have said. A year earlier, in his essay “The Sword of Achilles,” Isaac Asimov provided what we can take as a fairly representative summary of the state of the genre:

Robert A. Heinlein is usually considered the leading light among good science fiction writers. Others with a fine grasp of science and a fascinatingly imaginative view of its future possibilities are Arthur C. Clarke, Frederik Pohl, Damon Knight, James Blish, Clifford D. Smiak, Poul Anderson, L. Sprague de Camp, Theodore Sturgeon, Walter Miller, A.J. Budrys…These are by no means all.

Even accounting for the writer and the time period, there are a few noticeable omissions—it’s surprising not to see Lester del Rey, for instance, and A.E. van Vogt, who might not have qualified as what Asimov saw as “good science fiction,” had been voted one of the top four writers in the field in a pair of polls a few years earlier. It’s also necessary to add Asimov himself, who at the time was arguably the science fiction writer best known to general readers. (In 1964, he would even be mentioned briefly in Saul Bellow’s novel Herzog, which was the perfect intersection of the highbrow and the mainstream.) Arthur C. Clarke’s high ranking wasn’t just a matter of personal affection, either—he and Asimov later became good friends, but when the article was published, they had only met a handful of times. Clarke, in other words, was clearly a major figure. But it seems fair to say that anyone claiming to name “the best” science fiction writer in the field might very well have gone with Asimov or Heinlein instead.

Caras, of course, recommended Clarke, whom he had first met five years earlier at a weekend in Boston with Jacques Cousteau. Kubrick was under the impression that Clarke was a recluse, “a nut who lives in a tree in India someplace,” and after being reassured that he wasn’t, the director became excited: “Jesus, get in touch with him, will you?” Caras sent Clarke a telegram to ask about his availability, and when the author said that he was “frightfully interested,” Kubrick wrote him a fateful letter:

It’s a very interesting coincidence that our mutual friend Caras mentioned you in a conversation we were having about a Questar telescope. I had been a great admirer of your books for quite a time and had always wanted to discuss with you the possibility of doing the proverbial “really good” science-fiction movie…Roger tells me you are planning to come to New York this summer. Do you have an inflexible schedule? If not, would you consider coming sooner with a view to a meeting, the purpose of which would be to determine whether an idea might exist or arise which could sufficiently interest both of us enough to want to collaborate on a screenplay?

This account of the conversation differs slightly from Caras’s recollection—Kubrick doesn’t say that they were actively discussing potential writers for a film project, and he may have been flattering Clarke slightly with the statement that he had “always wanted” to talk about a movie with him. But it worked. Clarke wrote back to confirm his interest, and the two men finally met in New York on April 22, where the author did his best to talk Kubrick out of his newfound interest in flying saucers.

But why Clarke? At the time, Kubrick was living on the Upper East Side, which placed him within walking distance of many science fiction authors who were considerably closer than Ceylon, and it’s tempting to wonder what might have happened if he had approached Heinlein or Asimov, both of whom would have been perfectly sensible choices. A decade earlier, Heinlein made a concerted effort to break into Hollywood with the screenplays for Destination Moon and Project Moon Base, and the year before, he had written an unproduced teleplay for a proposed television show called Century XXII. (Kubrick studied Destination Moon for its special effects, if not for its story, as we learn from the correspondence of none other than Roger Caras, who had gone to work for Kubrick’s production company.) Asimov, for his part, was more than willing to explore such projects—in years to come, he would meet to discuss movies with Woody Allen and Paul McCartney, and I’ve written elsewhere about his close encounter with Steven Spielberg. But if Kubrick went with Clarke instead, it wasn’t just because they had a friend in common. At that point, Clarke was a highly respected writer, but not yet a celebrity outside the genre, and the idea of a “Big Three” consisting of Asimov, Clarke, and Heinlein was still a decade away. His talent was undeniable, but he was also a more promising candidate for the kind of working relationship that the director had in mind, which Kubrick later estimated as “four hours a day, six days a week” for more than three years. I suspect that Kubrick recognized what might best be described as a structural inefficiency in the science fiction market. The time and talents of one of the most qualified writers imaginable happened to be undervalued and available at just the right moment. When the opportunity came, Kubrick seized it. And it turned out to be one hell of a bargain.

The ultimate trip

with 2 comments

On Saturday, I was lucky enough to see 2001: A Space Odyssey on the big screen at the Music Box Theatre in Chicago. I’ve seen this movie well over a dozen times, but watching it on a pristine new print from the fourth row allowed me to pick up on tiny details that I’d never noticed before, such as the fact that David Bowman, stranded at the end in his celestial hotel room, ends up wearing a blue velvet robe startlingly like Isabella Rossellini’s. I was also struck by the excellence of the acting, which might sound like a joke, but it isn’t. Its human protagonists have often been dismissed—Roger Ebert, who thought it was one of the greatest films of all time, called it “a bloodless movie with faceless characters”—and none of the actors, aside from Douglas Rain as the voice of HAL, are likely to stick in the memory. (As Noël Coward reputedly said: “Keir Dullea, gone tomorrow.”) But on an objective level, these are nothing less than the most naturalistic performances of any studio movie of the sixties. There isn’t a trace of the affectation or overacting that you see in so much science fiction, and Dullea, Gary Lockwood, and particularly William Sylvester, in his nice dry turn as Heywood Floyd, are utterly believable. You could make a strong case that their work here has held up better than most of the more conventionally acclaimed performances from the same decade. This doesn’t make them any better or worse, but it gives you a sense of what Kubrick, who drew his characters as obsessively as his sets and special effects, was trying to achieve. He wanted realism in his acting, along with everything else, and this is how it looks, even if we aren’t used to seeing it in space.

The result is still the most convincing cinematic vision of space exploration that we have, as well as the most technically ambitious movie ever made, and its impact, like that of all great works of art, appears in surprising places. By coincidence, I went to see 2001 the day after Donald Trump signed an executive order to reinstate the National Space Council, at a very peculiar ceremony that was held with a minimum of fanfare. The event was attended by Buzz Aldrin, who has played scenes across from Homer Simpson and Optimus Prime, and I can’t be sure that this didn’t strike him as the strangest stage he had ever shared. Here are a few of Trump’s remarks, pulled straight from the official transcript:

Security is going to be a very big factor with respect to space and space exploration.  At some point in the future, we’re going to look back and say, how did we do it without space? The Vice President will serve as the council’s chair….Some of the most successful people in the world want to be on this board…Our journey into space will not only make us stronger and more prosperous, but will unite us behind grand ambitions and bring us all closer together. Wouldn’t that be nice? Can you believe that space is going to do that? I thought politics would do that. Well, we’ll have to rely on space instead…We will inspire millions of children to carry on this proud tradition of American space leadership—and they’re excited—and to never stop wondering, hoping, and dreaming about what lies beyond the stars.

Taking a seat, Trump opened the executive order, exclaiming: “I know what this is. Space!” Aldrin then piped up with what was widely reported as a reference to Toy Story: “Infinity and beyond!” Trump seemed pleased: “This is infinity here. It could be infinity. We don’t really don’t know. But it could be. It has to be something—but it could be infinity, right?”

As HAL 9000 once said: “Yes, it’s puzzling.” Aldrin may have been quoting Toy Story, but he might well have been thinking of 2001, too, the last section of which is titled “Jupiter and Beyond the Infinite.” (As an aside, I should note that the line “To infinity and beyond” makes its first known appearance, as far as I can tell, in John W. Campbell’s 1934 serial The Mightiest Machine.) It’s an evocative but meaningless phrase, with the same problems that led Arthur C. Clarke to express doubts about Kubrick’s working title, Journey Beyond the Stars—which Trump, you’ll notice, also echoed. Its semantic content is nonexistent, which is only fitting for a ceremony that underlined the intellectual bankruptcy of this administration’s approach to space. I don’t think I’m overstating the matter when I say that Trump and Mike Pence have shown nothing but contempt for other forms of science. The science division of the Office of Science and Technology Policy lies empty. Pence has expressed bewilderment at the fact that climate change has emerged, “for some reason,” as an issue on the left. And Trump has proposed significant cuts to science and technology funding agencies. Yet his excitement for space seems unbounded and apparently genuine. He asked eagerly of astronaut Peggy Whitson: “Tell me, Mars, what do you see a timing for actually sending humans to Mars? Is there a schedule and when would you see that happening?” And the reasons behind his enthusiasm are primarily aesthetic and emotional. One of his favorite words is “beautiful,” in such phrases as “big, beautiful wall” and “beautiful military equipment,” and it was much in evidence here: “It is America’s destiny to be at the forefront of humanity’s eternal quest for knowledge and to be the leader amongst nations on our adventure into the great unknown. And I could say the great and very beautiful unknown. Nothing more beautiful.”

But the truly scary thing is that if Trump believes that the promotion of space travel can be divorced from any concern for science itself, he’s absolutely right. As I’ve said here before, in the years when science fiction was basically a subcategory of adventure fiction, with ray guns instead of revolvers, space was less important in itself than as the equivalent of the unexplored frontier of the western: it stood for the unknown, and it was a perfect backdrop for exciting plots. Later, when the genre began to take itself more seriously as a predictive literature, outer space was grandfathered in as a setting, even if it had little to do with any plausible vision of the future. Space exploration seemed like an essential part of our destiny as a species because it happened to be part of the genre already. As a result, you can be excited by the prospect of going to Mars while actively despising or distrusting everything else about science—which may be the only reason that we managed to get to the moon at all. (These impulses may have less to do with science than with religion. The most haunting image from the Apollo 11 mission, all the more so because it wasn’t televised, may be that of Aldrin taking communion on the lunar surface.) Science fiction made it possible, and part of the credit, or blame, falls on Kubrick. Watching 2001, I had tears in my eyes, and I felt myself filled with all my old emotions of longing and awe. As Kubrick himself stated: “If 2001 has stirred your emotions, your subconscious, your mythological yearnings, then it has succeeded.” And it did, all too well, at the price of separating our feelings for space even further from science, and of providing a place for those subconscious urges to settle while leaving us consciously indifferent to problems closer to home. Kubrick might not have faked the moon landing, but he faked a Jupiter mission, and he did it beautifully. And maybe, at least for now, it should save us the expense of doing it for real.

From Sputnik to WikiLeaks

with 2 comments

In Toy Story 2, there’s a moment in which Woody discovers that his old television series, Woody’s Roundup, was abruptly yanked off the air toward the end of the fifties. He asks: “That was a great show. Why cancel it?” The Prospector replies bitterly: “Two words: Sput-nik. Once the astronauts went up, children only wanted to play with space toys.” And while I wouldn’t dream of questioning the credibility of a man known as Stinky Pete, I feel obliged to point out that his version of events isn’t entirely accurate. The space craze among kids really began more than half a decade earlier, with the premiere of Tom Corbett, Space Cadet, and the impact of Sputnik on science fiction was far from a positive one. Here’s what John W. Campbell wrote about it in the first issue of Astounding to be printed after the satellite’s launch:

Well, we lost that race; Russian technology achieved an important milestone in human history—one that the United States tried for, talked about a lot, and didn’t make…One of the things Americans have long been proud of—and with sound reason—is our ability to convert theoretical science into practical, working engineering…This time we’re faced with the uncomfortable realization that the Russians have beaten us in our own special field; they solved a problem of engineering technology faster and better than we did.

And while much of the resulting “Sputnik crisis” was founded on legitimate concerns—Sputnik was as much a triumph of ballistic rocketry as it was of satellite technology—it also arose from the notion that the United States had been beaten at its own game. As Arthur C. Clarke is alleged to have said, America had become “a second-rate power.”

Campbell knew right away that he had reason to worry. Lester del Rey writes in The World of Science Fiction:

Sputnik simply convinced John Campbell that he’d better watch his covers and begin cutting back on space scenes. (He never did, but the art director of the magazine and others were involved in that decision.) We agreed in our first conversation after the satellite went up that people were going to react by deciding science had caught up with science fiction, and with a measure of initial fear. They did. Rather than helping science fiction, Sputnik made it seem outmoded.

And that’s more or less exactly what happened. There was a brief spike in sales, followed by a precipitous fall as mainstream readers abandoned the genre. I haven’t been able to find specific numbers for this period, but one source, the Australian fan Wynne Whitford, states that the circulation of Astounding fell by half after Sputnik—which seems high, but probably reflects a real decline. In a letter written decades later, Campbell said of Sputnik: “Far from encouraging the sales of science fiction magazines—half the magazines being published lost circulation so drastically they went out of business!” An unscientific glance at a list of titles appears to support this. In 1958, the magazines Imagination, Imaginative Tales, Infinity Science Fiction, Phantom, Saturn, Science Fiction Adventures, Science Fiction Quarterly, Star Science Fiction, and Vanguard Science Fiction all ceased publication, followed by three more over the next twelve months. The year before, just four magazines had folded. There was a bubble, and after Sputnik, it burst.

At first, this might seem like a sort of psychological self-care, of the same kind that motivated me to scale back my news consumption after the election. Americans were simply depressed, and they didn’t need any reminders of the situation they were in. But it also seems to have affected the public’s appetite for science fiction in particular, rather than science as a whole. In fact, the demand for nonfiction science writing actually increased. As Isaac Asimov writes in his memoir In Joy Still Felt:

The United States went into a dreadful crisis of confidence over the fact that the Soviet Union had gotten there first and berated itself for not being interested enough in science. And I berated myself for spending too much time on science fiction when I had the talent to be a great science writer…Sputnik also served to increase the importance of any known public speaker who could talk on science and, particularly, on space, and that meant me.

What made science fiction painful to read, I think, was its implicit assumption of American superiority, which had been disproven so spectacularly. Campbell later compared it to the reaction after the bomb fell, claiming that it was the moment when people realized that science fiction wasn’t a form of escapism, but a warning:

The reactions to Sputnik have been more rapid, and, therefore, more readily perceptible and correlatable. There was, again, a sudden rise in interest in science fiction…and there is, now, an even more marked dropping of the science-fiction interest. A number of the magazines have been very heavily hit…I think the people of the United States thought we were kidding.

And while Campbell seemed to believe that readers had simply misinterpreted science fiction’s intentions, the conventions of the genre itself clearly bore part of the blame.

In his first editorials after Sputnik, Campbell drew a contrast between the American approach to engineering, which proceeded logically and with vast technological resources, and the quick and dirty Soviet program, which was based on rules of thumb, trial and error, and the ability to bull its way through on one particular point of attack. It reminds me a little of the election. Like the space race, last year’s presidential campaign could be seen as a kind of proxy war between the American and Russian administrations, and regardless of what you believe about the Trump camp’s involvement, which I suspect was probably a tacit one, there’s no question as to which side Putin favored. On one hand, you had a large, well-funded political machine, and on the other, one that often seemed comically inept. Yet it was the quick and dirty approach that triumphed. “The essence of ingenuity is the ability to get precision results without precision equipment,” Campbell wrote, and that’s pretty much what occurred. A few applications of brute force in the right place made all the difference, and they were aided, to some extent, by a similar complacency. The Americans saw the Soviets as bunglers, and they never seriously considered the possibility that they might be beaten by a bunch of amateurs. As Campbell put it: “We earned what we got—fully, and of our own efforts. The ridicule we’ve collected is our just reward for our consistent efforts.” Sometimes I feel the same way. Right now, we’re entering a period in which the prospect of becoming a second-rate power is far more real than it was when Clarke made his comment. It took a few months for the implications of Sputnik to really sink in. And if history is any indication, we haven’t even gotten to the crisis yet.

A Hawk From a Handsaw, Part 3

with 10 comments

Hermann Göring with falcon

Over the last few days, I’ve been doing my best Robert Anton Wilson impression, and, like him, I’ve been seeing hawks everywhere. Science fiction is full of them. Skylark of Space, which is arguably the story that kicked off the whole business in the first place, was written by E.E. Smith and his friend Lee Hawkins Garby, who is one of those women who seem to have largely fallen out of the history of the genre. Then there’s Hawk Carse, the main character of a series of stories, written for Astounding by editors Harry Bates and Desmond W. Hall, that have become synonymous with bad space opera. And you’ve got John W. Campbell himself, who was described as having “hawklike” features by the fan historian Sam Moskowitz, and who once said of his own appearance: “I haven’t got eyes like a hawk, but the nose might serve.” (Campbell also compared his looks to those of The Shadow and, notably, Hermann Göring, an enthusiastic falconer who loved hawks.) It’s all a diverting game, but it gets at a meaningful point. When Wilson’s wife objected to his obsession with the 23 enigma, pointing out that he was just noticing that one number and ignoring everything else, Wilson could only reply: “Of course.” But continued to believe in it as an “intuitive signal” that would guide him in useful directions, as well as an illustration of the credo that guided his entire career:

Our models of “reality” are very small and tidy, the universe of experience is huge and untidy, and no model can ever include all the huge untidiness perceived by uncensored consciousness.

We’re living at a time in which the events of the morning can be spun into two contradictory narratives by early afternoon, so it doesn’t seem all that original to observe that you can draw whatever conclusion you like from a sufficiently rich and random corpus of facts. On some level, all too many mental models come down to looking for hawks, noting their appearances, and publishing a paper about the result. And when you’re talking about something like the history of science fiction, which is an exceptionally messy body of data, it’s easy to find the patterns that you want. You could write an overview of the genre that draws a line from A.E. van Vogt to Alfred Bester to Philip K. Dick that would be just as persuasive and consistent as one that ignores them entirely. The same is true of individuals like Campbell and Heinlein, who, like all of us, contained multitudes. It can be hard to reconcile the Campbell who took part in parapsychological experiments at Duke and was editorializing in the thirties about the existence of telepathy in Unknown with the founder of whatever we want to call Campbellian science fiction, just as it can be difficult to make sense of the contradictory aspects of Heinlein’s personality, which is something I haven’t quite managed to do yet. As Borges writes:

Let us greatly simplify, and imagine that a life consists of 13,000 facts. One of the hypothetical biographies would record the series 11, 22, 33…; another, the series 9, 13, 17, 21…; another, the series 3, 12, 21, 30, 39…A history of a man’s dreams is not inconceivable; another, of the organs of his body; another, of the mistakes he made; another, of all the moments when he thought about the Pyramids; another, of his dealings with the night and the dawn.

It’s impossible to keep all those facts in mind at once, so we make up stories about people that allow us to extrapolate the rest, in a kind of lossy compression. The story of Arthur C. Clarke’s encounter with Uri Geller is striking mostly because it doesn’t fit our image of Clarke as the paradigmatic hard science fiction writer, but of course, he was much more than that.

The Falcon Killer

I’ve been focusing on places where science fiction intersects with the mystical because there’s a perfectly valid history to be written about it, and it’s a thread that tends to be overlooked. But perhaps the most instructive paranormal encounter of all happened to none other than Isaac Asimov. In July 1966, Asimov and his family were spending two weeks at a summer house in Concord, Massachusetts. One evening, his daughter ran into the house shouting: “Daddy, Daddy, a flying saucer! Come look!” Here’s how he describes what happened next:

I rushed out of the house to see…It was a cloudless twilight. The sun had set and the sky was a uniform slate gray, still too light for any stars to be visible; and there, hanging in the sky, like an oversize moon, was a perfect featureless metallic circle of something like aluminum.

I was thunderstruck, and dashed back into the house for my glasses, moaning, “Oh no, this can’t happen to me. This can’t happen to me.” I couldn’t bear the thought that I would have to report something that really looked as though it might conceivably be an extraterrestrial starship.

When Asimov went back outside, the object was still there. It slowly began to turn, becoming gradually more elliptical, until the black markings on its side came into view—and it turned out to be the Goodyear blimp. Asimov writes: “I was incredibly relieved!” Years later, his daughter told the New York Times: “He nearly had a heart attack. He thought he saw his career going down the drain.”

It’s a funny story in itself, but let’s compare it to what Geller writes about Clarke: “Clarke was not there just to scoff. He had wanted things to happen. He just wanted to be completely convinced that everything was legitimate.” The italics are mine. Asimov, alone of all the writers I’ve mentioned, never had any interest in the paranormal, and he remained a consistent skeptic throughout his life. As a result, unlike the others, he was very rarely wrong. But I have a hunch that it’s also part of the reason why he sometimes seems like the most limited of all major science fiction writers—undeniably great within a narrow range—while simultaneously the most important to the culture as a whole. Asimov became the most famous writer the genre has ever seen because you could basically trust him: it was his nonfiction, not his fiction, that endeared him to the public, and his status as a explainer depended on maintaining an appearance of unruffled rationality. It allowed him to assume a very different role than Campbell, who manifestly couldn’t be trusted on numerous issues, or even Heinlein, who convinced a lot of people to believe him while alienating countless others. But just as W.B. Yeats drew on his occult beliefs as a sort of battery to drive his poetry, Campbell and Heinlein were able to go places where Asimov politely declined to follow, simply because he had so much invested in not being wrong. Asimov was always able to tell the difference between a hawk and a handsaw, no matter which way the wind was blowing, and in some ways, he’s the best model for most of us to emulate. But it’s hard to write science fiction, or to live in it, without seeing patterns that may or may not be there.

A Hawk From a Handsaw, Part 1

with one comment

Uri Geller

I am but mad north-north-west. When the wind is southerly, I know a hawk from a handsaw.

Hamlet

In the summer of 1974, the Israeli magician and purported psychic Uri Geller arrived at Birkbeck College in Bloomsbury, London, where the physicist David Bohm planned to subject him to a series of tests. Two of the scheduled observers were the writers Arthur Koestler and Arthur C. Clarke, of whom Geller writes in his autobiography:

Arthur Clarke…would be particularly important because he was highly skeptical of anything paranormal. His position was that his books, like 2001 and Childhood’s End, were pure science fiction, and it would be highly unlikely that any of their fantasies would come true, at least in his own lifetime.

Geller met the group in a conference room, where Koestler was cordial, although, Geller says, “I sensed that I really wasn’t getting through to Arthur C. Clarke.” A demonstration seemed to be in order, so Geller asked Clarke to hold one of his own housekeys in one hand, watching it closely to make sure that it wasn’t being swapped out, handled, or subjected to any trickery. Sure enough, the key began to bend. Clarke cried out, in what I like to think was an inadvertent echo of one of his most famous stories: “My God, my eyes are seeing it! It’s bending!”

Geller went on to display his talents in a number of other ways, including forcing a Geiger counter to click at an accelerated rate merely by concentrating on it. (It has been suggested by the skeptic James Randi that Geller had a magnet taped to his leg.) “By that time,” Geller writes, “Arthur Clarke seemed to have lost all his skepticism. He said something like, “My God! It’s all coming true! This is what I wrote about in Childhood’s End. I can’t believe it.” Geller continues:

Clarke was not there just to scoff. He had wanted things to happen. He just wanted to be completely convinced that everything was legitimate. When he saw that it was, he told the others: “Look, the magicians and the journalists who are knocking this better put up or shut up now. Unless they can repeat the same things Geller is doing under the same rigidly controlled conditions, they have nothing further to say.”

Clarke also told him about the plot of Childhood’s End, which Geller evidently hadn’t read: “It involves a UFO that is hovering over the earth and controlling it. He had written the book about twenty years ago. He said that, after being a total skeptic about these things, his mind had really been changed by observing these experiments.”

The Horus Errand

It’s tempting to think that Geller is exaggerating the extent of the author’s astonishment, but here’s what Clarke himself wrote about it:

Although it’s hard to focus on that hectic and confusing day at Birkbeck College in 1974…I suspect that Uri Geller’s account in My Story is all too accurate…In view of the chaos at the hastily arranged Birkbeck encounter, the phrase “rigidly controlled conditions” is hilarious. But that last sentence is right on target, for [the reproduction of Geller’s effects by stage magicians] is precisely what happened…Nevertheless, I must confess a sneaking fondness for Uri; though he left a trail of bent cutlery and fractured reputations round the world, he provided much-needed entertainment at a troubled and unhappy time.

Geller has largely faded from the public consciousness, but Clarke—who continued to believe long afterward that paranormal phenomena “can’t all be nonsense”—wasn’t the only science fiction writer to be intrigued by him. Robert Anton Wilson, one of my intellectual heroes, discusses him at length in the book Cosmic Trigger, in which he recounts the strange experience of his friend Saul-Paul Sirag. The year before the Birkbeck tests, Sirag was speaking to Geller when he saw the other man’s head turn into a “bird of prey,” like a hawk: “His nose became a beak, and his entire head sprouted feathers, down to his neck and shoulders.” (Sirag was also taking LSD at the time, which Wilson neglects to mention.) The hawk, Sirag thought, was the form assumed by an extraterrestrial intelligence that was allegedly in contact with Geller, and he didn’t know then that it had appeared in the same shape to two other men, including a psychic named Ray Stanford and another who had nicknamed it “Horus,” after the Egyptian god with a hawk’s head.

It gets weirder. A few months later, Sirag saw the January 1974 issue of Analog, which featured the story “The Horus Errand” by William E. Cochrane. The cover illustration depicted a man wearing a hawklike helmet, with the name “Stanford” written over his breast pocket. According to one of Sirag’s friends, the occultist Alan Vaughan, the character even looked a little like Ray Stanford—and you can judge the resemblance for yourself. Vaughan was interested enough to write to the artist, the legendary Kelly Freas, for more information. (Freas, incidentally, was close friends with John W. Campbell, to the point where Campbell even asked him to serve as the guardian for his two daughters if anything ever happened to him or his wife.) Freas replied that he had never met Stanford in person or knew how he looked, but that he had once received a psychic consultation from him by mail, in which Stanford said that “Freas had been some sort of illustrator in a past life in ancient Egypt.” As a result, Freas began to employ Egyptian imagery more consciously in his work, and the design of the helmet on the cover was entirely his own, without any reference to the story. At that point, the whole thing kind of peters out, aside from serving as an example of the kind of absurd coincidence that was so close to Wilson’s heart. But the intersection of Arthur C. Clarke, Uri Geller, and Robert Anton Wilson at that particular moment in time is a striking one, and it points toward an important thread in the history of science fiction that tends to be overlooked or ignored. Tomorrow, I’ll be writing more about what it all means, along with a few other ominous hawks.

Santa Claus conquers the Martians

leave a comment »

Santa Claus by Mauri Kunnas

Like most households, my family has a set of traditions that we like to observe during the holiday season. A vinyl copy of A Charlie Brown Christmas spends most of December on our record player, and I never feel as if I’m really in the spirit of things until I’ve listened to Kokomo Jo’s Caribbean Christmas—a staple of my own childhood—and The Ventures’ Christmas Album. My wife and I have started watching the Mystery Science Theater 3000 episode Santa Claus, not to be confused with Santa Claus Conquers the Martians, on an annual basis: it’s one of the best episodes that the show ever did, and I’m still tickled by it after close to a dozen viewings. (My favorite line, as Santa deploys a massive surveillance system to spy on the world’s children: “Increasingly paranoid, Santa’s obsession with security begins to hinder everyday operations.”) But my most beloved holiday mainstay is the book Santa Claus and His Elves by the cartoonist and children’s author Mauri Kunnas. If you aren’t Finnish, you probably haven’t heard of it, and readers from other countries might be momentarily bemused by its national loyalties: Santa’s workshop is explicitly located on Mount Korvatunturi in Lapland. As Kunnas writes: “So far away from human habitation is this village that no one is known to have seen it, except for a couple of old Lapps who stumbled across it by accident on their travels.”

I’ve been fascinated by this book ever since I was a child, and I was saddened when it inexplicably went missing for years, probably stashed away in a Christmas box in my parents’ garage. When my mother bought me a new copy, I was overjoyed, and as I began to read it to my own daughter, I was relieved to find that it holds up as well as always. The appeal of Kunnas’s book lies in its marvelous specificity: it treats Santa’s village as a massive industrial operation, complete with print shops, factories, and a fleet of airplanes. Santa Claus himself barely figures in the story at all. The focus is much more on the elves: where they work and sleep, their schools, their hobbies, and above all how they coordinate the immense task of tracking wish lists, making toys, and delivering presents. (Looking at Kunnas’s lovingly detailed illustrations of their warehouses and machine rooms, it’s hard not to be reminded of an Amazon fulfillment center—and although Jeff Bezos comes closer than anyone in history to realizing Santa’s workshop for real, complete with proposed deliveries by air, I’d like to think that the elves get better benefits.) As you leaf through the book, Santa’s operation starts to feel weirdly plausible, and everything from the “strong liniment” that he puts on his back to the sauna that he and the elves enjoy on their return adds up to a picture that could convince even the most skeptical adult.

Santa Claus by Mauri Kunnas

The result is nothing less than a beautiful piece of speculative fiction, enriched by the tricks that all such writers use: the methodical working out of a seemingly impossible premise, governed by perfect internal logic and countless persuasive details. Kunnas pulls it off admirably. In the classic study Pilgrims Through Space and Time, J.O. Bailey has an entire section titled “Probability Devices,” in which he states: “The greatest technical problem facing the writer of science fiction is that of securing belief…The oldest and perhaps the soundest method for securing suspension of disbelief is that of embedding the strange event in realistic detail about normal, everyday events.” He continues:

[Jules] Verne, likewise, offers minute details. Five Weeks in a Balloon, for instance, figures every pound of hydrogen and every pound of air displaced by it in the filling of the balloon, lists every article packed into the car, and states every detail of date, time (to the minute), and topography.

Elsewhere, I’ve noted that this sort of careful elaboration of hardware is what allows the reader to accept the more farfetched notions that govern the story as a whole—which might be the only thing that my suspense fiction and my short science fiction have in common. Filling out the world I’ve invented with small, accurate touches might be my single favorite part of being a writer, and the availability of such material often makes the difference between a finished story and one that never leaves the conceptual stage.

And when I look back, I wonder if I might not have imbibed much of this from the Santa Claus story, and in particular from Kunnas. Santa, in a way, is one of the first exposures to speculative fiction that any child gets: it inherently strains credulity, but you can’t argue with the gifts that appear under the tree on Christmas Day, and reconciling the implausibility of that story with the concrete evidence requires a true leap of imagination. Speculating that it might be the result of an organized conspiracy of adults is, if anything, an even bigger stretch—just as maintaining secrecy about a faked moon landing for decades would have been a greater achievement than going to the moon for real. Santa Claus, oddly enough, has rarely been a popular subject in science fiction, the Robot Santa on Futurama aside. As Gary Westfahl notes in The Greenwood Encyclopedia of Science Fiction and Fantasy: “As a literature dedicated by its very nature to breaking new ground, perhaps, science fiction is not well suited as a vehicle for ancient time-honored sentiments about the virtues of love and family life. (It’s no accident that the genre’s most famous treatment of Christmas lies in the devastating ending of Arthur C. Clarke’s “The Star,” which you should read right now if you haven’t before.) But I suspect that those impulses have simply been translated into another form. Robert Anton Wilson once commented on the prevalence of the “greenish-skinned, pointy-eared man” in science fiction and folklore, and he thought they might be manifestations of the peyote god Mescalito. But I prefer to think that most writers are secretly wondering what the elves have been doing all this time…

The good idea trap

with 3 comments

Raymond Chandler

Ideas are poison. The more you reason, the less you create.

—Raymond Chandler

As I’ve noted on this blog many times before, good ideas are cheap. Today, I’d like to make the case that they’re also dangerous, at least when it comes to bringing a story to its full realization. And I say this as someone who has a lot of good ideas. Nearly every novel or short story I’ve written hinges on a clever twist, some of them better than others. (I’m still pleased by the twist in “Kawataro,” and wish I’d done a slightly better job with the one in “The Voices.”) It’s partly for this reason that I tend to focus on suspense and science fiction, which are genres in which conceptual ingenuity is disproportionately rewarded. In some cases, as in many locked-room mysteries and the kind of hard science fiction we find in my beloved Analog, the idea or twist is all there is, and I’m probably not alone in occasionally saving time by skipping ahead to the surprise at once, without having to sit through all the bother of plot or characterization.

Which isn’t to say that a dynamite idea is always a bad thing. A story like Arthur C. Clarke’s “The Star,” for instance, turns almost entirely on the revelation in its final sentence, but that doesn’t make the rest of it any less satisfying—although it also doesn’t hurt that the story itself is relatively short. The real mistake is to assume that the creative process hinges on the idea. As I mentioned in my recent post on Shakespeare, a story’s premise is often the least interesting thing about it: nearly every idea has been done before, and the more it lends itself to being expressed in a single knockout sentence, the more likely someone else has written it already. As a result, an artist who commits himself to the cult of the idea, rather than its execution and elaboration, will eventually start to seem desperate, which goes a long way toward explaining the curious downward arc of a man like M. Night Shyamalan, a director with a sensational eye and considerable talent long since betrayed by his ideas.

M. Night Shyamalan

It should come as no surprise, then, that good ideas can be the most dangerous, since they’re inherently seductive. A writer with a great original idea is more likely to overlook problems of plot, structure, or language, when a merely decent idea that demands flawless execution may ultimately result in a more satisfying story. I’ve said before that a writer is best advised to start out from a position of neutrality toward his own material, and to allow his passion to flow from the process, and I still think that’s good advice. I’ve learned to be very suspicious of ideas that grab me at once, knowing that it’s going to be hard for me to remain objective. And I’ve found that sustained detachment, which allows me to evaluate each link of the chain on its own merits, is much more valuable than an early rush of excitement. Otherwise, I run the risk of turning into the producer described by David Mamet in On Directing Film, who “sees all ideas as equal and his own as first among them, for no reason other than he has thought of it.”

And the more talented the writer, the greater the risk. All writers have their moments of cleverness and ingenuity; the labor of turning a bad sentence into a good one is the sort of work that encourages the development of all kinds of tricks, and a writer who knows how to get published consistently can only get there with a lot of shrewdness. It’s worth remembering, then, that there are two sides to craft. The word evokes a set of proven tools, but it also carries a negative connotation: when we describe a person as “crafty,” that isn’t necessarily a compliment. The real point of craft is to cultivate the ability to treat all premises as fundamentally equal, and which rise or fall based only on how honestly the author follows through. It treats the best premise in the world as if it were the worst, or at least as if it required the same amount of time and effort to reach its full realization—which it does. It’s the author, not the idea, that makes the difference. And it’s frightening how often a great idea can turn a good writer into a bad one.

Written by nevalalee

May 29, 2013 at 9:12 am

Quote of the Day

leave a comment »

Written by nevalalee

May 14, 2012 at 7:50 am

Posted in Quote of the Day

Tagged with

Freeman Dyson and the closing of the science-fictional mind

with 2 comments

Arthur C. Clarke famously argued that our politicians should read science fiction, instead of westerns and detective stories, and Isaac Asimov, as we’ve seen, thought that an early interest in good science fiction was the best predictor of children who would become the great scientists of tomorrow. As I look around the world today, though, I worry that we’re suffering from a lack of science-fictional thinking. And it isn’t just the fact that America can no longer go into space. It’s that our dreams have grown smaller, and the most ambitious visions are greeted with a dismissive tweet. George W. Bush’s proposal to go to Mars was admittedly hard to take seriously, given its complete lack of specifics, but when the timeline of DARPA’s 100-year Starship Study makes it clear that nobody expects to go to the stars within the next century, I have to wonder what happened to the national will that put a man on the moon using computers like this. And my greatest fear is that we’ve lost the ability to even talk about such issues in suitably cosmic terms.

These days, only a handful of public intellectuals seem willing to talk about the future in ways designed to expand our sense of the possible. One is Ray Kurzweil, whose concept of the singularity, perhaps the most exciting—and lunatic—of all forms of futurism, has finally crossed over into the mainstream. Another is Freeman Dyson, the legendary physicist and mathematician who made several practical, lasting contributions to speculative fiction, notably the concept of the Dyson sphere, almost in passing. Both men are geniuses, and both are willing to take outlandish positions. As a result, both often seem faintly ridiculous themselves. Kurzweil, with his line of longevity supplements and obsession with the idea of his own immortality, can occasionally come off as a snake oil salesman, while Dyson has been roundly attacked as a global warming skeptic. And although Dyson’s arguments deserve to be taken seriously, there doesn’t seem to be a place for them in the mainstream dialogue on climate change, which reflects less on his ideas themselves than on the limitations we’ve subconsciously imposed on the debate.

Dyson’s treatment in the media has been particularly sobering. He doesn’t deny that global warming exists, or that it’s primarily caused by human activity, but questions whether it’s possible to predict the consequences using existing models of climate change, and believes that the danger is overblown compared to other risks, such as global poverty and disease. Dyson also argues that the problem of climate change isn’t social or political, but scientific, and has proposed a number of seemingly farfetched solutions, such as planting a trillion trees to absorb excess carbon dioxide. Perhaps most notoriously, he believes that global warming itself might not be entirely a bad thing. Rather, it will be good for some species and bad for others, a general “evening out” of the climate in a post-Darwinian world driven less by natural selection than by human activity. As a result, he has been widely accused of being oblivious, uncaring, or demented, notably in a fascinating but profoundly disingenuous piece by Kenneth Brower in the Atlantic.

Many of Dyson’s ideas are impractical, or simply incorrect, but it doesn’t seem wise to dismiss a scientist universally regarded by his colleagues as one of the smartest men in the world. And the more one looks at Dyson’s opinions, the more obvious it becomes that they need to be part of the conversation. This isn’t a politically motivated “skeptic” whose ideas are so far off the map that they don’t even deserve refutation; it’s a profoundly original mind approaching the problem from a novel perspective, drawing conclusions that have the power to shake us into new ways of thinking, and as such, he deserves to be celebrated—and, when necessary, refuted, but only by critics willing to meet him on equal terms. He may come up with outlandish proposals, but that’s what science-fictional minds do. Dyson may not have the answers, but only a system of public discussion capable of engaging his ideas will result in the answers we need. And if we can’t talk about his ideas at all, it’s our loss.

Written by nevalalee

October 10, 2011 at 9:42 am

Quote of the Day

leave a comment »

When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

—Arthur C. Clarke, “Clarke’s Three Laws”

Written by nevalalee

June 7, 2011 at 7:36 am

Posted in Quote of the Day

Tagged with

Why science fiction?

with 2 comments

When I look at my writing from the past few years, I’m struck by a sharp division in my work, which sometimes resembles the output of two different authors. On the novel side, I’ve focused almost exclusively on suspense fiction, with the occasional literary touch: The Icon Thief is basically my take on the paranoid conspiracy novel, while its sequel—once called Midrash, currently untitled—is even more of a straight thriller. I love writing books like this, and one of the great pleasures of my recent life has been exploring the genre’s conventions and learning what makes such novels tick. But at the same time, I’ve been living an alternate, almost entirely separate life as a writer of short science fiction. And now that my novelette “Kawataro” is in stores, it’s probably worth asking why I write this stuff in the first place.

Because it certainly isn’t for the money. Analog‘s payment rate is pretty modest—at the moment, for a novelette, it’s between five and six cents a word—and while it still pays better than most other magazines, where payment can consist of nothing but a few contributors’ copies, devoting two or more weeks to writing a 12,000-word novelette isn’t an especially lucrative way of spending one’s time. And while I’m always immensely gratified to read reviews of my short fiction online, the fact remains that a writer can make a bigger impression with a single novel than with a dozen short stories. There doesn’t seem to be any rational reason, then, why I should spend my time writing stuff for Analog. And yet I still try to write at least a couple of short stories a year, and whenever I’m not writing one, I really miss it.

So why is that? The real question, I suppose, is why I write science fiction at all, instead of some other genre. (Mystery fiction, for one, has an honorable history, and there are still a couple of good genre magazines on the market.) Writers, not surprisingly, are drawn to science fiction for all sorts of reasons. Many of the writers in Analog, which remains the leading voice of hard science fiction, seem to have been brought to it by a deep love of science itself, with stories that methodically work out the details of a particular scientific problem. Other authors write science fiction because it gives them the opportunity to discuss major issues involving humanity’s future, to build entire worlds, or to allegorize a contemporary issue (as in Children of Men, which takes our reluctance to plan for the future and turns it into a world in which there is no future). Others, maybe most, are drawn to science fiction simply because it was the kind of fiction they loved best growing up.

This last reason comes fairly close for me, although it isn’t the whole story. Growing up, one of my favorite books was the wonderful anthology 100 Great Science Fiction Short Short Stories, which I highly recommend if you can track down a copy. Reading these and similar stories—many of which I still know practically by heart—I was deeply impressed by their clarity, their precision, and above all their ingenuity. On the cinematic side, my favorite movie for many years was 2001, which, in turn, served as a gateway to such authors as Arthur C. Clarke, Orson Scott Card, and Robert Anton Wilson—the last of whom, in particular, remains one of my intellectual heroes. And I’ve already spoken of my love for The X-Files, which has given my stories much of their overall tone and shape.

Above all else, though, I love science fiction because it gives me a chance to make beautiful toys. The toymaking aspect of fiction has always been important to me, and hopefully this comes through in my novels, which I like to think of as intricate games between myself and the reader. And the science fiction short story—because of its love of ideas, its range of possible subjects, and the rewards it offers to ingenuity—has always been an ideal medium for play. While my stories occasionally tackle larger social themes, the motivation for writing them in the first place is always one of playfulness: I have an idea, an image, a twist, and want to see how far I can mislead the reader while still making the story an exciting one.  Writing novels is joyous work, but it’s still work. Writing short fiction, especially science fiction, is closer to a game. And as far as I’m concerned, it’s the greatest game in the world.

Quote of the Day

leave a comment »

One of the biggest roles of science fiction is to prepare people to accept the future without pain and to encourage a flexibility of mind. Politicians should read science fiction, not westerns and detective stories.

Arthur C. Clarke

Written by nevalalee

April 14, 2011 at 7:45 am