Posts Tagged ‘John W. Campbell’
In this week’s issue of The New York Times Magazine, there’s a profile of Mark Zuckerberg by Farhad Manjoo, who describes how the founder of Facebook is coming to terms with his role in the world in the aftermath of last year’s election. I find myself thinking about Zuckerberg a lot these days, arguably even more than I use Facebook itself. We just missed overlapping in college, and with one possible exception, which I’ll mention later, he’s the most influential figure to emerge from those ranks in the last two decades. Manjoo depicts him as an intensely private man obliged to walk a fine line in public, leading him to be absurdly cautious about what he says: “When I asked if he had chatted with Obama about the former president’s critique of Facebook, Zuckerberg paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Zuckerberg is trying to figure out what he believes—and how to act—under conditions of enormous scrutiny, but he also has more resources at his disposal than just about anyone else in history. Here’s the passage in the article that stuck with me the most:
The people who work on News Feed aren’t making decisions that turn on fuzzy human ideas like ethics, judgment, intuition, or seniority. They are concerned only with quantifiable outcomes about people’s actions on the site. That data, at Facebook, is the only real truth…This ideal runs so deep that the people who make News Feed often have to put aside their own notions of what’s best. “One of the things we’ve all learned over the years is that our intuition can be wrong a fair amount of the time,” John Hegeman, the vice president of product management and a News Feed team member, told me. “There are things you don’t expect will happen. And we learn a lot from that process: Why didn’t that happen, and what might that mean?”
Reading this, I began to reflect on how rarely we actually test our intuitions. I’ve spoken a lot on this blog about the role of intuitive thinking in the arts and sciences, mostly because it doesn’t get the emphasis it deserves, but there’s also no guarantee that intuition will steer us in the right direction. The psychologist Daniel Kahneman has devoted his career to showing how we tend to overvalue our gut reactions, particularly if we’ve been fortunate enough to be right in the past, and the study of human irrationality has become a rich avenue of research in the social sciences, which are often undermined by poor hunches of their own. It may not even be a matter of right or wrong. An intuitive choice may be better or worse than the alternative, but for the most part, we’ll never know. One of the quirks of Silicon Valley culture is that it claims to base everything on raw data, but it’s often in the service of notions that are outlandish, untested, and easy to misrepresent. Facebook comes closer than any company in existence to the ideal of an endless A/B test, in which the user base is randomly divided into two or more groups to see which approaches are the most effective. It’s the best lab ever developed for testing our hunches about human behavior. (Most controversially, Facebook modified the news feeds of hundreds of thousands of users to adjust the number of positive or negative posts, in order to gauge the emotional impact, and it has conducted similar tests on voter turnout.) And it shouldn’t surprise us if many of our intuitions turn out to be mistaken. If anything, we should expect them to be right about half the time—and if we can nudge that percentage just a little bit upward, in theory, it should give us a significant competitive advantage.
So what good is intuition, anyway? I like to start with William Goldman’s story about the Broadway producer George Abbott, who once passed a choreographer holding his head in his hands while the dancers stood around doing nothing. When Abbott asked what was wrong, the choreographer said that he couldn’t figure out what to do next. Abbott shot back: “Well, have them do something! That way we’ll have something to change.” Intuition, as I’ve argued before, is mostly about taking you from zero ideas to one idea, which you can then start to refine. John W. Campbell makes much the same argument in what might be his single best editorial, “The Value of Panic,” which begins with a maxim from the Harvard professor Wayne Batteau: “In total ignorance, try anything. Then you won’t be so ignorant.” Campbell argues that this provides an evolutionary rationale for panic, in which an animal acts “in a manner entirely different from the normal behavior patterns of the organism.” He continues:
Given: An organism with N characteristic behavior modes available. Given: An environmental situation which cannot be solved by any of the N available behavior modes, but which must be solved immediately if the organism is to survive. Logical conclusion: The organism will inevitably die. But…if we introduce Panic, allowing the organism to generate a purely random behavior mode not a member of the N modes characteristically available?
Campbell concludes: “When the probability of survival is zero on the basis of all known factors—it’s time to throw in an unknown.” In extreme situations, the result is panic; under less intense circumstances, it’s a blind hunch. You can even see them as points on a spectrum, the purpose of which is to provide us with a random action or idea that can then be revised into something better, assuming that we survive for long enough. But sometimes the animal just gets eaten.
The idea of refinement, revision, or testing is inseparable from intuition, and Zuckerberg has been granted the most powerful tool imaginable for asking hard questions and getting quantifiable answers. What he does with it is another matter entirely. But it’s also worth looking at his only peer from college who could conceivably challenge him in terms of global influence. On paper, Mark Zuckerberg and Jared Kushner have remarkable similarities. Both are young Jewish men—although Kushner is more observant—who were born less than four years and sixty miles apart. Kushner, whose acceptance to Harvard was so manifestly the result of his family’s wealth that it became a case study in a book on the subject, was a member of the final clubs that Zuckerberg badly wanted to join, or so Aaron Sorkin would have us believe. Both ended up as unlikely media magnates of a very different kind: Kushner, like Charles Foster Kane, took over a New York newspaper from a man named Carter. Yet their approaches to their newfound positions couldn’t be more different. Kushner has been called “a shadow secretary of state” whose portfolio includes Mexico, China, the Middle East, and the reorganization of the federal government, but it feels like one long improvisation, on the apparent assumption that he can wing it and succeed where so many others have failed. As Bruce Bartlett writes in the New York Times, without a staff, Kushner “is just a dilettante meddling in matters he lacks the depth or the resources to grasp,” and we may not have a chance to recover if his intuitions are wrong. In other words, he resembles his father-in-law, as Frank Bruni notes:
I’m told by insiders that when Trump’s long-shot campaign led to victory, he and Kushner became convinced not only that they’d tapped into something that everybody was missing about America, but that they’d tapped into something that everybody was missing about the two of them.
Zuckerberg and Kushner’s lives ran roughly in parallel for a long time, but now they’re diverging at a point at which they almost seem to be offering us two alternate versions of the future, like an A/B test with only one possible outcome. Neither is wholly positive, but that doesn’t make the choice any less stark. And if you think this sounds farfetched, bookmark this post, and read it again in about six years.
I think drugs are interesting principally as chemical means of altering metabolism and thereby altering what we call reality, which I would define as a more or less constant scanning pattern.
—William S. Burroughs, to The Paris Review
On September 7, 1967, the editor John W. Campbell, who had just returned from the World Science Fiction Convention in New York, wrote to the author Poul Anderson about how fantasy—as typified by the works of J.R.R. Tolkien—seemed to be taking over the fandom. Campbell weighed the various reasons why one genre might be on the rise and the other on the decline, but he was particularly dismissive of one possible factor:
One I do not intend to yield to—the escape-from-harsh-reality motivation that underlies the LSD craze among the younger group in colleges…No need for learning a discipline, no need to recognize that “my opinion” and “truth” are in conflict…Which makes for happy little self-satisfaction. But unfortunately overlooks that the Universe’s opinion has a somewhat special place in that scheme of things.
A few weeks later, in response to a letter from a reader, Campbell agreed with the notion that there was no substitute for “experience” when it came to the effects of LSD, but added: “The statement applies equally, however, to taking heroin, becoming a quadriplegic, or committing suicide.” Campbell proposed that as an alternative to drugs, his correspondent try inducing anoxia, by breathing air from which most of the oxygen had been removed:
In just a minute or two, you’ll discover a vast increase in your mental abilities—a sureness of thought, a breadth of understanding, and a rapidity and sureness of reasoning you never achieved before…Of course your brilliant realizations and mighty discoveries somehow seem to misfire when you come down off that jag, and your judgment faculty gets back on the job. But it’s a great trip while it lasts!
It’s worth noting that while Campbell was pointedly uninterested in exploring drugs in the science fiction that he published, he wasn’t exactly puritanical. In addition to his own habitual use of cigarettes, benzedrine, and occasionally alcohol, he sampled marijuana and even “an African witch doctor drug” that one of his chemist friends was developing. He didn’t much care for pot, which made him “uncomfortable,” but he also had a take on the subject that might strike readers as surprising:
Marijuana serves to demonstrate [to teenagers] that the older generation is stupid, ignorant, hypocritical, and unwilling to learn anything. They do reject learning the simple facts about marijuana, and give violently emotional lectures on the Awful Evils of That Hideous Drug—without knowing the first things about it…Any intelligent teenager who’s experienced the effects of marijuana, and discussed it with friends, knows the average family doctor does not know what he’s talking about…Marijuana is a damn sight less dangerous than alcohol. It’s less addictive, less toxic, and less dangerous for a “high” driver to be high on marijuana than on alcohol. It is not an aphrodisiac, nor does it have alcohol’s tendency to anesthetize the censor mechanisms of the mind.
Campbell believed that the real problem with marijuana is that a teenager who learns to doubt what adults say on the subject is likely to become equally skeptical when it comes to cocaine, heroin, and LSD: “So long as parents and doctors deny the facts about marijuana, and insist on classing it with hard drugs, the kid who knows they’re wrong about marijuana feels they’re wrong about heroin…Marijuana can be legalized—and thus separated, as it must be, from the problem of the hard drugs.”
When it came to LSD, Campbell’s attitudes were more or less in line with those of the three other authors who have been on my mind these days. L. Ron Hubbard warned gravely against its use—LSD and PCP were the only drugs that disqualified potential applicants for the Sea Org—and he described his effects in a bulletin of which one follower recalled: “All the information came from one person who had taken LSD once. That was how he did his research.” Isaac Asimov doesn’t appear to have written on the topic at length, although he refers in passing in More Words of Science to “young people foolishly [beginning] to play games with their minds by taking LSD,” and he writes in his memoirs:
Most people, when I tell them [how I get ideas], are dreadfully disappointed. They would be far readier to believe that I had to use LSD or something like that so that ideas would come to me in an altered state of consciousness. If all one has to do is think, where’s the glamour?
Asimov concludes: “Try thinking. You’ll find it’s a lot harder than taking LSD.” This echoes Robert A. Heinlein, who wrote in a letter in 1967:
LSD and pot? Marijuana has been readily available to anyone who wanted it throughout my lifetime and apparently for centuries before I was born. LSD is new but the hippies didn’t develop it; they simply use it. But it seems to me that the outstanding objective fact about LSD (despite the claims of Leary and others) is that it is as much of a failure as other drugs in producing any results of any value other than to the user—i.e., I know of no work of art, essay, story, discovery, or anything else of value created as a result of LSD. When the acid-droppers start outdistancing the squares in any field, I’ll sit up and take notice. Until that day I’ll regard it just as I do all other euphoric drugs: a sterile, subjective, sensory pleasure holding considerable hazard to the user.
Aside from Hubbard, these writers objected to LSD primarily in its role as a kind of shortcut to enlightenment, leading to subjectively meaningful results that aren’t useful to anyone else. On the other side, you can set the testimony of such writers as Aldous Huxley and Robert Anton Wilson, not to mention Stewart Brand, Douglas Engelbart, and Steve Jobs, who believed that they had emerged from their experiences with valuable insights. I think it’s fairly obvious that both sides have a point, and that you get out of LSD exactly what you put into it. If you lack any creative skills, you aren’t likely to produce anything interesting to others, but if you’ve taken the trouble of cultivating those talents in the usual boring way, it can push you along unexpected lines of development. Whether these directions are different from the ones that you would have taken anyway is a separate question, and probably an unanswerable one. My own hunch is that the connection, for instance, between Silicon Valley and the psychedelic culture was mostly a question of timing: it wasn’t that these drugs produced unusually smart or unconventional people, but that many of the smart, unconventional people of that time and place happened to be taking drugs. Many of them regarded it as a turning point in their lives, but I’m inclined to agree with W.H. Auden said of transformative experiences in childhood:
The so-called traumatic experience is not an accident, but the opportunity for which the child has been patiently waiting—had it not occurred, it would have found another, equally trivial—in order to find a necessity and direction for its existence, in order that its life may become a serious matter.
At a moment of renewed interest in microdosing, at least among young professionals with the resources and security in their own social position to try it, it’s worth remembering that the evidence suggests that drugs pay off in visible ways only for people who have already put in the hard work of figuring out how to make and do interesting things. Norman Mailer compared it to borrowing on the future. And as Heinlein himself might have put it, there’s no such thing as a free Naked Lunch.
A few days ago, I was struck by the fact that a mere thirty-one years separated The Thing From Another World from John Carpenter’s The Thing. The former was released on April 21, 1951, the latter on June 25, 1982, and another remake, which I haven’t yet seen, arrived right on schedule in 2011. Three decades might have once seemed like a long time to me, but now, it feels like the blink of an eye. It’s the equivalent of the upcoming remake of David Cronenberg’s The Fly, which was itself a reimagining of a movie that had been around for about the same amount of time. I picked these examples at random, and while there isn’t anything magical about a thirty-year cycle, it isn’t hard to understand. It’s enough time for a new generation of viewers to come of age, but not quite long enough for the memory of the earlier movie to fade entirely. (From my perspective, the films of the eighties seem psychologically far closer than those of the seventies, and not just for reasons of style.) It’s also long enough for the original reaction to a movie to be largely forgotten, so that it settles at what feels like its natural level. When The Thing From Another World first premiered, Isaac Asimov thought that it was one of the worst movies ever made. John W. Campbell, on whose original story it was based, was more generous, writing of the filmmakers: “I think they may be right in feeling that the proposition in ‘Who Goes There?’ is a little strong if presented literally in the screen.” Elsewhere, he noted:
I have an impression that the original version directed and acted with equal restraint would have sent some ten percent of the average movie audience into genuine, no-kidding, semi-permanent hysterical screaming meemies…You think that [story] wouldn’t tip an insipid paranoid psychotic right off the edge if it were presented skillfully?
For once, Campbell, whose predictions were only rarely on the mark, was entirely prescient. By the time John Carpenter’s The Thing came out, The Thing From Another World was seen as classic, and the remake, which tracked the original novella much more closely, struck many viewers as an assault on its legacy. One of its most vocal detractors, curiously, was Harlan Ellison, who certainly couldn’t be accused of squeamishness. In a column for L.A. Weekly, Ellison wrote that Carpenter “showed some stuff with Halloween,” but dismissed his later movies as “a swan dive into the potty.” He continued:
The Thing…[is a] depredation [Carpenter] attempts to validate by saying he wanted to pull out of the original John W. Campbell story those treasures undiscovered by the original creators…One should not eat before seeing it…and one cannot eat after having seen it.
If the treasures Carpenter sought to unearth are contained in the special effects lunacy of mannequins made to look like men, splitting open to disgorge sentient lasagna that slaughters for no conceivable reason, then John Carpenter is a raider of the lost ark of Art who ought to be sentenced to a lifetime of watching Neil Simon plays and films.
The Thing did not need to be remade, if the best this fearfully limited director could bring forth was a ripoff of Alien in the frozen tundra, this pointless, dehumanized freeway smashup of grisly special effects dreck, flensed of all characterization, philosophy, subtext, or rationality.
Thirty years later, the cycle of pop culture has come full circle, and it’s fair to say that Carpenter’s movie has eclipsed not just Howard Hawks and Christian Nyby, but even Campbell himself. (Having spent the last year trying to explain what I’m doing to people who aren’t science fiction fans, I can testify that if Campbell’s name resonates with them at all, it’s thanks solely to the 1982 version of The Thing.) Yet the two movies also share surprising affinities, and not simply because Carpenter idolized Hawks. Both seem interested in Campbell’s premise mostly for the visual possibilities that it suggests. In the late forties, the rights to “Who Goes There?” were purchased by RKO at the urging of Ben Hecht and Charles Lederer, the latter of whom wrote the script, with uncredited contributions from Hecht and Hawks. The direction was credited to Nyby, Hawks’s protégé, but Hawks was always on the set and later claimed most of the director’s fee, leading to much disagreement over who was responsible for the result. In the end, it threw out nearly all of Campbell’s story, keeping only the basic premise of an alien spacecraft discovered by researchers in an icy environment, while shifting the setting from Antarctica to Alaska. The filmmakers were clearly more drawn to the idea of a group of men facing danger in isolation, one of Hawks’s favorite themes, and they lavished greater attention on the stock types that they understood—the pilot, the journalist, the girl—than on the scientists, who were reduced to thankless foils. David Thomson has noted that the central principle of Hawks’s work is that “men are more expressive rolling a cigarette than saving the world,” and the contrast has never been more evident than it is here.
And while Hawks isn’t usually remembered as a visual director, The Thing From Another World exists almost entirely as a series of images: the opening titles burning through the screen, the crew standing in a circle on the ice to reveal the shape of the flying saucer underneath, the shock reveal of the alien itself in the doorway. When you account for the passage of time, Carpenter’s version rests on similar foundations. His characters and dialogue are less distinct than Hawks’s, but he also seems to have regarded Campbell’s story primarily as a source of visual problems and solutions. I don’t think I’m alone in saying that the images that are burned into my brain from The Thing probably add up to a total of about five minutes: the limits of its technology mean that we only see it in action for a few seconds at a time. But those images, most of which were the work of the special effects prodigy Rob Bottin, are still the best practical effects I’ve ever seen. (It also includes the single best jump scare in the movies, which is taken all but intact from Campbell.) Even after thirty years, its shock moments are so unforgettable that they have a way of overpowering the rest, as they did for Ellison, and neither version ever really approximates the clean narrative momentum of “Who Goes There?” But maybe that’s how it should be. Campbell, for all his gifts, wasn’t primarily a visual writer, and the movies are a visual medium, particularly in horror and science fiction. Both of the classic versions of The Thing are translations from one kind of storytelling to another, and they stick in the imagination precisely to the extent that they depart from the original. They’re works for the eye, not the mind, which may be why the only memorable line in either movie is the final warning in Hawks’s version, broadcast over the airwaves to the world, telling us to watch the skies.
If you wanted to construct the most prolific writer who ever lived, working from first principles, what features would you include? (We’ll assume, for the purposes of this discussion, that he’s a man.) Obviously, he would need to be capable of turning out clean, publishable prose at a fast pace and with a minimum of revision. He would be contented—even happy—within the physical conditions of writing itself, which requires working indoors at a desk alone for hours on end. Ideally, he would operate within a genre, either fiction or nonfiction, that lent itself to producing pages fairly quickly, but with enough variety to prevent burnout, since he’d need to maintain a steady clip every day for years. His most productive period would coincide with an era that gave him steady demand for his work, and he would have a boundless energy that was diverted early on toward the goal of producing more books. If you were particularly clever, you’d even introduce a psychological wrinkle: the act of writing would become his greatest source of satisfaction, as well as an emotional refuge, so that he would end up taking more pleasure in it than almost anything else in life. Finally, you’d provide him with cooperative publishers and an enthusiastic, although not overwhelming, readership, granting him a livelihood that was comfortable but not quite lavish enough to be distracting. Wind him up, let him run unimpeded for three or four decades, and how many books would you get? In the case of Isaac Asimov, the total comes to something like five hundred. Even if it isn’t quite enough to make him the most productive writer of all time, it certainly places him somewhere in the top ten. And it’s a career that followed all but axiomatically from the characteristics that I’ve listed above.
Let’s take these points one at a time. Asimov, like all successful pulp writers, learned how to crank out decent work on deadline, usually limiting himself to a first draft and a clean copy, with very little revision that wasn’t to editorial order. (And he wasn’t alone here. The pulps were an unforgiving school, and they quickly culled authors who weren’t able to write a sentence well enough the first time.) From a young age, Asimov was also drawn to enclosed, windowless spaces, like the kitchen at the back of his father’s candy store, and he had a persistent daydream about running a newsstand in the subway, where he could put up the shutter and read magazines in peace. After he began to write for a living, he was equally content to work in his attic office for up to ten hours a day. Yet it wasn’t fiction that accounted for the bulk of his output—which is a common misconception about his career—but a specific kind of nonfiction. Asimov was a prolific fiction writer, but no more so than many of his contemporaries. It was in nonfiction for general readers that he really shone, initially with such scientific popularizations as The Chemicals of Life and Inside the Atom. At first, his work drew on his academic and professional background in chemistry and biochemistry, but before long, he found that he was equally adept at explaining concepts from the other sciences, as well as such unrelated fields as history and literature. His usual method was to work straight from reference books, dictionaries, and encyclopedias, translating and organizing their concepts for a lay audience. As he once joked to Martin Gardner: “You mean you’re in the same racket I am? You just read books by the professors and rewrite them?”
This kind of writing is harder than it sounds. Asimov noted, correctly, that he added considerable value in arranging and presenting the material, and he was better at it than just about anyone else. (A faculty member at Boston University once observed him at work and exclaimed: “Why, you’re just copying the dictionary!” Asimov, annoyed, handed the dictionary to him and said: “Here. The dictionary is yours. Now go write the book.”) But it also lent itself admirably to turning out a lot of pages in a short period of time. Unlike fiction, it didn’t require him to come up with original ideas from scratch. As soon as he had enough projects in the hopper, he could switch between them freely to avoid becoming bored by any one subject. He could write treatments of the same topic for different audiences and cannibalize unsold material for other venues. In the years after Sputnik, there was plenty of demand for what he had to offer, and he had a ready market for short articles that could be collected into books. And since these were popular treatments of existing information, he could do all of the work from the comfort of his own office. Asimov hated to fly, and he actively avoided assignments that would require him to travel or do research away from home. Before long, his productivity became a selling point in itself, and when his wife told him that life was passing him by, Asimov responded: “If I do manage to publish a hundred books, and if I then die, my last words are likely to be, ‘Only a hundred!’” Writing became a place of security, both from life’s small crises and as an escape from an unhappy marriage, and it was also his greatest source of pleasure. When his daughter asked him what he would do if he had to choose between her and writing, Asimov said: “Why, I would choose you, dear.” But he adds: “But I hesitated—and she noticed that, too.”
Asimov was a complicated man—certainly more so than in the version of himself that he presented to the public—and he can’t be reduced to a neat set of factors. He wasn’t a robot. But those five hundred books represent an achievement so overwhelming that it cries out for explanation, and it wouldn’t exist if certain variables, both external and internal, hadn’t happened to align. In terms of his ability and ambition, Asimov was the equal of Campbell, Heinlein, or Hubbard, but in place of their public entanglements, he channeled his talents into a safer direction, where it grew to gargantuan proportions that only hint at how monstrous that energy and passion really were. (He was also considerably younger than the others, as well as more naturally cautious, and I’d like to believe that he drew a negative lesson from their example.) The result, remarkably, made him the most beloved writer of them all. It was a cultural position, outside the world of science fiction, that was due almost entirely to the body of his nonfiction work as a whole. He never had a bestseller until late in his career, but the volume and quality of his overall output were enough to make him famous. Asimov was the Mule, the unassuming superman of the Foundation series, but he conquered a world from his typewriter. He won the game. And when I think of how his talent, productivity, and love of enclosed spaces combined to produce a fortress made of books, I think of what David Mamet once said to The Paris Review. When asked to explain why he wrote, Mamet replied: “I’ve got to do it anyway. Like beavers, you know. They chop, they eat wood, because if they don’t, their teeth grow too long and they die. And they hate the sound of running water. Drives them crazy. So, if you put those two ideas together, they are going to build dams.”
Over the last year or so, I’ve found myself repeatedly struck by the parallels between the careers of John W. Campbell and Orson Welles. At first, the connection might seem tenuous. Campbell and Welles didn’t look anything alike, although they were about the same height, and their politics couldn’t have been more different—Welles was a staunch progressive and defender of civil rights, while Campbell, to put it mildly, wasn’t. Welles was a wanderer, while Campbell spent most of his life within driving distance of his birthplace in New Jersey. But they’re inextricably linked in my imagination. Welles was five years younger than Campbell, but they flourished at exactly the same time, with their careers peaking roughly between 1937 and 1942. Both owed significant creative breakthroughs to the work of H.G. Wells, who inspired Campbell’s story “Twilight” and Welles’s Mercury Theater adaptation of The War of the Worlds. In 1938, Campbell saw Welles’s famous modern-dress production of Julius Caesar with the writer L. Sprague de Camp, of which he wrote in a letter:
It represented, in a way, what I’m trying to do in the magazine. Those humans of two thousand years ago thought and acted as we do—even if they did dress differently. Removing the funny clothes made them more real and understandable. I’m trying to get away from funny clothes and funny-looking people in the pictures of the magazine. And have more humans.
And I suspect that the performance started a train of thought in both men’s minds that led to de Camp’s novel Lest Darkness Fall, which is about a man from the present who ends up in ancient Rome.
Campbell was less pleased by Welles’s most notable venture into science fiction, which he must have seen as an incursion on his turf. He wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.” In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:
I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement.
Undoubtedly, Mr. Orson Welles felt the same way.
Their most significant point of intersection was The Shadow, who was created by an advertising agency for Street & Smith, the publisher of Astounding, as a fictional narrator for the radio series Detective Story Hour. Before long, he became popular enough to star in his own stories. Welles, of course, voiced The Shadow from September 1937 to October 1938, and Campbell plotted some of the magazine installments in collaboration with the writer Walter B. Gibson and the editor John Nanovic, who worked in the office next door. And his identification with the character seems to have run even deeper. In a profile published in the February 1946 issue of Pic magazine, the reporter Dickson Hartwell wrote of Campbell: “You will find him voluble, friendly and personally depressing only in what his friends claim is a startling physical resemblance to The Shadow.”
It isn’t clear if Welles was aware of Campbell, although it would be more surprising if he wasn’t. Welles flitted around science fiction for years, and he occasionally crossed paths with other authors in that circle. To my lasting regret, he never met L. Ron Hubbard, which would have been an epic collision of bullshitters—although Philip Seymour Hoffman claimed that he based his performance in The Master mostly on Welles, and Theodore Sturgeon once said that Welles and Hubbard were the only men he had ever met who could make a room seem crowded simply by walking through the door. In 1946, Isaac Asimov received a call from a lawyer whose client wanted to buy all rights to his robot story “Evidence” for $250. When he asked Campbell for advice, the editor said that he thought it seemed fair, but Asimov’s wife told him to hold out for more. Asimov called back to ask for a thousand dollars, adding that he wouldn’t discuss it further until he found out who the client was. When the lawyer told him that it was Welles, Asimov agreed to the sale, delighted, but nothing ever came of it. (Welles also owned the story in perpetuity, making it impossible for Asimov to sell it elsewhere, a point that Campbell, who took a notoriously casual attitude toward rights, had neglected to raise.) Twenty years later, Welles made inquiries into the rights for Heinlein’s The Puppet Masters, which were tied up at the time with Roger Corman, but never followed up. And it’s worth noting that both stories are concerned with the problem of knowing how other people are what they claim to be, which Campbell had brilliantly explored in “Who Goes There?” It’s a theme to which Welles obsessively returned, and it’s fascinating to speculate what he might have done with it if Howard Hawks and Christian Nyby hadn’t gotten there first with The Thing From Another World. Who knows what evil lurks in the hearts of men?
But their true affinities were spiritual ones. Both Campbell and Welles were child prodigies who reinvented an art form largely by being superb organizers of other people’s talents—although Campbell always downplayed his own contributions, while Welles appears to have done the opposite. Each had a spectacular early success followed by what was perceived as decades of decline, which they seem to have seen coming. (David Thomson writes: “As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent.” And you could say much the same thing about “Twilight.”) Both had a habit of abandoning projects as soon as they realized that they couldn’t control them, and they both managed to seem isolated while occupying the center of attention in any crowd. They enjoyed staking out unreasonable positions in conversation, just to get a rise out of listeners, and they ultimately drove away their most valuable collaborators. What Pauline Kael writes of Welles in “Raising Kane” is equally true of Campbell:
He lost the collaborative partnerships that he needed…He was alone, trying to be “Orson Welles,” though “Orson Welles” had stood for the activities of a group. But he needed the family to hold him together on a project and to take over for him when his energies became scattered. With them, he was a prodigy of accomplishments; without them, he flew apart, became disorderly.
Both men were alone when they died, and both filled their friends, admirers, and biographers with intensely mixed feelings. I’m still coming to terms with Campbell. But I have a hunch that I’ll end up somewhere close to Kael’s ambivalence toward Welles, who, at the end of an essay that was widely seen as puncturing his myth, could only conclude: “In a less confused world, his glory would be greater than his guilt.”
In Toy Story 2, there’s a moment in which Woody discovers that his old television series, Woody’s Roundup, was abruptly yanked off the air toward the end of the fifties. He asks: “That was a great show. Why cancel it?” The Prospector replies bitterly: “Two words: Sput-nik. Once the astronauts went up, children only wanted to play with space toys.” And while I wouldn’t dream of questioning the credibility of a man known as Stinky Pete, I feel obliged to point out that his version of events isn’t entirely accurate. The space craze among kids really began more than half a decade earlier, with the premiere of Tom Corbett, Space Cadet, and the impact of Sputnik on science fiction was far from a positive one. Here’s what John W. Campbell wrote about it in the first issue of Astounding to be printed after the satellite’s launch:
Well, we lost that race; Russian technology achieved an important milestone in human history—one that the United States tried for, talked about a lot, and didn’t make…One of the things Americans have long been proud of—and with sound reason—is our ability to convert theoretical science into practical, working engineering…This time we’re faced with the uncomfortable realization that the Russians have beaten us in our own special field; they solved a problem of engineering technology faster and better than we did.
And while much of the resulting “Sputnik crisis” was founded on legitimate concerns—Sputnik was as much a triumph of ballistic rocketry as it was of satellite technology—it also arose from the notion that the United States had been beaten at its own game. As Arthur C. Clarke is alleged to have said, America had become “a second-rate power.”
Campbell knew right away that he had reason to worry. Lester del Rey writes in The World of Science Fiction:
Sputnik simply convinced John Campbell that he’d better watch his covers and begin cutting back on space scenes. (He never did, but the art director of the magazine and others were involved in that decision.) We agreed in our first conversation after the satellite went up that people were going to react by deciding science had caught up with science fiction, and with a measure of initial fear. They did. Rather than helping science fiction, Sputnik made it seem outmoded.
And that’s more or less exactly what happened. There was a brief spike in sales, followed by a precipitous fall as mainstream readers abandoned the genre. I haven’t been able to find specific numbers for this period, but one source, the Australian fan Wynne Whitford, states that the circulation of Astounding fell by half after Sputnik—which seems high, but probably reflects a real decline. In a letter written decades later, Campbell said of Sputnik: “Far from encouraging the sales of science fiction magazines—half the magazines being published lost circulation so drastically they went out of business!” An unscientific glance at a list of titles appears to support this. In 1958, the magazines Imagination, Imaginative Tales, Infinity Science Fiction, Phantom, Saturn, Science Fiction Adventures, Science Fiction Quarterly, Star Science Fiction, and Vanguard Science Fiction all ceased publication, followed by three more over the next twelve months. The year before, just four magazines had folded. There was a bubble, and after Sputnik, it burst.
At first, this might seem like a sort of psychological self-care, of the same kind that motivated me to scale back my news consumption after the election. Americans were simply depressed, and they didn’t need any reminders of the situation they were in. But it also seems to have affected the public’s appetite for science fiction in particular, rather than science as a whole. In fact, the demand for nonfiction science writing actually increased. As Isaac Asimov writes in his memoir In Joy Still Felt:
The United States went into a dreadful crisis of confidence over the fact that the Soviet Union had gotten there first and berated itself for not being interested enough in science. And I berated myself for spending too much time on science fiction when I had the talent to be a great science writer…Sputnik also served to increase the importance of any known public speaker who could talk on science and, particularly, on space, and that meant me.
What made science fiction painful to read, I think, was its implicit assumption of American superiority, which had been disproven so spectacularly. Campbell later compared it to the reaction after the bomb fell, claiming that it was the moment when people realized that science fiction wasn’t a form of escapism, but a warning:
The reactions to Sputnik have been more rapid, and, therefore, more readily perceptible and correlatable. There was, again, a sudden rise in interest in science fiction…and there is, now, an even more marked dropping of the science-fiction interest. A number of the magazines have been very heavily hit…I think the people of the United States thought we were kidding.
And while Campbell seemed to believe that readers had simply misinterpreted science fiction’s intentions, the conventions of the genre itself clearly bore part of the blame.
In his first editorials after Sputnik, Campbell drew a contrast between the American approach to engineering, which proceeded logically and with vast technological resources, and the quick and dirty Soviet program, which was based on rules of thumb, trial and error, and the ability to bull its way through on one particular point of attack. It reminds me a little of the election. Like the space race, last year’s presidential campaign could be seen as a kind of proxy war between the American and Russian administrations, and regardless of what you believe about the Trump camp’s involvement, which I suspect was probably a tacit one, there’s no question as to which side Putin favored. On one hand, you had a large, well-funded political machine, and on the other, one that often seemed comically inept. Yet it was the quick and dirty approach that triumphed. “The essence of ingenuity is the ability to get precision results without precision equipment,” Campbell wrote, and that’s pretty much what occurred. A few applications of brute force in the right place made all the difference, and they were aided, to some extent, by a similar complacency. The Americans saw the Soviets as bunglers, and they never seriously considered the possibility that they might be beaten by a bunch of amateurs. As Campbell put it: “We earned what we got—fully, and of our own efforts. The ridicule we’ve collected is our just reward for our consistent efforts.” Sometimes I feel the same way. Right now, we’re entering a period in which the prospect of becoming a second-rate power is far more real than it was when Clarke made his comment. It took a few months for the implications of Sputnik to really sink in. And if history is any indication, we haven’t even gotten to the crisis yet.
The life of Napoleon Bonaparte, Ralph Waldo Emerson writes in his book Representative Men, “showed us how much may be accomplished by the mere force of such virtues as all men possess in less degrees; namely, by punctuality, by personal attention, by courage and thoroughness.” I’ve never forgotten this sentence, in large part because the qualities that Emerson lists—apart from courage—are all so boring and mundane. Emerson, I think, is being deliberately provocative in explaining the career of Napoleon, the most overwhelming public figure who ever lived, in terms of qualities that we’d like to see in a certified public accountant. But he’s also right in noting that Napoleon’s fascination is rooted in his “very intelligible merits,” which give us the idea, which seems more plausible when we’re in our early twenties, that we might have done the same thing in his position. It’s an observation that must have seemed even more striking to Emerson’s audience than it does to us now. Napoleon rose from virtually nothing to become an emperor, and he emerged at a moment, just after the fall of a hereditary monarchy, in which such examples were still rare. A commoner could never hope to become a king, but every citizen could fantasize about being Napoleon. These days, when we tell our children that anyone can become president, we’re more likely to take such dreams for granted. (It’s noteworthy that Emerson delivered this lecture a decade before the election of Abraham Lincoln, who fills exactly that role in the American imagination.) As Emerson says: “If Napoleon is Europe, it is because the people whom he sways are little Napoleons.”
This is true of other forms of achievement, too. I’ve been thinking about this passage a lot recently, because it also seems like a list of the qualities that characterize a certain kind of writer, particularly one who works in nonfiction. I can’t speak for the extent to which courage enters into it, aside from the ordinary kind that is required to write anything at all—although some writers, now more than ever, display far greater courage than others. But the more you write, the more you come to value the homely virtues that Emerson catalogs here, both in yourself and in the books you read. Even fiction, which might seem to draw more on creativity and inspiration, is an act of sustained organization, and the best novels tend to be the ones that are so superbly organized that the writer can take the time to see clearly into every part. To stretch the military analogy even further, there’s a fog of war that descends on any extended writing project: it’s hard to keep both the details and the big picture in your head at once, and you don’t have time to follow up on every line of investigation. All books inevitably leave certain things undone. For a writer, personal attention and thoroughness come down to the ability to keep everything straight for long enough to develop every element exactly as far as it needs to extend. One of the attractions of a book like The Power Broker by Robert Caro is the sense that every paragraph represents the fruits of maximal thoroughness. The really funny thing is that Caro thought it would take him just nine months to write. But maybe that’s what all writers need to tell themselves before they start.
There’s a place, obviously, for inspiration, insight, and other factors that can’t be reduced to mere diligence. But organization is the essential backdrop from which ideas emerge, exactly as it was for Napoleon. It may not be sufficient, but it’s certainly necessary. Our university libraries are filled with monuments to thoroughness that went nowhere, but there’s also something weirdly logical about the notion of giving a doctoral candidate the chance to spend a few years thoroughly investigating a tiny slice of knowledge that hasn’t been explored before, on the off chance that something useful might come of it. Intuition is often described as a shortcut that allows the thinker to skip the intermediate steps of an argument, which suggests to me that the opposite should also be true: a year of patiently gathering data can yield a result that a genius would get in an instant. The tradeoff may not always be worth it for any one individual, but it’s certainly worth it for society as a whole. We suffer from a shortage of geniuses, but we’ve got plenty of man-hours in our graduate schools. Both are indispensable in their own way. To some extent, thoroughness can be converted into genius, just as one currency can be exchanged for another—it’s just that the exchange rate is sometimes unfavorable. And it’s even more accurate to say that insight is the paycheck you get for the hard daily work of thoroughness. (Which just reminds me of the fact that “earning a living” as an artist is both about putting a roof over your head and about keeping yourself in a position to utilize good ideas when they come.)
And it gives me hope for my current project. John W. Campbell, of all people, put it best. On July 5, 1967, he wrote to Larry Niven: “The readers lay their forty cents on the counter to employ me to think things through for them with more depth, more detail, and more ingenuity than they can, or want to bother achieving.” This is possibly my favorite thing that Campbell ever said—although it’s important to note that it dates from a period when his thinking was hideously wrong on countless matters. A writer is somebody you hire to be thorough about something when you don’t have the time or the inclination. (Journalism amounts to a kind of outsourcing of our own efforts to remain informed about the world, which makes it all the more important to choose our sources wisely.) I’m about halfway through this book, and it’s already clear that there are plenty of other people who would be more qualified than I am to write it. My only advantage is that I’m available. I can think about this subject every day for two to three years, and I can afford to spend my time chasing down details that even a diligent writer who only touches on the topic tangentially wouldn’t be able to investigate. All writing comes down to a process of triage, and as I work, I’m aware of potential avenues that I’ll need to leave unexplored or assertions that I’ll have to take on faith, trusting that someone else will look into them one day. The most I can do is flag them and move on. There are also days when even the humdrum qualities that Emerson lists seem impossibly out of reach, and I’m confronted by the physical limits to how thorough I can be, just as I’m aware of the limits to my insight. As a writer, you hope that these limitations will cancel each other out over a long enough period of time, but there’s no way of knowing until you’re finished. And maybe that’s where the courage comes in.