Posts Tagged ‘John W. Campbell’
If you wanted to construct the most prolific writer who ever lived, working from first principles, what features would you include? (We’ll assume, for the purposes of this discussion, that he’s a man.) Obviously, he would need to be capable of turning out clean, publishable prose at a fast pace and with a minimum of revision. He would be contented—even happy—within the physical conditions of writing itself, which requires working indoors at a desk alone for hours on end. Ideally, he would operate within a genre, either fiction or nonfiction, that lent itself to producing pages fairly quickly, but with enough variety to prevent burnout, since he’d need to maintain a steady clip every day for years. His most productive period would coincide with an era that gave him steady demand for his work, and he would have a boundless energy that was diverted early on toward the goal of producing more books. If you were particularly clever, you’d even introduce a psychological wrinkle: the act of writing would become his greatest source of satisfaction, as well as an emotional refuge, so that he would end up taking more pleasure in it than almost anything else in life. Finally, you’d provide him with cooperative publishers and an enthusiastic, although not overwhelming, readership, granting him a livelihood that was comfortable but not quite lavish enough to be distracting. Wind him up, let him run unimpeded for three or four decades, and how many books would you get? In the case of Isaac Asimov, the total comes to something like five hundred. Even if it isn’t quite enough to make him the most productive writer of all time, it certainly places him somewhere in the top ten. And it’s a career that followed all but axiomatically from the characteristics that I’ve listed above.
Let’s take these points one at a time. Asimov, like all successful pulp writers, learned how to crank out decent work on deadline, usually limiting himself to a first draft and a clean copy, with very little revision that wasn’t to editorial order. (And he wasn’t alone here. The pulps were an unforgiving school, and they quickly culled authors who weren’t able to write a sentence well enough the first time.) From a young age, Asimov was also drawn to enclosed, windowless spaces, like the kitchen at the back of his father’s candy store, and he had a persistent daydream about running a newsstand in the subway, where he could put up the shutter and read magazines in peace. After he began to write for a living, he was equally content to work in his attic office for up to ten hours a day. Yet it wasn’t fiction that accounted for the bulk of his output—which is a common misconception about his career—but a specific kind of nonfiction. Asimov was a prolific fiction writer, but no more so than many of his contemporaries. It was in nonfiction for general readers that he really shone, initially with such scientific popularizations as The Chemicals of Life and Inside the Atom. At first, his work drew on his academic and professional background in chemistry and biochemistry, but before long, he found that he was equally adept at explaining concepts from the other sciences, as well as such unrelated fields as history and literature. His usual method was to work straight from reference books, dictionaries, and encyclopedias, translating and organizing their concepts for a lay audience. As he once joked to Martin Gardner: “You mean you’re in the same racket I am? You just read books by the professors and rewrite them?”
This kind of writing is harder than it sounds. Asimov noted, correctly, that he added considerable value in arranging and presenting the material, and he was better at it than just about anyone else. (A faculty member at Boston University once observed him at work and exclaimed: “Why, you’re just copying the dictionary!” Asimov, annoyed, handed the dictionary to him and said: “Here. The dictionary is yours. Now go write the book.”) But it also lent itself admirably to turning out a lot of pages in a short period of time. Unlike fiction, it didn’t require him to come up with original ideas from scratch. As soon as he had enough projects in the hopper, he could switch between them freely to avoid becoming bored by any one subject. He could write treatments of the same topic for different audiences and cannibalize unsold material for other venues. In the years after Sputnik, there was plenty of demand for what he had to offer, and he had a ready market for short articles that could be collected into books. And since these were popular treatments of existing information, he could do all of the work from the comfort of his own office. Asimov hated to fly, and he actively avoided assignments that would require him to travel or do research away from home. Before long, his productivity became a selling point in itself, and when his wife told him that life was passing him by, Asimov responded: “If I do manage to publish a hundred books, and if I then die, my last words are likely to be, ‘Only a hundred!’” Writing became a place of security, both from life’s small crises and as an escape from an unhappy marriage, and it was also his greatest source of pleasure. When his daughter asked him what he would do if he had to choose between her and writing, Asimov said: “Why, I would choose you, dear.” But he adds: “But I hesitated—and she noticed that, too.”
Asimov was a complicated man—certainly more so than in the version of himself that he presented to the public—and he can’t be reduced to a neat set of factors. He wasn’t a robot. But those five hundred books represent an achievement so overwhelming that it cries out for explanation, and it wouldn’t exist if certain variables, both external and internal, hadn’t happened to align. In terms of his ability and ambition, Asimov was the equal of Campbell, Heinlein, or Hubbard, but in place of their public entanglements, he channeled his talents into a safer direction, where it grew to gargantuan proportions that only hint at how monstrous that energy and passion really were. (He was also considerably younger than the others, as well as more naturally cautious, and I’d like to believe that he drew a negative lesson from their example.) The result, remarkably, made him the most beloved writer of them all. It was a cultural position, outside the world of science fiction, that was due almost entirely to the body of his nonfiction work as a whole. He never had a bestseller until late in his career, but the volume and quality of his overall output were enough to make him famous. Asimov was the Mule, the unassuming superman of the Foundation series, but he conquered a world from his typewriter. He won the game. And when I think of how his talent, productivity, and love of enclosed spaces combined to produce a fortress made of books, I think of what David Mamet once said to The Paris Review. When asked to explain why he wrote, Mamet replied: “I’ve got to do it anyway. Like beavers, you know. They chop, they eat wood, because if they don’t, their teeth grow too long and they die. And they hate the sound of running water. Drives them crazy. So, if you put those two ideas together, they are going to build dams.”
Over the last year or so, I’ve found myself repeatedly struck by the parallels between the careers of John W. Campbell and Orson Welles. At first, the connection might seem tenuous. Campbell and Welles didn’t look anything alike, although they were about the same height, and their politics couldn’t have been more different—Welles was a staunch progressive and defender of civil rights, while Campbell, to put it mildly, wasn’t. Welles was a wanderer, while Campbell spent most of his life within driving distance of his birthplace in New Jersey. But they’re inextricably linked in my imagination. Welles was five years younger than Campbell, but they flourished at exactly the same time, with their careers peaking roughly between 1937 and 1942. Both owed significant creative breakthroughs to the work of H.G. Wells, who inspired Campbell’s story “Twilight” and Welles’s Mercury Theater adaptation of The War of the Worlds. In 1938, Campbell saw Welles’s famous modern-dress production of Julius Caesar with the writer L. Sprague de Camp, of which he wrote in a letter:
It represented, in a way, what I’m trying to do in the magazine. Those humans of two thousand years ago thought and acted as we do—even if they did dress differently. Removing the funny clothes made them more real and understandable. I’m trying to get away from funny clothes and funny-looking people in the pictures of the magazine. And have more humans.
And I suspect that the performance started a train of thought in both men’s minds that led to de Camp’s novel Lest Darkness Fall, which is about a man from the present who ends up in ancient Rome.
Campbell was less pleased by Welles’s most notable venture into science fiction, which he must have seen as an incursion on his turf. He wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.” In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:
I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement.
Undoubtedly, Mr. Orson Welles felt the same way.
Their most significant point of intersection was The Shadow, who was created by an advertising agency for Street & Smith, the publisher of Astounding, as a fictional narrator for the radio series Detective Story Hour. Before long, he became popular enough to star in his own stories. Welles, of course, voiced The Shadow from September 1937 to October 1938, and Campbell plotted some of the magazine installments in collaboration with the writer Walter B. Gibson and the editor John Nanovic, who worked in the office next door. And his identification with the character seems to have run even deeper. In a profile published in the February 1946 issue of Pic magazine, the reporter Dickson Hartwell wrote of Campbell: “You will find him voluble, friendly and personally depressing only in what his friends claim is a startling physical resemblance to The Shadow.”
It isn’t clear if Welles was aware of Campbell, although it would be more surprising if he wasn’t. Welles flitted around science fiction for years, and he occasionally crossed paths with other authors in that circle. To my lasting regret, he never met L. Ron Hubbard, which would have been an epic collision of bullshitters—although Philip Seymour Hoffman claimed that he based his performance in The Master mostly on Welles, and Theodore Sturgeon once said that Welles and Hubbard were the only men he had ever met who could make a room seem crowded simply by walking through the door. In 1946, Isaac Asimov received a call from a lawyer whose client wanted to buy all rights to his robot story “Evidence” for $250. When he asked Campbell for advice, the editor said that he thought it seemed fair, but Asimov’s wife told him to hold out for more. Asimov called back to ask for a thousand dollars, adding that he wouldn’t discuss it further until he found out who the client was. When the lawyer told him that it was Welles, Asimov agreed to the sale, delighted, but nothing ever came of it. (Welles also owned the story in perpetuity, making it impossible for Asimov to sell it elsewhere, a point that Campbell, who took a notoriously casual attitude toward rights, had neglected to raise.) Twenty years later, Welles made inquiries into the rights for Heinlein’s The Puppet Masters, which were tied up at the time with Roger Corman, but never followed up. And it’s worth noting that both stories are concerned with the problem of knowing how other people are what they claim to be, which Campbell had brilliantly explored in “Who Goes There?” It’s a theme to which Welles obsessively returned, and it’s fascinating to speculate what he might have done with it if Howard Hawks and Christian Nyby hadn’t gotten there first with The Thing From Another World. Who knows what evil lurks in the hearts of men?
But their true affinities were spiritual ones. Both Campbell and Welles were child prodigies who reinvented an art form largely by being superb organizers of other people’s talents—although Campbell always downplayed his own contributions, while Welles appears to have done the opposite. Each had a spectacular early success followed by what was perceived as decades of decline, which they seem to have seen coming. (David Thomson writes: “As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent.” And you could say much the same thing about “Twilight.”) Both had a habit of abandoning projects as soon as they realized that they couldn’t control them, and they both managed to seem isolated while occupying the center of attention in any crowd. They enjoyed staking out unreasonable positions in conversation, just to get a rise out of listeners, and they ultimately drove away their most valuable collaborators. What Pauline Kael writes of Welles in “Raising Kane” is equally true of Campbell:
He lost the collaborative partnerships that he needed…He was alone, trying to be “Orson Welles,” though “Orson Welles” had stood for the activities of a group. But he needed the family to hold him together on a project and to take over for him when his energies became scattered. With them, he was a prodigy of accomplishments; without them, he flew apart, became disorderly.
Both men were alone when they died, and both filled their friends, admirers, and biographers with intensely mixed feelings. I’m still coming to terms with Campbell. But I have a hunch that I’ll end up somewhere close to Kael’s ambivalence toward Welles, who, at the end of an essay that was widely seen as puncturing his myth, could only conclude: “In a less confused world, his glory would be greater than his guilt.”
In Toy Story 2, there’s a moment in which Woody discovers that his old television series, Woody’s Roundup, was abruptly yanked off the air toward the end of the fifties. He asks: “That was a great show. Why cancel it?” The Prospector replies bitterly: “Two words: Sput-nik. Once the astronauts went up, children only wanted to play with space toys.” And while I wouldn’t dream of questioning the credibility of a man known as Stinky Pete, I feel obliged to point out that his version of events isn’t entirely accurate. The space craze among kids really began more than half a decade earlier, with the premiere of Tom Corbett, Space Cadet, and the impact of Sputnik on science fiction was far from a positive one. Here’s what John W. Campbell wrote about it in the first issue of Astounding to be printed after the satellite’s launch:
Well, we lost that race; Russian technology achieved an important milestone in human history—one that the United States tried for, talked about a lot, and didn’t make…One of the things Americans have long been proud of—and with sound reason—is our ability to convert theoretical science into practical, working engineering…This time we’re faced with the uncomfortable realization that the Russians have beaten us in our own special field; they solved a problem of engineering technology faster and better than we did.
And while much of the resulting “Sputnik crisis” was founded on legitimate concerns—Sputnik was as much a triumph of ballistic rocketry as it was of satellite technology—it also arose from the notion that the United States had been beaten at its own game. As Arthur C. Clarke is alleged to have said, America had become “a second-rate power.”
Campbell knew right away that he had reason to worry. Lester del Rey writes in The World of Science Fiction:
Sputnik simply convinced John Campbell that he’d better watch his covers and begin cutting back on space scenes. (He never did, but the art director of the magazine and others were involved in that decision.) We agreed in our first conversation after the satellite went up that people were going to react by deciding science had caught up with science fiction, and with a measure of initial fear. They did. Rather than helping science fiction, Sputnik made it seem outmoded.
And that’s more or less exactly what happened. There was a brief spike in sales, followed by a precipitous fall as mainstream readers abandoned the genre. I haven’t been able to find specific numbers for this period, but one source, the Australian fan Wynne Whitford, states that the circulation of Astounding fell by half after Sputnik—which seems high, but probably reflects a real decline. In a letter written decades later, Campbell said of Sputnik: “Far from encouraging the sales of science fiction magazines—half the magazines being published lost circulation so drastically they went out of business!” An unscientific glance at a list of titles appears to support this. In 1958, the magazines Imagination, Imaginative Tales, Infinity Science Fiction, Phantom, Saturn, Science Fiction Adventures, Science Fiction Quarterly, Star Science Fiction, and Vanguard Science Fiction all ceased publication, followed by three more over the next twelve months. The year before, just four magazines had folded. There was a bubble, and after Sputnik, it burst.
At first, this might seem like a sort of psychological self-care, of the same kind that motivated me to scale back my news consumption after the election. Americans were simply depressed, and they didn’t need any reminders of the situation they were in. But it also seems to have affected the public’s appetite for science fiction in particular, rather than science as a whole. In fact, the demand for nonfiction science writing actually increased. As Isaac Asimov writes in his memoir In Joy Still Felt:
The United States went into a dreadful crisis of confidence over the fact that the Soviet Union had gotten there first and berated itself for not being interested enough in science. And I berated myself for spending too much time on science fiction when I had the talent to be a great science writer…Sputnik also served to increase the importance of any known public speaker who could talk on science and, particularly, on space, and that meant me.
What made science fiction painful to read, I think, was its implicit assumption of American superiority, which had been disproven so spectacularly. Campbell later compared it to the reaction after the bomb fell, claiming that it was the moment when people realized that science fiction wasn’t a form of escapism, but a warning:
The reactions to Sputnik have been more rapid, and, therefore, more readily perceptible and correlatable. There was, again, a sudden rise in interest in science fiction…and there is, now, an even more marked dropping of the science-fiction interest. A number of the magazines have been very heavily hit…I think the people of the United States thought we were kidding.
And while Campbell seemed to believe that readers had simply misinterpreted science fiction’s intentions, the conventions of the genre itself clearly bore part of the blame.
In his first editorials after Sputnik, Campbell drew a contrast between the American approach to engineering, which proceeded logically and with vast technological resources, and the quick and dirty Soviet program, which was based on rules of thumb, trial and error, and the ability to bull its way through on one particular point of attack. It reminds me a little of the election. Like the space race, last year’s presidential campaign could be seen as a kind of proxy war between the American and Russian administrations, and regardless of what you believe about the Trump camp’s involvement, which I suspect was probably a tacit one, there’s no question as to which side Putin favored. On one hand, you had a large, well-funded political machine, and on the other, one that often seemed comically inept. Yet it was the quick and dirty approach that triumphed. “The essence of ingenuity is the ability to get precision results without precision equipment,” Campbell wrote, and that’s pretty much what occurred. A few applications of brute force in the right place made all the difference, and they were aided, to some extent, by a similar complacency. The Americans saw the Soviets as bunglers, and they never seriously considered the possibility that they might be beaten by a bunch of amateurs. As Campbell put it: “We earned what we got—fully, and of our own efforts. The ridicule we’ve collected is our just reward for our consistent efforts.” Sometimes I feel the same way. Right now, we’re entering a period in which the prospect of becoming a second-rate power is far more real than it was when Clarke made his comment. It took a few months for the implications of Sputnik to really sink in. And if history is any indication, we haven’t even gotten to the crisis yet.
The life of Napoleon Bonaparte, Ralph Waldo Emerson writes in his book Representative Men, “showed us how much may be accomplished by the mere force of such virtues as all men possess in less degrees; namely, by punctuality, by personal attention, by courage and thoroughness.” I’ve never forgotten this sentence, in large part because the qualities that Emerson lists—apart from courage—are all so boring and mundane. Emerson, I think, is being deliberately provocative in explaining the career of Napoleon, the most overwhelming public figure who ever lived, in terms of qualities that we’d like to see in a certified public accountant. But he’s also right in noting that Napoleon’s fascination is rooted in his “very intelligible merits,” which give us the idea, which seems more plausible when we’re in our early twenties, that we might have done the same thing in his position. It’s an observation that must have seemed even more striking to Emerson’s audience than it does to us now. Napoleon rose from virtually nothing to become an emperor, and he emerged at a moment, just after the fall of a hereditary monarchy, in which such examples were still rare. A commoner could never hope to become a king, but every citizen could fantasize about being Napoleon. These days, when we tell our children that anyone can become president, we’re more likely to take such dreams for granted. (It’s noteworthy that Emerson delivered this lecture a decade before the election of Abraham Lincoln, who fills exactly that role in the American imagination.) As Emerson says: “If Napoleon is Europe, it is because the people whom he sways are little Napoleons.”
This is true of other forms of achievement, too. I’ve been thinking about this passage a lot recently, because it also seems like a list of the qualities that characterize a certain kind of writer, particularly one who works in nonfiction. I can’t speak for the extent to which courage enters into it, aside from the ordinary kind that is required to write anything at all—although some writers, now more than ever, display far greater courage than others. But the more you write, the more you come to value the homely virtues that Emerson catalogs here, both in yourself and in the books you read. Even fiction, which might seem to draw more on creativity and inspiration, is an act of sustained organization, and the best novels tend to be the ones that are so superbly organized that the writer can take the time to see clearly into every part. To stretch the military analogy even further, there’s a fog of war that descends on any extended writing project: it’s hard to keep both the details and the big picture in your head at once, and you don’t have time to follow up on every line of investigation. All books inevitably leave certain things undone. For a writer, personal attention and thoroughness come down to the ability to keep everything straight for long enough to develop every element exactly as far as it needs to extend. One of the attractions of a book like The Power Broker by Robert Caro is the sense that every paragraph represents the fruits of maximal thoroughness. The really funny thing is that Caro thought it would take him just nine months to write. But maybe that’s what all writers need to tell themselves before they start.
There’s a place, obviously, for inspiration, insight, and other factors that can’t be reduced to mere diligence. But organization is the essential backdrop from which ideas emerge, exactly as it was for Napoleon. It may not be sufficient, but it’s certainly necessary. Our university libraries are filled with monuments to thoroughness that went nowhere, but there’s also something weirdly logical about the notion of giving a doctoral candidate the chance to spend a few years thoroughly investigating a tiny slice of knowledge that hasn’t been explored before, on the off chance that something useful might come of it. Intuition is often described as a shortcut that allows the thinker to skip the intermediate steps of an argument, which suggests to me that the opposite should also be true: a year of patiently gathering data can yield a result that a genius would get in an instant. The tradeoff may not always be worth it for any one individual, but it’s certainly worth it for society as a whole. We suffer from a shortage of geniuses, but we’ve got plenty of man-hours in our graduate schools. Both are indispensable in their own way. To some extent, thoroughness can be converted into genius, just as one currency can be exchanged for another—it’s just that the exchange rate is sometimes unfavorable. And it’s even more accurate to say that insight is the paycheck you get for the hard daily work of thoroughness. (Which just reminds me of the fact that “earning a living” as an artist is both about putting a roof over your head and about keeping yourself in a position to utilize good ideas when they come.)
And it gives me hope for my current project. John W. Campbell, of all people, put it best. On July 5, 1967, he wrote to Larry Niven: “The readers lay their forty cents on the counter to employ me to think things through for them with more depth, more detail, and more ingenuity than they can, or want to bother achieving.” This is possibly my favorite thing that Campbell ever said—although it’s important to note that it dates from a period when his thinking was hideously wrong on countless matters. A writer is somebody you hire to be thorough about something when you don’t have the time or the inclination. (Journalism amounts to a kind of outsourcing of our own efforts to remain informed about the world, which makes it all the more important to choose our sources wisely.) I’m about halfway through this book, and it’s already clear that there are plenty of other people who would be more qualified than I am to write it. My only advantage is that I’m available. I can think about this subject every day for two to three years, and I can afford to spend my time chasing down details that even a diligent writer who only touches on the topic tangentially wouldn’t be able to investigate. All writing comes down to a process of triage, and as I work, I’m aware of potential avenues that I’ll need to leave unexplored or assertions that I’ll have to take on faith, trusting that someone else will look into them one day. The most I can do is flag them and move on. There are also days when even the humdrum qualities that Emerson lists seem impossibly out of reach, and I’m confronted by the physical limits to how thorough I can be, just as I’m aware of the limits to my insight. As a writer, you hope that these limitations will cancel each other out over a long enough period of time, but there’s no way of knowing until you’re finished. And maybe that’s where the courage comes in.
Two groups of very smart people are looking at the exact same data and coming to wildly different conclusions. Science hates that.
—Katie M. Palmer, Wired
In the early thirties, the parapsychologist J.B. Rhine conducted a series of experiments at Duke University to investigate the existence of extrasensory perception. His most famous test involved a deck of Zener cards, variously printed with the images of a star, a square, three waves, a circle, or a cross, in which subjects were invited to guess the symbol on a card drawn at random. The participants in the study, most of whom were college students, included the young John W. Campbell, who displayed no particular psychic ability. At least two, however, Adam Linzmayer and Hubert Pearce, were believed by Rhine to have consistently named the correct cards at a higher rate than chance alone would predict. Rhine wrote up his findings in a book titled Extrasensory Perception, which was published in 1934, and I’m not going to try to evaluate its merits here. What I will note is that attempts to replicate his work were made almost at once, and they failed to reproduce his results. Within two years, W.S. Cox of Princeton University had conducted a similar run of experiments, of which he concluded: “There is no evidence of extrasensory perception either in the ‘average man’ or of the group investigated or in any particular individual of that group. The discrepancy between these results and those obtained by Rhine is due either to uncontrollable factors in experimental procedure or to the difference in the subjects.” By 1938, four other studies had taken place, to similar effect. Rhine’s results were variously attributed to methodological flaws, statistical misinterpretation, sensory leakage, or outright cheating, and in consequence, fairly or not, parapsychological research was all but banished from academic settings.
Decades later, another study was conducted, and its initial reception was very different. Its subject was ego depletion, or the notion that willpower draws on a finite reservoir of internal resources that can be reduced with overuse. In its most famous demonstration, the psychologists Roy Baumeister and Dianne Tice of Case Western University baked chocolate chip cookies, set them on a plate next to a bowl of radishes, and brought a series of participants into the room. All were told to wait there, but some were allowed to eat the cookies, while the others were instructed to snack only on the radishes. Then they were all given the same puzzle to complete—although they weren’t told that it was impossible to solve. According to the study, students who had been asked to stick to the radishes spent an average of just eight minutes on the puzzle, while those who had been allowed to eat the cookies spent nineteen minutes. The researchers concluded that our willpower is a limited quantity, and it can even be exhausted, like a muscle. Their work was enormously influential, and dozens of subsequent studies seemed to confirm it. In 2010, however, an analysis of published papers on the subject was unable to find any ego depletion effect, and last year, it got even worse: an attempt to replicate the core findings, led by the psychologist Martin Hagger, found zero evidence to support its existence. And this is just the most notable instance of what has been called a replication crisis in the sciences, particularly psychology, with one ambitious attempt to duplicate the results of psychological studies, the Reproducibility Project, finding that only about a third could be reproduced.
But let’s consider the timelines involved. With Rhine, it took only two years before an attempt was made to duplicate his work, and two more years for the consensus in the field to turn against it decisively. In the case of ego depletion, twelve years passed before any questions were raised, and close to two decades before the first comprehensive effort to replicate it. And you don’t need to be a psychologist to understand why. Rhine’s results cut so radically against what was known about the brain—and the physical universe—that accepting them would have required a drastic overhaul of multiple disciplines. Not surprisingly, they inspired immediate skepticism, and they were subjected to intense scrutiny right away. Ego depletion, by contrast, was an elegant theory that seemed to confirm ordinary common sense. It came across as an experimental verification of something that we all know instinctively, and it was widely accepted almost at once. Many successful studies also followed in its wake, in large part because experiments that seemed to confirm it were more likely to be submitted for publication, while those that failed to produce interesting results simply disappeared. (When it came to Rhine, a negative result wouldn’t be discarded, but embraced as a sign that the system was working as intended.) Left to itself, the lag time between a study and any serious attempt to reproduce it seems to be much longer when the answer is intuitively acceptable. As the Reproducibility Project has shown, however, when we dispassionately pull studies from psychological journals and try to replicate them without regard to their inherent interest or plausibility, the results are often no better than they were with Rhine. It can leave psychologists sounding a lot like parapsychologists suffering through a crisis of faith. As the psychologist Michael Inzlicht wrote: “Have I been chasing puffs of smoke for all these years?”
I’m not saying that Rhine’s work didn’t deserve to be scrutinized closely, because it did. And I’m also not trying to argue that social psychology is a kind of pseudoscience. But I think it’s worth considering whether psychology and parapsychology might have more in common than we’d like to believe. This isn’t meant to be a knock against either one, but an attempt to nudge them a little closer together. As Alex Holcombe of the University of Sydney put it: “The more optimistic interpretation of failures to replicate is that many of the results are true, but human behavior is so variable that the original researchers had to get lucky to find the result.” Even Martin Hagger says much the same thing: “I think ego-depletion effect is probably real, but current methods and measures are problematic and make it difficult to find.” The italics, as usual, are mine. Replace “human behavior” and “ego depletion” with “extrasensory perception,” and you end up with a concise version of the most widely cited justification for the resistance of such abilities to scientific verification, which is that these phenomena are real, but difficult to reproduce. You could call this wishful thinking, and in most cases, it probably is. But it also raises the question of whether it’s possible to have a meaningful phenomenon that can’t be reproduced in a laboratory setting. Regardless of where you come down on the issue, the answer shouldn’t be obvious. Intuition, for instance, is often described as a real phenomenon that can’t be quantified or replicated, and whether or not you believe this, it’s worth taking seriously. A kind of collective intuition—or a hunch—is exactly what determines what results the scientific community is likely to accept. And the fact that this intuition is so often wrong means that we need to come to terms with it, even if it isn’t in a lab.
In the May 1947 issue of the nonfiction magazine Air Trails and Science Frontiers, which was edited at the time by John W. Campbell, the cover story was an article titled “Fortress in the Sky.” Its author, credited as “Capt. B. A. Northrop,” argues that the existence of the atomic bomb has rendered the notion of a defensible land or naval base obsolete. The only truly impregnable military position, he writes, is the moon, which will soon be “conquered” by man: “It will probably be reached in five years and completed in ten. Its possessor will be supreme over all nations and peoples of Earth.” Northrop discusses the possible technologies that could be used for a moon landing, and he goes on to make a very peculiar claim:
Here and there throughout the world many men have been thinking about rockets for some time. I recall that in 1930, L. Ron Hubbard, a writer and engineer, developed and tested—but without fanfare—a rocket motor considerably superior to the V-2 instrument of propulsion and rather less complicated.
In fact, “Northrop” was none other than Hubbard himself, and his pseudonym, which is spelled “Northorp” elsewhere in the issue, was evidently inspired by Northrop Aircraft, a frequent presence in the magazine’s pages. (At the time that he was allegedly conducting rocket research in 1930, incidentally, Hubbard was just nineteen years old.) “Fortress in the Sky” was Hubbard’s first major publication after the war. He had been suffering from depression and, unusually, writer’s block, and before Campbell give him the assignment, he had contributed just a few poems and short articles to the Catalina Islander. As such, the piece was a turning point in his career, but as far as I know, it has never been reprinted in its entirety. Recently, however, I got my hands on a copy of the original issue of Air Trails in which it appeared, and I was able to read the whole thing. Aside from Hubbard’s gratuitous reference to himself, which Campbell either believed or was willing to let slide, it’s a surprisingly plausible piece of futuristic speculation. Campbell appears to have provided much of the science, and a lot of the material relating to a future moon colony seems to have been drawn directly from the editor’s unpublished novel The Moon is Hell.
If you’re a science fiction fan, however, the most fascinating section comes about a third of the way through. While listing the moon’s strategic advantages as an atomic missile base, Hubbard notes:
The first [factor] might be termed the “gravity gauge” comparable to the weather gauge so desirable in the days of sailing ships of the line…The gravity gauge is important in the ratio of six to one, in that a missile would have to travel with an initial velocity of six miles per second to leave Earth, but would only have to travel with a velocity of one mile per second to leave the Moon. Such a missile, leaving Earth, would have to go nine-tenths of the way to the Moon on power before the latter body would begin to pull it in by its own gravity, whereas it would only have to travel one-tenth of the distance to escape the Moon and begin to ride down on Earth gravity.
On the next page, a diagram is provided to illustrate this point, with a caption that was presumably written by Campbell:
Skippers of old-time men-o’-war early learned the advantage of the “weather gauge,” which meant being up-wind from the enemy. The Moon affords a similar “gravity gauge” over the Earth by reason of its weaker pull. A missile shot from Earth to Moon must work against the stronger Earth gravity to the point where the pulls balance, nine-tenths of the way out. Missiles fired from the Moon work against one-sixth the pull through one-tenth the total distance, and ride free the rest of the way. Also, the moon has no retarding blanket of atmosphere to slow the take-off.
If I had to guess, I’d say the concept itself probably came from Campbell, while the term “gravity gauge” sounds more like Hubbard, a nautical enthusiast who casually uses the term “weather gauge” in one of his later novels, Masters of Sleep. But this is pure speculation.
Yet if this all sounds a little familiar, it might be because you’ve read Robert A. Heinlein’s The Moon is a Harsh Mistress, which was first published in 1966. The novel recounts the revolt of a lunar colony much like the one that Hubbard describes, and in a crucial plot point, its narrator quickly figures out the advantage that gravity provides, with the help of a computer named Mike: “Luna…has energy of position; she sits at top of gravity well eleven kilometers per second deep and kept from falling in by curb only two and a half km/s high. Mike knew that curb; daily he tossed grain freighters over it, let them slide downhill to Terra.” And they don’t need to use missiles at all. A hundred tons of anything, falling to earth from the moon, would generate six trillion joules of kinetic energy, or the equivalent of two-kiloton atomic bomb. All they have to do is throw rocks: “That terrible speed results from gravity well shaped by Terra’s mass, eighty times that of Luna, and made no real difference whether Mike pushed a missile gently over well curb or flipped it briskly. Was not muscle that counted but great depth of that well.” Heinlein had also mentioned this concept before—although, revealingly, not in Rocket Ship Galileo, which is actually about a Nazi military base on the moon, but was written before Hubbard’s piece was published. In late 1947, a few months after the article appeared, Heinlein began work on the juvenile novel Hayworth Hall, later retitled Space Cadet, in which we find the exchange:
“The spaceship is the perfect answer in a military sense to the atom bomb, and to germ warfare and weather warfare. It can deliver an attack that can’t be stopped—and it is utterly impossible to attack that spaceship from the surface of a planet.”
Matt nodded. “The gravity gauge.”
“Yes, the gravity gauge. Men on the surface of a planet are as helpless against men in spaceships as a man would be trying to conduct a rock-throwing fight from the bottom of a well. The man at the top of the well has gravity working for him.”
Heinlein first publicly made the connection to the moon at a meeting of the County Librarians’ Association in Los Angeles on May 5, 1948. (One of the other writers in attendance, incidentally, was Theodor Geisel, who would later become famous as Dr. Seuss.) A writeup in the Los Angeles Times quoted Heinlein as saying that the moon would make an ideal military base: “A power on the moon would have the gravity gauge. The moon has one-sixth the normal earth gravity. It would be like throwing rocks downhill.”
Decades later, in 1965, Heinlein expanded on this point in a comment on his earlier essay “Pandora’s Box,” which was published in the collection Expanded Universe:
The disadvantage in being at the bottom of a deep “gravity well” is very great; gravity gauge will be as crucial in the coming years as wind gauge was in the days when sailing ships controlled empires. The nation that controls the Moon will control the Earth—but no one seems willing these days to speak that nasty fact out loud.
The italics are mine. Given the timing of its appearance in Space Cadet, Heinlein’s remarks on the subject in 1948, and his use of the term “wind gauge” in 1965, it seems clear that he read Hubbard’s “Fortress in the Sky” and was influenced by it while writing The Moon is a Harsh Mistress. We know for a fact that Heinlein read Air Trails. Campbell had approached him about writing an article for the magazine, and Heinlein wrote back to the editor on November 11, 1946: “Air Trails shows distinct improvement under your editing.” He was also friends with Hubbard, and there’s no question that he would have found this article intensely interesting. The fact that the three men had all spent time together over the previous couple of years means that we can’t rule out the possibility that Heinlein came up with the notion first and passed it along to one or both of the others. But I think that the simplest explanation—that Heinlein borrowed the concept from Hubbard’s article—is also the most likely one. It doesn’t seem to have occurred to Hubbard or Campbell that the gravity well could be used to deliver anything other than missiles, and utilizing it to “throw rocks” is, frankly, a much better idea. This implies that Heinlein read the article, mentioned it in passing in Space Cadet, thought up a distinct improvement, and then simply set it aside until he was ready to use it. (As far as I can tell, Hubbard was the first to use the term “gravity gauge.”) Hubbard’s official biographies refer to Heinlein as his “protégé,” which is a stretch even by their uncritical standards, but this is one case in which the direction of influence genuinely appears to have run the other way. As Heinlein wrote in Glory Road, using an image that he liked so much that he returned to it repeatedly: “That’s the way with writers; they’ll steal anything, file off the serial numbers, and claim it for their own.”
In my recent piece on Longreads about L. Ron Hubbard and the origins of Scientology, I note that Hubbard initially didn’t want the first important article on dianetics to appear in Astounding Science Fiction at all. In April of 1949, he made efforts to reach out to such organizations as the American Psychiatric Association, the American Psychological Association, and the Gerontological Society in Baltimore, and he only turned to the science fiction editor John W. Campbell after all of these earlier attempts had failed. Most of the standard biographies of Hubbard mention this fact, but what isn’t always emphasized is that even Campbell, who became one of Hubbard’s most passionate supporters, didn’t seem all that eager to publish the piece in Astounding. Campbell knew perfectly well that printing this material in a pulp magazine would make it hard for it to be taken seriously, and he was also concerned that it would be mistaken for a hoax article, like Isaac Asimov’s story about the fictional compound thiotimoline. As a result, even as Campbell served as a key member of the team that was developing dianetics in Bay Head, New Jersey, he continued to push for it to make its first appearance in a professional journal. Later that year, Dr. Joseph Winter, their third crucial collaborator, reached out “informally” about a paper to the Journal of the American Medical Association, only to be told that it lacked sufficient evidence, and he got much the same response from the American Journal of Psychiatry. It was only after they had exhausted these avenues that they decided to publish “Dianetics: The Evolution of a Science” in the magazine that Campbell himself edited—which tells us a lot about how they had originally wanted their work to be received.
At that point, Campbell was hardly in a position to be objective, but he wanted to present the article to his readers in a way that at least gave the appearance of balance. Accordingly, he proposed that they find a psychiatrist to write a critical treatment of dianetics, presumably to run alongside Hubbard’s piece—but he was doomed to be disappointed in this, too. On December 9, 1949, Hubbard wrote: “In view of the fact that no psychiatrist to date has been able to look at Dianetics and listen long enough to find out the fundamentals, Dianetic explanations being dinned out by his educational efforts about Freud, we took it upon ourselves to compose the rebuttal.” Incredibly, Hubbard and Winter wrote up an entire article, “A Criticism of Dianetics,” that spent over five thousand words laying out the case against the new therapy, credited to the nonexistent “Irving R. Kutzman, M.D.” (In his letter, Hubbard argued that the “M.D.” was justified, since it reflected the contributions of Winter, a general practitioner and endocrinologist from Michigan.) Hubbard claimed that the essay consisted of the verbatim comments of four psychiatrists he had consulted on the subject, including one he had met while living in Savannah, Georgia, and that he had “played them back very carefully,” using the perfect memory that a dianetic “clear” possessed. He also described setting up “a psychiatric demon” to write the piece, which refers to the notion that a clear can deliberately create and break down temporary delusions for his private amusement. To the best of my knowledge, this paper, which I discovered among Campbell’s correspondence, hasn’t been published or discussed anywhere else, and it provides some fascinating insights into Hubbard’s thinking at the time.
The most interesting thing about “A Criticism of Dianetics” is how straightforward it is. Hubbard told Campbell that “it is in no sense an effort to be funny and it is not funny,” and for most of the piece, there’s little trace of burlesque. Notably, it anticipates many of the objections that would be raised against dianetics, including the idea that it merely repackaged existing psychological concepts. As “Kutzman” writes: “Further examination…disclosed that scraps of Dianetics have been known for thousands of years. Except for one or two relatively minor matters, all of them are known to the modern psychologist.” He also observes that Hubbard has only thirteen months of data—which is actually generous, given how little he disclosed about any of his alleged cases—and that there’s no evidence that any perceived improvements will last. It’s only toward the end that the mask begins to slip. “Kutzman” speaks glowingly of “the new technique of trans-orbital leukotomy and the older and more reliable technique of pre-frontal lobotomy,” with which “patients can be treated more swiftly and will be less of a menace to society than heretofore.” He concludes: “By such operations…[the neurosurgeon] can get rid of that part of your personality which is causing all your trouble.” (Even the name “Kutzman,” I suspect, is a bad pun.) The piece dismisses General Semantics and cybernetics, the latter of which it attributes to a “Dr. Werner [sic],” and closes with an odd account of the fictional Kutzman being audited by Hubbard, in which he explains away the prenatal and childhood memories that he recovered as delusions: “I had eaten excessively at supper and…my ulcer had been troubling me for some time.” It ends: “Discoveries not solidly founded in classical psychoanalysis are not likely to be easily accepted by a social world which already comprehends all the basic problems of the human mind.”
In any event, it was never published, and it isn’t clear whether Hubbard or Winter ever thought that it would be. Hubbard wrote to Campbell: “Any article you receive will, I know, run something on this order if written by a psychiatrist…May I invite you to peruse same, not in any misguided spirit of levity, but as a review of the composite and variously confirmed attitudes Dianetics meets in the field of those great men who guide our minds.” No actual rebuttal ever materialized, and dianetics was presented in the pages of Astounding without any critical analysis whatsoever. (Interestingly, Hubbard did contribute to a point/counterpoint discussion on at least two other occasions. One was in the November 1950 issue of Why Magazine, which ran Hubbard’s “The Case For It” with “The Case Against It” by Dr. Oscar Sachs of Mount Sinai, and the other was in the May 1951 installment of Marvel Science Stories, which contained positive articles on dianetics from Hubbard and Theodore Sturgeon and a critical one from Lester del Rey. Campbell could have arranged for something similar in Astounding, if he had really wanted it.) But it provides a valuable glimpse into a transitional moment in Hubbard’s career. Compared to the author’s later attacks on psychiatry, its tone is restrained, even subtle—which isn’t a description that usually comes to mind for Hubbard’s work. Yet it’s equally clear that he had already given up on reaching mainstream psychologists and psychiatrists, even to the extent of convincing one to compose an objective response. Campbell, for his part, still clung to the hope of obtaining academic or scientific recognition. Much of the tragicomedy of what happened over the next eighteen months emerged from that basic misunderstanding. And the seeds of it are visible here.