Posts Tagged ‘Ben Hecht’
The uranium in the wine bottle
In the March 1944 issue of Astounding Science Fiction, readers were treated to the story “Deadline” by Cleve Cartmill, which was set on an alien planet consumed by a war between two factions known as the “Sixa” and the “Seilla.” Its hero was a spy, complete with a prehensile tail, whose mission was to fly into enemy territory and destroy the ultimate weapon before it could be detonated. The story itself was undeniably mediocre, and it would be utterly forgotten today if it weren’t for its description of the weapon in question, an atomic bomb, which Cartmill based almost verbatim on letters from the editor John W. Campbell, who had pitched the idea in the first place. According to the physicist Edward Teller, it was plausible enough to cause “astonishment” at the Manhattan Project, which counted many readers of the magazine among its scientists, and after it was brought to the attention of the Counterintelligence Corps, both Campbell and Cartmill were interviewed to investigate the possibility of a leak. In reality, “Deadline” wasn’t even much of a prediction—Campbell, who was feeling frustrated about his lack of involvement in war research, had a hunch that an atomic bomb was in the works, and he packed the story with technical information that was already in the public domain. He evidently hoped that it would draw official interest that might lead to a real defense role, which failed to materialize. After the war, however, it paid off immensely, and Campbell found himself hailed as a prophet. Cartmill, the credited author, neatly fell out of the picture, and the fact that the story hadn’t predicted much of anything was lost on most readers. Campbell had essentially orchestrated the most famous anecdote of his career, planting “Deadline” in the magazine expressly so that he could point to it later, and across multiple retellings, the details of the ensuing investigation were exaggerated beyond recognition. As the historian Donald Spoto aptly puts it: “[His] calculated image of himself as a prophet does not coincide with the truth; inspired by his sense of publicity, he told a better story than the facts reveal.”
But Spoto isn’t writing about Campbell, but about Alfred Hitchcock, in his classic biography The Dark Side of Genius, and the story here isn’t “Deadline,” but the great romantic thriller Notorious. As legend has it, when Hitchcock had to come up with the MacGuffin, or the plot point that would drive the rest of the movie, he proposed a sample of uranium hidden in a wine bottle by a group of Nazis in Brazil. As he said to François Truffaut in their famous book-length interview:
The producer said, “What in the name of goodness is that?” I said, “This is uranium; it’s the thing they’re going to make an atom bomb with.” And he asked, “What atom bomb?” This, you must remember, was in 1944, a year before Hiroshima. I had only one clue. A writer friend of mine had told me that scientists were working on a secret project someplace in New Mexico. It was so secret that once they went into the plant, they never emerged again. I was also aware that the Germans were conducting experiments with heavy water in Norway. So these clues brought me to the uranium MacGuffin. The producer was skeptical, and he felt it was absurd to use the idea of an atom bomb as the basis for our story. I told him that it wasn’t the basis for the story, but only the MacGuffin, and I explained that there was no need to attach too much importance to it.
In the end, the idea was approved, and Hitchcock and screenwriter Ben Hecht allegedly went to Pasadena to get background information from the physicist Robert A. Millikan. According to Hitchcock, Millikan responded: “You want to have yourselves arrested and have me arrested as well?” After this outburst, Milkian informed them—in something of a non sequitur—that the idea was impossible anyway, although others evidently felt that they had come too close for comfort. As Hitchcock confided in Truffaut: “I learned later that afterward the FBI had me under surveillance for three months.”
Like many movie buffs, I accepted this story without question for years, but when you encounter it after the “Deadline” incident, it starts to seem too good to be true, which it was. As Spoto writes in The Dark Side of Genius: “The business of the uranium remained a considerable source of publicity for Hitchcock to the end of his life. To François Truffaut, to this writer, and to many others, he always insisted that he had chosen the device of uranium ore in Nazi experiments quite coincidentally, far in advance of the detonation of the atomic bomb in Japan in August 1945…He always emphasized, in every discussion of Notorious, that he was virtually a prophet.” The truth, Spoto continues, was very different:
By the time Notorious actually began filming, in October 1945, Hitchcock had made yet another trip to London…and he had returned to Los Angeles for final script work in September—after the bombings of Japan, and after he had spent several weeks in New York testing actors, among whom were several famous German refugees he finally cast in the film. On the basis of news from these German contacts, and from the accounts that flooded the world press…Hitchcock and Hecht refined the last addenda to their script just before the first day of production…All the evidence suggests that in truth the uranium was included after the fact.
As for the allegation of government surveillance, it was evidently based on a general directive from the FBI that the producer David O. Selznick received in May, which cautioned that any movie that featured American intelligence would have to be cleared by the State Department. Like Campbell, Hitchcock liked to make people think that he had been given special attention, and over the years, in both cases, the stories only grew.
There are obvious similarities between these two incidents, as well as equally noteworthy differences. With “Deadline,” the description of the bomb is the story’s sole reason for existing, while Notorious would still be a masterpiece even if the MacGuffin had been something else entirely. (As Hitchcock allegedly told his producer: “Look, if you don’t like uranium, let’s make it industrial diamonds, which the Germans need to cut their tools with.” He claimed to have later told a movie executive who had objected to the screenplay on grounds of its implausibility: “You were wrong to attach any importance to the MacGuffin. Notorious was simply the story of a man in love with a girl who, in the course of her official duties, had to go to bed with another man and even had to marry him. That’s the story.” And even if he invented the conversation, his point still stands.) The other difference is the use to which each anecdote was put. For Hitchcock, the uranium incident, and the reputation that it gave him as a “prophet,” was just another way of burnishing his image, and although he enjoyed dining out on it, it was a minor part of his legend. Campbell, by contrast, used it as the basis for his entire postwar career. Just two weeks after Hiroshima, The New Yorker profiled him in a Talk of the Town piece titled “1945 Cassandra,” in which it credulously wrote:
If you want to keep up with, or possibly stay ahead of, the development of secret weapons in time of war, you had better…go to the pulps, preferably Astounding. One reason is that Astounding, which has for the past ten years or so been predicting atomic bombs and using them to liven up its stories, has been permitted to duck some of the security rules that made high-echelon government officials such halting conversationalists in recent months.
And that reputation hinged largely on the myth of “Deadline” and its creation. It bought Campbell tremendous credibility after the war, earned or otherwise, and it played a significant role in science fiction’s big push into the mainstream. Eventually, the editor would stake—and lose—all of that goodwill on dianetics. But for a few years, Campbell, like Hitchcock, got to play his audience like a piano, and both men liked to pretend that they had once been notorious.
Playing the game
Yesterday, the magazine PC Gamer published an article by Alex Wiltshire on the challenges of writing for blockbuster video games. It’s an illuminating piece, especially if you haven’t given much thought to the subject before, and it’s loaded with interesting insights. What struck me the most, though, was the way in which the writers tend to talk about themselves and their craft. Walt Williams, the author of a memoir about game development that I clearly need to read, says of the trade: “As much as we like to say that video games can be a narrative medium, financially they’re really not…Writing is expendable.” Tom Bissell, who has worked on games in the Gears of War and Uncharted series, has similar views, as Wiltshire writes:
Bissell says that games have “shitty stories” because games are often simply absurd. “That’s not a criticism, it’s an acknowledgment of the reality that stares anyone working on an action game right in the face…The only way you escape the absurdity problem is through sheer force of will, and you can do that only when the prime creative force behind the game is also overseeing virtually every aspect of it…That’s not a position most game writers will ever find themselves in, obviously.”
And Williams concludes: “Our biggest mistake is that we’ve decided to consider AAA [blockbuster] games as something better than they are. We like to think our super-silly destruction derby arena is a piece of serious art that can say something meaningful.”
As I read this, I was strongly reminded of what another writer says about an art form that had been around for decades, but was still in its formative stages at the height of his career:
The movies are one of the bad habits that corrupted our century…The persistent banality of the movies is due to the “vision” of their manufacturers. I do not mean by manufacturers, writers or directors. These harassed toilers are no more than the lowest of Unteroffizieren in movieland. The orders come from the tents of a dozen invisible generals. The “vision” is theirs. They keep a visionary eye glued to the fact that the lower in class an entertainment product is, the more people will buy it…[The studio head] must examine every idea, plot, or venture submitted to him from the single point of view of whether it is trite enough to appeal to the masses.
The writer here is the screenwriter Ben Hecht, whose memoir A Child of the Century is filled with what Pauline Kael describes in “Raising Kane” as his “frivolously cynical” view of filmmaking. In 1925, Hecht, who had only seen “a few movies” at the time, said confidently to his friend Herman J. Mankiewicz: “Anybody with a good memory for clichés and unafraid to write like a child can bat out a superb movie in a few days.” A year later, Mankiewicz—who would go on to win an Oscar for Citizen Kane—took him at his word, and he cabled Hecht from Hollywood: “Will you accept three hundred per week to work for Paramount Pictures? All expenses paid. The three hundred is peanuts. Millions are to be grabbed out here and your only competition is idiots. Don’t let this get around.”
Hecht went on to a legendary career, much of which was spent serving as what Tomb Raider writer Rhianna Pratchett calls a “narrative paramedic” on movies like Gone With the Wind. And while I doubt that any video game writers are earning millions from their work, their attitude toward their medium seems largely the same as Hecht’s, even before you account for the intervening ninety years. Hecht writes of the films of the thirties:
One basic plot only has appeared daily in their fifteen hundred theaters—the triumph of virtue and the overthrow of wickedness…Not only was the plot the same, but the characters in it never varied. These characters must always be good or bad (and never human) in order not to confuse the plot of Virtue Triumphing. This denouement could be best achieved by stereotypes a fraction removed from those in the comic strips.
Despite their occasional stabs at moral ambiguity, most games operate under similar constraints, and the situation is only exacerbated by the money and resources at stake. Hecht writes that “millions of dollars and not mere thousands were involved,” while Bissell says that video games are “possibly the most complicated popular art form ever created,” which only decreases any tolerance for risk. Invariably, it’s the writers who lose. Hecht says that he ultimately lost every fight that he had with his producers, adding mordantly: “Months later, watching ‘my’ movie in a theater, I realize that not much damage had actually been done. A movie is basically so trite and glib that the addition of a half dozen miserable inanities does not cripple it.”
You might think that the solution would be to give the writers more control, but those on the inside seem unconvinced. Wiltshire writes:
For Bissell it’s a misconception that they’d improve if only writers were more integral with development. “Sorry, but that’s just not true in my experience. Games can go wrong in so many ways that have nothing to do with who the writer is or how well or poorly he or she or they are treated. Sometimes cleaning up the mess in a wayward game falls on level design and sometimes art and sometimes narrative, but this idea that games have ‘shitty stories’ because there aren’t good writers in the industry, or that writers aren’t listened to, is, to be perfectly frank, a deflection.”
Hecht makes much the same observation: “In a curious way, there is not much difference between the product of a good writer and a bad one. They both have to toe the same mark.” Which seems to be the real point in common. Movies and video games can both produce masterpieces, even at their most commercial, but on the blockbuster level, they tend to be the sum of a pattern of forces, with the writer serving as a kind of release valve for the rest, even if his or her contributions are usually undervalued. (“Everyone writes, whereas not everyone designs or codes, and I think people feel they have a stake in it,” says Phil Huxley, a former writer for Rocksteady.) In both cases, success or failure can be a matter of luck, and in the meantime, the game has to be its own reward, as Hecht knows well: “Making movies is a game played by a few thousand toy-minded folk. It is obsessive, exhausting, and jolly, as a good game should be. Played intently, it divorces you from life, as a good game will do.”
The vision thing
A few days ago, I was struck by the fact that a mere thirty-one years separated The Thing From Another World from John Carpenter’s The Thing. The former was released on April 21, 1951, the latter on June 25, 1982, and another remake, which I haven’t yet seen, arrived right on schedule in 2011. Three decades might have once seemed like a long time to me, but now, it feels like the blink of an eye. It’s the equivalent of the upcoming remake of David Cronenberg’s The Fly, which was itself a reimagining of a movie that had been around for about the same amount of time. I picked these examples at random, and while there isn’t anything magical about a thirty-year cycle, it isn’t hard to understand. It’s enough time for a new generation of viewers to come of age, but not quite long enough for the memory of the earlier movie to fade entirely. (From my perspective, the films of the eighties seem psychologically far closer than those of the seventies, and not just for reasons of style.) It’s also long enough for the original reaction to a movie to be largely forgotten, so that it settles at what feels like its natural level. When The Thing From Another World first premiered, Isaac Asimov thought that it was one of the worst movies ever made. John W. Campbell, on whose original story it was based, was more generous, writing of the filmmakers: “I think they may be right in feeling that the proposition in ‘Who Goes There?’ is a little strong if presented literally in the screen.” Elsewhere, he noted:
I have an impression that the original version directed and acted with equal restraint would have sent some ten percent of the average movie audience into genuine, no-kidding, semi-permanent hysterical screaming meemies…You think that [story] wouldn’t tip an insipid paranoid psychotic right off the edge if it were presented skillfully?
For once, Campbell, whose predictions were only rarely on the mark, was entirely prescient. By the time John Carpenter’s The Thing came out, The Thing From Another World was seen as classic, and the remake, which tracked the original novella much more closely, struck many viewers as an assault on its legacy. One of its most vocal detractors, curiously, was Harlan Ellison, who certainly couldn’t be accused of squeamishness. In a column for L.A. Weekly, Ellison wrote that Carpenter “showed some stuff with Halloween,” but dismissed his later movies as “a swan dive into the potty.” He continued:
The Thing…[is a] depredation [Carpenter] attempts to validate by saying he wanted to pull out of the original John W. Campbell story those treasures undiscovered by the original creators…One should not eat before seeing it…and one cannot eat after having seen it.
If the treasures Carpenter sought to unearth are contained in the special effects lunacy of mannequins made to look like men, splitting open to disgorge sentient lasagna that slaughters for no conceivable reason, then John Carpenter is a raider of the lost ark of Art who ought to be sentenced to a lifetime of watching Neil Simon plays and films.
The Thing did not need to be remade, if the best this fearfully limited director could bring forth was a ripoff of Alien in the frozen tundra, this pointless, dehumanized freeway smashup of grisly special effects dreck, flensed of all characterization, philosophy, subtext, or rationality.
Thirty years later, the cycle of pop culture has come full circle, and it’s fair to say that Carpenter’s movie has eclipsed not just Howard Hawks and Christian Nyby, but even Campbell himself. (Having spent the last year trying to explain what I’m doing to people who aren’t science fiction fans, I can testify that if Campbell’s name resonates with them at all, it’s thanks solely to the 1982 version of The Thing.) Yet the two movies also share surprising affinities, and not simply because Carpenter idolized Hawks. Both seem interested in Campbell’s premise mostly for the visual possibilities that it suggests. In the late forties, the rights to “Who Goes There?” were purchased by RKO at the urging of Ben Hecht and Charles Lederer, the latter of whom wrote the script, with uncredited contributions from Hecht and Hawks. The direction was credited to Nyby, Hawks’s protégé, but Hawks was always on the set and later claimed most of the director’s fee, leading to much disagreement over who was responsible for the result. In the end, it threw out nearly all of Campbell’s story, keeping only the basic premise of an alien spacecraft discovered by researchers in an icy environment, while shifting the setting from Antarctica to Alaska. The filmmakers were clearly more drawn to the idea of a group of men facing danger in isolation, one of Hawks’s favorite themes, and they lavished greater attention on the stock types that they understood—the pilot, the journalist, the girl—than on the scientists, who were reduced to thankless foils. David Thomson has noted that the central principle of Hawks’s work is that “men are more expressive rolling a cigarette than saving the world,” and the contrast has never been more evident than it is here.
And while Hawks isn’t usually remembered as a visual director, The Thing From Another World exists almost entirely as a series of images: the opening titles burning through the screen, the crew standing in a circle on the ice to reveal the shape of the flying saucer underneath, the shock reveal of the alien itself in the doorway. When you account for the passage of time, Carpenter’s version rests on similar foundations. His characters and dialogue are less distinct than Hawks’s, but he also seems to have regarded Campbell’s story primarily as a source of visual problems and solutions. I don’t think I’m alone in saying that the images that are burned into my brain from The Thing probably add up to a total of about five minutes: the limits of its technology mean that we only see it in action for a few seconds at a time. But those images, most of which were the work of the special effects prodigy Rob Bottin, are still the best practical effects I’ve ever seen. (It also includes the single best jump scare in the movies, which is taken all but intact from Campbell.) Even after thirty years, its shock moments are so unforgettable that they have a way of overpowering the rest, as they did for Ellison, and neither version ever really approximates the clean narrative momentum of “Who Goes There?” But maybe that’s how it should be. Campbell, for all his gifts, wasn’t primarily a visual writer, and the movies are a visual medium, particularly in horror and science fiction. Both of the classic versions of The Thing are translations from one kind of storytelling to another, and they stick in the imagination precisely to the extent that they depart from the original. They’re works for the eye, not the mind, which may be why the only memorable line in either movie is the final warning in Hawks’s version, broadcast over the airwaves to the world, telling us to watch the skies.
Looking forward, looking back
If you ask a man how many times he has loved—unless there is love in his heart at the moment—he is likely to answer, “Never.”
—Ben Hecht, A Child of the Century
Every now and then, I’ll go over to the bookshelf, pull down a copy of one of my own novels, and idly leaf through the pages. Whenever I do, my first thought is usually, Hey, this isn’t bad. But I can’t say that I’m all that tempted to read them over again. Finished works are like the old girlfriends or boyfriends of the writing life: they’ve left you with some lasting memories and some regrets, but now that it’s all over and done, you don’t necessarily want to go poking around to see what might be there today. I don’t think anyone who hasn’t written a novel can understand the ambivalence with which a writer regards a story that used to be a living, growing entity, and now is something closer to a dead thing, with its mistakes and typos still intact. I like my novels; they were always books that I wanted to read myself. But going back to revisit them again now feels a little like digging around into matters that shouldn’t be disturbed. As the members of Spinal Tap say about their first drummer, who died in a bizarre gardening accident: “The authorities said it was best to leave it unsolved.”
John Updike says somewhere in Self-Consciousness that it doesn’t make sense to be afraid of death, since we’ve all successively taken on and given up a series of selves that might as well be other people entirely. I have a feeling that he was pushed into that insight by his work as a novelist, which superimposes a second layer of reinvention on the changes that we all undergo. A writer is never quite the same person he was while writing a particular novel: you immerse yourself for a year or so in a web of lives that feel very real in the moment, but they’re diminished the second you turn to the next story. I’ve always said that a draft of any novel amounts to a message from my past self to the future, and that’s doubly true of everything that ended up in print. I vaguely remember the months of work that each book required, and certain moments in the creative process are indelibly vivid, but a lot of it has faded into a kind of creative haze. Keeping focused on the work at hand is hard enough; if you want to give the current story everything you have, you need to kill all those old darlings.
But that can be its own kind of trap. I’ve often thought that the secret to living a fulfilling life, not that I’ve managed to do this myself, is less about transforming into something better than about fully integrating all the old selves that we’ve left behind. If we could face each day as the sum of our experiences from childhood to adolescence to adulthood, with all those strange byways and fleeting obsessions and forgotten loves and hates organized into one person, we’d emerge as beings of incredible complexity, no matter how mundane the individual pieces might be. In practice, that’s not how we approach life: we’re more concerned with the little dilemmas that confront us every morning than with finding a shape for the whole. (Say what you will about psychoanalysis, but its underlying project—to understand the present in terms of the past—is hugely important, and it’s no surprise that it can require a lifetime of talk just to process what has already happened.) That’s true of writing, too. You do a better job of solving the problems in front of you if you have some sense of where you’ve been before, which means fighting against the amnesia that descends once you’ve moved on from an old story.
That’s a big part of the reason why I’ve spent so much time on the writer’s commentaries on The Icon Thief and City of Exiles. Like a lot of features on this blog, they’re really something I do for myself, even if I’d like to think that other readers—even those who haven’t picked up any of the novels—might get something out of it as well. They’re an excuse to confront old pages, gleaning any lessons I can from whatever I find there, while always remaining honest about their shortcomings: otherwise, there wouldn’t be much of a point. Sometimes I’m a little confused by my own conclusions; I still can’t decide if City of Exiles is the strongest novel in the series or the weakest. But even that confusion has its place. Next week, I’m going to start the process all over again for Eternal Empire, the final novel in the trilogy, and the one that I probably know the least well. If it inspires you to purchase a copy, that’s fantastic, but selling books was never really the point here. It’s more a way of setting down certain impressions for my own edification before time and distance erase them all. That may seem like a lot to put on three books that were never intended to be much more than smart, diverting thrillers. But that’s how it always feels when you look up an old flame.
Living with dissatisfaction
Novelists come in all shapes and sizes, but if they’re united by anything, it’s two qualities of mind. The first is an irrational optimism, a sense that despite all evidence to the contrary, they’ll conquer the odds and become one of the thousand or fewer authors on the planet to make a living from writing fiction alone. The second is ambition, which can be both a blessing and a curse. Ambition is the only force that can carry any sane person through the effort of writing an entire novel: a more rational being would have given up long before, and many often do. We have ambition to thank for the novels that stand as towering works of the human spirit, for the most crassly calculated commercial fiction, and for everything in between: if money were the only thing on a writer’s mind, after all, there are easier ways of making a living. It all comes down to a desire to be known as a writer, or to leave something meaningful behind when we’re gone, and while other factors—an urgent story to tell, the need to express our innermost thoughts and feelings, a sense of emptiness when we contemplate a life without some sustaining project—it’s ambition that keeps it going and carries it home.
That’s the good thing about ambition: it comes out of nowhere and rarely leaves a true writer entirely, and without it, the shelves of our homes and libraries would be bare. But there’s a dark side to it as well. It’s the voice in a writer’s head that tells him that he’ll never be as good as he has the potential to be, and one that quickly takes for granted what he’s already accomplished. Speaking of romantic love, the great screenwriter Ben Hecht once wrote:
If you ask a man who many times he has loved—unless there is love in his heart at the moment—he is likely to answer, “Never.” He will say, if his heart is loveless, that often he had thought he loved, but that, victim or hero of love, he was mistaken. For only love can believe in love—or even remember it.
Change a word here and there, and that sums up how most writers feel about what they’ve done in the past. When you’re working on a novel, it seems urgent, inevitable, the most important thing in the world; when it’s done, even in published form, it starts to feel a little dead, and whatever pleasure it once gave you is quickly swallowed up by the drive to move on to the next big thing. That voice in your head is implacable and coolly rational: What you’ve done so far is all very well and good, it says, but what have you done for me lately?
Learning to live with those two sides of ambition is one of the hardest challenges faced by a writer, or any creative artist. I’ve been living with an unquenchable ambition for as long as I can remember, and brother, it’s exhausting. I love writing, but as with so many other authors before me, the act itself gets tied up with other, less wholesome emotions, like competition or dissatisfaction. It doesn’t help to remind myself that if I’m dissatisfied with what I’ve done so far, it has less to do with my achievements themselves then with an ingrained state of mind: I’m the kind of person who is never going to be entirely satisfied, even if I tick off every item on my literary bucket list. I even catch myself wondering what it would be like to turn it all off. If there were a switch I could press to take my ambitions away, leaving me content with what I’ve accomplished and willing to live in relative peace, there are times when I’d be tempted to flip it. When someone like Philip Roth decides that it’s no longer worth the trouble and walks away, it makes headlines, but Roth only did what most writers, in their heart of hearts, often wish they could do, if only the voices in their heads would allow it.
And the only solution I’ve ever found is to refocus that ambition on the one place where it can do a bit of good, regardless of its external results: on the way you spend your time from one minute to the next. We may not be able to control what happens to our work once we’re done with it, or how we’ll feel about it if we ever see it in print, but we can at least make sure that our free time is spent thinking about the things we care about and pursuing the activities that matter to us. In some ways, that’s the most worthwhile ambition at all—the determination to own the time that we’re afforded, not just on the level of constructing a body of work that will outlive us, but on spending the next available hour doing something we find interesting. It’s quite possible that, like Hecht’s hypothetical lover, we won’t be satisfied with what we’ve produced, but the time invested in that pursuit can’t be wasted, however many mistakes we make along the way. Whenever I feel less than content with something I’ve done, I stop and ask myself a slightly different question: Am I happy with the way I spent the time it took? And if the response is yes, then I’ve got my real answer.
Quote of the Day
Will you accept three hundred per week to work for Paramount Pictures? All expenses paid. The three hundred is peanuts. Millions are to be grabbed out here and your competition is idiots. Don’t let this get around.
—Ben Hecht, in a telegram to Herman J. Mankiewicz