Posts Tagged ‘Howard Hawks’
A few days ago, I was struck by the fact that a mere thirty-one years separated The Thing From Another World from John Carpenter’s The Thing. The former was released on April 21, 1951, the latter on June 25, 1982, and another remake, which I haven’t yet seen, arrived right on schedule in 2011. Three decades might have once seemed like a long time to me, but now, it feels like the blink of an eye. It’s the equivalent of the upcoming remake of David Cronenberg’s The Fly, which was itself a reimagining of a movie that had been around for about the same amount of time. I picked these examples at random, and while there isn’t anything magical about a thirty-year cycle, it isn’t hard to understand. It’s enough time for a new generation of viewers to come of age, but not quite long enough for the memory of the earlier movie to fade entirely. (From my perspective, the films of the eighties seem psychologically far closer than those of the seventies, and not just for reasons of style.) It’s also long enough for the original reaction to a movie to be largely forgotten, so that it settles at what feels like its natural level. When The Thing From Another World first premiered, Isaac Asimov thought that it was one of the worst movies ever made. John W. Campbell, on whose original story it was based, was more generous, writing of the filmmakers: “I think they may be right in feeling that the proposition in ‘Who Goes There?’ is a little strong if presented literally in the screen.” Elsewhere, he noted:
I have an impression that the original version directed and acted with equal restraint would have sent some ten percent of the average movie audience into genuine, no-kidding, semi-permanent hysterical screaming meemies…You think that [story] wouldn’t tip an insipid paranoid psychotic right off the edge if it were presented skillfully?
For once, Campbell, whose predictions were only rarely on the mark, was entirely prescient. By the time John Carpenter’s The Thing came out, The Thing From Another World was seen as classic, and the remake, which tracked the original novella much more closely, struck many viewers as an assault on its legacy. One of its most vocal detractors, curiously, was Harlan Ellison, who certainly couldn’t be accused of squeamishness. In a column for L.A. Weekly, Ellison wrote that Carpenter “showed some stuff with Halloween,” but dismissed his later movies as “a swan dive into the potty.” He continued:
The Thing…[is a] depredation [Carpenter] attempts to validate by saying he wanted to pull out of the original John W. Campbell story those treasures undiscovered by the original creators…One should not eat before seeing it…and one cannot eat after having seen it.
If the treasures Carpenter sought to unearth are contained in the special effects lunacy of mannequins made to look like men, splitting open to disgorge sentient lasagna that slaughters for no conceivable reason, then John Carpenter is a raider of the lost ark of Art who ought to be sentenced to a lifetime of watching Neil Simon plays and films.
The Thing did not need to be remade, if the best this fearfully limited director could bring forth was a ripoff of Alien in the frozen tundra, this pointless, dehumanized freeway smashup of grisly special effects dreck, flensed of all characterization, philosophy, subtext, or rationality.
Thirty years later, the cycle of pop culture has come full circle, and it’s fair to say that Carpenter’s movie has eclipsed not just Howard Hawks and Christian Nyby, but even Campbell himself. (Having spent the last year trying to explain what I’m doing to people who aren’t science fiction fans, I can testify that if Campbell’s name resonates with them at all, it’s thanks solely to the 1982 version of The Thing.) Yet the two movies also share surprising affinities, and not simply because Carpenter idolized Hawks. Both seem interested in Campbell’s premise mostly for the visual possibilities that it suggests. In the late forties, the rights to “Who Goes There?” were purchased by RKO at the urging of Ben Hecht and Charles Lederer, the latter of whom wrote the script, with uncredited contributions from Hecht and Hawks. The direction was credited to Nyby, Hawks’s protégé, but Hawks was always on the set and later claimed most of the director’s fee, leading to much disagreement over who was responsible for the result. In the end, it threw out nearly all of Campbell’s story, keeping only the basic premise of an alien spacecraft discovered by researchers in an icy environment, while shifting the setting from Antarctica to Alaska. The filmmakers were clearly more drawn to the idea of a group of men facing danger in isolation, one of Hawks’s favorite themes, and they lavished greater attention on the stock types that they understood—the pilot, the journalist, the girl—than on the scientists, who were reduced to thankless foils. David Thomson has noted that the central principle of Hawks’s work is that “men are more expressive rolling a cigarette than saving the world,” and the contrast has never been more evident than it is here.
And while Hawks isn’t usually remembered as a visual director, The Thing From Another World exists almost entirely as a series of images: the opening titles burning through the screen, the crew standing in a circle on the ice to reveal the shape of the flying saucer underneath, the shock reveal of the alien itself in the doorway. When you account for the passage of time, Carpenter’s version rests on similar foundations. His characters and dialogue are less distinct than Hawks’s, but he also seems to have regarded Campbell’s story primarily as a source of visual problems and solutions. I don’t think I’m alone in saying that the images that are burned into my brain from The Thing probably add up to a total of about five minutes: the limits of its technology mean that we only see it in action for a few seconds at a time. But those images, most of which were the work of the special effects prodigy Rob Bottin, are still the best practical effects I’ve ever seen. (It also includes the single best jump scare in the movies, which is taken all but intact from Campbell.) Even after thirty years, its shock moments are so unforgettable that they have a way of overpowering the rest, as they did for Ellison, and neither version ever really approximates the clean narrative momentum of “Who Goes There?” But maybe that’s how it should be. Campbell, for all his gifts, wasn’t primarily a visual writer, and the movies are a visual medium, particularly in horror and science fiction. Both of the classic versions of The Thing are translations from one kind of storytelling to another, and they stick in the imagination precisely to the extent that they depart from the original. They’re works for the eye, not the mind, which may be why the only memorable line in either movie is the final warning in Hawks’s version, broadcast over the airwaves to the world, telling us to watch the skies.
Over the last year or so, I’ve found myself repeatedly struck by the parallels between the careers of John W. Campbell and Orson Welles. At first, the connection might seem tenuous. Campbell and Welles didn’t look anything alike, although they were about the same height, and their politics couldn’t have been more different—Welles was a staunch progressive and defender of civil rights, while Campbell, to put it mildly, wasn’t. Welles was a wanderer, while Campbell spent most of his life within driving distance of his birthplace in New Jersey. But they’re inextricably linked in my imagination. Welles was five years younger than Campbell, but they flourished at exactly the same time, with their careers peaking roughly between 1937 and 1942. Both owed significant creative breakthroughs to the work of H.G. Wells, who inspired Campbell’s story “Twilight” and Welles’s Mercury Theater adaptation of The War of the Worlds. In 1938, Campbell saw Welles’s famous modern-dress production of Julius Caesar with the writer L. Sprague de Camp, of which he wrote in a letter:
It represented, in a way, what I’m trying to do in the magazine. Those humans of two thousand years ago thought and acted as we do—even if they did dress differently. Removing the funny clothes made them more real and understandable. I’m trying to get away from funny clothes and funny-looking people in the pictures of the magazine. And have more humans.
And I suspect that the performance started a train of thought in both men’s minds that led to de Camp’s novel Lest Darkness Fall, which is about a man from the present who ends up in ancient Rome.
Campbell was less pleased by Welles’s most notable venture into science fiction, which he must have seen as an incursion on his turf. He wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.” In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:
I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement.
Undoubtedly, Mr. Orson Welles felt the same way.
Their most significant point of intersection was The Shadow, who was created by an advertising agency for Street & Smith, the publisher of Astounding, as a fictional narrator for the radio series Detective Story Hour. Before long, he became popular enough to star in his own stories. Welles, of course, voiced The Shadow from September 1937 to October 1938, and Campbell plotted some of the magazine installments in collaboration with the writer Walter B. Gibson and the editor John Nanovic, who worked in the office next door. And his identification with the character seems to have run even deeper. In a profile published in the February 1946 issue of Pic magazine, the reporter Dickson Hartwell wrote of Campbell: “You will find him voluble, friendly and personally depressing only in what his friends claim is a startling physical resemblance to The Shadow.”
It isn’t clear if Welles was aware of Campbell, although it would be more surprising if he wasn’t. Welles flitted around science fiction for years, and he occasionally crossed paths with other authors in that circle. To my lasting regret, he never met L. Ron Hubbard, which would have been an epic collision of bullshitters—although Philip Seymour Hoffman claimed that he based his performance in The Master mostly on Welles, and Theodore Sturgeon once said that Welles and Hubbard were the only men he had ever met who could make a room seem crowded simply by walking through the door. In 1946, Isaac Asimov received a call from a lawyer whose client wanted to buy all rights to his robot story “Evidence” for $250. When he asked Campbell for advice, the editor said that he thought it seemed fair, but Asimov’s wife told him to hold out for more. Asimov called back to ask for a thousand dollars, adding that he wouldn’t discuss it further until he found out who the client was. When the lawyer told him that it was Welles, Asimov agreed to the sale, delighted, but nothing ever came of it. (Welles also owned the story in perpetuity, making it impossible for Asimov to sell it elsewhere, a point that Campbell, who took a notoriously casual attitude toward rights, had neglected to raise.) Twenty years later, Welles made inquiries into the rights for Heinlein’s The Puppet Masters, which were tied up at the time with Roger Corman, but never followed up. And it’s worth noting that both stories are concerned with the problem of knowing how other people are what they claim to be, which Campbell had brilliantly explored in “Who Goes There?” It’s a theme to which Welles obsessively returned, and it’s fascinating to speculate what he might have done with it if Howard Hawks and Christian Nyby hadn’t gotten there first with The Thing From Another World. Who knows what evil lurks in the hearts of men?
But their true affinities were spiritual ones. Both Campbell and Welles were child prodigies who reinvented an art form largely by being superb organizers of other people’s talents—although Campbell always downplayed his own contributions, while Welles appears to have done the opposite. Each had a spectacular early success followed by what was perceived as decades of decline, which they seem to have seen coming. (David Thomson writes: “As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent.” And you could say much the same thing about “Twilight.”) Both had a habit of abandoning projects as soon as they realized that they couldn’t control them, and they both managed to seem isolated while occupying the center of attention in any crowd. They enjoyed staking out unreasonable positions in conversation, just to get a rise out of listeners, and they ultimately drove away their most valuable collaborators. What Pauline Kael writes of Welles in “Raising Kane” is equally true of Campbell:
He lost the collaborative partnerships that he needed…He was alone, trying to be “Orson Welles,” though “Orson Welles” had stood for the activities of a group. But he needed the family to hold him together on a project and to take over for him when his energies became scattered. With them, he was a prodigy of accomplishments; without them, he flew apart, became disorderly.
Both men were alone when they died, and both filled their friends, admirers, and biographers with intensely mixed feelings. I’m still coming to terms with Campbell. But I have a hunch that I’ll end up somewhere close to Kael’s ambivalence toward Welles, who, at the end of an essay that was widely seen as puncturing his myth, could only conclude: “In a less confused world, his glory would be greater than his guilt.”
“The history of the world is but the biography of great men,” Thomas Carlyle once wrote, and although this statement was criticized almost at once, it accurately captures the way many of us continue to think about historical events, both large and small. There’s something inherently appealing about the idea that certain exceptional personalities—Alexander the Great, Julius Caesar, Napoleon—can seize and turn the temper of their time, and we see it today in attempts to explain, say, the personal computing revolution though the life of someone like Steve Jobs. The alternate view, which was expressed forcefully by Herbert Spencer, is that history is the outcome of impersonal social and economic forces, in which a single man or woman can do little more than catalyze trends that are already there. If Napoleon had never lived, the theory goes, someone very much like him would have taken his place. It’s safe to say that any reasonable view of history has to take both theories into account: Napoleon was extraordinary in ways that can’t be fully explained by his environment, even if he was inseparably a part of it. But it’s also worth remembering that much of our fascination with such individuals arises from our craving for narrative structures, which demand a clear hero or villain. (The major exception, interestingly, is science fiction, in which the “protagonist” is often humanity as a whole. And the transition from the hard science fiction of the golden age to messianic stories like Dune, in which the great man reasserts himself with a vengeance, is a critical turning point in the genre’s development.)
You can see a similar divide in storytelling, too. One school of thought implicitly assumes that a story is a delivery system for great scenes, with the rest of the plot serving as a scaffold to enable a handful of awesome moments. Another approach sees a narrative as a series of small, carefully chosen details designed to create an emotional effect greater than the sum of its parts. When it comes to the former strategy, it’s hard to think of a better example than Game of Thrones, a television series that often seems to be marking time between high points: it can test a viewer’s patience, but to the extent that it works, it’s because it constantly promises a big payoff around the corner, and we can expect two or three transcendent set pieces per season. Mad Men took the opposite tack: it was made up of countless tiny but riveting choices that gained power from their cumulative impact. Like the theories of history I mentioned above, neither type of storytelling is necessarily correct or complete in itself, and you’ll find plenty of exceptions, even in works that seem to fall clearly into one category or the other. It certainly doesn’t mean that one kind of story is “better” than the other. But it provides a useful way to structure our thinking, especially when we consider how subtly one theory shades into the other in practice. The director Howard Hawks famously said that a good movie consisted of three great scenes and no bad scenes, which seems like a vote for the Game of Thrones model. Yet a great scene doesn’t exist in isolation, and the closer we look at stories that work, the more important those nonexistent “bad scenes” start to become.
I got to thinking about this last week, shortly after I completed the series about my alternative movie canon. Looking back at those posts, I noticed that I singled out three of these movies—The Night of the Hunter, The Limey, and Down with Love—for the sake of one memorable scene. But these scenes also depend in tangible ways on their surrounding material. The river sequence in The Night of the Hunter comes out of nowhere, but it’s also the culmination of a language of dreams that the rest of the movie has established. Terence Stamp’s unseen revenge in The Limey works only because we’ve been prepared for it by a slow buildup that lasts for more than twenty minutes. And Renée Zellweger’s confessional speech in Down with Love is striking largely because of how different it is from the movie around it: the rest of the film is relentlessly active, colorful, and noisy, and her long, unbroken take stands out for how emphatically it presses the pause button. None of the scenes would play as well out of context, and it’s easy to imagine a version of each movie in which they didn’t work at all. We remember them, but only because of the less showy creative decisions that have already been made. And at a time when movies seem more obsessed than ever with “trailer moments” that can be spliced into a highlight reel, it’s important to honor the kind of unobtrusive craft required to make a movie with no bad scenes. (A plot that consists of nothing but high points can be exhausting, and a good story both delivers on the obvious payoffs and maintains our interest in the scenes when nothing much seems to be happening.)
Not surprisingly, writers have spent a lot of time thinking about these issues, and it’s noteworthy that one of the most instructive examples comes from Leo Tolstoy. War and Peace is nothing less than an extended criticism of the great man theory of history: Tolstoy brings Napoleon onto the scene expressly to emphasize how insignificant he actually is, and the novel concludes with a lengthy epilogue in which the author lays out his objections to how history is normally understood. History, he argues, is a pattern that emerges from countless unobservable human actions, like the sum of infinitesimals in calculus, and because we can’t see the components in isolation, we have to content ourselves with figuring out the laws of their behavior in the aggregate. But of course, this also describes Tolstoy’s strategy as a writer: we remember the big set pieces in War and Peace and Anna Karenina, but they emerge from the diligent, seemingly impersonal collation of thousands of tiny details, recorded with what seems like a minimum of authorial interference. (As Victor Shklovsky writes: “[Tolstoy] describes the object as if he were seeing it for the first time, an event as if it were happening for the first time.”) And the awesome moments in his novels gain their power from the fact that they arise, as if by historical inevitability, from the details that came before them. Anna Karenina was still alive at the end of the first draft, and it took her author a long time to reconcile himself to the tragic climax toward which his story was driving him. Tolstoy had good reason to believe that great scenes, like great men, are the product of invisible forces. But it took a great writer to see this.
Note: Spoilers follow for the series finale of Glee.
“The best way to criticize a movie,” Jean-Luc Godard once said, “is to make another movie.” Intentional or not, we find apparent examples of this everywhere: the works of art we experience are constantly commenting on one another, often because similar ideas are in the air at the same time. And two parallel approaches viewed side by side can be more enlightening than either one on its own. Take, for instance, the series finales of Parks and Recreation and Glee, which aired less than a month apart. Both are built around an identical formal conceit—a series of self-contained flashforwards that tell us what happened to all the characters after the bulk of the story was over—and both are essentially exercises in wish fulfillment, in which everyone gets more or less exactly what they want. Yet the Parks and Rec finale was one of the best of its kind ever made, while the conclusion of Glee was yet another misfire, even as it offered a few small pleasures along the way. And the comparison is telling. On Parks and Rec, the characters get what they need, but it isn’t what they thought they wanted: Ron ends up working happily in a government job, while April settles down into marriage and family, even if her firstborn son’s name happens to be Burt Snakehole Ludgate Karate Dracula Macklin Demon Jack-o-Lantern Dwyer. It’s sweet, but it’s also the endpoint of a journey that lasted for six seasons.
On Glee, by contrast, Rachel wins a Tony for Best Lead Actress in a Musical—or exactly what she told us she wanted within five minutes of appearing onscreen in the pilot. Yet we shouldn’t be surprised. Glee always approached characterization as a variable that could be altered at will, or by Will, from one moment to the next, cheerfully dumping entire story arcs for the sake of a cheap gag or a musical number. When you can’t be bothered to sustain anyone’s emotional growth for more than an episode at a time, it’s no wonder that each student or teacher’s ultimate fulfillment takes a form that could have been predicted from a few lines of character description written before the pilot was even shot. Those capsule summaries are all we ever learned about these people, so when it came to write endings for them all, the show had no choice but to fall back on what it had originally jotted down. For a show that always seemed endlessly busy, it’s startling how little happened in the meantime, or how much it sacrificed its long game for the sake of a minute of momentum. It was ostensibly about the collision of dreams with reality—or about how hard it can be to escape the small town in which you were born—but in its final, crucial scenes, it seemed to say that happiness lies in getting everything you wanted in high school, and within five years, no less.
There’s one large exception, of course, and it’s a reminder that however haphazard Glee could be, it was also forced to deal with factors outside its control. Cory Monteith’s death was a tragedy on many levels, and it crippled whatever hope the show might have had for honoring its own premise. From the start, it was clear that Finn was the one character who might be forced to confront the reality behind his own dreams, looking for a form of meaning and contentment that didn’t resemble what he wanted when he was a teenager. His absence meant that the show had to recalibrate its endgame on the fly, and there’s a sense in which its decision to give everyone else outsized forms of happiness feels like a reaction to the real loss that the cast and crew endured. (It reminds me a little of The West Wing: originally, the Democratic candidate was supposed to lose the election in the final season, but after John Spencer’s sudden passing, the storyline was altered, since a political defeat on top of Leo’s death felt like just too much to bear.) I can understand the impulse, but I wish that it had been handled in a way that lived up to what Finn represented. His most memorable number expressed a sentiment that Glee seemed to have forgotten at the end: you can’t always get what you want, but sometimes you get what you need.
And by trying to be all things, Glee ended up as less than it could have been. Last week, while writing about three recent sitcoms, I pointed out that for all their surface similarity, they’re very different on the inside. What set Glee apart is that it wanted to have it all: the flyover sentimentality of Parks and Rec, the genre-bending of Community, the rapid succession of throwaway jokes we see in the likes of Unbreakable Kimmy Schmidt. That’s a lot for one show to handle, and Glee never lacked for ambition; unfortunately, it just wasn’t very competent or consistent, although its good intentions carried it surprisingly far. After the finale, my wife pointed out that the show’s most lasting legacy might be in the inner lives of teenagers coming to terms with their own sexuality, which can’t be denied. But it could have done all this and been a good show. I’m grateful to it for a handful of unforgettable moments, but that’s true of any television series, which time and memory tend to reduce to little more than a single look on an actor’s face. As Howard Hawks, one of Godard’s idols, said: “A good movie is three great scenes and no bad scenes.” For television, you can multiply that number by five. Glee had all the great scenes we could ever need, but it racked up countless bad scenes and diminished itself as it tried to be everything to everyone. And it got the finale that it wanted, even if Finn deserved more.
If there’s one note that nearly every writer gets from an editor or reader at one point or another, it’s this: “Raise the stakes.” What makes this note so handy from a reader’s point of view—and beyond infuriating for the writer who receives it—is that it’s never wrong, and it doesn’t require much in the way of close reading or analysis of the story itself. The stakes in a story could always be a little higher, and it’s hard for an author to make a case that he’s calibrated the stakes just right, or that the story wouldn’t benefit from some additional risk or tension. It’s such a common note, in fact, that it’s turned into a running joke among screenwriters. In the commentary track for the Simpsons episode “Natural Born Kissers,” for instance, the legendary comedy writer George Meyer watches a scene in which Homer and Marge need to drive to the store to buy a new motor for their broken refrigerator, and he drily notes: “This is what’s known as ‘raising the stakes.'”
And the fact that development executives can give this note so unthinkingly explains a lot about the movies. Recently, the New York Times reporter Brooks Barnes circulated a fake proposal for an action movie called Red, White and Blood to a number of Hollywood insiders to see what they had to say. The response from producer Lynda Obst is particularly interesting:
The stakes need to be much, much higher. A gun battle? How cute. We need hotter weapons. Huge, big battle weapons—maybe an end-of-the-world device.
Hence the fact that every superhero movie seems to end with a crisis that threatens to wipe out all of humanity, or at least most of Gotham City. In itself, this isn’t necessarily a bad thing: the lack of a credible threat is part of what makes Superman Returns, for all its good intentions, a bit of a snooze. But after a while, the stakes become so high that they’re almost abstract. The final battle in The Avengers is theoretically supposed to determine the fate of the world, but it still comes down to our heroes fighting a bunch of aliens on flying scooters outside Grand Central Station.
Really, though, the problem isn’t raising the stakes, but finding ways to express them in immediate human terms. Take the ending of Man of Steel. After an epic fistfight that destroys entire skyscrapers and probably costs thousands of lives, the struggle between Zod and Superman comes down to the fate of a handful of innocent bystanders—also staged, interestingly enough, in Grand Central Station. In principle, a few more casualties shouldn’t matter much either way, but they do: it’s an undeniably powerful moment in a movie in which the emotional side is often puzzlingly opaque. And it isn’t hard to see why. Instead of the legions of digitized fatalities in a Michael Bay movie, we’re given a good look at a handful of real people. We’re close enough to see the fear on their faces, and we care. (One suspects that Synder and Nolan took a cue from Richard Donner’s original Superman movie, in which the destruction of most of California seems insignificant compared to what happens to Lois Lane.)
And maybe it’s time filmmakers—and other storytellers—gave the world a break. In his great Biographical Dictionary of Film, David Thomson notes of Howard Hawks:
Like Monet forever painting lilies or Bonnard always re-creating his wife in her bath, Hawks made only one artwork. It is the principle of that movie that men are more expressive rolling a cigarette than saving the world.
Aside from the fact that Disney isn’t likely to show any of its Marvel characters smoking, this is still good advice to follow. You can raise the stakes as high as you want, but as disaster movies like 2012 have shown, you can destroy the entire planet and we still won’t care if you don’t give us characters to care about. Like most notes from readers, “raising the stakes” is less a way of solving a problem than an indication that deeper issues may lie elsewhere. And the real solution isn’t to blow up the world, or introduce hotter weapons, but to slow things down, show us a recognizable human being with needs we can understand, and maybe even let him roll a cigarette or two.