Posts Tagged ‘2001: A Space Odyssey’
The soul of a new machine
Over the weekend, I took part in a panel at Windycon titled “Evil Computers: Why Didn’t We Just Pull the Plug?” Naturally, my mind turned to the most famous evil computer in all of fiction, so I’ve been thinking a lot about HAL, which made me all the more sorry to learn yesterday of the death of voice actor Douglas Rain. (Stan Lee also passed away, of course, which is a subject for a later post.) I knew that Rain had been hired to record the part after Stanley Kubrick was dissatisfied by an earlier attempt by Martin Balsam, but I wasn’t aware that the director had a particular model in mind for the elusive quality that he was trying to evoke, as Kate McQuiston reveals in the book We’ll Meet Again:
Would-be HALs included Alistair Cooke and Martin Balsam, who read for the part but was deemed too emotional. Kubrick set assistant Benn Reyes to the task of finding the right actor, and expressly not a narrator, to supply the voice. He wrote, “I would describe the quality as being sincere, intelligent, disarming, the intelligent friend next door, the Winston Hibler/Walt Disney approach. The voice is neither patronizing, nor is it intimidating, nor is it pompous, overly dramatic, or actorish. Despite this, it is interesting. Enough said, see what you can do.” Even Kubrick’s U.S. lawyer, Louis Blau, was among those making suggestions, which included Richard Basehart, José Ferrer, Van Heflin, Walter Pigeon, and Jason Robards. In Douglas Rain, who had experience both as an actor and a narrator, Kubrick found just what he was looking for: “I have found a narrator…I think he’s perfect, he’s got just the right amount of the Winston Hibler, the intelligent friend next door quality, with a great deal of sincerity, and yet, I think, an arresting quality.”
Who was Winston Hibler? He was the producer and narrator for Disney who provided voiceovers for such short nature documentaries as Seal Island, In Beaver Valley, and White Wilderness, and the fact that Kubrick used him as a touchstone is enormously revealing. On one level, the initial characterization of HAL as a reassuring, friendly voice of information has obvious dramatic value, particularly as the situation deteriorates. (It’s the same tactic that led Richard Kiley to figure in both the novel and movie versions of Jurassic Park. And I have to wonder whether Kubrick ever weighed the possibility of hiring Hibler himself, since in other ways, he clearly spared no expense.) But something more sinister is also at play. As I’ve mentioned before, Disney and its aesthetic feels weirdly central to the problem of modernity, with its collision between the sentimental and the calculated, and the way in which its manufactured feeling can lead to real memories and emotion. Kubrick, a famously meticulous director who looked everywhere for insights into craft, seems to have understood this. And I can’t resist pointing out that Hibler did the voiceover for White Wilderness, which was nominated for an Academy Award for Best Documentary Short, but also included a scene in which the filmmakers deliberately herded lemmings off a cliff into the water in a staged mass suicide. As Hibler smoothly narrates in the original version: “A kind of compulsion seizes each tiny rodent and, carried along by an unreasoning hysteria, each falls into step for a march that will take them to a strange destiny. That destiny is to jump into the ocean. They’ve become victims of an obsession—a one-track thought: ‘Move on! Move on!’ This is the last chance to turn back, yet over they go, casting themselves out bodily into space.”
And I think that Kubrick’s fixation on Hibler’s voice, along with the version later embodied by Rain, gets at something important about our feelings toward computers and their role in our lives. In 2001, the astronauts are placed in an artificial environment in which their survival depends on the outwardly benevolent HAL, and one of the central themes of science fiction is what happens when this situation expands to encompass an entire civilization. It’s there at the very beginning of the genre’s modern era, in John W. Campbell’s “Twilight,” which depicts a world seven million years in the future in which “perfect machines” provide for our every need, robbing the human race of all initiative. (Campbell would explore this idea further in “The Machine,” and he even offered an early version of the singularity—in which robots learn to build better versions of themselves—in “The Last Evolution.”) Years later, Campbell and Asimov put that relationship at the heart of the Three Laws of Robotics, the first of which states: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” This sounds straightforward enough, but as writers realized almost right away, it hinges on the definition of certain terms, including “human being” and “harm,” that are slipperier than they might seem. Its ultimate expression was Jack Williamson’s story “With Folded Hands,” which carried the First Law to its terrifying conclusion. His superior robots believe that their Prime Directive is to prevent all forms of unhappiness, which prompts them to drug or lobotomize any human beings who seem less than content. As Williamson said much later in an interview with Larry McCaffery: “The notion I was consciously working on specifically came out of a fragment of a story I had worked on for a while about an astronaut in space who is accompanied by a robot obviously superior to him physically…Just looking at the fragment gave me the sense of how inferior humanity is in many ways to mechanical creations.”
Which brings us back to the singularity. Its central assumption was vividly expressed by the mathematician I.J. Good, who also served as a consultant on 2001:
Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.
That last clause is a killer, but even if we accept that such a machine would be “docile,” it also embodies the fear, which Campbell was already exploring in the early thirties, of a benevolent dictatorship of machines. And the very Campbellian notion of “the last invention” should be frightening in itself. The prospect of immortality may be enticing, but not if it emerges through a technological singularity that leaves us unprepared to deal with the social consequences, rather than through incremental scientific and medical progress—and the public debate that it ought to inspire—that human beings have earned for themselves. I can’t imagine anything more nightmarish than a world in which we can all live forever without having gone through the necessary ethical, political, and ecological stages to make such a situation sustainable. (When I contemplate living through the equivalent of the last two years over the course of millennia, the notion of eternal life becomes considerably less attractive.) Our fear of computers taking over our lives, whether on a spacecraft or in society as a whole, is really about the surrender of control, even in the benevolent form embodied by Disney. And when I think of the singularity now, I seem to hear it speaking with Winston Hibler’s voice: “Move on! Move on!”
The dawn of man
Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.
Almost from the moment that critics began to write about 2001, it became fashionable to observe that the best performance in the movie was by an actor playing a computer. In his review in Analog, for example, P. Schuyler Miller wrote:
The actors, except for the gentle voice of HAL, are thoroughly wooden and uninteresting, and I can’t help wondering whether this isn’t Kubrick’s subtle way of suggesting that the computer is really more “human” than they and fully justified in trying to get rid of them before they louse up an important mission. Someday we may know whether the theme of this part is a Clarke or a Kubrick contribution. I suspect it was the latter…perhaps just because Stanley Kubrick is said to like gadgets.
This criticism is often used to denigrate the other performances or the film’s supposed lack of humanity, but I prefer to take it as a tribute to the work of actor Douglas Rain, Kubrick and Clarke’s script, and the brilliant design of HAL himself. The fact that a computer is the character we remember best isn’t a flaw in the movie, but a testament to its skill and imagination. And as I’ve noted elsewhere, the acting is excellent—it’s just so understated and naturalistic that it seems vaguely incongruous in such spectacular settings. (Compare it to the performances in Destination Moon, for instance, and you see how good Keir Dullea and William Sylvester really are here.)
But I also think that the best performance in 2001 isn’t by Douglas Rain at all, but by Vivian Kubrick, in her short appearance on the phone as Heywood Floyd’s daughter. It’s a curious scene that breaks many of the rules of good storytelling—it doesn’t lead anywhere, it’s evidently designed to do nothing but show off a piece of hardware, and it peters out even as we watch it. The funniest line in the movie may be Floyd’s important message:
Listen, sweetheart, I want you to tell mommy something for me. Will you remember? Well, tell mommy that I telephoned. Okay? And that I’ll try to telephone tomorrow. Now will you tell her that?
But that’s oddly true to life as well. And when I watch the scene today, with a five-year-old daughter of my own, it seems to me that there’s no more realistic little girl in all of movies. (Kubrick shot the scene himself, asking the questions from offscreen, and there’s a revealing moment when the camera rises to stay with Vivian as she stands. This is sometimes singled out as a goof, although there’s no reason why a sufficiently sophisticated video phone wouldn’t be able to track her automatically.) It’s a scene that few other films would have even thought to include, and now that video chat is something that we all take for granted, we can see through the screen to the touchingly sweet girl on the other side. On some level, Kubrick simply wanted his daughter to be in the movie, and you can’t blame him.
At the time, 2001 was criticized as a soulless hunk of technology, but now it seems deeply human, at least compared to many of its imitators. Yesterday in the New York Times, Bruce Handy shared a story from Keir Dullea, who explained why he breaks the glass in the hotel room at the end, just before he comes face to face with himself as an old man:
Originally, Stanley’s concept for the scene was that I’d just be eating and hear something and get up. But I said, “Stanley, let me find some slightly different way that’s kind of an action where I’m reaching—let me knock the glass off, and then in mid-gesture, when I’m bending over to pick it up, let me hear the breathing from that bent-over position.” That’s all. And he says, “Oh, fine. That sounds good.” I just wanted to find a different way to play the scene than blankly hearing something. I just thought it was more interesting.
I love this anecdote, not just because it’s an example of an evocative moment that arose from an actor’s pragmatic considerations, but because it feels like an emblem of the production of the movie as a whole. 2001 remains the most technically ambitious movie of all time, but it was also a project in which countless issues were being figured out on the fly. Every solution was a response to a specific problem, and it covered a dizzying range of challenges—from the makeup for the apes to the air hostess walking upside down—that might have come from different movies entirely.
2001, in short, was made by hand—and it’s revealing that many viewers assume that computers had to be involved, when they didn’t figure in the process at all. (All of the “digital” readouts on the spacecraft, for instance, were individually animated, shot on separate reels of film, and projected onto those tiny screens on set, which staggers me even to think about it. And even after all these years, I still can’t get my head around the techniques behind the Star Gate sequence.) It reminds me, in fact, of another movie that happens to be celebrating an anniversary this year. As a recent video essay pointed out, if the visual effects in Jurassic Park have held up so well, it’s because most of them aren’t digital at all. The majority consist of a combination of practical effects, stop motion, animatronics, raptor costumes, and a healthy amount of misdirection, with computers used only when absolutely necessary. Each solution is targeted at the specific problems presented by a snippet of film that might last just for a few seconds, and it moves so freely from one trick to another that we rarely have a chance to see through it. It’s here, not in A.I., that Spielberg got closest to Kubrick, and it hints at something important about the movies that push the technical aspects of the medium. They’re often criticized for an absence of humanity, but in retrospect, they seem achingly human, if only because of the total engagement and attention that was required for every frame. Most of their successors lack the same imaginative intensity, which is a greater culprit than the use of digital tools themselves. Today, computers are used to create effects that are perfect, but immediately forgettable. And one of the wonderful ironies of 2001 is that it used nothing but practical effects to create a computer that no viewer can ever forget.
The cosmic order
Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.
On April 2, 1968, the world premiere of 2001: A Space Odyssey was held at the Uptown Theater, a movie palace in the Cleveland Park neighborhood of Washington, D.C. Two days later, Martin Luther King, Jr. was assassinated in Memphis, sparking riots throughout the country, including the nation’s capital. At first, this might seem like another reminder of how we unconsciously compartmentalize the past, filing events into separate categories, like the moon landing and the Manson killings, that actually unfolded in a confused present tense. Three years ago, the artist Edgar Arceneaux released an experimental film, A Time to Break Silence, that tried to go deeper, explaining in an interview:
Stanley Kubrick, Arthur C. Clarke and Dr. King were formulating their ideas about the duality of technology, which can be used as both a weapon and tool, during the same time period. As the psychic trauma of Dr. King’s death had the nation in a raw state of anger and uncertainty, a film chronicling the genealogy of humanity’s troubled future with technology is released in theaters.
More often, however, we tend to picture the political upheavals of the sixties as moving along a separate track from the decade’s scientific and technological achievements. In his book on the making of 2001, the journalist Piers Bizony writes of its visual effects team: “The optimism of Kubrick’s technologists seemed unquenchable. Perhaps, like their counterparts at Cape Kennedy, they were just too busy in their intense and closed-off little world to notice Vietnam, Martin Luther King, LSD, the counterculture?”
But that isn’t really true. John W. Campbell liked to remind his authors: “The future doesn’t happen one at a time.” And neither does the past or the present. We find, for instance, that King himself—who was a man who thought about everything—spoke and wrote repeatedly about the space program. At first, like many others, he saw it through the lens of national security, saying in a speech on February 2, 1959: “In a day when Sputniks and Explorers dash through outer space and guided ballistic missiles are carving highways of death though the stratosphere, nobody can win a war.” Yet it remained on his mind, and images of space began to appear more often in his public statements over the following year. A few months later, in a sermon titled “Unfulfilled Hopes,” he said:
We look out at the stars; we find ourselves saying that these stars shine from their cold and serene and passionless height, totally indifferent to the joys and sorrows of men. We begin to ask, is man a plaything of a callous nature, sometimes friendly and sometimes inimical? Is man thrown out as a sort of orphan in the terrifying immensities of space, with nobody to guide him on and nobody concerned about him? These are the questions we ask, and we ask them because there is an element of tragedy in life.
And King proclaimed in a commencement speech at Morehouse College in June: “Man through his scientific genius has been able to dwarf distance and place time in chains. He has been able to carve highways through the stratosphere, and is now making preparations for a trip to the moon. These revolutionary changes have brought us into a space age. The world is now geographically one.”
King’s attitude toward space was defined by a familiar tension. On one hand, space travel is a testament to our accomplishments as a species; on the other, it diminishes our achievements by forcing us to confront the smallness of our place in the universe. On December 11, 1960, King emphasized this point in a sermon at the Unitarian Church of Germantown, Pennsylvania:
All of our new developments can banish God neither from the microcosmic compass of the atom nor from the vast unfathomable ranges of interstellar space, living in a universe in which we are forced to measure stellar distance by light years, confronted with the illimitable expanse of the universe in which stars are five hundred million billion miles from the Earth, which heavenly bodies travel at incredible speed and in which the ages of planets are reckoned in terms of billions of years. Modern man is forced to cry out with the solace of old: “When I behold the heavens, the work of thy hands, the moon, the stars, and all that thou hast created, what is man that thou art mindful of him and the son of man that thou remembereth him?”
In 1963, King made the comparison more explicit in his book The Strength to Love: “Let us notice, first, that God is able to sustain the vast scope of the physical universe. Here again, we are tempted to feel that man is the true master of the physical universe. Manmade jet planes compress into minutes distances that formerly required weeks of tortuous effort. Manmade spaceships carry cosmonauts through outer space at fantastic speeds. Is God not being replaced in the mastery of the cosmic order?” But after reminding us of the scale of the distances involved, King concludes: “We are forced to look beyond man and affirm anew that God is able.”
This seems very much in the spirit of 2001, which is both a hymn to technology and a meditation on human insignificance. For King, however, the contrast between the triumphs of engineering and the vulnerability of the individual wasn’t just an abstract notion, but a reflection of urgent practical decisions that had to be made here and now. Toward the end of his life, he framed it as a choice of priorities, as he did in a speech in 1967: “John Kenneth Galbraith said that a guaranteed national income could be done for about twenty billion dollars a year. And I say to you today, that if our nation can spend thirty-five billion dollars to fight an unjust, evil war in Vietnam, and twenty billion dollars to put a man on the moon, it can spend billions of dollars to put God’s children on their own two feet right here on Earth.” The following year, speaking to the Rabbinical Assembly in the Catskills, he was even more emphatic: “It must be made clear now that there are some programs that we can cut back on—the space program and certainly the war in Vietnam—and get on with this program of a war on poverty.” And on March 18, 1968, King said to the striking sanitation workers in Memphis, whom he would visit again on the day before he died:
I will hear America through her historians, years and generations to come, saying, “We built gigantic buildings to kiss the skies. We built gargantuan bridges to span the seas. Through our spaceships we were able to carve highways through the stratosphere. Through our airplanes we are able to dwarf distance and place time in chains. Through our submarines we were able to penetrate oceanic depths.” It seems that I can hear the God of the universe saying, “Even though you have done all of that, I was hungry and you fed me not, I was naked and you clothed me not. The children of my sons and daughters were in need of economic security and you didn’t provide it for them. And so you cannot enter the kingdom of greatness.”
How the solar system was won
Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.
When Stanley Kubrick hired Arthur C. Clarke to work on the project that became 2001: A Space Odyssey, they didn’t have a title, a plot, or even much in the way of a premise. In Kubrick’s introductory letter to the author, he had written only that his interest lay in “these broad areas, naturally assuming great plot and character”:
1. The reasons for believing in the existence of intelligent extraterrestrial life.
2. The impact (and perhaps even lack of impact in some quarters) such discovery would have on earth in the near future.
3. A space probe with a landing and exploration of the moon and Mars.
If you’ve seen the movie, you know that almost none of what Kubrick describes here ended up in the finished film. The existence of extraterrestrial life is the anthropic assumption on which the entire story rests; there’s no real attempt to sketch in the larger social context; and the discovery of the alien artifact—far from having any impact on society—remains a secret until the end to all but a few scientists. There’s already a thriving colony on the moon when the main action of the story really starts, and Heywood Floyd only turns up after the monolith has been found. All that remains of Kubrick’s original conception, in fact, is a vague feeling that he tried to convey early in their partnership, which Clarke remembered later as the desire to make “a movie about man’s relation to the universe—something which had never been attempted, still less achieved, in the history of motion pictures.”
In this respect, they undoubtedly succeeded, and a lot of it had to do with Kubrick’s choice of collaborator. Yesterday, I suggested that Kubrick settled on Clarke because he was more likely than the other obvious candidates to be available for the extended writing process that the director had in mind. (This was quite an assumption, since it meant that Clarke had to be away from his home in Ceylon for more than a year, but it turned out to be right.) Yet Clarke was also uniquely qualified to write about “man’s relation to the universe,” and in particular about aliens who were far in advance of the human race. As Isaac Asimov has memorably explained, this was a plot point that was rarely seen in Astounding, mostly because of John W. Campbell’s personal prejudices:
[Campbell] was a devout believer in the inequality of man and felt that the inequality could be detected by outer signs such as skin and hair coloring…In science fiction, this translated itself into the Campbellesque theory that earthmen (all of whom, in the ideal Campbell story, resembled, people of northwestern European extraction) were superior to all other intelligent races.
Clarke had broken through in Astounding after the war—his stories “Loophole” and “Rescue Party” appeared in 1946—but geographical distance and foreign rights issues had kept him from being shaped by Campbell to any real extent. As a result, he was free to indulge in such works as Childhood’s End, the ultimate story about superior aliens, which was inspired by Campbell’s novel The Mightiest Machine but ran its first installment in the British magazine New Worlds.
Clarke, in short, was unquestionably part of the main sequence of hard science fiction that Campbell had inaugurated, but he was also open to exploring enormous, borderline mystical questions that emphasized mankind’s insignificance. (At his best, in such stories as “The Star” and “The Nine Billion Names of God,” he managed to combine clever twist endings with a shattering sense of scale in a way that no other writer has ever matched.) It was this unlikely combination of wit, technical rigor, and awareness of the infinite that made him ideally suited to Kubrick, and they promptly embarked on one of the most interesting collaborations in the history of the genre. As an example of a symbiotic organism, the only comparable example is Campbell and the young Asimov, except that Clarke and Kubrick were both mature artists at the peak of their talents. Fortunately for us, Clarke kept a journal, and he provided excerpts in two fascinating essays, “Christmas, Shepperton” and “Monoliths and Manuscripts,” which were published in the collection The Lost Worlds of 2001. The entries offer a glimpse of a process that ranged freely in all directions, with both men pursuing trains of thought as far as they would go before abandoning them for something better. As Clarke writes:
It was [Kubrick’s] suggestion that, before embarking on the drudgery of the script, we let our imaginations soar freely by developing the story in the form of a complete novel…After various false starts and twelve-hour talkathons, by early May 1964 Stanley agreed that [Clarke’s short story] “The Sentinel” would provide good story material. But our first concept—and it is hard now for me to focus on such an idea, though it would have been perfectly viable—involved working up to the discovery of an extraterrestrial artifact as the climax, not the beginning, of the story. Before that, we would have a series of incidents or adventures devoted to the exploration of the moon and planets…[for which] our private title (never of course intended for public use) was How the Solar System Was Won.
And while 2001 arguably made its greatest impact on audiences with its meticulous art direction and special effects, Kubrick’s approach to writing was equally obsessive. He spent a full year developing the story with Clarke before approaching the studio for financing, and although they soon realized that the premise of “The Sentinel” would work better as an inciting incident, rather than as the ending, the notion of “incidents or adventures” persisted in the finished script. The film basically consists of four loosely connected episodes, the most memorable of which—the story of HAL 9000—could be eliminated without fundamentally affecting the others. But if it feels like an organic whole, this is largely thanks to the decision to develop far more material than could ever fit into a novel, much less a movie. (Clarke’s diary entries are filled with ideas that were dropped or transformed in the final version: “The people we meet on the other star system are humans who were collected from earth a hundred thousand years ago, and hence are virtually identical to us.” “What if our E.T.s are stranded on earth and need the ape-men to help them?” And then there’s the startling line, which Clarke, who was discreetly gay, records without comment: “Stanley has invented the wild idea of slightly fag robots who create a Victorian environment to put our heroes at their ease.”) It verged on a private version of development hell, without any studio notes or interference, and it’s hard to imagine any other director who could have done it. 2001 started a revolution in visual effects, but its writing process was just as remarkable, and we still haven’t caught up to it yet. Even Clarke, whose life it changed, found Kubrick’s perfectionism hard to take, and he concluded: “In the long run, everything came out all right—exactly as Stanley had predicted. But I can think of easier ways of earning a living.”
When Clarke Met Kubrick
Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.
“I’m reading everything by everybody,” Stanley Kubrick said one day over lunch in New York. It was early 1964, and he was eating at Trader Vic’s with Roger A. Caras, a wildlife photographer and studio publicist who was working at the time for Columbia Pictures. Dr. Strangelove had just been released, and after making small talk about their favorite brand of telescope, Caras asked the director what he had in mind for his next project. Kubrick replied that he was thinking about “something on extraterrestrials,” but he didn’t have a writer yet, and in the meantime, he was consuming as much science fiction as humanly possible. Unfortunately, we don’t know much about what he was reading, which is a frustrating omission in the career of a filmmaker whose archives have been the subject of so many exhaustive studies. In his biography of Kubrick, Vincent Lobrutto writes tantalizingly of this period: “Every day now boxes of science fiction and fact books were being delivered to his apartment. Kubrick was immersing himself in a subject he would soon know better than most experts. His capacity to grasp and disseminate information stunned many who worked with him.” Lobrutto notes that Kubrick took much the same approach a decade later on the project that became The Shining, holing up in his office with “stacks of horror books,” and the man with whom he would eventually collaborate on 2001 recalled of their first meeting: “[Kubrick] had already absorbed an immense amount of science fact and science fiction, and was in some danger of believing in flying saucers.” At their lunch that day at Trader Vic’s, however, Caras seemed to think that all of this work was unnecessary, and he told this to Kubrick in no uncertain terms: “Why waste your time? Why not just start with the best?”
Let’s pause the tape here for a moment to consider what other names Caras might plausibly have said. A year earlier, in his essay “The Sword of Achilles,” Isaac Asimov provided what we can take as a fairly representative summary of the state of the genre:
Robert A. Heinlein is usually considered the leading light among good science fiction writers. Others with a fine grasp of science and a fascinatingly imaginative view of its future possibilities are Arthur C. Clarke, Frederik Pohl, Damon Knight, James Blish, Clifford D. Smiak, Poul Anderson, L. Sprague de Camp, Theodore Sturgeon, Walter Miller, A.J. Budrys…These are by no means all.
Even accounting for the writer and the time period, there are a few noticeable omissions—it’s surprising not to see Lester del Rey, for instance, and A.E. van Vogt, who might not have qualified as what Asimov saw as “good science fiction,” had been voted one of the top four writers in the field in a pair of polls a few years earlier. It’s also necessary to add Asimov himself, who at the time was arguably the science fiction writer best known to general readers. (In 1964, he would even be mentioned briefly in Saul Bellow’s novel Herzog, which was the perfect intersection of the highbrow and the mainstream.) Arthur C. Clarke’s high ranking wasn’t just a matter of personal affection, either—he and Asimov later became good friends, but when the article was published, they had only met a handful of times. Clarke, in other words, was clearly a major figure. But it seems fair to say that anyone claiming to name “the best” science fiction writer in the field might very well have gone with Asimov or Heinlein instead.
Caras, of course, recommended Clarke, whom he had first met five years earlier at a weekend in Boston with Jacques Cousteau. Kubrick was under the impression that Clarke was a recluse, “a nut who lives in a tree in India someplace,” and after being reassured that he wasn’t, the director became excited: “Jesus, get in touch with him, will you?” Caras sent Clarke a telegram to ask about his availability, and when the author said that he was “frightfully interested,” Kubrick wrote him a fateful letter:
It’s a very interesting coincidence that our mutual friend Caras mentioned you in a conversation we were having about a Questar telescope. I had been a great admirer of your books for quite a time and had always wanted to discuss with you the possibility of doing the proverbial “really good” science-fiction movie…Roger tells me you are planning to come to New York this summer. Do you have an inflexible schedule? If not, would you consider coming sooner with a view to a meeting, the purpose of which would be to determine whether an idea might exist or arise which could sufficiently interest both of us enough to want to collaborate on a screenplay?
This account of the conversation differs slightly from Caras’s recollection—Kubrick doesn’t say that they were actively discussing potential writers for a film project, and he may have been flattering Clarke slightly with the statement that he had “always wanted” to talk about a movie with him. But it worked. Clarke wrote back to confirm his interest, and the two men finally met in New York on April 22, where the author did his best to talk Kubrick out of his newfound interest in flying saucers.
But why Clarke? At the time, Kubrick was living on the Upper East Side, which placed him within walking distance of many science fiction authors who were considerably closer than Ceylon, and it’s tempting to wonder what might have happened if he had approached Heinlein or Asimov, both of whom would have been perfectly sensible choices. A decade earlier, Heinlein made a concerted effort to break into Hollywood with the screenplays for Destination Moon and Project Moon Base, and the year before, he had written an unproduced teleplay for a proposed television show called Century XXII. (Kubrick studied Destination Moon for its special effects, if not for its story, as we learn from the correspondence of none other than Roger Caras, who had gone to work for Kubrick’s production company.) Asimov, for his part, was more than willing to explore such projects—in years to come, he would meet to discuss movies with Woody Allen and Paul McCartney, and I’ve written elsewhere about his close encounter with Steven Spielberg. But if Kubrick went with Clarke instead, it wasn’t just because they had a friend in common. At that point, Clarke was a highly respected writer, but not yet a celebrity outside the genre, and the idea of a “Big Three” consisting of Asimov, Clarke, and Heinlein was still a decade away. His talent was undeniable, but he was also a more promising candidate for the kind of working relationship that the director had in mind, which Kubrick later estimated as “four hours a day, six days a week” for more than three years. I suspect that Kubrick recognized what might best be described as a structural inefficiency in the science fiction market. The time and talents of one of the most qualified writers imaginable happened to be undervalued and available at just the right moment. When the opportunity came, Kubrick seized it. And it turned out to be one hell of a bargain.
The ultimate trip
On Saturday, I was lucky enough to see 2001: A Space Odyssey on the big screen at the Music Box Theatre in Chicago. I’ve seen this movie well over a dozen times, but watching it on a pristine new print from the fourth row allowed me to pick up on tiny details that I’d never noticed before, such as the fact that David Bowman, stranded at the end in his celestial hotel room, ends up wearing a blue velvet robe startlingly like Isabella Rossellini’s. I was also struck by the excellence of the acting, which might sound like a joke, but it isn’t. Its human protagonists have often been dismissed—Roger Ebert, who thought it was one of the greatest films of all time, called it “a bloodless movie with faceless characters”—and none of the actors, aside from Douglas Rain as the voice of HAL, are likely to stick in the memory. (As Noël Coward reputedly said: “Keir Dullea, gone tomorrow.”) But on an objective level, these are nothing less than the most naturalistic performances of any studio movie of the sixties. There isn’t a trace of the affectation or overacting that you see in so much science fiction, and Dullea, Gary Lockwood, and particularly William Sylvester, in his nice dry turn as Heywood Floyd, are utterly believable. You could make a strong case that their work here has held up better than most of the more conventionally acclaimed performances from the same decade. This doesn’t make them any better or worse, but it gives you a sense of what Kubrick, who drew his characters as obsessively as his sets and special effects, was trying to achieve. He wanted realism in his acting, along with everything else, and this is how it looks, even if we aren’t used to seeing it in space.
The result is still the most convincing cinematic vision of space exploration that we have, as well as the most technically ambitious movie ever made, and its impact, like that of all great works of art, appears in surprising places. By coincidence, I went to see 2001 the day after Donald Trump signed an executive order to reinstate the National Space Council, at a very peculiar ceremony that was held with a minimum of fanfare. The event was attended by Buzz Aldrin, who has played scenes across from Homer Simpson and Optimus Prime, and I can’t be sure that this didn’t strike him as the strangest stage he had ever shared. Here are a few of Trump’s remarks, pulled straight from the official transcript:
Security is going to be a very big factor with respect to space and space exploration. At some point in the future, we’re going to look back and say, how did we do it without space? The Vice President will serve as the council’s chair….Some of the most successful people in the world want to be on this board…Our journey into space will not only make us stronger and more prosperous, but will unite us behind grand ambitions and bring us all closer together. Wouldn’t that be nice? Can you believe that space is going to do that? I thought politics would do that. Well, we’ll have to rely on space instead…We will inspire millions of children to carry on this proud tradition of American space leadership—and they’re excited—and to never stop wondering, hoping, and dreaming about what lies beyond the stars.
Taking a seat, Trump opened the executive order, exclaiming: “I know what this is. Space!” Aldrin then piped up with what was widely reported as a reference to Toy Story: “Infinity and beyond!” Trump seemed pleased: “This is infinity here. It could be infinity. We don’t really don’t know. But it could be. It has to be something—but it could be infinity, right?”
As HAL 9000 once said: “Yes, it’s puzzling.” Aldrin may have been quoting Toy Story, but he might well have been thinking of 2001, too, the last section of which is titled “Jupiter and Beyond the Infinite.” (As an aside, I should note that the line “To infinity and beyond” makes its first known appearance, as far as I can tell, in John W. Campbell’s 1934 serial The Mightiest Machine.) It’s an evocative but meaningless phrase, with the same problems that led Arthur C. Clarke to express doubts about Kubrick’s working title, Journey Beyond the Stars—which Trump, you’ll notice, also echoed. Its semantic content is nonexistent, which is only fitting for a ceremony that underlined the intellectual bankruptcy of this administration’s approach to space. I don’t think I’m overstating the matter when I say that Trump and Mike Pence have shown nothing but contempt for other forms of science. The science division of the Office of Science and Technology Policy lies empty. Pence has expressed bewilderment at the fact that climate change has emerged, “for some reason,” as an issue on the left. And Trump has proposed significant cuts to science and technology funding agencies. Yet his excitement for space seems unbounded and apparently genuine. He asked eagerly of astronaut Peggy Whitson: “Tell me, Mars, what do you see a timing for actually sending humans to Mars? Is there a schedule and when would you see that happening?” And the reasons behind his enthusiasm are primarily aesthetic and emotional. One of his favorite words is “beautiful,” in such phrases as “big, beautiful wall” and “beautiful military equipment,” and it was much in evidence here: “It is America’s destiny to be at the forefront of humanity’s eternal quest for knowledge and to be the leader amongst nations on our adventure into the great unknown. And I could say the great and very beautiful unknown. Nothing more beautiful.”
But the truly scary thing is that if Trump believes that the promotion of space travel can be divorced from any concern for science itself, he’s absolutely right. As I’ve said here before, in the years when science fiction was basically a subcategory of adventure fiction, with ray guns instead of revolvers, space was less important in itself than as the equivalent of the unexplored frontier of the western: it stood for the unknown, and it was a perfect backdrop for exciting plots. Later, when the genre began to take itself more seriously as a predictive literature, outer space was grandfathered in as a setting, even if it had little to do with any plausible vision of the future. Space exploration seemed like an essential part of our destiny as a species because it happened to be part of the genre already. As a result, you can be excited by the prospect of going to Mars while actively despising or distrusting everything else about science—which may be the only reason that we managed to get to the moon at all. (These impulses may have less to do with science than with religion. The most haunting image from the Apollo 11 mission, all the more so because it wasn’t televised, may be that of Aldrin taking communion on the lunar surface.) Science fiction made it possible, and part of the credit, or blame, falls on Kubrick. Watching 2001, I had tears in my eyes, and I felt myself filled with all my old emotions of longing and awe. As Kubrick himself stated: “If 2001 has stirred your emotions, your subconscious, your mythological yearnings, then it has succeeded.” And it did, all too well, at the price of separating our feelings for space even further from science, and of providing a place for those subconscious urges to settle while leaving us consciously indifferent to problems closer to home. Kubrick might not have faked the moon landing, but he faked a Jupiter mission, and he did it beautifully. And maybe, at least for now, it should save us the expense of doing it for real.
My alternative canon #1: A Canterbury Tale
Note: I’ve often discussed my favorite movies on this blog, but I also love films that are relatively overlooked or unappreciated. Over the next two weeks, I’ll be looking at some of the neglected gems, problem pictures, and flawed masterpieces that have shaped my inner life, and which might have become part of the standard cinematic canon if the circumstances had been just a little bit different.
I’ve frequently said that The Red Shoes is my favorite movie of all time, but it isn’t even the most remarkable film directed by Michael Powell and Emeric Pressburger. The Red Shoes succeeds in large part by following through on its promises: it takes place in a fascinating world and tells a story of high melodrama, with an obvious determination to deliver as much color and atmosphere to the audience as possible, and its brilliance emerges from how consistently it lives up to its own impossible standards. A Canterbury Tale, which came out five years earlier, is in many respects more astonishing, because it doesn’t seem to have any conventional ambitions at all. It’s a deliberately modest film with a story so inconsequential that it verges on a commentary on the arbitrariness of all narrative: three young travelers, stranded at a small village near Canterbury during World War II, attempt to solve the mystery of “the glue man,” an unseen figure who throws glue at the hair of local women to discourage them from going out at night—and that, incredibly, is it. When the glue man’s identity is revealed, it’s handled so casually that the moment is easy to miss, and not even the protagonists themselves seem all that interested in the plot, which occupies about ten minutes of a film that runs over two hours in its original cut. And the fact that the movie itself was openly conceived as a light propaganda picture doesn’t seem to work in its favor.
Yet this is one of the most beautiful movies ever made, a languid series of funny, moving, and evocative set pieces that reminded me, when I first saw it, of Wong Kar-Wai magically set loose in wartime Britain. There are the usual flourishes of cinematic playfulness from Powell and Pressburger—including a cut from a medieval falcon to a modern warplane that anticipates Kubrick in 2001—but the tone is atypically relaxed and gentle, with even less plot than in its spiritual sequel I Know Where I’m Going! Despite the title, it doesn’t have much to do with Chaucer, except that the lead characters are all pilgrims who have been damaged in different ways and are healed by a journey to Canterbury. (Years later, I stayed at a tiny hotel within sight of the cathedral, where I verified that the movie was on sale at its gift shop.) It’s nostalgic and vaguely conservative, but it also looks ahead to the New Wave with its visual zest, greediness for location detail, and willingness to take happy digressions. The cast includes the lovely ingenue Sheila Sim, who later married Richard Attenborough, and Eric Portman as Colpeper, the local magistrate, who, in a typically perverse touch from the Archers, is both their virtuous embodiment of high Tory ideals and kind of a creepy weirdo. Sim died earlier this year, but when she looks up at the clouds in the tall grass with Portman, she lives forever in my heart—along with the film itself, which keeps one foot in the past while somehow managing to seem one step ahead of every movie that came after it.
The Coco Chanel rule
“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)
But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:
In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.
As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”
This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.
This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”
This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it.
The dancer from the dance
Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What one piece of pop culture would you use to teach an artificial intelligence what it means to be human?”
When I was growing up, one of the books I browsed through endlessly was Murmurs of Earth by Carl Sagan, which told the story behind the Voyager golden records. Attached to the two Voyager spacecraft and engraved with instructions for playback, each record was packed with greetings in multiple languages, sounds, encoded images of life on earth, and, most famously, music. The musical selection opens with the first movement of Bach’s Brandenburg Concerto No. 2, which is about as solid a choice as it gets, and the remaining tracks are eclectic and inspired, ranging from a Pygmy girls’ initiation song to Blind Willie Johnson’s “Dark Was the Night, Cold Was the Ground.” (The inclusion of “Johnny B. Goode” led to a legendary joke on Saturday Night Live, purporting to predict the first message from an alien civilization: “Send more Chuck Berry.”) Not included, alas, was “Here Comes the Sun,” which the Beatles were happy to contribute, only to be vetoed by their record company. Evidently, EMI was concerned about the distribution of royalties from any commercial release of the disc—which says more about our society than we’d like any alien culture to know.
Of course, the odds of either record ever being found and played are infinitesimal, but it was still a valuable exercise. What, exactly, does it mean to be us, and how can we convey this to a nonhuman intelligence? Other solutions have been proposed, some simpler and more elegant than others. In The Lives of a Cell, Lewis Thomas writes:
Perhaps the safest thing to do at the outset, if technology permits, is to send music. This language may be the best we have for explaining what we are like to others in space, with least ambiguity. I would vote for Bach, all of Bach, streamed out into space, over and over again. We would be bragging of course, but it is surely excusable to put the best possible face on at the beginning of such an acquaintance. We can tell the harder truths later.
If such thought experiments so often center on music, it’s because we intuitively see it as our most timeless, universal production, even if that’s as much a cultural construct as anything else. All art, Walter Pater says, aspires to the condition of music, in which form and content can’t be separated, so it’s natural to regard it as the best we have to offer.
Yet music, for all its merits, only hints at a crucial aspect of human existence: its transience. It’s true that every work of music has a beginning and an end, but once written, it potentially exists forever—if not as a single performance, then as an act of crystalized thought—and it can be experienced in pretty much the form that Bach or Beethoven intended. In that sense, it’s an idealized, aspirational, and not particularly accurate representation of human life, in which so much of what matters is ephemeral and irreproducible. We may never have a chance to explain this to an alien civilization, but it’s likely that we’ll have to convey it sooner or later to another form of nonhuman consciousness that arises closer to home. Assuming we’re not convinced, like John Searle, of the philosophical impossibility of artificial intelligence, it’s only a matter of time before we have to take this problem seriously. And when we do, it’s our sense of mortality and impermanence that might pose the greatest obstacle to mutual comprehension. Unless its existence is directly threatened, as with HAL in 2001, an A.I., which is theoretically immortal, might have trouble understanding how we continue to find meaning in a life that is defined largely by the fact that it ends.
When I ask myself what form of art expresses this fact the most vividly, it has to be dance. And although I’d be tempted to start with The Red Shoes, my favorite movie of all time, there’s an even better candidate: the extraordinary documentary Ballets Russes, available now for streaming on Hulu, which celebrates its tenth anniversary this year. (I didn’t even realize this until I looked up its release date shortly before typing this sentence, which is just another reminder of how quickly time slips away.) Just as the Voyager record was a kind of exercise to determine what art we find most worthy of preservation, the question of what to show a nonhuman intelligence is really more about what works can teach us something about what it means to be human. Ballets Russes qualifies as few other movies do: I welled up with tears within the first minute, which juxtaposes archival footage of dancers in their prime with the same men and women sixty years later. In the space of a cut, we see the full mystery of human existence, and it’s all the more powerful when we reflect that these artists have devoted their lives to creating a string of moments that can’t be recaptured—as we all do, in our different ways. An artificial intelligence might wonder if there was any point. I don’t have an answer to that. But if one exists at all, it’s here.
A cut above the rest
The other day, my wife pointed me to a recent poll by the Motion Picture Editors Guild of the best-edited movies of all time. Most of the usual suspects are here, although not, curiously, The Usual Suspects: Raging Bull and Citizen Kane top the list, followed by the likes of Bonnie and Clyde, Psycho, and Raiders of the Lost Ark, as well as a few enticing surprises. (I’ve never seen All That Jazz, which sits at number four, although the fact that a subplot revolves around the protagonist’s attempts to edit a movie of his own makes me wonder if there’s a touch of sentiment involved.) What struck me the most about the ranking is its fundamental oddity: it seems natural that a list like this would exist for movies, but it’s hard to imagine a similar one for books or albums, which are as intensely edited as any motion picture. So, for that matter, are plays, songs, magazine articles, and podcasts. Nearly any work of art, in fact, has undergone an editing process, if we take this to mean only the systematic arrangement of its component parts. To take a slightly offbeat example: Meghan Trainor’s “All About that Bass” might seem like a trifle, but it’s ruthlessly organized, with a lot of ideas—some, admittedly, lifted from Chuck Berry—flowing seamlessly together. The editing, if we’re willing to grant that a pop song can be as consciously constructed as a film by Martin Scorsese, is brilliant. So why are we so used to talking about it in movies and nowhere else?
A few possible explanations come to mind, starting with the fact that the roles of movie editor and director usually, although not always, reside in two different people. Choices about editing can be hard to separate from earlier choices about structure, and the division of labor in movie production—with structural decisions shared among the screenwriter, editor, director, and others—make film editing feel like a pursuit in itself, which is less obvious in a novel or album. (Literary editors and music producers play a crucial role in the arrangement of the pieces in their respective fields, but their contribution is harder to define.) It doesn’t hurt that movie editors are probably the only ones we’ve ever seen accepting an award on television, or that books on film editing considerably outnumber those of any other kind. Perhaps most relevant of all is the very nature of editing a movie, which differs from other types of editorial work in that the amount of raw material is fixed. When you’re writing a book, it’s possible to write new chapters to fill in the gaps in the story; a recording artist can always lay down a fresh version of a track; but a movie editor is stuck with the dailies that the director delivers. These days, this isn’t necessarily true: directors like Peter Jackson plan for reshoots even before principal photography begins, and modern software allows for considerable freedom in creating new shots in post. But the image still persists of the editor exercising his or her will on a resistant mass of footage, solving narrative problems under enormous constraints. Which is what makes it so fascinating.
So what do we mean when we say that a movie had great editing? There’s an old chestnut, which isn’t any less true for being so familiar, that if you’ve noticed the editing in a movie, the editor has done a poor job. That’s right as far as it goes, and it’s equally correct that the showier moments in a smartly edited movie have a way of obscuring more meaningful work. The multiple film stocks in JFK might grab the eye, but they’re much less impressive than the massive amount of information that the movie allows the viewer to absorb. Famous cuts, like the one from the match to the desert in Lawrence of Arabia or the time jump in 2001, are the ones we recall, but we’re less prone to take notice of how expertly those films keep us oriented in two of the most confusing environments imaginable—the desert and outer space. And we’re often barely aware of how much of a movie has been constructed in postproduction. When you compare the script of The Usual Suspects with the final result, it’s hard not to conclude that the movie’s secret hero, its true Keyser Soze, is editor John Ottman: the whole closing montage of sounds, images, and dialogue, which is the first thing many of us remember, isn’t even hinted at in the screenplay. But we aren’t meant to see any of this. We’re left with the stubborn, redundant axiom that if a movie is great, its editing was great as well. That’s why the Editors Guild poll is foremost a list of terrific movies, and one of the first such lists that I’d recommend to anyone who was interested in learning more about film.
That said, as I’ve suggested above, there are times when we can’t help but be grateful for the problems that a movie’s editor has solved. Managing the delivery of complicated information, as we often see in the movies of David Fincher, poses tremendous challenges, and Gone Girl and The Girl With the Dragon Tattoo play like thrillers in which most of the drama is unfolding in the editing room. Casino, which I recently watched again just for my own pleasure, does this kind of thing so beautifully that it makes The Wolf of Wall Street seem a little lame by comparison. When it comes to keeping the audience grounded during complex action, we’re likely to think first of the films of Paul Greengrass, who has ruined much of modern action filmmaking by chopping up the footage so fluently that he encourages less talented filmmakers to do the same—hence the vast divide between The Bourne Supremacy and Quantum of Solace. (Although if I had to name one movie that still fills me with awe at how expertly it choreographs and assembles action on a large scale, it would have to be Titanic.) And editors have often been called upon to pull shape and logic out of seemingly unworkable footage. Annie Hall wasn’t even a love story before Ralph Rosenblum, by his own account, saw what its three hours of raw material were really about, and the result is a film that seems perfect, even if it was anything but preordained. Elsewhere, I’ve described creativity as the conversion of the arbitrary into the inevitable. And that, really, is what editors do.
Stellar mass
Note: This post does its best to avoid spoilers for Interstellar. I hope to have a more detailed consideration up next week.
Halfway through the first showing of Interstellar at the huge IMAX theater at Chicago’s Navy Pier, the screen abruptly went black. At a pivotal moment, the picture cut out first, followed immediately by the sound, and it took the audience a second to realize that the film had broken. Over the five minutes or so that followed, as we waited for the movie to resume, I had time to reflect on the sheer physicality of the technology involved. As this nifty featurette points out, a full print of Interstellar weighs six hundred pounds, mounted on a six-foot platter, and just getting it to move smoothly through the projector gate presents considerable logistical challenges, as we found out yesterday. (The film itself is so large that there isn’t room on the platter for any previews or extraneous features: it’s the first movie I’ve ever seen that simply started at the scheduled time, without any tedious preliminaries, and its closing credits are startlingly short.) According to Glenn Newland, the senior director of operations at IMAX, the company started making calls eighteen months ago to theater owners who were converting from film to digital, saying, in effect: Please hold on to that projector. You’re going to need it.
And they were right. I’ve noted before that if Christopher Nolan has indelibly associated himself with the IMAX format, that’s no accident. Nolan’s intuition about his large-scale medium seems to inform the narrative choices he makes: he senses, for instance, that plunging across a field of corn can be as visually thrilling as a journey through a wormhole or the skyline of Gotham City. Watching it, I got the impression that Nolan is drawn to IMAX as a kind of corrective to his own naturally hermetic style of storytelling: the big technical problems that the format imposes force him to live out in the world, not simply in his own head. And if the resulting image is nine times larger than that of conventional celluloid, that squares well with his approach to screenwriting, which packs each story with enough ideas for nine ordinary movies. Interstellar sometimes groans under the weight of its own ambitions; it lacks the clean lines provided by the heist plot of Inception or the superhero formula of his Batman films. It wants to be a popcorn movie, a visionary epic, a family story, and a scientifically rigorous adventure that takes a serious approach to relativity and time dilation, and it succeeds about two-thirds of the time.
Given the loftiness of its aims, that’s not too bad. Yet it might have worked even better if it had taken a cue from the director whose influence it struggles so hard to escape. Interstellar is haunted by 2001 in nearly every frame, from small, elegant touches, like the way a single cut is used to cover a vast stretch of time—in this case, the two-year journey from Earth to Saturn—to the largest of plot points. Like Kubrick’s film, it pauses in its evocation of vast cosmic vistas for a self-contained interlude of intimate, messy drama, which in both cases seems designed to remind us that humanity, or what it creates, can’t escape its most primitive impulses for self-preservation. Yet it also suffers a little in the comparison. Kubrick was shrewd enough to understand that a movie showing mankind in its true place in the universe had no room for ordinary human plots, and if his characters seem so drained of personality, it’s only a strategy for eliminating irrelevant distractions. Nolan wants to have it all, so he ends up with a film in which the emotional pieces sit uneasily alongside the spectacle, jostling for space when they should have had all the cosmos at their disposal.
Like most of Nolan’s recent blockbuster films, Interstellar engages in a complicated triangulation between purity of vision and commercial appeal, and the strain sometimes shows. It suffers, though much less glaringly, from the same tendency as Prometheus, in which characters stand around a spacecraft discussing information, like what the hell a wormhole is, that should have probably been covered long before takeoff. And while it may ultimately stand as Nolan’s most personal film—it was delivered to theaters under the fake title Flora’s Letter, which is named after his daughter—its monologues on the transcendent power of love make a less convincing statement than the visual wonders on display. (All praise and credit, by the way, are due to Matthew McConaughey, who carries an imperfectly conceived character with all the grace and authority he brought to True Detective, which also found him musing over the existence of dimensions beyond our own.) For all its flaws, though, it still stands as a rebuke to more cautious entertainments, a major work from a director who hardly seems capable of anything else. In an age of massless movies, it exerts a gravitational pull all its own, and if it were any larger, the theater wouldn’t be able to hold it.
The best closing shots in film
Note: Since I’m taking a deserved break for the holidays, I’m reposting a couple of my favorite entries from early in this blog’s run. This post was originally published, in a slightly different form, on January 13, 2011. Visual spoilers follow. Cover your eyes!
As I’ve noted before, the last line of a novel is almost always of interest, but the last line of a movie generally isn’t. It isn’t hard to understand why: movies are primarily a visual medium, and there’s a sense in which even the most brilliant dialogue can often seem beside the point. And as much the writer in me wants to believe otherwise, audiences don’t go to the movies to listen to words: they go to look at pictures.
Perhaps inevitably, then, there are significantly more great closing shots in film than there are great curtain lines. Indeed, the last shot of nearly every great film is memorable, so the list of finalists can easily expand into the dozens. Here, though, in no particular order, are twelve of my favorites. Click for the titles:
Goodbye, cinephilia
For as long as I can remember, the movies have been a huge part of my life. Growing up, I raided my parents’ videocassette collection on a regular basis, and may have been the only eleven-year-old in my fifth grade class whose favorite movie was 2001: A Space Odyssey. In high school, I was lucky enough to live only a short train ride away from the wonderful UC Theater in Berkeley, and spent many a wonderful weekend there taking in a double feature. (The only time I ever cut class on purpose was to catch an afternoon screening of Last Tango in Paris.) I continued this tradition in college, with countless visits to the Brattle and the Harvard Film Archive, and in New York I had an embarrassment of riches at the Film Forum, Lincoln Center, Landmark Sunshine, the Ziegfeld, and even the Angelika, despite its awful seats, screens, and location above a rumbling subway line. Chicago, meanwhile, offered the Music Box, Landmark Century, and many others. As a result, for the past fifteen years, I’ve probably averaged a movie a week, and sometimes more.
And although I’ve tried to make my mark in a different sort of art form, I’ve learned a tremendous amount as a storyteller from the movies, to the point where I sometimes feel, to misquote Ishmael, that the local movie theater was my Yale College and my Harvard. Part of the reason is that the movies allow us to experience a wide range of styles and subjects more quickly than a lifetime of reading: I can watch most of the movies on the Sight & Sound poll in the time it takes me to read War and Peace. The movies have omnivorously stolen whatever useful tricks were available from the other arts, and have raided the literary corpus for stories, often transforming them in fascinating ways. Of course, there’s some danger in taking the lessons of cinema too literally: a novel isn’t a movie, and both forms are capable of effects that can’t be achieved in the other. As a novelist, I have far more control over the finished product than I would as a director or screenwriter. But there’s no doubt that the play of my imagination on the page has been deeply shaped by my love of such filmmakers as Kubrick and the Archers, to the point where I can only echo John Irving: “When I feel like being a director, I write a novel.”
Which is why one of the hardest adjustments I’ve had to make as the father of a newborn baby centers on the fact that I’ll no longer be able to go to the movies as often as I’d like. For someone who has long been used to seeing the latest releases on opening weekend, and plenty of art house and revival movies on a regular basis, this is a real shock. The last movie I saw on the big screen was The Hobbit, and I’m not sure when I’ll have the chance to catch another. This wouldn’t be as big of a deal if Beatrix had happened to arrive, say, in early February, when there isn’t much worth seeing in any case. But as luck would have it, she was born at the height of Oscar season, which means I haven’t been able to see a wide range of movies that I otherwise would have caught on opening day: I haven’t seen Django Unchained or Silver Linings Playbook or Zero Dark Thirty or Amour or Les Misérables or even This is 40. I owe Christopher McQuarrie a personal apology for failing to at least check out Jack Reacher. And in a year that was already shaping up to be one of the best for popular filmmaking in a long time, it’s a loss that I feel deeply.
Nevertheless, beginning tomorrow, I’ll be counting down my ten favorite movies of the year, as I’ve done every year since starting this blog, despite the fact that the list will contain a number of startling omissions. At first, I was tempted to skip this year’s ranking, or to hold off on the outside chance that I’d at least see a couple of the movies mentioned above before Oscar night. At the moment, this doesn’t seem likely, so I’m going ahead with what can only be seen as an incomplete pool of contenders. Yet even if you arbitrarily cut the movie year off in the middle of December, as I’ve effectively done, you’re still left with an extraordinary year for cinema, and especially for big popular movies—a better year, in some ways, than either of the two I’ve covered here in the past. As such, it was perhaps the best year imaginable for me to say goodbye to cinema, at least for now: I’ve missed a lot, but I feel blessed to have seen the movies I did. The ones I’ve been forced to omit will still be waiting for me when the time comes, even if I end up watching them months from now, at home, with a baby in my arms. And when I put it that way, it doesn’t sound so bad at all.
Prometheus and the perils of secrecy
I’m tired of secrets. Over the past few years, ever since the release of the teaser trailer for Cloverfield, an increasing number of movies have shifted from the entirely reasonable attempt to keep certain plot elements a surprise to making a fetish of secrecy for its own sake. I blame J.J. Abrams, a talented director and producer who often puts more thought into a movie’s marketing campaign than into the story itself—witness Super 8, which shrouded in great intrigue a plot that turned out to be utterly conventional. Ridley Scott’s Prometheus is perhaps the most disappointing victim of this tendency to date, a movie that comes cloaked in secrecy—is it a prequel to Alien, or isn’t it?—only stand revealed as a total narrative nonevent. (It may not be a coincidence that one of the film’s writers is frequent Abrams collaborator Damon Lindelof, whose Lost displayed a similar inability to deliver on the revelations that the hype had led us to expect.)
Prometheus, to put it mildly, has some script problems. The trouble begins in one of the very first scenes, in which Noomi Rapace and Logan Marshall-Green, as a pair of startlingly incompetent archaeologists, discover an array of remarkable cave paintings at a site in Scotland, only to begin blithely tromping around with flashlights, no doubt destroying thousands of years of material in the process. The paintings, we’re told, are 35,000 years old—the age of the earliest human settlement in Scotland is usually dated closer to 15,000 years, but never mind—and depict a constellation that has appeared in works of art in every human culture, a configuration the archaeologists have confidently identified with a single star system many light years away (the arrangement of the stars in the sky having evidently remained unchanged across thirty millennia). Such plot holes are far from unusual in a big summer movie, of course, but none of these issues make us especially optimistic about the quality of the story we’re about to be told.
Our concerns are not without foundation. Rapace and Marshall-Green end up traveling on the most casually organized interstellar voyage of all time, a trillion-dollar project whose members not only haven’t been told the purpose of the mission, but haven’t even met yet, or been told anything about the chain of command, before awakening from hibernation on their arrival. Upon landing, they do, in fact, make the greatest archeological discovery in human history, stumbling at once on the remains of a massive alien civilization, a result which is somehow seen as disappointing, because none of the aliens there are still alive. (This is after a single day of exploration at one random site, which is sort of like aliens landing at Chichen Itza at night and bemoaning the fact that the humans there have gone extinct.) But of course, there is life here, of a particularly unpleasant kind, and Prometheus soon turns into something less than a coherent horror movie than a series of disconnected ideas about scenes it might be cool to have in an undeclared Alien prequel.
In interviews, Scott and Lindelof have spoken about the supposed profundity of the film’s ideas, and their decision to leave certain elements unexplained, with a nod toward such works as 2001: A Space Odyssey. But 2001, for all its obscurities, gives us the pieces for a perfectly straightforward explanation, which the novel makes even more clear, while Prometheus consists of such ill-fitting parts that any coherent reading seems impossible. There are occasional pleasures to be found here: Michael Fassbender is particularly good as an android who draws his personal style from Peter O’Toole’s Lawrence of Arabia, and there’s one nifty scene involving Rapace, an automated medical pod, and a particularly traumatic surgical procedure. For the most part, however, the astronauts are such idiots that one finds oneself missing the cult of of competence that James Cameron brought to Aliens. And that’s the heart of the problem. If we had characters that we cared about, the movie’s incoherencies wouldn’t matter. Because in the end, I don’t want answers. I want Ripley.
The way of Coppola, the way of Kubrick
Since yesterday’s posting on The Shining and Apocalypse Now, I’ve been thinking a lot about Stanley Kubrick and Francis Ford Coppola, who arguably had the two greatest careers in the past half century of American film. There have been other great directors, of course, but what sets Kubrick and Coppola apart is a matter of scale: each had a golden age—for Coppola, less than a decade, while for Kubrick, it lasted more than thirty years—when they were given massive budgets, studio resources, and creative control to make intensely, almost obsessively personal movies. The results are among the pillars of world cinema: aside from the two movies mentioned above, it gave us the Godfather films, 2001: A Space Odyssey, A Clockwork Orange, and more.
And yet these two men are also very different, both in craft and temperament. I’ve been listening to Coppola’s commentary tracks for the better part of a week now, and it’s hard to imagine a warmer, more inviting, almost grandfatherly presence—but even the most superficial look at his career reveals a streak of all but suicidal darkness. As David Thomson puts it:
[Coppola] tries to be everything for everyone; yet that furious effort may mask some inner emptiness. For he is very gregarious and very withdrawn, the life and soul of some parties, and a depressive. He is Sonny and Michael Corleone, for sure, but there are traces of Fredo, too—and he is at his best when secretly telling a part of his own story, or working out his fearful fantasies.
Kubrick, in some respects, is the opposite: a superficially cold and clinical director, deeply pessimistic about the human condition, who nonetheless was able to work happily and with almost complete creative freedom for the better part of his career. His films are often dark, but there’s also an abiding sense of a director tickled by the chance to play with such wonderful toys—whether the spaceships of 2001 or the fantastically detailed dream set of New York in Eyes Wide Shut. Coppola, by contrast, never seems entirely content unless the film stock is watered with his own blood.
These differences are also reflected in their approaches to filmmaking. Coppola and Kubrick have made some of the most visually ravishing movies of all time, but the similarities end there. Kubrick was controlling and precise—one assumes that every moment has been worked out in advance in script and storyboard—while Coppola seemed willing to follow the inner life of the movie wherever it led, whether through actors, the input of valued collaborators like Walter Murch, or the insane workings of chance or fate. This allowed him to make astonishing discoveries on set or in the editing room, but it also led to ridiculous situations like the ending of Apocalypse Now, where he paid Marlon Brando three million dollars to spend three weeks in the Philippines, but didn’t know what would happen when he got there. (And as the last scenes of the movie imply, he never did entirely figure it out.)
So what do these men have to tell us? Kubrick’s career is arguably greater: while you can debate the merits of the individual movies, there’s no doubt that he continued to make major films over the course of four decades. Coppola, alas, had eight miraculous years where he changed film forever, and everything since has been one long, frustrating, sometimes enchanting footnote (even if, like me, you love his Dracula and One From the Heart). It’s possible that Coppola, who spent such a long time in bankruptcy after his delirious dreams had passed, wishes he’d been more like Kubrick the clinician. And yet Coppola is the one who seems to have the most lessons for the rest of us. He’s the model of all true artists and directors: technically astounding, deeply humane, driven to find something personal in the most unlikely subjects, visionary, loyal, sometimes crazy, and finally, it seems, content. We’re all Coppola’s children. Kubrick, for all his genius, is nothing but Kubrick.
The Shining, Apocalypse Now, and the uses of allegory
On Saturday, in what seemed like an appropriate way to celebrate the completion of Part I of my new novel, my wife and I caught a midnight showing of The Shining at the Music Box in Chicago. Watching The Shining again was a reminder of how central this extraordinary film is to my experience of the movies: while 2001 may be Kubrick’s most ambitious film, and Eyes Wide Shut his most narratively intricate (as well as underrated), The Shining strikes me as his most purely satisfying work, and as such, it has always occupied a peculiar place in my imagination. The Overlook Hotel, as conceived by Stephen King and brought to life by Kubrick, is one of the greatest locations in all of cinema, and it’s the perfect stage for a series of unparalleled set pieces that are frightening, beautiful, and often very funny.
After the movie was over, I showed off a bit to my wife by pointing out the symbols that Kubrick uses to imply that the story of the Overlook is, in fact, an allegory for the history of America: it was built on an Indian burial ground, occupied by the British (as symbolized by the incongruously English ghost of Grady, the hotel’s previous caretaker), and inherited by American pioneers (hence Jack’s lumberman’s jacket and axe). And this network of symbols informs many aspects of the film, both large, like the uncomfortable fate of Scatman Crothers’s black psychic, who makes the long trip back to the Overlook only to be slaughtered on arrival, and small, like the designs on Danny Torrance’s sweaters, with their handmade versions of Mickey Mouse and the Apollo 11 spacecraft. It all ends with a closeup of a single date: July 4, 1921. And I believe that Kubrick’s use of such images is very intentional.
But then my wife asked a question that brought me up short: “So what is it trying to say?” Which caught me at a bit of a loss. My first response was that trying to sum up The Shining into a single message was doing the movie a disservice. After all, if Kubrick had meant it to be an allegory, clearly the movie itself was the simplest possible expression of the message he had in mind. But the more I thought about it, the less certain I became that there even was a message, which raises the question of what the allegorical elements were doing there at all. The question seemed all the more urgent because I’d had a similar experience, earlier that week, while watching Apocalypse Now Redux on Blu-ray. Coppola’s flawed masterpiece openly evokes not only Heart of Darkness but also the Odyssey—the river patrol boat encounters the Cyclops, the Sirens, Hades, and (in the extended version) the Lotus-Eaters. Which is great for critics playing a game of spot-the-reference. But what does it really mean for the viewer?
My more considered response, which I’m still working through in my own head, is simply this: it doesn’t necessarily need to mean anything. The role of allegory, at least in terms of my own reactions, isn’t so much to convey a message as to set up a chain of associations in the viewer’s mind. The Shining and Apocalypse Now are echo chambers in which images and symbols can jangle against one another, evoking other myths and works of art, and setting off unexpected vibrations within the story. The best allegories should be all but invisible, at least at first viewing, and even afterward, they continue to resist verbalization, because any allegory sounds weak and reductive when boiled down to a sentence or two. If we say that The Shining is about the violence inherent in the American experience, we risk two responses: first, a sense that this message isn’t exactly original, and second, a stubborn insistence that the movie isn’t about this, but rather a series of images and moments that can take up their own life in the experience of the viewer.
Which brings us to perhaps the most useful aspect of allegory: it helps the author find his way. I’ve written before about how structural constraints allow a writer to make unexpected discoveries about his own story, and though I was referring mostly to genre and plot, it also applies to allegory—which is only another way of bringing the reader from point A to B. And it seems clear that Coppola and Kubrick came up with artistic discoveries, using their allegorical elements as a guide, that they wouldn’t have made otherwise. Coppola admits that he didn’t have an ending to Apocalypse Now until almost the day they shot it, when he saw that a mythic journey had to have an equally mythic ending—that is, the sacrifice of the divine king. And The Shining is full of design choices that owe their existence to an almost subterranean allegory, invisible at first, but imperceptibly enriching the viewer’s experience. Is there a deeper meaning? Sure. But not one that can easily be put into words—at least not when it’s all there in Nicholson’s eyes.
“When Maddy emerged from the train at Southampton…”
leave a comment »
(Note: This post is the eighteenth installment in my author’s commentary for The Icon Thief, covering Chapter 17. You can read the earlier installments here.)
One of my favorite works on creativity of any kind is a short essay titled “Fantasy and Faculty X,” by the British author Colin Wilson, which I first encountered in the excellent collection How to Write Tales of Horror, Fantasy, and Science Fiction, edited by J.N. Williamson. Wilson believes that because the left and right hemispheres of the brain operate at different speeds, it’s necessary for both readers and writers to bring the two halves into sync, usually by slowing the left brain down, in order to fully immerse themselves in a fictional world. With respect to the writing process, this partially explains why writers often get their best ideas in the bus, bath, or bed, when a state of relaxation naturally allows both hemispheres to move at the same pace. And for readers, it sheds light on why a long, slow, descriptive section of a novel can plunge us into its world far better than nonstop action ever can—as long as we’re willing to follow the story wherever it’s trying to go.
This is why authors like Proust or Thomas Mann can immerse us in the details of a party or other social gathering, sometimes for a hundred pages, and leave us feeling as if we’d attended it ourselves. And it also applies to more mainstream works of art. For readers and audiences to really believe in the world they’re about to enter, it’s often useful to slow things down, which is why the languorous shots of spacecraft in movies like 2001 and the early Star Trek movies are so crucial in setting the tone for the story. (As much as I liked the J.J. Abrams reboot of Star Trek, I felt it was missing some of this fundamental sense of awe, which it might have achieved if it had eased up on the action for a moment or two.) And this is part of the reason why both Thomas Harris and Jonathan Demme spend so much time on those long walks down the hallway to Hannibal Lecter’s cell. It builds suspense, but it also puts us squarely into a particular state of mind before introducing us to the monster at the end of the corridor.
In a thriller, such a change of pace can be tricky to manage, which is why it’s often best to save it for times when the reader knows that something big is coming. This is why Chapter 17 of The Icon Thief, in which Maddy finally attends the party at the Hamptons that has been built up for much of Part I, is structured entirely as one long scene of arrival. If I were operating entirely by the principle of starting each scene as late as possible, I could have begun the chapter at the gate of the mansion, or even halfway through the party itself. In this case, however, it seemed better to take my time: I’ve spent several chapters leading up to this moment, establishing that this is where the various threads of the plot will finally converge, and if I’ve done my work properly, the reader will see this chapter as not just another transitional scene, but the overture to arguably the most important set piece in the entire novel. And having invested so much time and energy in preparing the reader for what follows, it doesn’t make sense to hurry past it.
This is why the chapter begins, not at the mansion itself, but with Maddy’s arrival at the train station in Southampton, and why I devote several pages to her preparations for the party, all of which I might have covered elsewhere in a paragraph or two. It helps that the details here are a lot of fun: the contrast between the sketchy share house, in which Maddy has arranged to sleep in a walk-in closet, and the opulence of the party itself, and between her own insecurity and the guests she encounters. In fact, this is one of the rare sections in the novel in which both my agent and editor actively encouraged me to add more detail, both visual and sociological, until the reader fully saw it in his or her mind’s eye. (In an earlier draft, Maddy overhears a guest say, enunciating carefully, “Fuck the endangered piping plover“—which my editor rightly flagged as being a little too on the nose.) As a result, when Maddy finally passes through the ranks of guests and comes face to face with the man she has come to find, the oligarch Anzor Archvadze, the moment has the impact it deserves. And I hope the reader also senses that there are some big things around the corner…
Like this:
Written by nevalalee
September 20, 2012 at 9:48 am
Posted in Books, Writing
Tagged with 2001: A Space Odyssey, Colin Wilson, How to Write Tales of Horror Fantasy and Science Fiction, J.J. Abrams, J.N. Williamson, Marcel Proust, Star Trek, The Icon Thief commentary, Thomas Harris, Thomas Mann