Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Stanley Kubrick

The soul of a new machine

with 5 comments

Over the weekend, I took part in a panel at Windycon titled “Evil Computers: Why Didn’t We Just Pull the Plug?” Naturally, my mind turned to the most famous evil computer in all of fiction, so I’ve been thinking a lot about HAL, which made me all the more sorry to learn yesterday of the death of voice actor Douglas Rain. (Stan Lee also passed away, of course, which is a subject for a later post.) I knew that Rain had been hired to record the part after Stanley Kubrick was dissatisfied by an earlier attempt by Martin Balsam, but I wasn’t aware that the director had a particular model in mind for the elusive quality that he was trying to evoke, as Kate McQuiston reveals in the book We’ll Meet Again:

Would-be HALs included Alistair Cooke and Martin Balsam, who read for the part but was deemed too emotional. Kubrick set assistant Benn Reyes to the task of finding the right actor, and expressly not a narrator, to supply the voice. He wrote, “I would describe the quality as being sincere, intelligent, disarming, the intelligent friend next door, the Winston Hibler/Walt Disney approach. The voice is neither patronizing, nor is it intimidating, nor is it pompous, overly dramatic, or actorish. Despite this, it is interesting. Enough said, see what you can do.” Even Kubrick’s U.S. lawyer, Louis Blau, was among those making suggestions, which included Richard Basehart, José Ferrer, Van Heflin, Walter Pigeon, and Jason Robards. In Douglas Rain, who had experience both as an actor and a narrator, Kubrick found just what he was looking for: “I have found a narrator…I think he’s perfect, he’s got just the right amount of the Winston Hibler, the intelligent friend next door quality, with a great deal of sincerity, and yet, I think, an arresting quality.”

Who was Winston Hibler? He was the producer and narrator for Disney who provided voiceovers for such short nature documentaries as Seal Island, In Beaver Valley, and White Wilderness, and the fact that Kubrick used him as a touchstone is enormously revealing. On one level, the initial characterization of HAL as a reassuring, friendly voice of information has obvious dramatic value, particularly as the situation deteriorates. (It’s the same tactic that led Richard Kiley to figure in both the novel and movie versions of Jurassic Park. And I have to wonder whether Kubrick ever weighed the possibility of hiring Hibler himself, since in other ways, he clearly spared no expense.) But something more sinister is also at play. As I’ve mentioned before, Disney and its aesthetic feels weirdly central to the problem of modernity, with its collision between the sentimental and the calculated, and the way in which its manufactured feeling can lead to real memories and emotion. Kubrick, a famously meticulous director who looked everywhere for insights into craft, seems to have understood this. And I can’t resist pointing out that Hibler did the voiceover for White Wilderness, which was nominated for an Academy Award for Best Documentary Short, but also included a scene in which the filmmakers deliberately herded lemmings off a cliff into the water in a staged mass suicide. As Hibler smoothly narrates in the original version: “A kind of compulsion seizes each tiny rodent and, carried along by an unreasoning hysteria, each falls into step for a march that will take them to a strange destiny. That destiny is to jump into the ocean. They’ve become victims of an obsession—a one-track thought: ‘Move on! Move on!’ This is the last chance to turn back, yet over they go, casting themselves out bodily into space.”

And I think that Kubrick’s fixation on Hibler’s voice, along with the version later embodied by Rain, gets at something important about our feelings toward computers and their role in our lives. In 2001, the astronauts are placed in an artificial environment in which their survival depends on the outwardly benevolent HAL, and one of the central themes of science fiction is what happens when this situation expands to encompass an entire civilization. It’s there at the very beginning of the genre’s modern era, in John W. Campbell’s “Twilight,” which depicts a world seven million years in the future in which “perfect machines” provide for our every need, robbing the human race of all initiative. (Campbell would explore this idea further in “The Machine,” and he even offered an early version of the singularity—in which robots learn to build better versions of themselves—in “The Last Evolution.”) Years later, Campbell and Asimov put that relationship at the heart of the Three Laws of Robotics, the first of which states: “A robot may not injure a human being or, through inaction, allow a human being to come to harm.” This sounds straightforward enough, but as writers realized almost right away, it hinges on the definition of certain terms, including “human being” and “harm,” that are slipperier than they might seem. Its ultimate expression was Jack Williamson’s story “With Folded Hands,” which carried the First Law to its terrifying conclusion. His superior robots believe that their Prime Directive is to prevent all forms of unhappiness, which prompts them to drug or lobotomize any human beings who seem less than content. As Williamson said much later in an interview with Larry McCaffery: “The notion I was consciously working on specifically came out of a fragment of a story I had worked on for a while about an astronaut in space who is accompanied by a robot obviously superior to him physically…Just looking at the fragment gave me the sense of how inferior humanity is in many ways to mechanical creations.”

Which brings us back to the singularity. Its central assumption was vividly expressed by the mathematician I.J. Good, who also served as a consultant on 2001:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.

That last clause is a killer, but even if we accept that such a machine would be “docile,” it also embodies the fear, which Campbell was already exploring in the early thirties, of a benevolent dictatorship of machines. And the very Campbellian notion of “the last invention” should be frightening in itself. The prospect of immortality may be enticing, but not if it emerges through a technological singularity that leaves us unprepared to deal with the social consequences, rather than through incremental scientific and medical progress—and the public debate that it ought to inspire—that human beings have earned for themselves. I can’t imagine anything more nightmarish than a world in which we can all live forever without having gone through the necessary ethical, political, and ecological stages to make such a situation sustainable. (When I contemplate living through the equivalent of the last two years over the course of millennia, the notion of eternal life becomes considerably less attractive.) Our fear of computers taking over our lives, whether on a spacecraft or in society as a whole, is really about the surrender of control, even in the benevolent form embodied by Disney. And when I think of the singularity now, I seem to hear it speaking with Winston Hibler’s voice: “Move on! Move on!”

The pencil and paper level

leave a comment »

The best education in film is to make one. I would advise any neophyte director to try to make a film by himself. A three-minute short will teach him a lot. I know that all the things I did at the beginning were, in microcosm, the things I’m doing now as a director and producer. There are a lot of noncreative aspects to filmmaking which have to be overcome, and you will experience them all when you make even the simplest film: business, organization, taxes, etc., etc. It is rare to be able to have uncluttered artistic environment when you make a film, and being able to accept this is essential.

The point to stress is that anyone seriously interested in making a film should find as much money as he can as quickly as he can and go out and do it. And this is no longer as difficult as it once was. When I began making movies as an independent in the early 1950s I received a fair amount of publicity because I was something of a freak in an industry dominated by a handful of huge studios. Everyone was amazed that it could be done at all. But anyone can make a movie who has a little knowledge of cameras and tape recorders, a lot of ambition and—hopefully—talent. It’s gotten down to the pencil and paper level. We’re really on the threshold of a revolutionary new era in film.

Stanley Kubrick, in an interview with Joseph Gelmis in The Film Director as Superstar

Written by nevalalee

August 25, 2018 at 7:30 am

Thinkers of the unthinkable

with 4 comments

At the symposium that I attended over the weekend, the figure whose name seemed to come up the most was Herman Kahn, the futurologist and military strategist best known for his book On Thermonuclear War. Kahn died in 1983, but he still looms large over futures studies, and there was a period in which he was equally inescapable in the mainstream. As Louis Menand writes in a harshly critical piece in The New Yorker: “Herman Kahn was the heavyweight of the Megadeath Intellectuals, the men who, in the early years of the Cold War, made it their business to think about the unthinkable, and to design the game plan for nuclear war—how to prevent it, or, if it could not be prevented, how to win it, or, if it could not be won, how to survive it…The message of [his] book seemed to be that thermonuclear war will be terrible but we’ll get over it.” And it isn’t surprising that Kahn engaged in a dialogue throughout his life with science fiction. In her book The Worlds of Herman Kahn, Sharon Ghamari-Tabrizi relates:

Early in life [Kahn] discovered science fiction, and he remained an avid reader throughout adulthood. While it nurtured in him a rich appreciation for plausible possibilities, [his collaborator Anthony] Wiener observed that Kahn was quite clear about the purposes to which he put his own scenarios. “Herman would say, ‘Don’t imagine that it’s an arbitrary choice as though you were writing science fiction, where every interesting idea is worth exploring.’ He would have insisted on that. The scenario must focus attention on a possibility that would be important if it occurred.” The heuristic or explanatory value of a scenario mattered more to him than its accuracy.

Yet Kahn’s thinking was inevitably informed by the genre. Ghamari-Tabrizi, who refers to nuclear strategy as an “intuitive science,” sees hints of “the scientist-sleuth pulp hero” in On Thermonuclear War, which is just another name for the competent man, and Kahn himself openly acknowledged the speculative thread in his work: “What you are doing today fundamentally is organizing a Utopian society. You are sitting down and deciding on paper how a society at war works.” On at least one occasion, he invoked psychohistory directly. In the revised edition of the book Thinking About the Unthinkable, Kahn writes of one potential trigger for a nuclear war:

Here we turn from historical fact to science fiction. Isaac Asimov’s Foundation novels describe a galaxy where there is a planet of technicians who have developed a long-term plan for the survival of civilization. The plan is devised on the basis of a scientific calculation of history. But the plan is upset and the technicians are conquered by an interplanetary adventurer named the Mule. He appears from nowhere, a biological mutant with formidable personal abilities—an exception to the normal laws of history. By definition, such mutants rarely appear but they are not impossible. In a sense, we have already seen a “mule” in this century—Hitler—and another such “mutant” could conceivably come to power in the Soviet Union.

And it’s both frightening and revealing, I think, that Kahn—even as he was thinking about the unthinkable—doesn’t take the next obvious step, and observe that such a mutant could also emerge in the United States.

Asimov wouldn’t have been favorably inclined toward the notion of a “winnable” nuclear war, but Kahn did become friendly with a writer whose attitudes were more closely aligned with his own. In the second volume of Robert A. Heinlein: In Dialogue with His Century, William H. Patterson describes the first encounter between the two men:

By September 20, 1962, [the Heinleins] were in Las Vegas…[They] met Dr. Edward Teller, who had been so supportive of the Patrick Henry campaign, as well as one of Teller’s colleagues, Herman Kahn. Heinlein’s ears pricked up when he was introduced to this jolly, bearded fat man who looked, he said, more like a young priest than one of the sharpest minds in current political thinking…Kahn was a science fiction reader and most emphatically a Heinlein fan.

Three years later, Heinlein attended a seminar, “The Next Ten Years: Scenarios and Possibilities,” that Kahn held at the Hudson Institute in New York. Heinlein—who looked like Quixote to Kahn’s Sancho Panza—was flattered by the reception:

If I attend an ordinary cocktail party, perhaps two or three out of a large crowd will know who I am. If I go to a political meeting or a church or such, I may not be spotted at all…But at Hudson Institute, over two-thirds of the staff and over half of the students button-holed me. This causes me to have a high opinion of the group—its taste, IQ, patriotism, sex appeal, charm, etc. Writers are incurably conceited and pathologically unsure of themselves; they respond to stroking the way a cat does.

And it wasn’t just the “stroking” that Heinlein liked, of course. He admired Thinking About the Unthinkable and On Thermonuclear War, both of which would be interesting to read alongside Farnham’s Freehold, which was published just a few years later. Both Heinlein and Kahn thought about the future through stories, in a pursuit that carried a slightly disreputable air, as Kahn implied in his use of the word “scenario”:

As near as I can tell, the term scenario was first used in this sense in a group I worked with at the RAND Corporation. We deliberately choose the word to deglamorize the concept. In writing the scenarios for various situations, we kept saying “Remember, it’s only a scenario,” the kind of thing that is produced by Hollywood writers, both hacks and geniuses.

You could say much the same about science fiction. And perhaps it’s appropriate that Kahn’s most lasting cultural contribution came out of Hollywood. Along with Wernher von Braun, he was one of the two most likely models for the title character in Dr. Strangelove. Stanley Kubrick immersed himself in Kahn’s work—the two men met a number of times—and Kahn’s reaction to the film was that of a writer, not a scientist. As Ghamari-Tabrizi writes:

The Doomsday Machine was Kahn’s idea. “Since Stanley lifted lines from On Thermonuclear War without change but out of context,” Khan told reporters, he thought he was entitled to royalties from the film. He pestered him several times about it, but Kubrick held firm. “It doesn’t work that way!” he snapped, and that was that.

The dawn of man

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

Almost from the moment that critics began to write about 2001, it became fashionable to observe that the best performance in the movie was by an actor playing a computer. In his review in Analog, for example, P. Schuyler Miller wrote:

The actors, except for the gentle voice of HAL, are thoroughly wooden and uninteresting, and I can’t help wondering whether this isn’t Kubrick’s subtle way of suggesting that the computer is really more “human” than they and fully justified in trying to get rid of them before they louse up an important mission. Someday we may know whether the theme of this part is a Clarke or a Kubrick contribution. I suspect it was the latter…perhaps just because Stanley Kubrick is said to like gadgets.

This criticism is often used to denigrate the other performances or the film’s supposed lack of humanity, but I prefer to take it as a tribute to the work of actor Douglas Rain, Kubrick and Clarke’s script, and the brilliant design of HAL himself. The fact that a computer is the character we remember best isn’t a flaw in the movie, but a testament to its skill and imagination. And as I’ve noted elsewhere, the acting is excellent—it’s just so understated and naturalistic that it seems vaguely incongruous in such spectacular settings. (Compare it to the performances in Destination Moon, for instance, and you see how good Keir Dullea and William Sylvester really are here.)

But I also think that the best performance in 2001 isn’t by Douglas Rain at all, but by Vivian Kubrick, in her short appearance on the phone as Heywood Floyd’s daughter. It’s a curious scene that breaks many of the rules of good storytelling—it doesn’t lead anywhere, it’s evidently designed to do nothing but show off a piece of hardware, and it peters out even as we watch it. The funniest line in the movie may be Floyd’s important message:

Listen, sweetheart, I want you to tell mommy something for me. Will you remember? Well, tell mommy that I telephoned. Okay? And that I’ll try to telephone tomorrow. Now will you tell her that?

But that’s oddly true to life as well. And when I watch the scene today, with a five-year-old daughter of my own, it seems to me that there’s no more realistic little girl in all of movies. (Kubrick shot the scene himself, asking the questions from offscreen, and there’s a revealing moment when the camera rises to stay with Vivian as she stands. This is sometimes singled out as a goof, although there’s no reason why a sufficiently sophisticated video phone wouldn’t be able to track her automatically.) It’s a scene that few other films would have even thought to include, and now that video chat is something that we all take for granted, we can see through the screen to the touchingly sweet girl on the other side. On some level, Kubrick simply wanted his daughter to be in the movie, and you can’t blame him.

At the time, 2001 was criticized as a soulless hunk of technology, but now it seems deeply human, at least compared to many of its imitators. Yesterday in the New York Times, Bruce Handy shared a story from Keir Dullea, who explained why he breaks the glass in the hotel room at the end, just before he comes face to face with himself as an old man:

Originally, Stanley’s concept for the scene was that I’d just be eating and hear something and get up. But I said, “Stanley, let me find some slightly different way that’s kind of an action where I’m reaching—let me knock the glass off, and then in mid-gesture, when I’m bending over to pick it up, let me hear the breathing from that bent-over position.” That’s all. And he says, “Oh, fine. That sounds good.” I just wanted to find a different way to play the scene than blankly hearing something. I just thought it was more interesting.

I love this anecdote, not just because it’s an example of an evocative moment that arose from an actor’s pragmatic considerations, but because it feels like an emblem of the production of the movie as a whole. 2001 remains the most technically ambitious movie of all time, but it was also a project in which countless issues were being figured out on the fly. Every solution was a response to a specific problem, and it covered a dizzying range of challenges—from the makeup for the apes to the air hostess walking upside down—that might have come from different movies entirely.

2001, in short, was made by hand—and it’s revealing that many viewers assume that computers had to be involved, when they didn’t figure in the process at all. (All of the “digital” readouts on the spacecraft, for instance, were individually animated, shot on separate reels of film, and projected onto those tiny screens on set, which staggers me even to think about it. And even after all these years, I still can’t get my head around the techniques behind the Star Gate sequence.) It reminds me, in fact, of another movie that happens to be celebrating an anniversary this year. As a recent video essay pointed out, if the visual effects in Jurassic Park have held up so well, it’s because most of them aren’t digital at all. The majority consist of a combination of practical effects, stop motion, animatronics, raptor costumes, and a healthy amount of misdirection, with computers used only when absolutely necessary. Each solution is targeted at the specific problems presented by a snippet of film that might last just for a few seconds, and it moves so freely from one trick to another that we rarely have a chance to see through it. It’s here, not in A.I., that Spielberg got closest to Kubrick, and it hints at something important about the movies that push the technical aspects of the medium. They’re often criticized for an absence of humanity, but in retrospect, they seem achingly human, if only because of the total engagement and attention that was required for every frame. Most of their successors lack the same imaginative intensity, which is a greater culprit than the use of digital tools themselves. Today, computers are used to create effects that are perfect, but immediately forgettable. And one of the wonderful ironies of 2001 is that it used nothing but practical effects to create a computer that no viewer can ever forget.

The cosmic order

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

On April 2, 1968, the world premiere of 2001: A Space Odyssey was held at the Uptown Theater, a movie palace in the Cleveland Park neighborhood of Washington, D.C. Two days later, Martin Luther King, Jr. was assassinated in Memphis, sparking riots throughout the country, including the nation’s capital. At first, this might seem like another reminder of how we unconsciously compartmentalize the past, filing events into separate categories, like the moon landing and the Manson killings, that actually unfolded in a confused present tense. Three years ago, the artist Edgar Arceneaux released an experimental film, A Time to Break Silence, that tried to go deeper, explaining in an interview:

Stanley Kubrick, Arthur C. Clarke and Dr. King were formulating their ideas about the duality of technology, which can be used as both a weapon and tool, during the same time period. As the psychic trauma of Dr. King’s death had the nation in a raw state of anger and uncertainty, a film chronicling the genealogy of humanity’s troubled future with technology is released in theaters.

More often, however, we tend to picture the political upheavals of the sixties as moving along a separate track from the decade’s scientific and technological achievements. In his book on the making of 2001, the journalist Piers Bizony writes of its visual effects team: “The optimism of Kubrick’s technologists seemed unquenchable. Perhaps, like their counterparts at Cape Kennedy, they were just too busy in their intense and closed-off little world to notice Vietnam, Martin Luther King, LSD, the counterculture?”

But that isn’t really true. John W. Campbell liked to remind his authors: “The future doesn’t happen one at a time.” And neither does the past or the present. We find, for instance, that King himself—who was a man who thought about everything—spoke and wrote repeatedly about the space program. At first, like many others, he saw it through the lens of national security, saying in a speech on February 2, 1959: “In a day when Sputniks and Explorers dash through outer space and guided ballistic missiles are carving highways of death though the stratosphere, nobody can win a war.” Yet it remained on his mind, and images of space began to appear more often in his public statements over the following year. A few months later, in a sermon titled “Unfulfilled Hopes,” he said:

We look out at the stars; we find ourselves saying that these stars shine from their cold and serene and passionless height, totally indifferent to the joys and sorrows of men. We begin to ask, is man a plaything of a callous nature, sometimes friendly and sometimes inimical? Is man thrown out as a sort of orphan in the terrifying immensities of space, with nobody to guide him on and nobody concerned about him? These are the questions we ask, and we ask them because there is an element of tragedy in life.

And King proclaimed in a commencement speech at Morehouse College in June: “Man through his scientific genius has been able to dwarf distance and place time in chains. He has been able to carve highways through the stratosphere, and is now making preparations for a trip to the moon. These revolutionary changes have brought us into a space age. The world is now geographically one.”

King’s attitude toward space was defined by a familiar tension. On one hand, space travel is a testament to our accomplishments as a species; on the other, it diminishes our achievements by forcing us to confront the smallness of our place in the universe. On December 11, 1960, King emphasized this point in a sermon at the Unitarian Church of Germantown, Pennsylvania:

All of our new developments can banish God neither from the microcosmic compass of the atom nor from the vast unfathomable ranges of interstellar space, living in a universe in which we are forced to measure stellar distance by light years, confronted with the illimitable expanse of the universe in which stars are five hundred million billion miles from the Earth, which heavenly bodies travel at incredible speed and in which the ages of planets are reckoned in terms of billions of years. Modern man is forced to cry out with the solace of old: “When I behold the heavens, the work of thy hands, the moon, the stars, and all that thou hast created, what is man that thou art mindful of him and the son of man that thou remembereth him?”

In 1963, King made the comparison more explicit in his book The Strength to Love: “Let us notice, first, that God is able to sustain the vast scope of the physical universe. Here again, we are tempted to feel that man is the true master of the physical universe. Manmade jet planes compress into minutes distances that formerly required weeks of tortuous effort. Manmade spaceships carry cosmonauts through outer space at fantastic speeds. Is God not being replaced in the mastery of the cosmic order?” But after reminding us of the scale of the distances involved, King concludes: “We are forced to look beyond man and affirm anew that God is able.”

This seems very much in the spirit of 2001, which is both a hymn to technology and a meditation on human insignificance. For King, however, the contrast between the triumphs of engineering and the vulnerability of the individual wasn’t just an abstract notion, but a reflection of urgent practical decisions that had to be made here and now. Toward the end of his life, he framed it as a choice of priorities, as he did in a speech in 1967: “John Kenneth Galbraith said that a guaranteed national income could be done for about twenty billion dollars a year. And I say to you today, that if our nation can spend thirty-five billion dollars to fight an unjust, evil war in Vietnam, and twenty billion dollars to put a man on the moon, it can spend billions of dollars to put God’s children on their own two feet right here on Earth.” The following year, speaking to the Rabbinical Assembly in the Catskills, he was even more emphatic: “It must be made clear now that there are some programs that we can cut back on—the space program and certainly the war in Vietnam—and get on with this program of a war on poverty.” And on March 18, 1968, King said to the striking sanitation workers in Memphis, whom he would visit again on the day before he died:

I will hear America through her historians, years and generations to come, saying, “We built gigantic buildings to kiss the skies. We built gargantuan bridges to span the seas. Through our spaceships we were able to carve highways through the stratosphere. Through our airplanes we are able to dwarf distance and place time in chains. Through our submarines we were able to penetrate oceanic depths.” It seems that I can hear the God of the universe saying, “Even though you have done all of that, I was hungry and you fed me not, I was naked and you clothed me not. The children of my sons and daughters were in need of economic security and you didn’t provide it for them. And so you cannot enter the kingdom of greatness.”

How the solar system was won

leave a comment »

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

When Stanley Kubrick hired Arthur C. Clarke to work on the project that became 2001: A Space Odyssey, they didn’t have a title, a plot, or even much in the way of a premise. In Kubrick’s introductory letter to the author, he had written only that his interest lay in “these broad areas, naturally assuming great plot and character”:

1. The reasons for believing in the existence of intelligent extraterrestrial life.
2. The impact (and perhaps even lack of impact in some quarters) such discovery would have on earth in the near future.
3. A space probe with a landing and exploration of the moon and Mars.

If you’ve seen the movie, you know that almost none of what Kubrick describes here ended up in the finished film. The existence of extraterrestrial life is the anthropic assumption on which the entire story rests; there’s no real attempt to sketch in the larger social context; and the discovery of the alien artifact—far from having any impact on society—remains a secret until the end to all but a few scientists. There’s already a thriving colony on the moon when the main action of the story really starts, and Heywood Floyd only turns up after the monolith has been found. All that remains of Kubrick’s original conception, in fact, is a vague feeling that he tried to convey early in their partnership, which Clarke remembered later as the desire to make “a movie about man’s relation to the universe—something which had never been attempted, still less achieved, in the history of motion pictures.”

In this respect, they undoubtedly succeeded, and a lot of it had to do with Kubrick’s choice of collaborator. Yesterday, I suggested that Kubrick settled on Clarke because he was more likely than the other obvious candidates to be available for the extended writing process that the director had in mind. (This was quite an assumption, since it meant that Clarke had to be away from his home in Ceylon for more than a year, but it turned out to be right.) Yet Clarke was also uniquely qualified to write about “man’s relation to the universe,” and in particular about aliens who were far in advance of the human race. As Isaac Asimov has memorably explained, this was a plot point that was rarely seen in Astounding, mostly because of John W. Campbell’s personal prejudices:

[Campbell] was a devout believer in the inequality of man and felt that the inequality could be detected by outer signs such as skin and hair coloring…In science fiction, this translated itself into the Campbellesque theory that earthmen (all of whom, in the ideal Campbell story, resembled, people of northwestern European extraction) were superior to all other intelligent races.

Clarke had broken through in Astounding after the war—his stories “Loophole” and “Rescue Party” appeared in 1946—but geographical distance and foreign rights issues had kept him from being shaped by Campbell to any real extent. As a result, he was free to indulge in such works as Childhood’s End, the ultimate story about superior aliens, which was inspired by Campbell’s novel The Mightiest Machine but ran its first installment in the British magazine New Worlds.

Clarke, in short, was unquestionably part of the main sequence of hard science fiction that Campbell had inaugurated, but he was also open to exploring enormous, borderline mystical questions that emphasized mankind’s insignificance. (At his best, in such stories as “The Star” and “The Nine Billion Names of God,” he managed to combine clever twist endings with a shattering sense of scale in a way that no other writer has ever matched.) It was this unlikely combination of wit, technical rigor, and awareness of the infinite that made him ideally suited to Kubrick, and they promptly embarked on one of the most interesting collaborations in the history of the genre. As an example of a symbiotic organism, the only comparable example is Campbell and the young Asimov, except that Clarke and Kubrick were both mature artists at the peak of their talents. Fortunately for us, Clarke kept a journal, and he provided excerpts in two fascinating essays, “Christmas, Shepperton” and “Monoliths and Manuscripts,” which were published in the collection The Lost Worlds of 2001. The entries offer a glimpse of a process that ranged freely in all directions, with both men pursuing trains of thought as far as they would go before abandoning them for something better. As Clarke writes:

It was [Kubrick’s] suggestion that, before embarking on the drudgery of the script, we let our imaginations soar freely by developing the story in the form of a complete novel…After various false starts and twelve-hour talkathons, by early May 1964 Stanley agreed that [Clarke’s short story] “The Sentinel” would provide good story material. But our first concept—and it is hard now for me to focus on such an idea, though it would have been perfectly viable—involved working up to the discovery of an extraterrestrial artifact as the climax, not the beginning, of the story. Before that, we would have a series of incidents or adventures devoted to the exploration of the moon and planets…[for which] our private title (never of course intended for public use) was How the Solar System Was Won.

And while 2001 arguably made its greatest impact on audiences with its meticulous art direction and special effects, Kubrick’s approach to writing was equally obsessive. He spent a full year developing the story with Clarke before approaching the studio for financing, and although they soon realized that the premise of “The Sentinel” would work better as an inciting incident, rather than as the ending, the notion of “incidents or adventures” persisted in the finished script. The film basically consists of four loosely connected episodes, the most memorable of which—the story of HAL 9000—could be eliminated without fundamentally affecting the others. But if it feels like an organic whole, this is largely thanks to the decision to develop far more material than could ever fit into a novel, much less a movie. (Clarke’s diary entries are filled with ideas that were dropped or transformed in the final version: “The people we meet on the other star system are humans who were collected from earth a hundred thousand years ago, and hence are virtually identical to us.” “What if our E.T.s are stranded on earth and need the ape-men to help them?” And then there’s the startling line, which Clarke, who was discreetly gay, records without comment: “Stanley has invented the wild idea of slightly fag robots who create a Victorian environment to put our heroes at their ease.”) It verged on a private version of development hell, without any studio notes or interference, and it’s hard to imagine any other director who could have done it. 2001 started a revolution in visual effects, but its writing process was just as remarkable, and we still haven’t caught up to it yet. Even Clarke, whose life it changed, found Kubrick’s perfectionism hard to take, and he concluded: “In the long run, everything came out all right—exactly as Stanley had predicted. But I can think of easier ways of earning a living.”

When Clarke Met Kubrick

with 3 comments

Note: To celebrate the fiftieth anniversary of the release of 2001: A Space Odyssey, which held its premiere on April 2, 1968, I’ll be spending the week looking at various aspects of what remains the greatest science fiction movie ever made.

“I’m reading everything by everybody,” Stanley Kubrick said one day over lunch in New York. It was early 1964, and he was eating at Trader Vic’s with Roger A. Caras, a wildlife photographer and studio publicist who was working at the time for Columbia Pictures. Dr. Strangelove had just been released, and after making small talk about their favorite brand of telescope, Caras asked the director what he had in mind for his next project. Kubrick replied that he was thinking about “something on extraterrestrials,” but he didn’t have a writer yet, and in the meantime, he was consuming as much science fiction as humanly possible. Unfortunately, we don’t know much about what he was reading, which is a frustrating omission in the career of a filmmaker whose archives have been the subject of so many exhaustive studies. In his biography of Kubrick, Vincent Lobrutto writes tantalizingly of this period: “Every day now boxes of science fiction and fact books were being delivered to his apartment. Kubrick was immersing himself in a subject he would soon know better than most experts. His capacity to grasp and disseminate information stunned many who worked with him.” Lobrutto notes that Kubrick took much the same approach a decade later on the project that became The Shining, holing up in his office with “stacks of horror books,” and the man with whom he would eventually collaborate on 2001 recalled of their first meeting: “[Kubrick] had already absorbed an immense amount of science fact and science fiction, and was in some danger of believing in flying saucers.” At their lunch that day at Trader Vic’s, however, Caras seemed to think that all of this work was unnecessary, and he told this to Kubrick in no uncertain terms: “Why waste your time? Why not just start with the best?”

Let’s pause the tape here for a moment to consider what other names Caras might plausibly have said. A year earlier, in his essay “The Sword of Achilles,” Isaac Asimov provided what we can take as a fairly representative summary of the state of the genre:

Robert A. Heinlein is usually considered the leading light among good science fiction writers. Others with a fine grasp of science and a fascinatingly imaginative view of its future possibilities are Arthur C. Clarke, Frederik Pohl, Damon Knight, James Blish, Clifford D. Smiak, Poul Anderson, L. Sprague de Camp, Theodore Sturgeon, Walter Miller, A.J. Budrys…These are by no means all.

Even accounting for the writer and the time period, there are a few noticeable omissions—it’s surprising not to see Lester del Rey, for instance, and A.E. van Vogt, who might not have qualified as what Asimov saw as “good science fiction,” had been voted one of the top four writers in the field in a pair of polls a few years earlier. It’s also necessary to add Asimov himself, who at the time was arguably the science fiction writer best known to general readers. (In 1964, he would even be mentioned briefly in Saul Bellow’s novel Herzog, which was the perfect intersection of the highbrow and the mainstream.) Arthur C. Clarke’s high ranking wasn’t just a matter of personal affection, either—he and Asimov later became good friends, but when the article was published, they had only met a handful of times. Clarke, in other words, was clearly a major figure. But it seems fair to say that anyone claiming to name “the best” science fiction writer in the field might very well have gone with Asimov or Heinlein instead.

Caras, of course, recommended Clarke, whom he had first met five years earlier at a weekend in Boston with Jacques Cousteau. Kubrick was under the impression that Clarke was a recluse, “a nut who lives in a tree in India someplace,” and after being reassured that he wasn’t, the director became excited: “Jesus, get in touch with him, will you?” Caras sent Clarke a telegram to ask about his availability, and when the author said that he was “frightfully interested,” Kubrick wrote him a fateful letter:

It’s a very interesting coincidence that our mutual friend Caras mentioned you in a conversation we were having about a Questar telescope. I had been a great admirer of your books for quite a time and had always wanted to discuss with you the possibility of doing the proverbial “really good” science-fiction movie…Roger tells me you are planning to come to New York this summer. Do you have an inflexible schedule? If not, would you consider coming sooner with a view to a meeting, the purpose of which would be to determine whether an idea might exist or arise which could sufficiently interest both of us enough to want to collaborate on a screenplay?

This account of the conversation differs slightly from Caras’s recollection—Kubrick doesn’t say that they were actively discussing potential writers for a film project, and he may have been flattering Clarke slightly with the statement that he had “always wanted” to talk about a movie with him. But it worked. Clarke wrote back to confirm his interest, and the two men finally met in New York on April 22, where the author did his best to talk Kubrick out of his newfound interest in flying saucers.

But why Clarke? At the time, Kubrick was living on the Upper East Side, which placed him within walking distance of many science fiction authors who were considerably closer than Ceylon, and it’s tempting to wonder what might have happened if he had approached Heinlein or Asimov, both of whom would have been perfectly sensible choices. A decade earlier, Heinlein made a concerted effort to break into Hollywood with the screenplays for Destination Moon and Project Moon Base, and the year before, he had written an unproduced teleplay for a proposed television show called Century XXII. (Kubrick studied Destination Moon for its special effects, if not for its story, as we learn from the correspondence of none other than Roger Caras, who had gone to work for Kubrick’s production company.) Asimov, for his part, was more than willing to explore such projects—in years to come, he would meet to discuss movies with Woody Allen and Paul McCartney, and I’ve written elsewhere about his close encounter with Steven Spielberg. But if Kubrick went with Clarke instead, it wasn’t just because they had a friend in common. At that point, Clarke was a highly respected writer, but not yet a celebrity outside the genre, and the idea of a “Big Three” consisting of Asimov, Clarke, and Heinlein was still a decade away. His talent was undeniable, but he was also a more promising candidate for the kind of working relationship that the director had in mind, which Kubrick later estimated as “four hours a day, six days a week” for more than three years. I suspect that Kubrick recognized what might best be described as a structural inefficiency in the science fiction market. The time and talents of one of the most qualified writers imaginable happened to be undervalued and available at just the right moment. When the opportunity came, Kubrick seized it. And it turned out to be one hell of a bargain.

American Stories #6: The Shining

leave a comment »

Note: As we enter what Joe Scarborough justifiably expects to be “the most consequential political year of our lives,” I’m looking back at ten works of art—books, film, television, and music—that deserve to be reexamined in light of where America stands today. You can find the earlier installments here

“Vanderbilts have stayed here, and Rockefellers, and Astors, and Du Ponts,” Stuart Ullmann, the manager of the Overlook Hotel, smugly informs Jack Torrance in the opening pages of Stephen King’s The Shining. “Four presidents have stayed in the Presidential Suite. Wilson, Harding, Roosevelt, and Nixon.” After Torrance replies that they shouldn’t be too proud of Harding and Nixon, Ullmann adds, frowning, that the hotel was later purchased by a man named Horace Derwent, “millionaire inventor, pilot, film producer, and entrepreneur.” Just in case we don’t make the connection, here’s what Torrance, now the caretaker, thinks to himself about Derwent hundreds of pages later, while leafing through the scrapbook that he finds in the hotel’s basement:

[Derwent was] a balding man with eyes that pierced you even from an old newsprint photo. He was wearing rimless spectacles and a forties-style pencil mustache that did nothing at all to make him look like Errol Flynn. His face was that of an accountant. It was the eyes that made him look like someone or something else…[His movie studio] ground out sixty movies, fifty-five of which glided right into the face of the Hayes Office and spit on its large blue nose…During one of them an unnamed costume designer had jury-rigged a strapless bra for the heroine to appear in during the Grand Ball scene, where she revealed everything except possibly the birthmark just below the cleft of her buttocks. Derwent received credit for this invention as well, and his reputation—or notoriety—grew…Living in Chicago, seldom seen except for Derwent Enterprises board meetings…it was supposed by many that he was the richest man in the world.

There’s only one mogul who fits that description, and it isn’t William Randolph Hearst. By hitching his story to the myth of Howard Hughes, who died shortly before the novel’s publication but would have been alive during much of its conception and writing, King taps into an aspect of the American experience symbolized by his reclusive subject, the aviator, engineer, and movie producer who embodied all of his nation’s virtues and vices before succumbing gradually to madness. It’s no surprise that Hughes has fascinated directors as obsessive as Martin Scorsese, Warren Beatty, Christopher Nolan—who shelved a Hughes biopic to focus instead on the similar figure of Batman—and even Orson Welles, whose last film, F for Fake, included an extended meditation on the Clifford Irving hoax. As for Stanley Kubrick, who once listed Hughes’s Hell’s Angels among his favorite movies, he could hardly have missed the implication. (If we see the Overlook’s mysterious owner at all in the movie, it’s in the company of the otherwise inexplicable man in the dog costume, who is identified in the novel as Derwent’s lover, while in the sequel Doctor Sleep, which I haven’t read, King evidently associates him with the ghost who offers the toast to Wendy: “Great party, isn’t it?”) The film’s symbols have been analyzed to death, but they only externalize themes that are there in the novel, and although King was dissatisfied by the result, his attempt to treat this material more explicitly in the later miniseries only shows how right Kubrick was to use them instead as the building blocks of a visual language. The Overlook is a stage for reenacting the haunted history of its nation, much of which can only be expressed as a ghost story, and it isn’t finished yet. Looking at the pictures in the scrapbook from the hotel’s grand opening in 1945, Torrance thinks: “The war was over, or almost over. The future lay ahead, clean and shining.”

Written by nevalalee

January 8, 2018 at 7:46 am

The act of killing

with one comment

Over the weekend, my wife and I watched the first two episodes of Mindhunter, the new Netflix series created by Joe Penhall and produced by David Fincher. We took in the installments over successive nights, but if you can, I’d recommend viewing them back to back—they really add up to a single pilot episode, arbitrarily divided in half, and they amount to a new movie from one of the five most interesting American directors under sixty. After the first episode, I was a little mixed, but I felt better after the next one, and although I still have some reservations, I expect that I’ll keep going. The writing tends to spell things out a little too clearly; it doesn’t always avoid clichés; and there are times when it feels like a first draft of a stronger show to come. Fincher, characteristically, sometimes seems less interested in the big picture than in small, finicky details, like the huge titles used to identify the locations onscreen, or the fussily perfect sound that the springs of the chair make whenever the bulky serial killer Ed Kemper sits down. (He also gives us two virtuoso sequences of the kind that he does better than just about anyone else—a scene in a noisy club with subtitled dialogue, which I’ve been waiting to see for years, and a long, very funny montage of two FBI agents on the road.) For long stretches, the show is about little else than the capabilities of the Red Xenomorph digital camera. Yet it also feels like a skeleton key for approaching the work of a man who, in fits and starts, has come to seem like the crucial director of our time, in large part because of his own ambivalence toward his fantasies of control.

Mindhunter is based on a book of the same name by John Douglas and Mark Olshaker about the development of behavioral science at the FBI. I read it over twenty years ago, at the peak of my morbid interest in serial killers, which is a phase that a lot of us pass through and that Fincher, revealingly, has never outgrown. Apart from Alien 3, which was project that he barely understood and couldn’t control, his real debut was Seven, in which he benefited from a mechanical but undeniably compelling script by Andrew Kevin Walker and a central figure who has obsessed him ever since. John Doe, the killer, is still the greatest example of the villain who seems to be writing the screenplay for the movie in which he appears. (As David Thomson says of Donald Sutherland’s character in JFK: “[He’s] so omniscient he must be the scriptwriter.”) Doe’s notebooks, rendered in comically lavish detail, are like a nightmare version of the notes, plans, and storyboards that every film generates, and he alternately assumes the role of writer, art director, prop master, and producer. By the end, with the hero detectives reduced to acting out their assigned parts in his play, the distinction between Doe and the director—a technical perfectionist who would later become notorious for asking his performers for hundreds of takes—seems to disappear completely. It seems to have simultaneously exhilarated and troubled Fincher, much as it did Christopher Nolan as he teased out his affinities with the Joker in The Dark Knight, and both men have spent much of their subsequent careers working through the implications of that discovery.

Fincher hasn’t always been comfortable with his association with serial killers, to the extent that he made a point of having the characters in The Girl With the Dragon Tattoo refer to “a serial murderer,” as if we’d be fooled by the change in terminology. Yet the main line of his filmography is an attempt by a surprisingly smart, thoughtful director to come to terms with his own history of violence. There were glimpses of it as early as The Game, and Zodiac, his masterpiece, is a deconstruction of the formula that turned out to be so lucrative in Seven—the killer, wearing a mask, appears onscreen for just five minutes, and some of the scariest scenes don’t have anything to do with him at all, even as his actions reverberate outward to affect the lives of everyone they touch. Dragon Tattoo, which is a movie that looks a lot better with time, identifies its murder investigation with the work of the director and his editors, who seemed to be asking us to appreciate their ingenuity in turning the elements of the book, with its five acts and endless procession of interchangeable suspects, into a coherent film. And while Gone Girl wasn’t technically a serial killer movie, it gave us his most fully realized version to date of the antagonist as the movie’s secret writer, even if she let us down with the ending that she wrote for herself. In each case, Fincher was processing his identity as a director who was drawn to big technical challenges, from The Curious Case of Benjamin Button to The Social Network, without losing track of the human thread. And he seems to have sensed how easily he could become a kind of John Doe, a master technician who toys sadistically with the lives of others.

And although Mindhunter takes a little while to reveal its strengths, it looks like it will be worth watching as Fincher’s most extended attempt to literally interrogate his assumptions. (Fincher only directed the first two episodes, but this doesn’t detract from what might have attracted him to this particular project, or the role that he played in shaping it as a producer.) The show follows two FBI agents as they interview serial killers in search of insights into their madness, with the tone set by a chilling monologue by Ed Kemper:

People who hunt other people for a vocation—all we want to talk about is what it’s like. The shit that went down. The entire fucked-upness of it. It’s not easy butchering people. It’s hard work. Physically and mentally, I don’t think people realize. You need to vent…Look at the consequences. The stakes are very high.

Take out the references to murder, and it might be the director talking. Kemper later casually refers to his “oeuvre,” leading one of the two agents to crack: “Is he Stanley Kubrick?” It’s a little out of character, but also enormously revealing. Fincher, like Nolan, has spent his career in dialogue with Kubrick, who, fairly or not, still sets the standard for obsessive, meticulous, controlling directors. Kubrick never made a movie about a serial killer, but he took the equation between the creative urge and violence—particularly in A Clockwork Orange and The Shining—as far as anyone ever has. And Mindhunter will only become the great show that it has the potential to be if it asks why these directors, and their fans, are so drawn to these stories in the first place.

The ultimate trip

with 2 comments

On Saturday, I was lucky enough to see 2001: A Space Odyssey on the big screen at the Music Box Theatre in Chicago. I’ve seen this movie well over a dozen times, but watching it on a pristine new print from the fourth row allowed me to pick up on tiny details that I’d never noticed before, such as the fact that David Bowman, stranded at the end in his celestial hotel room, ends up wearing a blue velvet robe startlingly like Isabella Rossellini’s. I was also struck by the excellence of the acting, which might sound like a joke, but it isn’t. Its human protagonists have often been dismissed—Roger Ebert, who thought it was one of the greatest films of all time, called it “a bloodless movie with faceless characters”—and none of the actors, aside from Douglas Rain as the voice of HAL, are likely to stick in the memory. (As Noël Coward reputedly said: “Keir Dullea, gone tomorrow.”) But on an objective level, these are nothing less than the most naturalistic performances of any studio movie of the sixties. There isn’t a trace of the affectation or overacting that you see in so much science fiction, and Dullea, Gary Lockwood, and particularly William Sylvester, in his nice dry turn as Heywood Floyd, are utterly believable. You could make a strong case that their work here has held up better than most of the more conventionally acclaimed performances from the same decade. This doesn’t make them any better or worse, but it gives you a sense of what Kubrick, who drew his characters as obsessively as his sets and special effects, was trying to achieve. He wanted realism in his acting, along with everything else, and this is how it looks, even if we aren’t used to seeing it in space.

The result is still the most convincing cinematic vision of space exploration that we have, as well as the most technically ambitious movie ever made, and its impact, like that of all great works of art, appears in surprising places. By coincidence, I went to see 2001 the day after Donald Trump signed an executive order to reinstate the National Space Council, at a very peculiar ceremony that was held with a minimum of fanfare. The event was attended by Buzz Aldrin, who has played scenes across from Homer Simpson and Optimus Prime, and I can’t be sure that this didn’t strike him as the strangest stage he had ever shared. Here are a few of Trump’s remarks, pulled straight from the official transcript:

Security is going to be a very big factor with respect to space and space exploration.  At some point in the future, we’re going to look back and say, how did we do it without space? The Vice President will serve as the council’s chair….Some of the most successful people in the world want to be on this board…Our journey into space will not only make us stronger and more prosperous, but will unite us behind grand ambitions and bring us all closer together. Wouldn’t that be nice? Can you believe that space is going to do that? I thought politics would do that. Well, we’ll have to rely on space instead…We will inspire millions of children to carry on this proud tradition of American space leadership—and they’re excited—and to never stop wondering, hoping, and dreaming about what lies beyond the stars.

Taking a seat, Trump opened the executive order, exclaiming: “I know what this is. Space!” Aldrin then piped up with what was widely reported as a reference to Toy Story: “Infinity and beyond!” Trump seemed pleased: “This is infinity here. It could be infinity. We don’t really don’t know. But it could be. It has to be something—but it could be infinity, right?”

As HAL 9000 once said: “Yes, it’s puzzling.” Aldrin may have been quoting Toy Story, but he might well have been thinking of 2001, too, the last section of which is titled “Jupiter and Beyond the Infinite.” (As an aside, I should note that the line “To infinity and beyond” makes its first known appearance, as far as I can tell, in John W. Campbell’s 1934 serial The Mightiest Machine.) It’s an evocative but meaningless phrase, with the same problems that led Arthur C. Clarke to express doubts about Kubrick’s working title, Journey Beyond the Stars—which Trump, you’ll notice, also echoed. Its semantic content is nonexistent, which is only fitting for a ceremony that underlined the intellectual bankruptcy of this administration’s approach to space. I don’t think I’m overstating the matter when I say that Trump and Mike Pence have shown nothing but contempt for other forms of science. The science division of the Office of Science and Technology Policy lies empty. Pence has expressed bewilderment at the fact that climate change has emerged, “for some reason,” as an issue on the left. And Trump has proposed significant cuts to science and technology funding agencies. Yet his excitement for space seems unbounded and apparently genuine. He asked eagerly of astronaut Peggy Whitson: “Tell me, Mars, what do you see a timing for actually sending humans to Mars? Is there a schedule and when would you see that happening?” And the reasons behind his enthusiasm are primarily aesthetic and emotional. One of his favorite words is “beautiful,” in such phrases as “big, beautiful wall” and “beautiful military equipment,” and it was much in evidence here: “It is America’s destiny to be at the forefront of humanity’s eternal quest for knowledge and to be the leader amongst nations on our adventure into the great unknown. And I could say the great and very beautiful unknown. Nothing more beautiful.”

But the truly scary thing is that if Trump believes that the promotion of space travel can be divorced from any concern for science itself, he’s absolutely right. As I’ve said here before, in the years when science fiction was basically a subcategory of adventure fiction, with ray guns instead of revolvers, space was less important in itself than as the equivalent of the unexplored frontier of the western: it stood for the unknown, and it was a perfect backdrop for exciting plots. Later, when the genre began to take itself more seriously as a predictive literature, outer space was grandfathered in as a setting, even if it had little to do with any plausible vision of the future. Space exploration seemed like an essential part of our destiny as a species because it happened to be part of the genre already. As a result, you can be excited by the prospect of going to Mars while actively despising or distrusting everything else about science—which may be the only reason that we managed to get to the moon at all. (These impulses may have less to do with science than with religion. The most haunting image from the Apollo 11 mission, all the more so because it wasn’t televised, may be that of Aldrin taking communion on the lunar surface.) Science fiction made it possible, and part of the credit, or blame, falls on Kubrick. Watching 2001, I had tears in my eyes, and I felt myself filled with all my old emotions of longing and awe. As Kubrick himself stated: “If 2001 has stirred your emotions, your subconscious, your mythological yearnings, then it has succeeded.” And it did, all too well, at the price of separating our feelings for space even further from science, and of providing a place for those subconscious urges to settle while leaving us consciously indifferent to problems closer to home. Kubrick might not have faked the moon landing, but he faked a Jupiter mission, and he did it beautifully. And maybe, at least for now, it should save us the expense of doing it for real.

The last tango

with 5 comments

Bernardo Bertoclucci, Marlon Brando, and Maria Schneider on the set of Last Tango in Paris

When I look back at many of my favorite movies, I’m troubled by a common thread that they share. It’s the theme of the control of a vulnerable woman by a man in a position of power. The Red Shoes, my favorite film of all time, is about artistic control, while Blue Velvet, my second favorite, is about sexual domination. Even Citizen Kane has that curious subplot about Kane’s attempt to turn Susan into an opera star, which may have originated as an unkind reference to William Randolph Hearst and Marion Davies, but which survives in the final version as an emblem of Kane’s need to collect human beings like playthings. It’s also hard to avoid the feeling that some of these stories secretly mirror the relationship between the director and his actresses on the set. Vertigo, of course, can be read as an allegory for Hitchcock’s own obsession with his leading ladies, whom he groomed and remade as meticulously as Scotty attempts to do with Madeline. In The Shining, Jack’s abuse of Wendy feels only slightly more extreme than what we know Kubrick—who even resembles Jack a bit in the archival footage that survives—imposed on Shelley Duvall. (Duvall’s mental health issues have cast a new pall on those accounts, and the involvement of Kubrick’s daughter Vivian has done nothing to clarify the situation.) And Roger Ebert famously hated Blue Velvet because he felt that David Lynch’s treatment of Isabella Rossellini had crossed an invisible moral line.

The movie that has been subjected to this kind of scrutiny most recently is Last Tango in Paris, after interview footage resurfaced of Bernardo Bertolucci discussing its already infamous rape scene. (Bertolucci originally made these comments three years ago, and the fact that they’ve drawn attention only now is revealing in itself—it was hiding in plain sight, but it had to wait until we were collectively prepared to talk about it.) Since the story first broke, there has been some disagreement over what Maria Schneider knew on the day of the shoot. You can read all about it here. But it seems undeniable that Bertolucci and Brando deliberately withheld crucial information about the scene from Schneider until the cameras were rolling. Even the least offensive version makes me sick to my stomach, all the more so because Last Tango in Paris has been an important movie to me for most of my life. In online discussions of the controversy, I’ve seen commenters dismissing the film as an overrated relic, a vanity project for Brando, or one of Pauline Kael’s misguided causes célèbres. If anything, though, this attitude lets us off the hook too easily. It’s much harder to admit that a film that genuinely moved audiences and changed lives might have been made under conditions that taint the result beyond retrieval. It’s a movie that has meant a lot to me, as it did to many other viewers, including some I knew personally. And I don’t think I can ever watch it again.

Marlon Brando in Last Tango in Paris

But let’s not pretend that it ends there. It reflects a dynamic that has existed between directors and actresses since the beginning, and all too often, we’ve forgiven it, as long as it results in great movies. We write critical treatments of how Vertigo and Psycho masterfully explore Hitchcock’s ambivalence toward women, and we overlook the fact that he sexually assaulted Tippi Hedren. When we think of the chummy partnerships that existed between men like Cary Grant and Howard Hawks, or John Wayne and John Ford, and then compare them with how directors have regarded their female collaborators, the contrast couldn’t be more stark. (The great example here is Gone With the Wind: George Cukor, the original director, was fired because he made Clark Gable uncomfortable, and he was replaced by Gable’s buddy Victor Fleming. Vivien Leigh and Olivia de Havilland were forced to consult with Cukor in secret.) And there’s an unsettling assumption on the part of male directors that this is the only way to get a good performance from a woman. Bertolucci says that he and Brando were hoping to get Schneider’s raw reaction “as a girl, instead of as an actress.” You can see much the same impulse in Kubrick’s treatment of Duvall. Even Michael Powell, one of my idols, writes of how he and the other actors frightened Moira Shearer to the point of tears for the climactic scene of The Red Shoes—“This was no longer acting”—and says elsewhere: “I never let love interfere with business, or I would have made love to her. It would have improved her performance.”

So what’s a film buff to do? We can start by acknowledging that the problem exists, and that it continues to affect women in the movies, whether in the process of filmmaking itself or in the realities of survival in an industry that is still dominated by men. Sometimes it leads to abuse or worse. We can also honor the work of those directors, from Ozu to Almodóvar to Wong Kar-Wai, who have treated their actresses as partners in craft. Above all else, we can come to terms with the fact that sometimes even a masterpiece fails to make up for the choices that went into it. Thinking of Last Tango in Paris, I was reminded of Norman Mailer, who wrote one famous review of the movie and was linked to it in another. (Kael wrote: “On the screen, Brando is our genius as Mailer is our genius in literature.”) Years later, Mailer supported the release from prison of a man named Jack Henry Abbott, a gifted writer with whom he had corresponded at length. Six weeks later, Abbott stabbed a stranger to death. Afterward, Mailer infamously remarked:

I’m willing to gamble with a portion of society to save this man’s talent. I am saying that culture is worth a little risk.

But it isn’t—at least not like this. Last Tango in Paris is a masterpiece. It contains the single greatest male performance I’ve ever seen. But it wasn’t worth it.

My alternative canon #7: Vanilla Sky

with 5 comments

Tom Cruise and Penelope Cruz in Vanilla Sky

Note: I’ve often discussed my favorite movies on this blog, but I also love films that are relatively overlooked or unappreciated. For the rest of the week, I’ll be looking at some of the neglected gems, problem pictures, and flawed masterpieces that have shaped my inner life, and which might have become part of the standard cinematic canon if the circumstances had been just a little bit different. You can read the previous installments here

I’ve always been an unabashed Tom Cruise fan, less for the actor than for the world’s finest producer and packager of talent who happens to occupy the body of a star, and after Edge of Tomorrow and the last two Mission: Impossible films, there are signs that the overall culture is coming around to the realization that he’s simply the most reliable brand in movies. Over the last decade, though, he has shown signs of diminished ambition. Cruise seems increasingly content to be nothing but an action hero, and there’s no question that he still delivers great entertainments. But for a while, starting in the late nineties, there were tantalizing hints of something more. Between 1999 and 2004, he made a series of movies that were essentially about being Tom Cruise, beginning with Eyes Wide Shut, a grueling experience that seems to have catalyzed his interest in pushing against his own aura. Stanley Kubrick always knew that he wanted a married couple to play Bill and Alice Harford, and the result is a movie that only becomes more complex and intriguing—at least to my eyes—the more we learn about how that marriage unraveled. Cruise never quite managed to pull off the same trick again, but his performances in movies from Magnolia to Collateral feel like a series of exploratory maneuvers, played out for an audience of millions. After War of the Worlds, the effort faded, and he spends most of his time now leveraging his history and presence in ways that are more obvious, which isn’t to say that they aren’t effective.

But I miss the Cruise of the turn of the millennium, a peerless creation that received its definitive statement in Vanilla Sky, which I still regard as criminally unappreciated and misunderstood. It feels like a snapshot now of a lost moment, both in history—you can see the Twin Towers looming in the background of a crucial shot—and in my own life: I saw it just before moving to New York after college, and it’s my favorite portrait of that city as it existed in those days. I’m not sure what drew Cruise to attempt a remake of Abre Los Ojos, or to recruit Cameron Crowe to direct it, but the sheer impersonality of the project seems to have freed Crowe, who transformed it from a straight thriller into a pop cultural phantasmagoria. It’s really an allegory about how we all construct ourselves out of fragments of songs, album covers, and old movies, and it captured something essential for me in a year when I was building an adult life out of little more than a few precious notions. (I ended up seeing it four times in the theater, a personal record, although it was mostly just so I could listen again to the first five notes of Radiohead’s “Everything in Its Right Place” as they played over the opening cut to black.) And it wouldn’t work at all without the presence of the world’s biggest movie star. Cruise plays much of it in a mask, a visual device that appears in films as different as Eyes Wide Shut and the Mission: Impossible franchise, but as time goes on, Vanilla Sky feels like the movie in which he comes the closest to revealing who he really is, even if it’s nothing more than the sum of his roles. But isn’t that true of everyone?

Written by nevalalee

June 14, 2016 at 9:00 am

My alternative canon #1: A Canterbury Tale

with 3 comments

Sheila Sim and Eric Portman in A Canterbury Tale

Note: I’ve often discussed my favorite movies on this blog, but I also love films that are relatively overlooked or unappreciated. Over the next two weeks, I’ll be looking at some of the neglected gems, problem pictures, and flawed masterpieces that have shaped my inner life, and which might have become part of the standard cinematic canon if the circumstances had been just a little bit different.

I’ve frequently said that The Red Shoes is my favorite movie of all time, but it isn’t even the most remarkable film directed by Michael Powell and Emeric Pressburger. The Red Shoes succeeds in large part by following through on its promises: it takes place in a fascinating world and tells a story of high melodrama, with an obvious determination to deliver as much color and atmosphere to the audience as possible, and its brilliance emerges from how consistently it lives up to its own impossible standards. A Canterbury Tale, which came out five years earlier, is in many respects more astonishing, because it doesn’t seem to have any conventional ambitions at all. It’s a deliberately modest film with a story so inconsequential that it verges on a commentary on the arbitrariness of all narrative: three young travelers, stranded at a small village near Canterbury during World War II, attempt to solve the mystery of “the glue man,” an unseen figure who throws glue at the hair of local women to discourage them from going out at night—and that, incredibly, is it. When the glue man’s identity is revealed, it’s handled so casually that the moment is easy to miss, and not even the protagonists themselves seem all that interested in the plot, which occupies about ten minutes of a film that runs over two hours in its original cut. And the fact that the movie itself was openly conceived as a light propaganda picture doesn’t seem to work in its favor.

Yet this is one of the most beautiful movies ever made, a languid series of funny, moving, and evocative set pieces that reminded me, when I first saw it, of Wong Kar-Wai magically set loose in wartime Britain. There are the usual flourishes of cinematic playfulness from Powell and Pressburger—including a cut from a medieval falcon to a modern warplane that anticipates Kubrick in 2001—but the tone is atypically relaxed and gentle, with even less plot than in its spiritual sequel I Know Where I’m Going! Despite the title, it doesn’t have much to do with Chaucer, except that the lead characters are all pilgrims who have been damaged in different ways and are healed by a journey to Canterbury. (Years later, I stayed at a tiny hotel within sight of the cathedral, where I verified that the movie was on sale at its gift shop.) It’s nostalgic and vaguely conservative, but it also looks ahead to the New Wave with its visual zest, greediness for location detail, and willingness to take happy digressions. The cast includes the lovely ingenue Sheila Sim, who later married Richard Attenborough, and Eric Portman as Colpeper, the local magistrate, who, in a typically perverse touch from the Archers, is both their virtuous embodiment of high Tory ideals and kind of a creepy weirdo. Sim died earlier this year, but when she looks up at the clouds in the tall grass with Portman, she lives forever in my heart—along with the film itself, which keeps one foot in the past while somehow managing to seem one step ahead of every movie that came after it.

The Coco Chanel rule

with 4 comments

Coco Chanel

“Before you leave the house,” the fashion designer Coco Chanel is supposed to have said, “look in the mirror and remove one accessory.” As much as I like it, I’m sorry to say that this quote is most likely apocryphal: you see it attributed to Chanel everywhere, but without the benefit of an original source, which implies that it’s one of those pieces of collective wisdom that have attached themselves parasitically to a famous name. Still, it’s valuable advice. It’s usually interpreted, correctly enough, as a reminder that less is more, but I prefer to think of it as a statement about revision. The quote isn’t about reaching simplicity from the ground up, but about taking something and improving it by subtracting one element, like the writing rule that advises you to cut ten percent from every draft. And what I like the most about it is that its moment of truth arrives at the very last second, when you’re about to leave the house. That final glance in the mirror, when it’s almost too late to make additional changes, is often when the true strengths and weaknesses of your decisions become clear, if you’re smart enough to distinguish it from the jitters. (As Jeffrey Eugenides said to The Paris Review: “Usually I’m turning the book in at the last minute. I always say it’s like the Greek Olympics—’Hope the torch lights.'”)

But which accessory should you remove? In the indispensable book Behind the Seen, the editor Walter Murch gives us an important clue, using an analogy from filmmaking:

In interior might have four different sources of light in it: the light from the window, the light from the table lamp, the light from the flashlight that the character is holding, and some other remotely sourced lights. The danger is that, without hardly trying, you can create a luminous clutter out of all that. There’s a shadow over here, so you put another light on that shadow to make it disappear. Well, that new light casts a shadow in the other direction. Suddenly there are fifteen lights and you only want four.

As a cameraman what you paradoxically do is have the gaffer turn off the main light, because it is confusing your ability to really see what you’ve got. Once you do that, you selectively turn off some of the lights and see what’s left. And you discover that, “OK, those other three lights I really don’t need at all—kill ’em.” But it can also happen that you turn off the main light and suddenly, “Hey, this looks great! I don’t need that main light after all, just these secondary lights. What was I thinking?”

This principle, which Murch elsewhere calls “blinking the key,” implies that you should take away the most important piece, or the accessory that you thought you couldn’t live without.

Walter Murch

This squares nicely with a number of principles that I’ve discussed here before. I once said that ambiguity is best created out of a network of specifics with one crucial piece removed, and when you follow the Chanel rule, on a deeper level, the missing accessory is still present, even after you’ve taken it off. The remaining accessories were presumably chosen with it in mind, and they preserve its outlines, resulting in a kind of charged negative space that binds the rest together. This applies to writing, too. “The Cask of Amontillado” practically amounts to a manual on how to wall up a man alive, but Poe omits the one crucial detail—the reason for Montresor’s murderous hatred—that most writers would have provided up front, and the result is all the more powerful. Shakespeare consistently leaves out key explanatory details from his source material, which renders the behavior of his characters more mysterious, but no less concrete. And the mumblecore filmmaker Andrew Bujalski made a similar point a few years ago to The New York Times Magazine: “Write out the scene the way you hear it in your head. Then read it and find the parts where the characters are saying exactly what you want/need them to say for the sake of narrative clarity (e.g., ‘I’ve secretly loved you all along, but I’ve been too afraid to tell you.’) Cut that part out. See what’s left. You’re probably close.”

This is a piece of advice that many artists could stand to take to heart, especially if they’ve been blessed with an abundance of invention. I like Interstellar, for instance, but I have a hunch that it would have been an even stronger film if Christopher Nolan had made a few cuts. If he had removed Anne Hathaway’s speech on the power of love, for instance, the same point would have come across in the action, but more subtly, assuming that the rest of the story justified its inclusion in the first place. (Of course, every film that Nolan has ever made strives valiantly to strike a balance between action and exposition, and in this case, it stumbled a little in the wrong direction. Interstellar is so openly indebted to 2001 that I wish it had taken a cue from that movie’s script, in which Kubrick and Clarke made the right strategic choice by minimizing the human element wherever possible.) What makes the Chanel rule so powerful is that when you glance in the mirror on your way out the door, what catches your eye first is likely to be the largest, flashiest, or most obvious component, which often adds the most by its subtraction. It’s the accessory that explains too much, or draws attention to itself, rather than complementing the whole, and by removing it, we’re consciously saying no to what the mind initially suggests. As Chanel is often quoted as saying: “Elegance is refusal.” And she was right—even if it was really Diana Vreeland who said it. 

The prankster principle

leave a comment »

Totoro in Toy Story 3

In an interview with McKinsey Quarterly, Ed Catmull of Pixar was recently asked: “How do you, as the leader of a company, simultaneously create a culture of doubt—of being open to careful, systematic introspection—and inspire confidence?” He replied:

The fundamental tension [at Pixar] is that people want clear leadership, but what we’re doing is inherently messy. We know, intellectually, that if we want to do something new, there will be some unpredictable problems. But if it gets too messy, it actually does fall apart. And adhering to the pure, original plan falls apart, too, because it doesn’t represent reality. So you are always in this balance between clear leadership and chaos; in fact that’s where you’re supposed to be. Rather than thinking, “Okay, my job is to prevent or avoid all the messes,” I just try to say, “well, let’s make sure it doesn’t get too messy.”

Which sounds a lot like the observation from the scientist Max Delbrück that I never tire of quoting: “If you’re too sloppy, then you never get reproducible results, and then you never can draw any conclusions; but if you are just a little sloppy, then when you see something startling, you [can] nail it down…I called it the ‘Principle of Limited Sloppiness.’”

Most artists are aware that creativity requires a certain degree of controlled messiness, and scientists—or artists who work in fields where science and technology play a central role, as they do at Pixar—seem to be particularly conscious of this. As the zoologist John Zachary Young said:

Each individual uses the store of randomness, with which he was born, to build during his life rules which are useful and can be passed on…We might therefore take as our general picture of the universe a system of continuity in which there are two elements, randomness and organization, disorder and order, if you like, alternating with one another in such a fashion as to maintain continuity.

I suspect that scientists feel compelled to articulate this point so explicitly because there are so many other factors that discourage it in the pursuit of ordinary research. Order, cleanliness, and control are regarded as scientific virtues, and for good reason, which makes it all the more important to introduce a few elements of disorder in a systematic way. Or, failing that, to acknowledge the usefulness of disorder and to tolerate it to a certain extent.

Werner Herzog Eats His Shoe

When you’re working by yourself, you find that both your headspace and your workspace tend to arrive at whatever level of messiness works best for you. On any given day, the degree of clutter in my office is more or less the same, with occasional deviations toward greater or lesser neatness: it’s a nest that I’ve feathered into a comfortable setting for productivity—or inactivity, which often amounts to the same thing. It’s tricker when different personalities have to work together. What sets Pixar apart is its ability to preserve that healthy alternation between order and disorder, while still releasing a blockbuster movie every year. It does this, in part, by limiting the number of feature films that it has in production at any one time, and by building in systems for feedback and deconstruction, with an environment that encourages artists to start again from scratch. There’s also a tradition of prankishness that the company has tried to preserve. As Catmull says:

For example, when we were building Pixar, the people at the time played a lot of practical jokes on each other, and they loved that. They think it’s awesome when there are practical jokes and people do things that are wild and crazy…Without intending to, the culture slowly shifts. How do you keep the shift from happening? I can’t go out and say, “Okay, we’re going to organize some wild and crazy activities.” Top-down organizing of spontaneous activities isn’t a good idea.

It’s hard to scale up a culture of practical jokes, and Pixar has faced the same challenges here as elsewhere. The mixed outcomes of Brave and, to some extent, The Good Dinosaur show that the studio isn’t infallible, and a creative process that depends on a movie sucking for three out of four years can run into trouble when you shift that timeline. But the fact that Pixar places so much importance on this kind of prankishness is revealing in itself. It arises in large part from its roots in the movies, which have been faced with the problem of maintaining messiness in the face of big industrial pressures almost from the beginning. (Orson Welles spoke of “the orderly disorder” that emerges from the need to make quick decisions while moving large amounts of people and equipment, and Stanley Kubrick was constantly on the lookout for collaborators like Ken Adam who would allow him to be similarly spontaneous.) There’s a long tradition of pranks on movie sets, shading imperceptibly from the gags we associate with the likes of George Clooney to the borderline insane tactics that Werner Herzog uses to keep that sense of danger alive. The danger, as Herzog is careful to assure us, is more apparent than real, and it’s more a way of fruitfully disordering what might otherwise become safe and predictable. But just by the right amount. As the artist Frank Stella has said of his own work: “I disorder it a little bit or, I should say, I reorder it. I wouldn’t be so presumptuous to claim that I had the ability to disorder it. I wish I did.”

“Open the bomb bay doors, please, Ken…”

leave a comment »

Slim Pickens in Dr. Strangelove

After the legendary production designer Ken Adam died last week, I found myself browsing through the book Ken Adam: The Art of Production Design, a wonderfully detailed series of interviews that he conducted with the cultural historian Christopher Frayling. It’s full of great stories, but the one I found myself pondering the most is from the making of Dr. Strangelove. Stanley Kubrick had just cast Slim Pickens in the role of Major Kong, the pilot of the B-52 bomber that inadvertently ends up triggering the end of the world, and it led the director to a sudden brainstorm. Here’s how Adam tells it:

[The bomber set] didn’t have practical bomb doors—we didn’t need them in the script at that time—and the set was almost ready to shoot. And Stanley said, “We need practical bomb doors.” He wanted this Texan cowboy to ride the bomb like a bronco into the Russian missile site. I did some setups, sketches for the whole thing, and Stanley asked me when it would be ready. I said, “If I work three crews twenty-four hours a day, you still won’t have it for at least a week, and that’s too late.” So now I arrive at Shepperton and I’m having kittens because I knew it was a fantastic idea but physically, mechanically, we couldn’t get it done. So again it was Wally Veevers, our special effects man, who saved the day, saying he’d sleep on it and come up with an idea. He always did that, even though he was having heart problems and wasn’t well. Wally came back and said, “We’re going to take a ten-by-eight still of the bomb bay interior, cut out the bomb-door opening, and shoot the bomb coming down against blue backing.” And that’s the way they did it.

I love this story for a lot of reasons. The first is the rare opportunity it affords to follow Kubrick’s train of thought. He had cast Peter Sellers, who was already playing three other lead roles, as Major Kong, but the performance wasn’t working, and when Sellers injured his ankle, Kubrick used this as an excuse to bring in another actor. Slim Pickens brought his own aura of associations, leading Kubrick to the movie’s single most memorable image, which now seems all but inevitable. And he seemed confident that any practical difficulties could be overcome. As Adam says elsewhere:

[Kubrick] had this famous theory in those days that the director had the right to change his mind up until the moment the cameras started turning. But he changed his mind after the cameras were rolling! For me, it was enormously demanding, because until then I was basically a pretty organized person. But I wasn’t yet flexible enough to meet these sometimes impossible demands that he came up with. So I was going through an anxiety crisis. But at the same time I knew that every time he changed his mind, he came up with a brilliant idea. So I knew I had to meet his demands in some way, even if it seemed impossible from a practical point of view.

Which just serves as a reminder that for Kubrick, who is so often characterized as the most meticulous and obsessive of directors, an intense level of preparation existed primarily to enable those moments in which the plan could be thrown away—a point that even his admirers often overlook.

Design by Ken Adam for Dr. Strangelove

It’s also obvious that Kubrick couldn’t have done any of this if he hadn’t surrounded himself with brilliant collaborators, and his reliance on Adam testifies to his belief that he had found someone who could translate his ideas into reality. (He tried and failed to get Adam to work with him on 2001, and the two reunited for Barry Lyndon, for which Adam deservedly won an Oscar.) We don’t tend to think of Dr. Strangelove as a movie that solved enormous technical problems in the way that some of Kubrick’s other projects did, but like any film, it presented obstacles that most viewers will never notice. Creating the huge maps in the war room, for instance, required a thousand hundred-watt bulbs installed behind perspex, along with an improvised air-conditioning system to prevent the heat from blistering the transparencies. Like the bomb bay doors, it’s the sort of issue that would probably be solved today with digital effects, but the need to address it on the set contributes to the air of authenticity that the story demands. Dr. Strangelove wouldn’t be nearly as funny if its insanities weren’t set against a backdrop of painstaking realism. Major Kong is a loving caricature, but the bomber he flies isn’t: it was reconstructed down to the tiniest detail from photos in aeronautical magazines. And there’s a sense in which Kubrick, like Christopher Nolan, embraced big logistical challenges as a way to combat a tendency to live in his own head—which is the one thing that these two directors, who are so often mentioned together, really do have in common.

There’s also no question that this was hard on Ken Adam, who was driven to something close to a nervous breakdown during the filming of Barry Lyndon. He says:

I became so neurotic that I bore all of Stanley’s crazy decisions on my own shoulders. I was always apologizing to actors for something that had gone wrong. I felt responsible for every detail of Stanley’s film, for all his mistakes and neuroses. I was apologizing to actors for Stanley’s unreasonable demands.

In Frayling’s words, Adam was “the man in the middle, with a vengeance.” And if he ended up acting as the ambassador, self-appointed or otherwise, between Kubrick and the cast and crew, it isn’t hard to see why: the production designer, then as now, provides the primary interface between the vision on the page—or in the director’s head—and its realization as something that can be captured on film. It’s a role that deserves all the more respect at a time when physical sets are increasingly being replaced by digital environments that live somewhere on a hard drive at Weta Digital. A director is not a designer, and even Adam says that Kubrick “didn’t know how to design,” although he also states that the latter could have taken over any number of the other technical departments. (This wasn’t just flattery, either. Years later, Adam would call Kubrick, in secret, to help him light the enormous supertanker set for The Spy Who Loved Me.) A director has to be good at many things, but it all emerges from a willingness to confront the problems that arise where the perfect collides with the possible. And it’s to the lasting credit of both Kubrick and Adam that they never flinched from that single combat, toe to toe with reality.

My ten great movies #4: The Shining

with 3 comments

For most of the past decade, the Kubrick film on this list would have been Eyes Wide Shut, and while my love for that movie remains undiminished—I think it’s Kubrick’s most humane and emotionally complex work, and endlessly inventive in ways that most viewers tend to underestimate—it’s clear now that The Shining is more central to my experience of the movies. The crucial factor, perhaps unsurprisingly, was my decision to become a writer. Because while there have been a lot of movies about novelists, The Shining is by far our greatest storehouse of images about the inside of a writer’s head. The huge, echoing halls of the Overlook are as good a metaphor as I’ve ever seen for writer’s block or creative standstill, and there isn’t a writer who hasn’t looked at a pile of manuscript and wondered, deep down, if it isn’t basically the same as the stack of pages that Jack Torrance lovingly ruffles in his climactic scene with Wendy.

The visual, aural, and visceral experience of The Shining is so overwhelming that there’s no need to describe it here. Instead, I’d like to talk about the performances, which are the richest that Kubrick—often underrated in his handling of actors—ever managed to elicit. (Full credit should also be given to Stephen King’s original novel, to which the movie is more indebted than is generally acknowledged.) At one point, I thought that the film’s only major flaw is that it was impossible to imagine Jack Nicholson and Shelley Duvall as a married couple, but I’m no longer sure about this: there are marriages this strange and mismatched, and the glimpses of their relationship early on are depressingly plausible. Duvall gives what is simply one of the major female performances in the history of movies, and as David Thomson was among the first to point out, Nicholson is great when he plays crazy, but he’s also strangely tender in his few quiet scenes with his son. “You’ve always been the caretaker,” Grady’s ghost tells Torrance, and his personality suffuses every frame of this incredible labyrinth.

Tomorrow: A masterpiece in six weeks.

Written by nevalalee

May 19, 2015 at 9:00 am

Altered states of conscientiousness

with 2 comments

Bob Dylan in Don't Look Back

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What pop culture is best consumed in an altered state?”

When Bob Dylan first met the Beatles, the story goes, he was astonished to learn that they’d never used drugs. (Apparently, the confusion was all caused by a mondegreen: Dylan misheard a crucial lyric from “I Want to Hold Your Hand” as “I get high” instead of “I can’t hide.”) This was back in the early days, of course, and later, the Beatles would become part of the psychedelic culture in ways that can’t be separated from their greatest achievements. Still, it’s revealing that their initial triumphs emerged from a period of clean living. Drugs can encourage certain qualities, but musicianship and disciplined invention aren’t among them, and I find it hard to believe that Lennon and McCartney would have gained much, if anything, from controlled substances without that essential foundation—certainly not to the point where Dylan would have wanted to meet them in the first place. For artists, drugs are a kind of force multiplier, an ingredient that can enhance elements that are already there, but can’t generate something from nothing. As Norman Mailer, who was notably ambivalent about his own drug use, liked to say, drugs are a way of borrowing on the future, but those seeds can wither and die if they don’t fall on soil that has been prepared beforehand.

Over the years, I’ve read a lot written by or about figures in the drug culture, from Carlos Castaneda to Daniel Pinchbeck to The Electric Kool-Aid Acid Test, and I’m struck by a common pattern: if drugs lead to a state of perceived insight, it usually takes the form of little more than a conviction that everyone should try drugs. Drug use has been a transformative experience for exceptional individuals as different as Aldous Huxley, Robert Anton Wilson, and Steve Jobs, but it tends to be private, subjective, and uncommunicable. As such, it doesn’t have much to do with art, which is founded on its functional objectivity—that is, on its capacity to be conveyed more or less intact from one mind to the next. And it creates a lack of critical discrimination that can be dangerous to artists when extended over time. If marijuana, as South Park memorably pointed out, makes you fine with being bored, it’s the last thing artists need, since art boils down to nothing but a series of deliberate strategies for dealing with, confronting, or eradicating boredom. When you’re high, you’re easily amused, which makes you less likely to produce anything that can sustain the interest of someone who isn’t in the same state of chemical receptivity.

2001: A Space Odyssey

And the same principle applies to the artistic experience from the opposite direction. When someone says that 2001 is better on pot, that isn’t saying much, since every movie seems better on pot. Again, however, this has a way of smoothing out and trivializing a movie’s real merits. Kubrick’s film comes as close as any ever made to encouraging a transcendent state without the need of mind-altering substances, and his own thoughts on the subject are worth remembering:

[Drug use] tranquilizes the creative personality, which thrives on conflict and on the clash and ferment of ideas…One of the things that’s turned me against LSD is that all the people I know who use it have a peculiar inability to distinguish between things that are really interesting and stimulating and things that appear so in the state of universal bliss the drug induces on a good trip. They seem to completely lose their critical faculties and disengage themselves from some of the most stimulating areas of life.

Which isn’t to say that a temporary relaxation of the faculties doesn’t have its place. I’ll often have a beer while watching a movie or television show, and my philosophy here is similar to that of chef David Chang, who explains his preference for “the lightest, crappiest beer”:

Let me make one ironclad argument for shitty beer: It pairs really well with food. All food. Think about how well champagne pairs with almost anything. Champagne is not a flavor bomb! It’s bubbly and has a little hint of acid and is cool and crisp and refreshing. Cheap beer is, no joke, the champagne of beers.

And a Miller Lite—which I’m not embarrassed to proclaim as my beer of choice—pairs well with almost any kind of entertainment, since it both gives and demands so little. At minimum, it makes me the tiniest bit more receptive to whatever I’m being shown, not enough to forgive its flaws, but enough to encourage me to meet it halfway. For much the same reason, I no longer drink while working: even that little extra nudge can be fatal when it comes to evaluating whether something I’ve written is any good. Because Kubrick, as usual, deserves the last word: “Perhaps when everything is beautiful, nothing is beautiful.”

Written by nevalalee

March 20, 2015 at 9:16 am

The adaptation game

with 2 comments

Nicole Kidman in Eyes Wide Shut

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s question: “Have you ever had a movie (or other media) experience enhanced by a lack of familiarity with the source material?

There was a time in my life when I took it as an article of faith that if I wanted to see a movie based on a novel, I had to read the book first. When I was working as a film critic in college, this policy made sense—I wanted my reviews to seem reasonably informed—so I devoured the likes of Harry Potter and the Sorcerer’s Stone and Bridget Jones’s Diary mere days before seeing their adaptations in theaters. Later, I tackled the original material out of a vague sense of guilt or obligation, as I did with Watchmen, a comparison that did Zack Snyder’s movie version no favors. In almost every instance, though, it meant that I watched the resulting film through a kind of double exposure, constantly comparing the events on screen with their equivalents, or lack thereof, on the page. It’s how I imagine fans of Twilight or The Hunger Games regard the adaptations of their own favorite properties, the quality of which is often judged by how faithfully they follow their sources. And it wasn’t until recently that I gave up on the idea of trying to read every book before seeing the movie, in part because I have less free time, but also because my attitudes toward the issue have changed, hopefully for the better.

In fact, I’d like to propose a general rule: the priority of one version of a story over another is a fact, not a value judgment. This apples to remakes and homages as much as to more straightforward adaptations. After enough time has passed, the various approaches that different artists take to the same underlying narrative cease to feel like points on a timeline, and more like elements of a shared constellation of ideas. I saw The Silence of the Lambs long before reading Thomas Harris’s original novels, later added Manhunter to the mix, and have been having a hell of a good time going back to the books with the cast of Hannibal in mind. I don’t know how I’d feel about these characters and stories if I’d read each book as it came out and watched the adaptations later, but I’d like to think that I’d have ended up in more or less the same place, with each element sustaining and enriching every other. The same is true of a movie like L.A. Confidential, which is less a faithful translation of the book into film than a rearrangement of the pieces that James Ellroy provided, an “alternate life,” as the author himself puts it, for the men and women he had imagined. Would I feel the same way if I’d read the book first? Maybe—but only if enough time had passed to allow me to regard the movie in its own right.

Anthony Hopkins and Jodie Foster in The Silence of the Lambs

Ultimately, I’ve come to think that out of all the ways of experiencing works of art with a common origin, the best option is to absorb them all, but to leave sufficient space between each encounter. I watched Infernal Affairs long before The Departed, but the earlier movie had faded almost entirely when I saw the remake, and now I find that I can switch back and forth between the two films in full appreciation of each one’s strengths. (The Departed is less a remake than an expansion of the tightly focused original: its bones are startlingly similar, but fleshed out with an hour’s worth of digressions and elaborations, all of which I love.) Occasionally, of course, the memory of one version is so strong that its alternate incarnations can’t compete, and this doesn’t always work to the benefit of the original. A few years ago, I tried to read Mario Puzo’s The Godfather for the first time, and I found that I just couldn’t finish it: Coppola’s movie is remarkably faithful, while elevating the material in almost every way, to the point where the novel itself seems oddly superfluous. This isn’t the case with The Silence of the Lambs, which I’m reading again now for maybe the tenth time with undiminished delight, but it’s a reminder of how unpredictable the relationship between the source and its adaptation can be.

And in retrospect, I’m grateful that I experienced certain works of art without any knowledge of the originals. I’ve enjoyed movies as different as The Name of the Rose and Lolita largely because I didn’t have a point of reference: the former because I didn’t know how much I was missing, the latter because I realized only later how much it owed to the book. And if you have the patience, it can be rewarding to delay the moment of comparison for as long as possible. I’ve loved Eyes Wide Shut ever since its initial release, fifteen years ago, when I saw it twice in a single day. A few months ago, I finally got around to reading Arthur Schnitzler’s Traumnovelle, and I was struck by the extent to which Kubrick’s movie is nearly a point-for-point adaptation. (The only real interpolation is the character of Ziegler, played by Sydney Pollack, who looms in the memory like a significant figure, even though he only appears in a couple of scenes.) Kubrick was famously secretive about his movie’s plot, and having read the novel, I can see why: faithful or not, he wanted it to be seen free of expectations—although I have a hunch that the film might have been received a little more warmly if viewers had been given a chance to acclimate themselves to its origins. But that doesn’t make him wrong. Stories have to rise or fall on their own terms, and when it comes to evaluating how well a movie works, a little knowledge can be a dangerous thing.

Left brain, right brain, samurai brain

leave a comment »

Seven Samurai

The idea that the brain can be neatly divided into its left and right hemispheres, one rational, the other intuitive, has been largely debunked, but that doesn’t make it any less useful as a metaphor. You could play an instructive game, for instance, by placing movie directors on a spectrum defined by, say, Kubrick and Altman as the quintessence of left-brained filmmaking and its right-brained opposite, and although such distinctions may be artificial, they can generate their own kind of insight. Christopher Nolan, for one, strikes me as a fundamentally left-brained director who makes a point of consciously willing himself into emotion. (Citing some of the cornier elements of Interstellar, the writer Ta-Nehisi Coates theorizes that they were imposed by the studio, but I think it’s more likely that they reflect Nolan’s own efforts, not always successful, to nudge the story into recognizably human places. He pulled it off beautifully in Inception, but it took him ten years to figure out how.) And just as Isaiah Berlin saw Tolstoy as a fox who wanted to be a hedgehog, many of the recent films of Wong Kar-Wai feel like the work of a right-brained director trying to convince himself that the left hemisphere is where he belongs.

Of all my favorite directors, the one who most consistently hits the perfect balance between the two is Akira Kurosawa. I got to thinking about this while reading the editor and teacher Richard D. Pepperman’s appealing new book Everything I Know About Filmmaking I Learned Watching Seven Samurai, which often reads like the ultimate tribute to Kurosawa’s left brain. It’s essentially a shot for shot commentary, cued up to the definitive Criterion Collection release, that takes us in real time through the countless meaningful decisions made by Kurosawa in the editing room: cuts, dissolves, wipes, the interaction between foreground and background, the use of music and sound, and the management of real and filmic space, all in service of story. It’s hard to imagine a better movie for a study like this, and with its generous selection of stills, the book is a delight to browse through—it reminds me a little of Richard J. Anobile’s old photonovels, which in the days before home video provided the most convenient way of revisiting Casablanca or The Wrath of Khan. I’ve spoken before of the film editor as a kind of Apollonian figure, balancing out the Dionysian personality of the director on the set, and this rarely feels so clear as it does here, even, or especially, when the two halves are united in a single man.

Seven Samurai

As for Kurosawa’s right brain, the most eloquent description I’ve found appears in Donald Richie’s The Films of Akira Kurosawa, which is still the best book of its kind ever written. In his own discussion of Seven Samurai, Richie speaks of “the irrational rightness of an apparently gratuitous image in its proper place,” and continues:

Part of the beauty of such scenes…is just that they are “thrown away” as it were, that they have no place, that they do not ostensibly contribute, that they even constitute what has been called bad filmmaking. It is not the beauty of these unexpected images, however, that captivates…but their mystery. They must remain unexplained. It has been said that after a film is over all that remains are a few scattered images, and if they remain then the film was memorable…Further, if one remembers carefully one finds that it is only the uneconomical, mysterious images which remain…

Kurosawa’s films are so rigorous and, at the same time, so closely reasoned, that little scenes such as this appeal with the direct simplicity of water in the desert…[and] in no other single film are there as many as in Seven Samurai.

What one remembers best from this superbly economical film then are those scenes which seem most uneconomical—that is, those which apparently add nothing to it.

Richie goes on to list several examples: the old crone tottering forward to avenge the death of her son, the burning water wheel, and, most beautifully, the long fade to black before the final sequence of the villagers in the rice fields. My own favorite moment, though, occurs in the early scene when Kambei, the master samurai, rescues a little boy from a thief. In one of the greatest character introductions in movie history, Kambei shaves his head to disguise himself as a priest, asking only for two rice balls, which he’ll use to lure the thief out of the barn where the boy has been taken hostage. This information is conveyed in a short conversation between the farmers and the townspeople, who exit the frame—and after the briefest of pauses, a woman emerges from the house in the background, running directly toward the camera with the rice balls in hand, looking back for a frantic second at the barn. It’s the boy’s mother. There’s no particular reason to stage the scene like this; another director might have done it in two separate shots, if it had occurred to him to include it at all. Yet the way in which Kurosawa films it, with the crowd giving way to the mother’s isolated figure, is both formally elegant and strangely moving. It offers up a miniature world of story and emotion without a single cut, and like Kurosawa himself, it resists any attempt, including this one, to break it down into parts.

%d bloggers like this: