Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Archive for the ‘Movies’ Category

Cutty Sark and the semicolon

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on December 22, 2015.

In an interview that was first published in The Paris Review, the novelist Herbert Gold asked Vladimir Nabokov if an editor had ever offered him any useful advice. This is what Nabokov said in response:

By “editor” I suppose you mean proofreader. Among these I have known limpid creatures of limitless tact and tenderness who would discuss with me a semicolon as if it were a point of honor—which, indeed, a point of art often is. But I have also come across a few pompous avuncular brutes who would attempt to “make suggestions” which I countered with a thunderous “stet!”

I’ve always adored that thunderous stet, which tells us so much about Nabokov and his imperious resistance to being edited by anybody. Today, however, I’m more interested in the previous sentence. A semicolon, as Nabokov puts it, can indeed be a point of honor. Nabokov was perhaps the most painstaking of all modern writers, and it’s no surprise that the same perfectionism that produced such conceptual and structural marvels as Lolita and Pale Fire would filter down to the smallest details. But I imagine that even ordinary authors can relate to how a single punctuation mark in a manuscript can start to loom as large as the finger of God on the Sistine Chapel ceiling.

And there’s something about the semicolon that seems to inspire tussles between writers and their editors—or at least allows it to stand as a useful symbol of the battles that can occur during the editorial process. Here’s an excerpt from a piece by Charles McGrath in The New York Times Magazine about the relationship between Robert Caro, author of The Years of Lyndon Johnson, and his longtime editor Robert Gottlieb:

“You know that insane old expression, ‘The quality of his defect is the defect of his quality,’ or something like that?” Gottlieb asked me. “That’s really true of Bob. What makes him such a genius of research and reliability is that everything is of exactly the same importance to him. The smallest thing is as consequential as the biggest. A semicolon matters as much as, I don’t know, whether Johnson was gay. But unfortunately, when it comes to English, I have those tendencies, too, and we could go to war over a semicolon. That’s as important to me as who voted for what law.”

It’s possible that the semicolon keeps cropping up in such stories because its inherent ambiguity lends itself to disagreement. As Kurt Vonnegut once wrote: “Here is a lesson in creative writing. First rule: Do not use semicolons. They are transvestite hermaphrodites representing absolutely nothing. All they do is show you’ve been to college.” And I’ve more or less eliminated semicolons from my own work for much the same reason.

Robert De Niro and Martin Scorsese on the set of Raging Bull

But the larger question here is why artists fixate on things that even the most attentive reader would pass over without noticing. On one level, you could take a fight over a semicolon as an illustration of the way that the creative act—in which the artist is immersed in the work for months on end—tends to turn mountains into molehills. Here’s one of my favorite stories about the making of Raging Bull:

One night, when the filmmakers were right up against the deadline to make their release date, they were working on a nothing little shot that takes place in a nightclub, where a minor character turns to the bartender and orders a Cutty Sark. “I can’t hear what he’s saying,” [Martin Scorsese] said. Fiddling ensued—extensive fiddling—without satisfying him. [Producer Irwin] Winkler, who was present, finally deemed one result good enough and pointed out that messengers were standing by to hand-carry release prints to the few theaters where the picture was about to premiere. At which point, Scorsese snapped. “I want my name taken off the picture,” he cried—which bespeaks his devotion to detail. It also bespeaks his exhaustion at the end of Raging Bull, not to mention the craziness that so often overtakes movies as they wind down. Needless to say, he was eventually placated. And you can more or less hear the line in the finished print.

And you could argue that this kind of microscopic attention is the only thing that can lead to a work that succeeds on the largest possible scale.

But there’s yet another story that gets closer to truth. In Existential Errands, Norman Mailer describes a bad period in his life—shortly after he was jailed for stabbing his second wife Adele—in which he found himself descending into alcoholism and unable to work. His only source of consolation were the scraps of paper, “little crossed communications from some wistful outpost of my mind,” that he would find in his jacket pocket after a drunken night. Mailer writes of these poems:

I would go to work, however, on my scraps of paper. They were all I had for work. I would rewrite them carefully, printing in longhand and ink, and I would spend hours whenever there was time going over these little poems…And since I wasn’t doing anything else very well in those days, I worked the poems over every chance I had. Sometimes a working day would go by, and I might put a space between two lines and remove a word. Maybe I was mending.

Which just reminds us that a seemingly minuscule change can be the result of a prolonged confrontation with the work as a whole. You can’t obsess over a semicolon without immersing yourself in the words around it, and there are times when you need such a focal point to structure your engagement with the rest. It’s a little like what is called a lakshya in yoga: the tiny spot on the body or in the mind on which you concentrate while meditating. In practice, the lakshya can be anything or nothing, but without it, your attention tends to drift. In art, it can be a semicolon, a word, or a line about Cutty Sark. It may not be much in itself. But when you need to tether yourself to something, even a semicolon can be a lifeline.

Falls the Shadow

with one comment

Over the last year or so, I’ve found myself repeatedly struck by the parallels between the careers of John W. Campbell and Orson Welles. At first, the connection might seem tenuous. Campbell and Welles didn’t look anything alike, although they were about the same height, and their politics couldn’t have been more different—Welles was a staunch progressive and defender of civil rights, while Campbell, to put it mildly, wasn’t. Welles was a wanderer, while Campbell spent most of his life within driving distance of his birthplace in New Jersey. But they’re inextricably linked in my imagination. Welles was five years younger than Campbell, but they flourished at exactly the same time, with their careers peaking roughly between 1937 and 1942. Both owed significant creative breakthroughs to the work of H.G. Wells, who inspired Campbell’s story “Twilight” and Welles’s Mercury Theater adaptation of The War of the Worlds. In 1938, Campbell saw Welles’s famous modern-dress production of Julius Caesar with the writer L. Sprague de Camp, of which he wrote in a letter:

It represented, in a way, what I’m trying to do in the magazine. Those humans of two thousand years ago thought and acted as we do—even if they did dress differently. Removing the funny clothes made them more real and understandable. I’m trying to get away from funny clothes and funny-looking people in the pictures of the magazine. And have more humans.

And I suspect that the performance started a train of thought in both men’s minds that led to de Camp’s novel Lest Darkness Fall, which is about a man from the present who ends up in ancient Rome.

Campbell was less pleased by Welles’s most notable venture into science fiction, which he must have seen as an incursion on his turf. He wrote to his friend Robert Swisher: “So far as sponsoring that War of [the] Worlds thing—I’m damn glad we didn’t! The thing is going to cost CBS money, what with suits, etc., and we’re better off without it.” In Astounding, he said that the ensuing panic demonstrated the need for “wider appreciation” of science fiction, in order to educate the public about what was and wasn’t real:

I have long been an exponent of the belief that, should interplanetary visitors actually arrive, no one could possibly convince the public of the fact. These stories wherein the fact is suddenly announced and widespread panic immediately ensues have always seemed to me highly improbable, simply because the average man did not seem ready to visualize and believe such a statement.

Undoubtedly, Mr. Orson Welles felt the same way.

Their most significant point of intersection was The Shadow, who was created by an advertising agency for Street & Smith, the publisher of Astounding, as a fictional narrator for the radio series Detective Story Hour. Before long, he became popular enough to star in his own stories. Welles, of course, voiced The Shadow from September 1937 to October 1938, and Campbell plotted some of the magazine installments in collaboration with the writer Walter B. Gibson and the editor John Nanovic, who worked in the office next door. And his identification with the character seems to have run even deeper. In a profile published in the February 1946 issue of Pic magazine, the reporter Dickson Hartwell wrote of Campbell: “You will find him voluble, friendly and personally depressing only in what his friends claim is a startling physical resemblance to The Shadow.”

It isn’t clear if Welles was aware of Campbell, although it would be more surprising if he wasn’t. Welles flitted around science fiction for years, and he occasionally crossed paths with other authors in that circle. To my lasting regret, he never met L. Ron Hubbard, which would have been an epic collision of bullshitters—although Philip Seymour Hoffman claimed that he based his performance in The Master mostly on Welles, and Theodore Sturgeon once said that Welles and Hubbard were the only men he had ever met who could make a room seem crowded simply by walking through the door. In 1946, Isaac Asimov received a call from a lawyer whose client wanted to buy all rights to his robot story “Evidence” for $250. When he asked Campbell for advice, the editor said that he thought it seemed fair, but Asimov’s wife told him to hold out for more. Asimov called back to ask for a thousand dollars, adding that he wouldn’t discuss it further until he found out who the client was. When the lawyer told him that it was Welles, Asimov agreed to the sale, delighted, but nothing ever came of it. (Welles also owned the story in perpetuity, making it impossible for Asimov to sell it elsewhere, a point that Campbell, who took a notoriously casual attitude toward rights, had neglected to raise.) Twenty years later, Welles made inquiries into the rights for Heinlein’s The Puppet Masters, which were tied up at the time with Roger Corman, but never followed up. And it’s worth noting that both stories are concerned with the problem of knowing how other people are what they claim to be, which Campbell had brilliantly explored in “Who Goes There?” It’s a theme to which Welles obsessively returned, and it’s fascinating to speculate what he might have done with it if Howard Hawks and Christian Nyby hadn’t gotten there first with The Thing From Another World. Who knows what evil lurks in the hearts of men?

But their true affinities were spiritual ones. Both Campbell and Welles were child prodigies who reinvented an art form largely by being superb organizers of other people’s talents—although Campbell always downplayed his own contributions, while Welles appears to have done the opposite. Each had a spectacular early success followed by what was perceived as decades of decline, which they seem to have seen coming. (David Thomson writes: “As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent.” And you could say much the same thing about “Twilight.”) Both had a habit of abandoning projects as soon as they realized that they couldn’t control them, and they both managed to seem isolated while occupying the center of attention in any crowd. They enjoyed staking out unreasonable positions in conversation, just to get a rise out of listeners, and they ultimately drove away their most valuable collaborators. What Pauline Kael writes of Welles in “Raising Kane” is equally true of Campbell:

He lost the collaborative partnerships that he needed…He was alone, trying to be “Orson Welles,” though “Orson Welles” had stood for the activities of a group. But he needed the family to hold him together on a project and to take over for him when his energies became scattered. With them, he was a prodigy of accomplishments; without them, he flew apart, became disorderly.

Both men were alone when they died, and both filled their friends, admirers, and biographers with intensely mixed feelings. I’m still coming to terms with Campbell. But I have a hunch that I’ll end up somewhere close to Kael’s ambivalence toward Welles, who, at the end of an essay that was widely seen as puncturing his myth, could only conclude: “In a less confused world, his glory would be greater than his guilt.”

The imperious and tyrannical images

leave a comment »

You can’t throw in images the way you throw in a fishhook, at random! These obedient images are, in a film constructed according to the dark and mysterious rules of the unconscious, necessary images, imperious and tyrannical images…It can be useful for a while to rediscover by methods that are unusual, excessive, arbitrary, methods that are primitive, direct, and stripped of nonessentials, polished to the bone, the laws of eternal poetry, but these laws are always the same, and the goal of poetry cannot be simply to play with the laws by which it is made…Just because with the help of psychoanalysis the rules of the game have become infinitely clear, and because the technique of poetry has revealed its secrets, the point is not to show that we are extraordinarily intelligent and that we now know how to go about it.

Antonin Artaud, in a letter to Jean Paulhan

Written by nevalalee

March 11, 2017 at 7:29 am

Who we are in the moment

with 57 comments

Jordan Horowitz and Barry Jenkins

By now, you’re probably sick of hearing about what happened at the Oscars. I’m getting a little tired of it, too, even though it was possibly the strangest and most riveting two minutes I’ve ever seen on live television. It left me feeling sorry for everyone involved, but there are at least three bright spots. The first is that it’s going to make a great case study for somebody like Malcolm Gladwell, who is always looking for a showy anecdote to serve as a grabber opening for a book or article. So many different things had to go wrong for it to happen—on the levels of design, human error, and simple dumb luck—that you can use it to illustrate just about any point you like. A second silver lining is that it highlights the basically arbitrary nature of all such awards. As time passes, the list of Best Picture winners starts to look inevitable, as if Cimarron and Gandhi and Chariots of Fire had all been canonized by a comprehensible historical process. If anything, the cycle of inevitability is accelerating, so that within seconds of any win, the narratives are already locking into place. As soon as La La Land was announced as the winner, a story was emerging about how Hollywood always goes for the safe, predictable choice. The first thing that Dave Itzkoff, a very smart reporter, posted on the New York Times live chat was: “Of course.” Within a couple of minutes, however, that plot line had been yanked away and replaced with one for Moonlight. And the fact that the two versions were all but superimposed onscreen should warn us against reading too much into outcomes that could have gone any number of ways.

But what I want to keep in mind above all else is the example of La La Land producer Jordan Horowitz, who, at a moment of unbelievable pressure, simply said: “I’m going to be really proud to hand this to my friends from Moonlight.” It was the best thing that anybody could have uttered under those circumstances, and it tells us a lot about Horowitz himself. If you were going to design a psychological experiment to test a subject’s reaction under the most extreme conditions imaginable, it’s hard to think of a better one—although it might strike a grant committee as possibly too expensive. It takes what is undoubtedly one of the high points of someone’s life and twists it instantly into what, if perhaps not the worst moment, at least amounts to a savage correction. Everything that the participants onstage did or said, down to the facial expressions of those standing in the background, has been subjected to a level of scrutiny worthy of the Zapruder film. At the end of an event in which very little occurs that hasn’t been scripted or premeditated, a lot of people were called upon to figure out how to act in real time in front of an audience of hundreds of millions. It’s proverbial that nobody tells the truth in Hollywood, an industry that inspires insider accounts with titles like Hello, He Lied and Which Lie Did I Tell? A mixup like the one at the Oscars might have been expressly conceived as a stress test to bring out everyone’s true colors. Yet Horowitz said what he did. And I suspect that it will do more for his career than even an outright win would have accomplished.

Kellyanne Conway

It also reminds me of other instances over the last year in which we’ve learned exactly what someone thinks. When we get in trouble for a remark picked up on a hot mike, we often say that it doesn’t reflect who we really are—which is just another way of stating that it doesn’t live up to the versions of ourselves that we create for public consumption. It’s far crueler, but also more convincing, to argue that it’s exactly in those unguarded, unscripted moments that our true selves emerge. (Freud, whose intuition on such matters was uncanny, was onto something when he focused on verbal mistakes and slips of the tongue.) The justifications that we use are equally revealing. Maybe we dismiss it as “locker room talk,” even if it didn’t take place anywhere near a locker room. Kellyanne Conway excused her reference to the nonexistent Bowling Green Massacre by saying “I misspoke one word,” even though she misspoke it on three separate occasions. It doesn’t even need to be something said on the spur of the moment. At his confirmation hearing for the position of ambassador to Israel, David M. Friedman apologized for an opinion piece he had written before the election: “These were hurtful words, and I deeply regret them. They’re not reflective of my nature or my character.” Friedman also said that “the inflammatory rhetoric that accompanied the presidential campaign is entirely over,” as if it were an impersonal force that briefly took possession of its users and then departed. We ask to be judged on our most composed selves, not the ones that we reveal at our worst.

To some extent, that’s a reasonable request. I’ve said things in public and in private that I’ve regretted, and I wouldn’t want to be judged solely on my worst moments as a writer or parent. At a time when a life can be ruined by a single tweet, it’s often best to err on the side of forgiveness, especially when there’s any chance of misinterpretation. But there’s also a place for common sense. You don’t refer to an event as a “massacre” unless you really think of it that way or want to encourage others to do so. And we judge our public figures by what they say when they think that nobody is listening, or when they let their guard down. It might seem like an impossibly high standard, but it’s also the one that’s effectively applied in practice. You can respond by becoming inhumanly disciplined, like Obama, who in a decade of public life has said maybe five things he has reason to regret. Or you can react like Trump, who says five regrettable things every day and trusts that its sheer volume will reduce it to a kind of background noise—which has awakened us, as Trump has in so many other ways, to a political option that we didn’t even knew existed. Both strategies are exhausting, and most of us don’t have the energy to pursue either path. Instead, we’re left with the practical solution of cultivating the inner voice that, as I wrote last week, allows us to act instinctively. Kant writes: “Live your life as though your every act were to become a universal law.” Which is another way of saying that we should strive to be the best version of ourselves at all times. It’s probably impossible. But it’s easier than wearing a mask.

Written by nevalalee

February 28, 2017 at 9:00 am

One breath, one blink

leave a comment »

Gene Hackman in The Conversation

A few weeks ago, my wife, who is a professional podcaster, introduced me to the concept of the “breath” in audio editing. When you’re putting together an episode, you often find yourself condensing an interview or splicing together two segments, and you can run into trouble when those edits interfere with the speaker’s natural breathing rhythms. As an excellent tutorial from NPR explains it:

Breaths are a problem when they are upcut or clipped. An upcut breath is one that is edited so it’s incomplete (or “chopped”)—only the first or last part is audible…Missing breaths are just that—breaths that have been removed or silenced. They sound unnatural and can cause some listeners to feel tense…Breaths are also problematic when they don’t match the cadence of the speech (i.e. a short, quick breath appears in the middle of a slower passage)…

When editing breaths, listen closely to the beginning and end. If replacing a breath, choose one that matches the cadence and tone of the words around it.

For example, a short, quick breath is useful during an interruption or an excited, quick-paced reply. A longer breath is appropriate for a relaxed, measured response…As a rule of thumb, do not remove breaths—it sounds unnatural.

As I read this, I grew particularly interested in the idea that a poorly edited breath can make the listener feel anxious without knowing it, which reminded me of what the film editor Walter Murch says in his book In The Blink of an Eye. Murch writes that when he was editing Francis Ford Coppola’s The Conversation, he noticed that Harry Caul, the character played by Gene Hackman, would frequently blink around the point where he had decided to make a cut. “It was interesting,” Murch says, “but I didn’t know what to make of it.” Then he happened to read an interview with the director John Huston that shed an unexpected light on the subject:

To me, the perfect film is as thought it were unwinding behind your eyes…Look at that lamp across the room. Now look back at me. Look back at that lamp. Now look back at me again. Do you see what you did? You blinked. Those are cuts. After the first look, you know that there’s no reason to pan continuously from me to the lamp because you know what’s in between. Your mind cut the scene. First you behold the lamp. Cut. Then you behold me.

Murch was fascinated by this, and he began to pay closer attention to blinking’s relationship to emotional or cognitive states. He concluded that blinks tend to occur at instants in which an internal separation of thought has taken place, either to help it along or as an involuntary reflex that coincides with a moment of transition.

Walter Murch

As Murch writes: “Start a conversation with somebody and watch when they blink. I believe you will find that your listener will blink at the precise moment he or she ‘gets’ the idea of what you are saying, not an instant earlier or later…And that blink will occur where a cut could have happened, had the conversation been filmed.” This doesn’t necessarily mean that an editor should worry about when the actors are blinking, but that if he or she is making the cut in the right spot, as a kind of visual punctuation, the blinks and the cuts will coincide anyway. Apart from Murch’s anecdotal observations, I don’t know if this phenomenon has ever been studied in detail, but it’s intriguing. It’s also evident that breathing in audio and blinking in film are two aspects of the same thing. Both are physiological phenomena, but they’re also connected with cognition in profound ways, especially when we’re trying to communicate with others. When we’re talking to someone else, we don’t stop to breathe in arbitrary places, but at moments when the sense of what we’re saying has reached a natural break. Hence the function of the comma, which is a visual marker that sets apart clauses or units of information on the page, as well as a vestigial trace of the pause that would have occurred in conversation, even if we don’t stop when we’re reading it silently to ourselves. And I’ve spoken elsewhere of the relationship between breathing and the length of sentences or lines of poetry, in which the need to breathe is inseparable from the necessity of pausing for consolidation or comprehension.

What makes these issues important to editors is that they’re essentially playing a confidence trick. They’re trying to create an impression of continuity while assembling many discrete pieces, and if they fail to honor the logic of the breath or the blink, the listener or viewer will subconsciously sense it. This is the definition of a thankless task, because you’ll never notice it when it works, and when it doesn’t, you probably won’t even be able to articulate the problem. I suspect that the uneasiness caused by a badly edited stretch of audio or film is caused by the rhythms of one’s own body falling out of sync with the story: when a work of art is flowing properly, we naturally adjust ourselves to its rhythms, and a dropped or doubled breath can shake us out of that sense of harmony. After a while, addressing this becomes a matter of instinct, and a skilled editor will unconsciously take these factors into account, much as an author eventually learns to write smoothly without worrying about it too much. We only become aware of it when something feels wrong. (It’s also worth paying close attention to it during the revision phase. The NPR tutorial notes that problems with breaths can occur when the editor tries to “nickel and dime” an interview to make it fit within a certain length. And when James Cameron tried to cut Terminator 2 down to its contractual length by removing just a single frame per second from the whole movie, he found that the result was unwatchable.) When we’re awake, no matter what else we might be doing, we’re breathing and blinking. And it’s a testament to the challenges that editors face that they can’t even take breathing for granted.

Written by nevalalee

February 14, 2017 at 9:08 am

Quote of the Day

leave a comment »

Written by nevalalee

January 31, 2017 at 7:30 am

The temple of doom

with 3 comments

Steven Spielberg on the set of Indiana Jones and the Temple of Doom

I think America is going through a paroxysm of rage…But I think there’s going to be a happy ending in November.

—Steven Spielberg, to Sky News, July 17, 2016

Last month, Steven Spielberg celebrated his seventieth birthday. Just a few weeks later, Yale University Press released Steven Spielberg: A Life in Films by the critic Molly Haskell, which has received a surprising amount of attention for a relatively slender book from an academic publisher, including a long consideration by David Denby in The New Yorker. I haven’t read Haskell’s book, but it seems likely that its reception is partially a question of good timing. We’re in the mood to talk about Spielberg, and not just because of his merits as a filmmaker or the fact that he’s entering the final phase of his career. Spielberg, it’s fair to say, is the most quintessentially American of all directors, despite a filmography that ranges freely between cultures and seems equally comfortable in the past and in the future. He’s often called a mythmaker, and if there’s a place where his glossy period pieces, suburban landscapes, and visionary adventures meet, it’s somewhere in the nation’s collective unconscious: its secret reveries of what it used to be, what it is, and what it might be again. Spielberg country, as Stranger Things was determined to remind us, is one of small towns and kids on bikes, but it also still vividly remembers how it beat the Nazis, and it can’t keep from turning John Hammond from a calculating billionaire into a grandfatherly, harmless dreamer. No other artist of the last half century has done so much to shape how we feel about ourselves. He took over where Walt Disney left off. But what has he really done?

To put it in the harshest possible terms, it’s worth asking whether Spielberg—whose personal politics are impeccably liberal—is responsible in part for our current predicament. He taught the New Hollywood how to make movies that force audiences to feel without asking them to think, to encourage an illusion of empathy instead of the real thing, and to create happy endings that confirm viewers in their complacency. You can’t appeal to all four quadrants, as Spielberg did to a greater extent than anyone who has ever lived, without consistently telling people exactly what they want to hear. I’ve spoken elsewhere of how film serves as an exercise ground for the emotions, bringing us closer on a regular basis to the terror, wonder, and despair that many of us would otherwise experience only rarely. It reminds the middle class of what it means to feel pain or awe. But I worry that when we discharge these feelings at the movies, it reduces our capacity to experience them in real life, or, even more insidiously, makes us think that we’re more empathetic and compassionate than we actually are. Few movies have made viewers cry as much as E.T., and few have presented a dilemma further removed than anything a real person is likely to face. (Turn E.T. into an illegal alien being sheltered from a government agency, maybe, and you’d be onto something.) Nearly every film from the first half of Spielberg’s career can be taken as a metaphor for something else. But great popular entertainment has a way of referring to nothing but itself, in a cognitive bridge to nowhere, and his images are so overwhelming that it can seem superfluous to give them any larger meaning.

Steven Spielberg on the set of Jaws

If Spielberg had been content to be nothing but a propagandist, he would have been the greatest one who ever lived. (Hence, perhaps, his queasy fascination with the films of Leni Riefenstahl, who has affinities with Spielberg that make nonsense out of political or religious labels.) Instead, he grew into something that is much harder to define. Jaws, his second film, became the most successful movie ever made, and when he followed it up with Close Encounters, it became obvious that he was in a position with few parallels in the history of art—he occupied a central place in the culture and was also one of its most advanced craftsmen, at a younger age than Damien Chazelle is now. If you’re talented enough to assume that role and smart enough to stay there, your work will inevitably be put to uses that you never could have anticipated. It’s possible to pull clips from Spielberg’s films that make him seem like the cuddliest, most repellent reactionary imaginable, of the sort that once prompted Tony Kushner to say:

Steven Spielberg is apparently a Democrat. He just gave a big party for Bill Clinton. I guess that means he’s probably idiotic…Jurassic Park is sublimely good, hideously reactionary art. E.T. and Close Encounters of the Third Kind are the flagship aesthetic statements of Reaganism. They’re fascinating for that reason, because Spielberg is somebody who has just an astonishing ear for the rumblings of reaction, and he just goes right for it and he knows exactly what to do with it.

Kushner, of course, later became Spielberg’s most devoted screenwriter. And the total transformation of the leading playwright of his generation is the greatest testament imaginable to this director’s uncanny power and importance.

In reality, Spielberg has always been more interesting than he had any right to be, and if his movies have been used to shake people up in the dark while numbing them in other ways, or to confirm the received notions of those who are nostalgic for an America that never existed, it’s hard to conceive of a director of his stature for whom this wouldn’t have been the case. To his credit, Spielberg clearly grasps the uniqueness of his position, and he has done what he could with it, in ways that can seem overly studied. For the last two decades, he has worked hard to challenge some of our assumptions, and at least one of his efforts, Munich, is a masterpiece. But if I’m honest, the film that I find myself thinking about the most is Indiana Jones and the Temple of Doom. It isn’t my favorite Indiana Jones movie—I’d rank it a distant third. For long stretches, it isn’t even all that good. It also trades in the kind of casual racial stereotyping that would be unthinkable today, and it isn’t any more excusable because it deliberately harks back to the conventions of an earlier era. (The fact that it’s even watchable now only indicates how much ground East and South Asians have yet to cover.) But its best scenes are so exciting, so wonderful, and so conductive to dreams that I’ve never gotten over it. Spielberg himself was never particularly pleased with the result, and if asked, he might express discomfort with some of the decisions he made. But there’s no greater tribute to his artistry, which executed that misguided project with such unthinking skill that he exhilarated us almost against his better judgment. It tells us how dangerous he might have been if he hadn’t been so deeply humane. And we should count ourselves lucky that he turned out to be as good of a man as he did, because we’d never have known if he hadn’t.

%d bloggers like this: