Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘David Thomson

The greatest trick

leave a comment »

In the essay collection Candor and Perversion, the late critic Roger Shattuck writes: “The world scoffs at old ideas. It distrusts new ideas. It loves tricks.” He never explains what he means by “trick,” but toward the end of the book, in a chapter on Marcel Duchamp, he quotes a few lines from the poet Charles Baudelaire from the unpublished preface to Flowers of Evil:

Does one show to a now giddy, now indifferent public the working of one’s devices? Does one explain all the revision and improvised variations, right down to the way one’s sincerest impulses are mixed in with tricks and with the charlatanism indispensable to the work’s amalgamation?

Baudelaire is indulging here in an analogy from the theater—he speaks elsewhere of “the dresser’s and the decorator’s studio,” “the actor’s box,” and “the wrecks, makeup, pulleys, chains.” A trick, in this sense, is a device that the artist uses to convey an idea that also draws attention to itself, in the same way that we can simultaneously notice and accept certain conventions when we’re watching a play. In a theatrical performance, the action and its presentation are so intermingled that we can’t always say where one leaves off and the other begins, and we’re always aware, on some level, that we’re looking at actors on a stage behaving in a fashion that is necessarily stylized and artificial. In other art forms, we’re conscious of these tricks to a greater or lesser extent, and while artists are usually advised that such technical elements should be subordinated to the story, in practice, we often delight in them for their own sake.

For an illustration of the kind of trick that I mean, I can’t think of any better example than the climax of The Godfather, in which Michael Corleone attends the baptism of his godson—played by the infant Sofia Coppola—as his enemies are executed on his orders. This sequence seems as inevitable now as any scene in the history of cinema, but it came about almost by accident. The director Francis Ford Coppola had the idea to combine the christening with the killings after all of the constituent parts had already been shot, which left him with the problem of assembling footage that hadn’t been designed to fit together. As Michael Sragow recounts in The New Yorker:

[Editor Peter] Zinner, too, made a signal contribution. In a climactic sequence, Coppola had the stroke of genius (confirmed by Puzo) to intercut Michael’s serving as godfather at the christening of Connie’s baby with his minions’ savagely executing the Corleone family’s enemies. But, Zinner says, Coppola left him with thousands of feet of the baptism, shot from four or five angles as the priest delivered his litany, and relatively few shots of the assassins doing their dirty work. Zinner’s solution was to run the litany in its entirety on the soundtrack along with escalating organ music, allowing different angles of the service to dominate the first minutes, and then to build to an audiovisual crescendo with the wave of killings, the blaring organ, the priest asking Michael if he renounces Satan and all his works—and Michael’s response that he does renounce them. The effect sealed the movie’s inspired depiction of the Corleones’ simultaneous, duelling rituals—the sacraments of church and family, and the murders in the street.

Coppola has since described Zinner’s contribution as “the inspiration to add the organ music,” but as this account makes clear, the editor seems to have figured out the structure and rhythm of the entire sequence, building unforgettably on the director’s initial brainstorm.

The result speaks for itself. It’s hard to think of a more powerful instance in movies of the form of a scene, created by cuts and juxtaposition, merging with the power of its storytelling. As we watch it, consciously or otherwise, we respond both to its formal audacity and to the ideas and emotions that it expresses. It’s the ultimate trick, as Baudelaire defines it, and it also inspired one of my favorite passages of criticism, in David Thomson’s entry on Coppola in The Biographical Dictionary of Film:

When The Godfather measured its grand finale of murder against the liturgy of baptism, Coppola seemed mesmerized by the trick, and its nihilism. A Buñuel, by contrast, might have made that sequence ironic and hilarious. But Coppola is not long on those qualities, and he could not extricate himself from the engineering of scenes. The identification with Michael was complete and stricken.

Before reading these lines, I had never considered the possibility that the baptism scene could be “ironic and hilarious,” or indeed anything other than how it so overwhelmingly presents itself, although it might easily have played that way without the music. And I’ve never forgotten Thomson’s assertion that Coppola was mesmerized by his own trick, as if it had arisen from somewhere outside of himself. (It might be even more accurate to say that coming up with the notion that the sequences ought to be cut together is something altogether different from actually witnessing the result, after Zinner assembled all the pieces and added Bach’s Passacaglia and Fugue in C minor—which, notably, entwines three different themes.) Coppola was so taken by the effect that he reused it, years later, for a similar sequence in Bram Stoker’s Dracula, admitting cheerfully on the commentary track that he was stealing from himself.

It was a turning point both for Coppola and for the industry as a whole. Before The Godfather, Coppola had been a novelistic director of small, quirky stories, and afterward, like Michael coming into his true inheritance, he became the engineer of vast projects, following up on the clues that he had planted here for himself. (It’s typical of the contradictions of his career that he placed his own baby daughter at the heart of this sequence, which means that he could hardly keep from viewing the most technically nihilistic scene in all his work as something like a home movie.) And while this wasn’t the earliest movie to invite the audience to revel in its structural devices—half of Citizen Kane consists of moments like this—it may have been the first since The Birth of a Nation to do so while also becoming the most commercially successful film of all time. Along the way, it subtly changed us. In our movies, as in our politics, we’ve become used to thinking as much about how our stories are presented as about what they say in themselves. We can even come to prefer trickery, as Shattuck warns us, to true ideas. This doesn’t meant that we should renounce genuine artistic facility of the kind that we see here, as opposed to its imitation or its absence, any more than Michael can renounce Satan. But the consequences of this confusion can be profound. Coppola, the orchestrator of scenes, came to identify with the mafioso who executed his enemies with ruthless efficiency, and the beauty of Michael’s moment of damnation went a long way toward turning him into an attractive, even heroic figure, an impression that Coppola spent most of The Godfather Parts II and III trying in vain to correct. Pacino’s career was shaped by this moment as well. And we have to learn to distinguish between tricks and the truth, especially when they take pains to conceal themselves. As Baudelaire says somewhere else: “The greatest trick the devil ever pulled was convincing the world he didn’t exist.”

The space between us all

with 5 comments

In an interview published in the July 12, 1970 issue of Rolling Stone, the rock star David Crosby said: “My time has gotta be devoted to my highest priority projects, which starts with tryin’ to save the human race and then works its way down from there.” The journalist Ben Fong-Torres prompted him gently: “But through your music, if you affect the people you come in contact with in public, that’s your way of saving the human race.” And I’ve never forgotten Crosby’s response:

But somehow operating on that premise for the last couple of years hasn’t done it, see? Somehow Sgt. Pepper’s did not stop the Vietnam War. Somehow it didn’t work. Somebody isn’t listening. I ain’t saying stop trying; I know we’re doing the right thing to live, full on. Get it on and do it good. But the inertia we’re up against, I think everybody’s kind of underestimated it. I would’ve thought Sgt. Pepper’s could’ve stopped the war just by putting too many good vibes in the air for anybody to have a war around.

He was right about one thing—the Beatles didn’t stop the war. And while it might seem as if there’s nothing new left to say about Sgt. Pepper’s Lonely Hearts Club Band, which celebrates its fiftieth anniversary today, it’s worth asking what it tells us about the inability of even our greatest works of art to inspire lasting change. It’s probably ridiculous to ask this of any album. But if a test case exists, it’s here.

It seems fair to say that if any piece of music could have changed the world, it would have been Sgt. Pepper. As the academic Langdon Winner famously wrote:

The closest Western Civilization has come to unity since the Congress of Vienna in 1815 was the week the Sgt. Pepper album was released…At the time I happened to be driving across the country on Interstate 80. In each city where I stopped for gas or food—Laramie, Ogallala, Moline, South Bend—the melodies wafted in from some far-off transistor radio or portable hi-fi. It was the most amazing thing I’ve ever heard. For a brief while, the irreparably fragmented consciousness of the West was unified, at least in the minds of the young.

The crucial qualifier, of course, is “at least in the minds of the young,” which we’ll revisit later. To the critic Michael Bérubé, it was nothing less than the one week in which there was “a common culture of widely shared values and knowledge in the United States at any point between 1956 and 1976,” which seems to undervalue the moon landing, but never mind. Yet even this transient unity is more apparent than real. By the end of the sixties, the album had sold about three million copies in America alone. It’s a huge number, but even if you multiply it by ten to include those who were profoundly affected by it on the radio or on a friend’s record player, you end up with a tiny fraction of the population. To put it another way, three times as many people voted for George Wallace for president as bought a copy of Sgt. Pepper in those years.

But that’s just how it is. Even our most inescapable works of art seem to fade into insignificance when you consider the sheer number of human lives involved, in which even an apparently ubiquitous phenomenon is statistically unable to reach a majority of adults. (Fewer than one in three Americans paid to see The Force Awakens in theaters, which is as close as we’ve come in recent memory to total cultural saturation.) The art that feels axiomatic to us barely touches the lives of others, and it may leave only the faintest of marks on those who listen to it closely. The Beatles undoubtedly changed lives, but they were more likely to catalyze impulses that were already there, providing a shape and direction for what might otherwise have remained unexpressed. As Roger Ebert wrote in his retrospective review of A Hard Day’s Night:

The film was so influential in its androgynous imagery that untold thousands of young men walked into the theater with short haircuts, and their hair started growing during the movie and didn’t get cut again until the 1970s.

We shouldn’t underestimate this. But if you were eighteen when A Hard Day’s Night came out, it also means that you were born the same year as Donald Trump, who decisively won voters who were old enough to buy Sgt. Pepper on its initial release. Even if you took its message to heart, there’s a difference between the kind of change that marshals you the way that you were going and the sort that realigns society as a whole. It just isn’t what art is built to do. As David Thomson writes in Rosebud, alluding to Trump’s favorite movie: “The world is very large and the greatest films so small.”

If Sgt. Pepper failed to get us out of Vietnam, it was partially because those who were most deeply moved by it were more likely to be drafted and shipped overseas than to affect the policies of their own country. As Winner says, it united our consciousness, “at least in the young,” but all the while, the old men, as George McGovern put it, were dreaming up wars for young men to die in. But it may not have mattered. Wars are the result of forces that care nothing for what art has to say, and their operations are often indistinguishable from random chance. Sgt. Pepper may well have been “a decisive moment in the history of Western civilization,” as Kenneth Tynan hyperbolically claimed, but as Harold Bloom reminds us in The Western Canon:

Reading the very best writers—let us say Homer, Dante, Shakespeare, Tolstoy—is not going to make us better citizens. Art is perfectly useless, according to the sublime Oscar Wilde, who was right about everything.

Great works of art exist despite, not because of, the impersonal machine of history. It’s only fitting that the anniversary of Sgt. Pepper happens to coincide with a day on which our civilization’s response to climate change will be decided in a public ceremony with overtones of reality television—a more authentic reflection of our culture, as well as a more profound moment of global unity, willing or otherwise. If the opinions of rock stars or novelists counted for anything, we’d be in a very different situation right now. In “Within You Without You,” George Harrison laments “the people who gain the world and lose their soul,” which neatly elides the accurate observation that they, not the artists, are the ones who do in fact tend to gain the world. (They’re also “the people who hide themselves behind a wall.”) All that art can provide is private consolation, and joy, and the reminder that there are times when we just have to laugh, even when the news is rather sad.

On a wing and a prayer

leave a comment »

“It was the greatest career move in the history of entertainment,” David Thomson writes in an entry in The New Biographical Dictionary of Film. He’s speaking, of course, of Ronald Reagan:

He was a hugely successful and evasive president, as blind to disaster, inquiry, and humiliation as he was to the Constitution. And he was as lucky as he had been a loser in pictures…To paraphrase Gore Vidal, the wisdom and integrity of someone told where to stand and what to say for twenty years were made manifest. The fraudulence of the presidency was revealed so that the office could never quite be honored again.

When I look at these lines now, especially that last sentence, they can start to seem rather quaint. But Reagan has a lot to tell us about Trump, and not simply because he looks so much better by comparison. “An actor is playing the president,” Paul Slansky lamented in The Clothes Have No Emperor, a book—with its painstaking chronology of the unlikely events of the Reagan Administration—that looks increasingly funny, resonant, and frightening these days. Yet the presidency has always been something of a performance. As Malcolm Gladwell recently noted to The Undefeated, most presidents have been white men of a certain age and height:

Viewed statistically it’s absurd. Why would you limit your search for the most important job in the land to this tiny group of people? But it’s an incredibly common thing. We do a category selection before we do individual analysis.

In other words, we cast men who look the part, and then we judge them by how well they fulfill our idea of the role.

Reagan, like Trump, was unusually prone to improvising, or, in Thomson’s words, “deftly feeding the lines and situations of Warner Brothers in the 1940s back into world affairs.” Occasionally, he would tell a story to put himself in a favorable light, as when he made the peculiar claim—to Yitzhak Shamir and Simon Wiesenthal, no less—that he had personally shot documentary film of the concentration camps after World War II. (In reality, Reagan spent the war in Hollywood, where he assisted in processing footage taken by others in Europe.) But sometimes his reasons were harder to pin down. On December 12, 1983, Reagan told a story in a speech to the annual convention of the Congressional Medal Honor Society:

A B‑17 was coming back across the channel from a raid over Europe, badly shot up by anti‑aircraft; the ball turret that hung underneath the belly of the plane had taken a hit. The young ball‑turret gunner was wounded, and they couldn’t get him out of the turret there while flying. But over the channel, the plane began to lose altitude, and the commander had to order, “Bail out.” And as the men started to leave the plane, the last one to leave—the boy, understandably, knowing he was being left behind to go down with the plane, cried out in terror—the last man to leave the plane saw the commander sit down on the floor. He took the boy’s hand and said, “Never mind, son, we’ll ride it down together.” Congressional Medal of honor posthumously awarded.

Reagan recounted this story on numerous other occasions. But as Lars-Erik Nelson, the Washington bureau chief for the New York Daily News, subsequently determined, after checking hundreds of Medal of Honor citations from World War II: “It didn’t happen. It’s a Reagan story…The president of the United States went before an audience of three hundred real Congressional Medal of Honor winners and told them about a make‑believe Medal of Honor winner.”

There’s no doubt that Reagan, who often grew visibly moved as he recounted this story, believed that it was true, and it has even been used as a case study in the creation of false memories. Nelson traced it back to a scene in the 1944 movie Wing and a Prayer, as well as to a similar apocryphal item that appeared that year in Reader’s Digest. (The same story, incidentally, later became the basis for an episode of Amazing Stories, “The Mission,” starring Kevin Costner and Kiefer Sutherland and directed by Steven Spielberg. Tony Kushner once claimed that Spielberg’s movies “are the flagship aesthetic statements of Reaganism,” and this is the most compelling point I’ve seen in that argument’s favor.) But the most Trumpian aspect of the entire incident was the response of Reagan’s staff. As the Washington Post reported a few days later:

A determined White House is searching the records of American servicemen awarded the Medal of Honor in an effort to authenticate a disputed World War II story President Reagan told last week at a ceremony honoring recipients of the medal…The White House then began checking records to document the episode. Reagan is said by aides to be certain that he saw the citation exactly as he recounted it. The citations are summarized in a book published by Congress, but none of these summaries seem to fit precisely the episode Reagan described, although some are similar…The White House is now attempting to look beyond the summaries to more detailed accounts to see if one of the episodes may be the one Reagan mentioned. “We will find it,” said Misty Church, a researcher for the White House.

They never did. And the image of White House staffers frantically trying to justify something that the president said off the cuff certainly seems familiar today.

But what strikes me the most about this story is that Reagan himself had nothing to gain from it. Most of Trump’s fabrications are designed to make him look better, more successful, or more impressive than he actually is, while Reagan’s fable is rooted in a sentimental ideal of heroism itself. (It’s hard to even imagine a version of this story that Trump might have told, since the most admirable figure in it winds up dead. As Trump might say, he likes pilots who weren’t shot down.) Which isn’t to say that Reagan’s mythologizing isn’t problematic in itself, as Nelson pointed out:

[It’s] the difference between a make-believe pilot, dying nobly and needlessly to comfort a wounded boy, and the real-life pilots, bombardiers and navigators who struggled to save their planes, their crews and themselves and died trying. It’s the difference between war and a war story.

And while this might seem preferable to Trump’s approach, which avoids any talk of sacrifice in favor of scenarios in which everybody wins, or we stick other people with the cost of our actions, it still closes off higher levels of thought in favor of an appeal to emotion. Reagan was an infinitely more capable actor than Trump, and he was much easier to love, which shouldn’t blind us to what they have in common. They were both winging it. And the most characteristic remark to come out of the whole affair is how Larry Speakes, the White House spokesman under Reagan, responded when asked if the account was accurate: “If you tell the same story five times, it’s true.”

Listen without prejudice

with 4 comments

George Michael

In The Biographical Dictionary of Film, David Thomson says of Tuesday Weld: “If she had been ‘Susan Weld’ she might now be known as one of our great actresses.” The same point might hold true of George Michael, who was born Georgios Kyriacos Panayiotou and chose a nom de mike—with its unfortunate combination of two first names—that made him seem frothy and lightweight. If he had called himself, say, George Parker, he might well have been regarded as one of our great songwriters, which he indisputably was. In the past, I’ve called Tom Cruise a brilliant producer who happened to be born into the body of a movie star, and George Michael had the similar misfortune of being a perversely inventive and resourceful recording artist who was also the most convincing embodiment of a pop superstar that anybody had ever seen. It’s hard to think of another performer of that era who had so complete a package: the look, the voice, the sexuality, the stage presence. The fact that he was gay and unable to acknowledge it for so long was an undeniable burden, but it also led him to transform himself into what would have been almost a caricature of erotic assertiveness if it hadn’t been delivered so earnestly. Like Cary Grant, a figure with whom he might otherwise seem to have little in common, he turned himself into exactly what he thought everyone wanted, and he did it so well that he was never allowed to be anything else.

But consider the songs. Michael was a superb songwriter from the very beginning, and “Everything She Wants,” “Last Christmas,” “Careless Whisper,” and “A Different Corner,” which he all wrote in his early twenties, should be enough to silence any doubts about his talent. His later songs could be exhausting in their insistence on doubling as statements of purpose. But it’s Faith, and particularly the first side of the album and the coda of “Kissing a Fool,” that never fails to fill me with awe. It was a clear declaration that this was a young man, not yet twenty-five, who was capable of anything, and he wasn’t shy about alerting us to the fact: the back of the compact disc reads “Written, Arranged, and Produced by George Michael.” In those five songs, Michael nimbly tackles so many different styles and tones that it threatens to make the creation of timeless pop music seem as mechanical a process as it really is. A little less sex and a lot more irony, and you’d be looking at as skilled a chameleon as Stephin Merritt—which is another comparison that I didn’t think I’d ever make. But on his best day, Michael was the better writer. “One More Try” has meant a lot to me since the moment I first heard it, while “I Want Your Sex” is one of those songs that would sound revolutionary in any decade. When you listen to the Monogamy Mix, which blends all three sections together into a monster track of thirteen minutes, you start to wonder if we’ve caught up to it even now.

George Michael and Andrew Ridgeley

These songs have been part of the background of my life for literally as long as I can remember—the music video for “Careless Whisper” was probably the first one I ever saw, except maybe for “Thriller,” and I can’t have been more than five years old. Yet I never felt like I understood George Michael in the way I thought I knew, say, the Pet Shop Boys, who also took a long time to get the recognition they deserved. (They also settled into their roles as elder statesmen a little too eagerly, while Michael never seemed comfortable with his cultural position at any age.) For an artist who told us what he thought in plenty of songs, he remained essentially unknowable. Part of it was due to that glossy voice, one of the best of its time, especially when it verged on Alison Moyet territory. But it often seemed like just another instrument, rather than a piece of himself. Unlike David Bowie, who assumed countless personas that still allowed the man underneath to peek through, Michael wore his fame, in John Updike’s words, like a mask that ate into the face. His death doesn’t feel like a personal loss to me, in the way that Bowie did, but I’ve spent just about as much time listening to his music, even if you don’t count all the times I’ve played “Last Christmas” in an endless loop on Infinite Jukebox.

In the end, it was a career that was bound to seem unfinished no matter when or how it ended. Its back half was a succession of setbacks and missed opportunities, and you could argue that its peak lasted for less than four years. The last album of his that I owned was the oddball Songs from the Last Century, in which he tried on a new role—a lounge singer of old standards—that would have been ludicrous if it hadn’t been so deeply heartfelt. It wasn’t a persuasive gesture, because he didn’t need to sing somebody else’s songs to sound like part of the canon. That was seventeen years ago, or almost half my lifetime. There were long stretches when he dropped out of my personal rotation, but he always found his way back: “Wake Me Up Before You Go-Go” even played at my wedding. “One More Try” will always be my favorite, but the snippet that has been in my head the most is the moment in “Everything She Wants” when Michael just sings: Uh huh huh / Oh, oh / Uh huh huh / Doo doo doo / La la la la… Maybe he’s just marking time, or he wanted to preserve a melodic idea that didn’t lend itself to words, or it was a reflection of the exuberance that Wesley Morris identifies in his excellent tribute in the New York Times: “There aren’t that many pop stars with as many parts of as many songs that are as exciting to sing as George Michael has—bridges, verses, the fillips he adds between the chorus during a fade-out.” But if I were trying to explain what pop music was all about to someone who had never heard it, I might just play this first.

The steady hand

with 2 comments

Danny Lloyd in The Shining

Forty years ago, the cinematographer Garrett Brown invented the Steadicam. It was a stabilizer attached to a harness that allowed a camera operator, walking on foot or riding in a vehicle, to shoot the kind of smooth footage that had previously only been possible using a dolly. Before long, it had revolutionized the way in which both movies and television were shot, and not always in the most obvious ways. When we think of the Steadicam, we’re likely to remember virtuoso extended takes like the Copacabana sequence in Goodfellas, but it can also be a valuable tool even when we aren’t supposed to notice it. As the legendary Robert Elswit said recently to the New York Times:

“To me, it’s not a specialty item,” he said. “It’s usually there all the time.” The results, he added, are sometimes “not even necessarily recognizable as a Steadicam shot. You just use it to get something done in a simple way.”

Like digital video, the Steadicam has had a leveling influence on the movies. Scenes that might have been too expensive, complicated, or time-consuming to set up in the conventional manner can be done on the fly, which has opened up possibilities both for innovative stylists and for filmmakers who are struggling to get their stories made at all.

Not surprisingly, there are skeptics. In On Directing Film, which I think is the best book on storytelling I’ve ever read, David Mamet argues that it’s a mistake to think of a movie as a documentary record of what the protagonist does, and he continues:

The Steadicam (a hand-held camera), like many another technological miracle, has done injury; it has injured American movies, because it makes it so easy to follow the protagonist around, one no longer has to think, “What is the shot?” or “Where should I put the camera?” One thinks, instead, “I can shoot the whole thing in the morning.”

This conflicts with Mamet’s approach to structuring a plot, which hinges on dividing each scene into individual beats that can be expressed in purely visual terms. It’s a method that emerges naturally from the discipline of selecting shots and cutting them together, and it’s the kind of hard work that we’re often tempted to avoid. As Mamet adds in a footnote: “The Steadicam is no more capable of aiding in the creation of a good movie than the computer is in the writing of a good novel—both are labor-saving devices, which simplify and so make more attractive the mindless aspects of creative endeavor.” The casual use of the Steadicam seduces directors into conceiving of the action in terms of “little plays,” rather than in fundamental narrative units, and it removes some of the necessity of disciplined thinking beforehand.

Michael Keaton and Edward Norton in Birdman

But it isn’t until toward the end of the book that Mamet delivers his most ringing condemnation of what the Steadicam represents:

“Wouldn’t it be nice,” one might say, “if we could get this hall here, really around the corner from that door there; or to get that door here to really be the door that opens on the staircase to that door there? So we could just movie the camera from one to the next?”

It took me a great deal of effort and still takes me a great deal and will continue to take me a great deal of effort to answer the question thusly: no, not only is it not important to have those objects literally contiguous; it is important to fight against this desire, because fighting it reinforces an understanding of the essential nature of film, which is that it is made of disparate shorts, cut together. It’s a door, it’s a hall, it’s a blah-blah. Put the camera “there” and photograph, as simply as possible, that object. If we don’t understand that we both can and must cut the shots together, we are sneakily falling victim to the mistaken theory of the Steadicam.

This might all sound grumpy and abstract, but it isn’t. Take Birdman. You might well love Birdman—plenty of viewers evidently did—but I think it provides a devastating confirmation of Mamet’s point. By playing as a single, seemingly continuous shot, it robs itself of the ability to tell the story with cuts, and it inadvertently serves as an advertisement of how most good movies come together in the editing room. It’s an audacious experiment that never needs to be tried again. And it wouldn’t exist at all if it weren’t for the Steadicam.

But the Steadicam can also be a thing of beauty. I don’t want to discourage its use by filmmakers for whom it means the difference between making a movie under budget and never making it at all, as long as they don’t forget to think hard about all of the constituent parts of the story. There’s also a place for the bravura long take, especially when it depends on our awareness of the unfaked passage of time, as in the opening of Touch of Evil—a long take, made without benefit of a Steadicam, that runs the risk of looking less astonishing today because technology has made this sort of thing so much easier. And there’s even room for the occasional long take that exists only to wow us. De Palma has a fantastic one in Raising Cain, which I watched again recently, that deserves to be ranked among the greats. At its best, it can make the filmmaker’s audacity inseparable from the emotional core of the scene, as David Thomson observes of Goodfellas: “The terrific, serpentine, Steadicam tracking shot by which Henry Hill and his girl enter the Copacabana by the back exit is not just his attempt to impress her but Scorsese’s urge to stagger us and himself with bravura cinema.” The best example of all is The Shining, with its tracking shots of Danny pedaling his Big Wheel down the deserted corridors of the Overlook. It’s showy, but it also expresses the movie’s basic horror, as Danny is inexorably drawn to the revelation of his father’s true nature. (And it’s worth noting that much of its effectiveness is due to the sound design, with the alternation of the wheels against the carpet and floor, which is one of those artistic insights that never grows dated.) The Steadicam is a tool like any other, which means that it can be misused. It can be wonderful, too. But it requires a steady hand behind the camera.

The low road to Xanadu

with 3 comments

Orson Welles in Citizen Kane

It was a miracle of rare device,
A sunny pleasure-dome with caves of ice!

—Samuel Taylor Coleridge, “Kubla Khan”

A couple of weeks ago, I wrote of Donald Trump: “He’s like Charles Foster Kane, without any of the qualities that make Kane so misleadingly attractive.” If anything, that’s overly generous to Trump himself, but it also points to a real flaw in what can legitimately be called the greatest American movie ever made. Citizen Kane is more ambiguous than it was ever intended to be, because we’re distracted throughout by our fondness for the young Orson Welles. He’s visible all too briefly in the early sequences at the Inquirer; he winks at us through his makeup as an older man; and the aura he casts was there from the beginning. As David Thomson points out in The New Biographical Dictionary of Film:

Kane is less about William Randolph Hearst—a humorless, anxious man—than a portrait and prediction of Welles himself. Given his greatest opportunity, [screenwriter Herman] Mankiewicz could only invent a story that was increasingly colored by his mixed feelings about Welles and that, he knew, would be brought to life by Welles the overpowering actor, who could not resist the chance to dress up as the old man he might one day become, and who relished the young showoff Kane just as he loved to hector and amaze the Mercury Theater.

You can see Welles in the script when Susan Alexander asks Kane if he’s “a professional magician,” or when Kane, asked if he’s still eating, replies: “I’m still hungry.” And although his presence deepens and enhances the movie’s appeal, it also undermines the story that Welles and Mankiewicz set out to tell in the first place.

As a result, the film that Hearst wanted to destroy turned out to be the best thing that could have happened to his legacy—it makes him far more interesting and likable than he ever was. The same factor tends to obscure the movie’s politics. As Pauline Kael wrote in the early seventies in the essay “Raising Kane”: “At some campus showings, they react so gullibly that when Kane makes a demagogic speech about ‘the underprivileged,’ stray students will applaud enthusiastically, and a shout of ‘Right on!’ may be heard.” But in an extraordinary review that was published when the movie was first released, Jorge Luis Borges saw through to the movie’s icy heart:

Citizen Kane…has at least two plots. The first, pointlessly banal, attempts to milk applause from dimwits: a vain millionaire collects statues, gardens, palaces, swimming pools, diamonds, cars, libraries, men and women…The second plot is far superior…At the end we realize that the fragments are not governed by any apparent unity: the detested Charles Foster Kane is a simulacrum, a chaos of appearances…In a story by Chesterton—“The Head of Caesar,” I think—the hero observes that nothing is so frightening as a labyrinth with no center. This film is precisely that labyrinth.

Borges concludes: “We all know that a party, a palace, a great undertaking, a lunch for writers and journalists, an enterprise of cordial and spontaneous camaraderie, are essentially horrendous. Citizen Kane is the first film to show such things with an awareness of this truth.” He might well be talking about the Trump campaign, which is also a labyrinth without a center. And Trump already seems to be preparing for defeat with the same defense that Kane did.

Everett Sloane in Citizen Kane

Yet if we’re looking for a real counterpart to Kane, it isn’t Trump at all, but someone standing just off to the side: his son-in-law, Jared Kushner. I’ve been interested in Kushner’s career for a long time, in part because we overlapped at college, although I doubt we’ve ever been in the same room. Ten years ago, when he bought the New York Observer, it was hard not to think of Kane, and not just because Kushner was twenty-five. It recalled the effrontery in Kane’s letter to Mr. Thatcher: “I think it would be fun to run a newspaper.” And I looked forward to seeing what Kushner would do next. His marriage to Ivanka Trump was a twist worthy of Mankiewicz, who married Kane to the president’s daughter, and as Trump lurched into politics, I wasn’t the only one wondering what Ivanka and Kushner—whose father was jailed after an investigation by Chris Christie—made of it all. Until recently, you could kid yourself that Kushner was torn between loyalty to his wife’s father and whatever else he might be feeling, even after he published his own Declaration of Principles in the Observer, writing: “My father-in-law is not an anti-Semite.” But that’s no longer possible. As the Washington Post reports, Kushner, along with former Breitbart News chief Stephen K. Bannon, personally devised the idea to seat Bill Clinton’s accusers in the family box at the second debate. The plan failed, but there’s no question that Kushner has deliberately placed himself at the center of Trump’s campaign, and that he bears an active, not passive, share of the responsibility for what promises to be the ugliest month in the history of presidential politics.

So what happened? If we’re going to press the analogy to its limit, we can picture the isolated Kane in his crumbling estate in Xanadu. It was based on Hearst Castle in San Simeon, and the movie describes it as standing on the nonexistent desert coast of Florida—but it could just as easily be a suite in Trump Tower. We all tend to surround ourselves with people with whom we agree, whether it’s online or in the communities in which we live, and if you want to picture this as a series of concentric circles, the ultimate reality distortion field must come when you’re standing in a room next to Trump himself. Now that Trump has purged his campaign of all reasonable voices, it’s easy for someone like Kushner to forget that there is a world elsewhere, and that his actions may not seem sound, or even sane, beyond those four walls. Eventually, this election will be over, and whatever the outcome, I feel more pity for Kushner than I do for his father-in-law. Trump can only stick around for so much longer, while Kushner still has half of his life ahead of him, and I have a feeling that it’s going to be defined by his decisions over the last three months. Maybe he’ll realize that he went straight from the young Kane to the old without any of the fun in between, and that his only choice may be to wall himself up in Xanadu in his thirties, with the likes of Christie, Giuliani, and Gingrich for company. As the News on the March narrator says in Kane: “An emperor of newsprint continued to direct his failing empire, vainly attempted to sway, as he once did, the destinies of a nation that had ceased to listen to him, ceased to trust him.” It’s a tragic ending for an old man. But it’s even sadder for a young one.

The excerpt opinion

leave a comment »

Norman Mailer

“It’s the rare writer who cannot have sentences lifted from his work,” Norman Mailer once wrote. What he meant is that if a reviewer is eager to find something to mock, dismiss, or pick apart, any interesting book will provide plenty of ammunition. On a simple level of craft, it’s hard for most authors to sustain a high pitch of technical proficiency in every line, and if you want to make a novelist seem boring or ordinary, you can just focus on the sentences that fall between the high points. In his famously savage takedown of Thomas Harris’s Hannibal, Martin Amis quotes another reviewer who raved: “There is not a single ugly or dead sentence.” Amis then acidly observes:

Hannibal is a genre novel, and all genre novels contain dead sentences—unless you feel the throb of life in such periods as “Tommaso put the lid back on the cooler” or “Eric Pickford answered” or “Pazzi worked like a man possessed” or “Margot laughed in spite of herself” or “Bob Sneed broke the silence.”

Amis knows that this is a cheap shot, and he glories in it. But it isn’t so different from what critics do when they list the awful sentences from a current bestseller or nominate lines for the Bad Sex in Fiction Award. I laugh at this along with anyone else, but I also wince a little, because there are few authors alive who aren’t vulnerable to that sort of treatment. As G.K. Chesterton pointed out: “You could compile the worst book in the world entirely out of selected passages from the best writers in the world.”

This is even more true of authors who take considerable stylistic or thematic risks, which usually result in individual sentences that seem crazy or, worse, silly. The fear of seeming ridiculous is what prevents a lot of writers from taking chances, and it isn’t always unjustified. An ambitious novel opens itself up to savaging from all sides, precisely because it provides so much material that can be turned against the author when taken out of context. And it doesn’t need to be malicious, either: even objective or actively sympathetic critics can be seduced by the ease with which a writer can be excerpted to make a case. I’ve become increasingly daunted by the prospect of distilling the work of Robert A. Heinlein, for example, because his career was so long, varied, and often intentionally provocative that you can find sentences to support any argument about him that you want to make. (It doesn’t help that his politics evolved drastically over time, and they probably would have undergone several more transformations if he had lived for longer.) This isn’t to say that his opinions aren’t a fair target for criticism, but any reasonable understanding of who Heinlein was and what he believed—which I’m still trying to sort out for myself—can’t be conveyed by a handful of cherry-picked quotations. Literary biography is useful primarily to the extent that it can lay out a writer’s life in an orderly fashion, providing a frame that tells us something about the work that we wouldn’t know by encountering it out of order. But even that involves a process of selection, as does everything else about a biography. The biographer’s project isn’t essentially different from that of a working critic or reviewer: it just takes place on a larger scale.

John Updike

And it’s worth noting that prolific critics themselves are particularly susceptible to this kind of treatment. When Renata Adler described Pauline Kael’s output as “not simply, jarringly, piece by piece, line by line, and without interruption, worthless,” any devotee of Kael’s work had to disagree—but it was also impossible to deny that there was plenty of evidence for the prosecution. If you’re determined to hate Roger Ebert, you just have to search for the reviews in which his opinions, written on deadline, weren’t sufficiently in line with the conclusions reached by posterity, as when he unforgivably gave only three stars to The Godfather Part II. And there isn’t a single page in the work of David Thomson, who is probably the most interesting movie critic who ever lived, that couldn’t be mined for outrageous, idiotic, or infuriating statements. I still remember a review on The A.V. Club of How to Watch a Movie that quoted lines like this:

Tell me a story, we beg as children, while wanting so many other things. Story will put off sleep (or extinction) and the child’s organism hardly trusts the habit of waking yet.

And this:

You came into this book under deceptive promises (mine) and false hopes (yours). You believed we might make decisive progress in the matter of how to watch a movie. So be it, but this was a ruse to make you look at life.

The reviewer quoted these sentences as examples of the book’s deficiencies, and they were duly excoriated in the comments. But anyone who has really read Thomson knows that such statements are part of the package, and removing them would also deny most of what makes him so fun, perverse, and valuable.

So what’s a responsible reviewer to do? We could start, maybe, by quoting longer or complete sections, rather than sentences in isolation, and by providing more context when we offer up just a line or two. We can also respect an author’s feelings, explicit or otherwise, about what sections are actually important. In the passage I mentioned at the beginning of this post, which is about John Updike, Mailer goes on to quote a few sentences from Rabbit, Run, and he adds:

The first quotation is taken from the first five sentences of the book, the second is on the next-to-last page, and the third is nothing less than the last three sentences of the novel. The beginning and end of a novel are usually worked over. They are the index to taste in the writer.

That’s a pretty good rule, and it ensures that the critic is discussing something reasonably close to what the writer intended to say. Best of all, we can approach the problem of excerpting with a kind of joy in the hunt: the search for the slice of a work that will stand as a synecdoche of the whole. In the book U & I, which is also about Updike, Nicholson Baker writes about the “standardized ID phrase” and “the aphoristic consensus” and “the jingle we will have to fight past at some point in the future” to see a writer clearly again, just as fans of Joyce have to do their best to forget about “the ineluctable modality of the visible” and “yes I said yes I will Yes.” For a living author, that repository of familiar quotations is constantly in flux, and reviewers might approach their work with a greater sense of responsibility if they realized that they were playing a part in creating it—one tiny excerpt at a time.

%d bloggers like this: