Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Werner Herzog

Broyles’s Law and the Ken Burns effect

with one comment

For most of my life as a moviegoer, I’ve followed a rule that has served me pretty well. Whenever the director of a documentary narrates the story in the first person, or, worse, appears on camera, I start to get suspicious. I’m not talking about movies like Roger and Me or even the loathsome Catfish, in which the filmmakers, for better or worse, are inherently part of the action, but about films in which the director inserts himself into the frame for no particular reason. Occasionally, I can forgive this, as I did with the brilliant The Cove, but usually, I feel a moment of doubt whenever the director’s voiceover begins. (In its worst form, it opens the movie with a redundant narration: “I first came across the story that you’re about to hear in the summer of 1990…”) But while I still think that this is a danger sign, I’ve recently concluded that I was wrong about why. I had always assumed that it was a sign of ego—that these directors were imposing themselves on a story that was really about other people, because they thought that it was all about them. In reality, it seems more likely that it’s a solution to a technical problem. What happens, I think, is that the director sits down to review his footage and discovers that it can’t be cut together as a coherent narrative. Perhaps there are are crucial scenes or beats missing, but the events that the movie depicts are long over, or there’s no budget to go back and shoot more. An interview might bridge the gaps, but maybe this isn’t logistically feasible. In the end, the director is left with just one person who is available to say all the right things on the soundtrack to provide the necessary transitions and clarifications. It’s himself. In a perfect world, if he had gotten the material that he needed, he wouldn’t have to be in his own movie at all, but he doesn’t have a choice. It isn’t a failure of character, but of technique, and the result ends up being much the same.

I got to thinking about this after reading a recent New Yorker profile by Ian Parker of the documentarian Ken Burns, whose upcoming series on the Vietnam War is poised to become a major cultural event. The article takes an irreverent tone toward Burns, whose cultural status encourages him to speechification in private: “His default conversational setting is Commencement Address, involving quotation from nineteenth-century heroes and from his own previous commentary, and moments of almost rhapsodic self-appreciation. He is readier than most people to regard his creative decisions as courageous.” But Parker also shares a fascinating anecdote about which I wish I knew more:

In the mid-eighties, Burns was working on a deft, entertaining documentary about Huey Long, the populist Louisiana politician. He asked two historians, William Leuchtenburg and Alan Brinkley, about a photograph he hoped to use, as a part of the account of Long’s assassination; it showed him protected by a phalanx of state troopers. Brinkley told him that the image might mislead; Long usually had plainclothes bodyguards. Burns felt thwarted. Then Leuchtenburg spoke. He’d just watched a football game in which Frank Broyles, the former University of Arkansas coach, was a commentator. When the game paused to allow a hurt player to be examined, Broyles explained that coaches tend to gauge the seriousness of an injury by asking a player his name or the time of day; if he can’t answer correctly, it’s serious. As Burns recalled it, Broyles went on, “But, of course, if the player is important to the game, we tell him what his name is, we tell him what time it is, and we send him back in.”

Hence Broyles’s Law: “If it’s super-important, if it’s working, you tell him what his name is, and you send him back into the game.” Burns decided to leave the photo in the movie. Parker continues:

Was this, perhaps, a terrible law? Burns laughed. “It’s a terrible law!” But, he went on, it didn’t let him off the hook, ethically. “This would be Werner Herzog’s ‘ecstatic truth’—‘I can do anything I want. I’ll pay the town drunk to crawl across the ice in the Russian village.’” He was referring to scenes in Herzog’s Bells from the Deep, which Herzog has been happy to describe, and defend, as stage-managed. “If he chooses to do that, that’s okay. And then there are other people who’d rather do reenactments than have a photograph that’s vague.” Instead, Burns said, “We do enough research that we can pretty much convince ourselves—in the best sense of the word—that we’ve done the honorable job.”

The reasoning in this paragraph is a little muddled, but Burns seems to be saying that he isn’t relying on “the ecstatic truth” of Herzog, who blurs the line between fiction and reality, or the reenactments favored by Errol Morris, who sometimes seems to be making a feature film interspersed with footage of talking heads. Instead, Burns is assembling a narrative solely out of primary sources, and if an image furthers the viewer’s intellectual understanding or emotional engagement, it can be included, even if it isn’t strictly accurate. These are the compromises that you make when you’re determined to use nothing but the visuals that you have available, and you trust in your understanding of the material to tell whether or not you’ve made the “honorable” choice.

On some level, this is basically what every author of nonfiction has to consider when assembling sources, which involves countless judgment calls about emphasis, order, and selection, as I’ve discussed here before. But I’m more interested in the point that this emerges from a technical issue inherent to the form of the documentary itself, in which the viewer always has to be looking at something. When the perfect image isn’t available, you have a few different options. You can ignore the problem; you can cut to an interview subject who tells the viewers about what they’re not seeing; or you can shoot a reenactment. (Recent documentaries seem to lean heavily on animation, presumably because it’s cheaper and easier to control in the studio.) Or, like Burns, you can make do with what you have, because that’s how you’ve defined the task for yourself. Burns wants to use nothing but interviews, narration, and archival materials, and the technical tricks that we’ve come to associate with his style—like the camera pan across photos that Apple actually calls the Ken Burns effect—arise directly out of those constraints. The result is often brilliant, in large part because Burns has no choice but to think hard about how to use the materials that he has. Broyles’s Law may be “terrible,” but it’s better than most of the alternatives. Burns has the luxury of big budgets, a huge staff, and a lot of time, which allows him to be fastidious about his solutions to such problems. But a desperate documentary filmmaker, faced with no money and a hole in the story to fill, may have no other recourse than to grab a microphone, sit down in the editing bay, and start to speak: “I first came across the story that you’re about to hear in the summer of 1990…”

Written by nevalalee

September 11, 2017 at 9:12 am

Blazing the trail

leave a comment »

When I’m looking for insights into writing, I often turn to the nonliterary arts, and the one that I’ve found the most consistently stimulating is film editing. This is partially because the basic problem that a movie editor confronts—the arrangement and distillation of a huge mass of unorganized material into a coherent shape—is roughly analogous to what a writer does, but at a larger scale and under conditions of greater scrutiny and pressure, which encourages the development of pragmatic technical solutions. This was especially true in the era before digital editing. As Walter Murch, my hero, has pointed out, one minute of film equals a pound of celluloid. A movie like Apocalypse Now generates something like seven tons of raw footage, so an editor, as Murch notes, needs “a strong back and arms.” At the same time, incredibly, he or she also has to keep track of the location of individual frames, which weigh just a few thousandths of an ounce. With such software tools as Final Cut Pro, this kind of bookkeeping becomes relatively easier, and I doubt that many professional editors are inclined to be sentimental about the old days. But there’s also a sense in which wrestling with celluloid required habits of mind and organization that are slowly being lost. In A Guide for the Perplexed, which I once described as the first book I’d recommend to anyone about almost anything, Werner Herzog writes:

I can edit almost as fast as I can think because I’m able to sink details of fifty hours of footage into my mind. This might have something to do with the fact that I started working on film, when there was so much celluloid about the place that you had to know where absolutely every frame was. But my memory of all this footage never lasts long, and within two days of finishing editing it becomes a blur in my mind.

On a more practical level, editing a movie means keeping good notes, and all editors eventually come up with their own system. Here’s how Herzog describes his method:

The way I work is to look through everything I have—very quickly, over a couple of days—and make notes. For all my films over the past decade I have kept a logbook in which I briefly describe, in longhand, the details of every shot and what people are saying. I know there’s a particularly wonderful moment at minute 4:13 on tape eight because I have marked the description of the action with an exclamation point. These days my editor Joe Bini and I just move from one exclamation point to the next; anything unmarked is almost always bypassed. When it comes to those invaluable clips with three exclamation marks, I tell Joe, “If these moments don’t appear in the finished film, I have lived in vain.”

What I like about Herzog’s approach to editing is its simplicity. Other editors, including Murch, keep detailed notes on each take, but Herzog knows that all he has to do is flag it and move on. When the time comes, he’ll remember why it seemed important, and he has implicit faith in the instincts of his past self, which he trusts to steer him in the right direction. It’s like blazing a trail through the woods. A few marks on a tree or a pile of stones, properly used, are all you need to indicate the path, but instead of trying to communicate with hikers who come after you, you’re sending a message to yourself in the future. As Herzog writes: “I feel safe in my skills of navigation.”

Reading Herzog’s description of his editorial notes, I realized that I do much the same thing with the books that I read for my work, whether it’s fiction or nonfiction. Whenever I go back to revisit a source, I’ll often see underlinings or other marks that I left on a previous pass, and I naturally look at those sections more closely, in order to remind myself why it seemed to matter. (I’ve learned to mark passages with a single vertical line in the outer margin, which allows me to flip quickly through the book to scan for key sections.) The screenwriter William Goldman describes a similar method of signaling to himself in his great book Which Lie Did I Tell?, in which he talks about the process of adapting novels to the screen:

Here is how I adapt and it’s very simple: I read the text again. And I read it this time with a pen in my hand—let’s pick a color, blue. Armed with that, I go back to the book, slower this time than when I was a traveler. And as I go through the book word by word, page by page, every time I hit anything I think might be useful—dialogue line, sequence, description—I make a mark in the margin…Then maybe two weeks later, I read the book again, this time with a different color pen…And I repeat the same marking process—a line in the margin for anything I think might make the screenplay…When I am done with all my various color-marked readings—five or six of them—I should have the spine. I should know where the story starts, where it ends. The people should be in my head now.

Goldman doesn’t say this explicitly, but he implies that if a passage struck him on multiple passes, which he undertook at different times and states of mind, it’s likely to be more useful than one that caught his eye only once. Speaking of a page in Stephen King’s novel Misery that ended up with six lines in the margin—it’s the scene in which Annie cuts off Paul’s foot—Goldman writes: “It’s pretty obvious that whatever the spine of the piece was, I knew from the start it had to pass through this sequence.”

And a line or an exclamation point is sometimes all you need. Trying to keep more involved notes can even be a hindrance: not only do they slow you down, but they can distort your subsequent impressions. If a thought is worth having, it will probably occur to you each time you encounter the same passage. You often won’t know its true significance until later, and in the meantime, you should just keep going. (This is part of the reason why Walter Mosley recommends that writers put a red question mark next to any unresolved questions in the first draft, rather than trying to work them out then and there. Stopping to research something the first time around can easily turn into a form of procrastination, and when you go back, you may find that you didn’t need it at all.) Finally, it’s worth remembering that an exclamation point, a line in the margin, or a red question mark are subtly different on paper than on a computer screen. There are plenty of ways to flag sections in a text document, and I often use the search function in Microsoft Word that allows me to review everything I’ve underlined. But having a physical document that you periodically mark up in ink has benefits of its own. When you repeatedly go back to the same book, manuscript, or journal over the course of a project, you find that you’ve changed, but the pages have stayed the same. It starts to feel like a piece of yourself that you’ve externalized and put in a safe place. You’ll often be surprised by the clues that your past self has left behind, like a hobo leaving signs for others, or Leonard writing notes to himself in Memento, and it helps if the hints are a little opaque. Faced with that exclamation point, you ask yourself: “What was I thinking?” And there’s no better way to figure out what you’re thinking right now.

Written by nevalalee

April 20, 2017 at 9:08 am

The glory of being a clown

with 4 comments

Ringling Bros. and Barnum & Bailey Circus souvenir program

When I was younger, there was a period in which I seriously considered becoming a clown. To understand why, you need to know two things. The first is that a clown lives out of a trunk. I was probably about six years old when I saw the Ringling Bros. and Barnum & Bailey Circus for the first time—it was the year that featured the notorious “living unicorn”—and I don’t remember much about the show itself. What I recall most vividly is the souvenir program, which I brought home and read to pieces. It was packed with information about the performers and their lives, but the tidbit that made the greatest impression on me was the fact that they’re always on the road: they travel by train, and if you want to be a clown, you need to fit all your possessions into that trunk. As a kid, I was always fantasizing about running off to live with nothing, relying on luck and my wits, and this seemed like the ultimate example. It fascinated me for the same reason that I’ve always been intrigued by buskers, except that a clown doesn’t work alone: he’s part of a community of circus folk with their own language and traditions who have managed to survive while moving from one gig to the next. I’ve written here before of how ephemeral the career of a dancer can seem, but clowning takes it to another level. It lacks even the superficial glamor of dance, leaving you with nothing but the life of which Homer Simpson once lamented: “When I started this clown thing, I thought it would be nothing but glory. You know, the glory of being a clown?”

The other important thing about clowns is that they have a college. (Or at least they did when I was growing up, although the original Ringling Bros. and Barnum & Bailey Clown College closed its doors nearly twenty years ago.) I first read about clown college in that souvenir program, and I don’t think I’ve ever gotten over the discovery that it existed. Even now, I can recite the description of the curriculum almost from memory. Tuition was free, but students had to pay for their own room, board, and grease paint. Subjects included costume and makeup design, tumbling, acrobatics, pantomime, juggling, stilt walking, and the history of comedy. The one catch is that if the circus offered you a contract at the end of the term, you were obliged to accept it for a year. Its graduates, I later learned, included Penn Gillette, Bill Irwin, and David Strathairn. But what stuck me the most was that this was a place where instructors and students could come together to discuss something as peculiar as the theory and practice of clowning. When I look back at it, it seems possible that it was my first exposure to the idea of college of any kind, and the basic appeal of it never changed, even when I traded my fantasies of Venice, Florida for Cambridge, Massachusetts. You could argue that learning how to become a clown is more practical than majoring in creative writing or film studies. And in each case, it allows an unlikely community to form around people who are otherwise persistently odd.

Ringling Bros. and Barnum & Bailey Circus souvenir program

It’s that sense of collective effort in the pursuit of strangeness, I think, that makes the circus so enticing. When we talk about running off to join the circus, what we’re actually saying is that we want to leave our responsibilities behind and join a troupe of likeminded individuals: free artists of themselves who require nothing but a vacant lot in order to put on a show. If I was saddened by the recent news that Ringling Bros. is closing after well over a century of operation, it’s because I feel a sense of loss at the end of the dream that it represented. There were aspects of the circus that deserved to be retired, and I wasn’t sorry when they finally put an end to their animal acts. But I never dreamed about being a lion tamer. I identified with the clowns, the trapeze artists, the acrobats, the contortionists, and all the others who symbolized the romance of devoting a life to a form of art that is inherently transient. To some extent, this is true of every artist, but what sets the circus apart is that its performers do it together, on the road, and for years on end. Directors like Fellini and Max Ophüls have been instinctively drawn to circus imagery, because it captures something fundamental about what they do for a living: they’re ringmasters with the ability to harness chaos for just long enough to make a movie. Yet this isn’t quite the same thing. A film, in theory, is something permanent, but a circus is over as soon as the show ends, and to make it last, you have to keep up the act forever.

And I’ve gradually come to realize that I did become a clown, at least in all the ways that count. (As Werner Herzog observes in Werner Herzog Eats His Shoe: “As you see [filmmaking] makes me into a clown. And that happens to everyone—just look at Orson Welles or look at even people like Truffaut. They have become clowns.”) I spent four years at college studying two dead languages that I haven’t used since graduation, which is either a cosmic joke in itself or an acknowledgment that the knowledge you acquire is less important than the fact that you’ve pursued it in the company of others. Later, I left my job to become a writer, an activity that I’ve since begun to understand is as ephemeral, in some respects, as that of a clown or ballet dancer: few of its fruits last for any longer than the time it takes to write them down, and you’re left with nothing but the process. Along the way, I’ve successively joined and departed from various communities of people who share the same goals. We’ve never traveled on a real train together, but we’re all bound for a common destination, and we’ve developed the same set of strategies to get there. The promise of the circus is that you can get paid for being a clown, if you’re willing to sacrifice every practical consideration and assume every indignity along the way, and that you’re not alone. In the end, the joke might be on you. But the joke is ultimately on all of us. And maybe the clowns are the only ones sane enough to understand this.

Written by nevalalee

January 16, 2017 at 9:32 am

Tales from The Far Side

with 8 comments

"They're lighting their arrows!"

Last week, I finally saw The Revenant. I know that I’m pretty late to the party here, but I don’t have a chance to watch a lot of movies for grownups in the theater these days, and it wasn’t a film that my wife particularly wanted to see, so I had to wait for one of the rare weekends when she was out of town. At this point, a full review probably isn’t of much interest to anyone, so I’ll confine myself to observing that it’s an exquisitely crafted movie that I found very hard to take seriously. Alejandro G. Iñárittu, despite his obvious visual gifts, may be the most pretentious and least self-aware director at work today—which is one reason why Birdman fell so flat for me—and I would have liked The Revenant a lot more if it had allowed itself to smile a little at how absurd it all was. (Even the films of someone like Werner Herzog include flashes of dark humor, and I suspect that Herzog actively seeks out these moments, even if he maintains a straight face.) And it took me about five minutes to realize that the movie and I were fundamentally out of sync. It happened during the scene in which the fur trappers find themselves under attack by an Arikara war party, which announces itself, in classic fashion, with a sudden arrow through a character’s throat. A few seconds later, the camera pans up to show more arrows, now on fire, arcing through the trees overhead. It’s an eerie sight, and it’s given the usual glow by Emmanuel Lubezki’s luminous cinematography. But I’ll confess that when I first saw it, I said to myself: “Hey! They’re lighting their arrows! Can they do that?”

It’s a caption from a Far Side cartoon, of course, and it started me thinking about the ways in which the work of Gary Larson has imperceptibly shaped my inner life. I’ve spoken here before about how quotations from The Simpsons provide a kind of complete metaphorical language for fans, like the one that Captain Picard learns in “Darmok.” You could do much the same thing with Larson’s captions, and there are probably more fluent speakers alive than you might think. Peanuts is still the comic strip that has meant the most to me, and I count myself lucky that I grew up at a time when I could read most of Calvin and Hobbes in its original run. Yet both of these strips, like Bloom County, lived most vividly for me in the form of collections, and in the case of Peanuts, its best years were long behind it. The Far Side, by contrast, obsessed me on a daily basis, more than any other comic strip of its era. When I was eight years old, I spent a few months diligently cutting out all the panels from my local paper and pasting them into a scrapbook, which is an impulse that I hadn’t felt before and haven’t felt since. Two decades later, I got a copy of The Complete Far Side for Christmas, which might still be my favorite present ever. Every three years so, I get bitten by the bug again, and I spend an evening or two with one of those huge volumes on my lap, going through the strip systematically from beginning to end. Its early years are rough and a little uncertain, but they’re still wonderful, and it went out when it was close to its peak. And when I’m reading it in the right mood, there’s nothing else in the world that I’d rather be doing.

"Think there are any bears in this old cave?"

A gag panel might seem like the lowest form of comic, but The Far Side also had a weirdly novelistic quality that I’ve always admired as a writer. Larson’s style seemed easy to imitate—I think that every high school newspaper had a strip that was either an homage or outright plagiarism—but his real gift was harder to pin down. It was the ability to take what feels like an ongoing story, pause it, and offer it up to readers at a moment of defining absurdity. (Larson himself says in The Prehistory of The Far Side: “Cartoons are, after all, little stories themselves, frozen at an interesting point in time.”) His ideas stuck in the brain because we couldn’t help but wonder what happened before or afterward. Part of this because he cleverly employed all the usual tropes of the gag cartoon, which are fun precisely because of the imaginative fertility of the clichés they depict: the cowboys singing around a campfire, the explorers in pith helmets hacking their way through the jungle, the castaway on the desert island. But the snapshots in time that Larson captures are both so insane and so logical that the reader has no choice but to make up a story. The panel is never the inciting incident or the climax, but a ticklish moment somewhere in the middle. It can be the gigantic mailman knocking over buildings while a dog exhorts a crowd of his fellows: “Listen! The authorities are helpless! If the city’s to be saved, I’m afraid it’s up to us! This is our hour!” Or the duck hunter with a shotgun confronted by a row of apparitions in a hall of mirrors: “Ah, yes, Mr. Frischberg, I thought you’d come…but which of us is the real duck, Mr. Frischberg, and not just an illusion?”

As a result, you could easily go through a Far Side collection and use it as a series of writing prompts, like a demented version of The Mysteries of Harris Burdick. I’ve occasionally thought about writing a story revolving around the sudden appearance of Professor DeArmond, “the epitome of evil among butterfly collectors,” or expanding on the incomparable caption: “Dwayne paused. As usual, the forest was full of happy little animals—but this time something seemed awry.” It’s hard to pick just one favorite, but the panel I’ve thought about the most is probably the one with the elephant in the trench coat, speaking in a low voice out of the darkness of the stairwell:

Remember me, Mr. Schneider? Kenya. 1947. If you’re going to shoot at an elephant, Mr. Schneider, you better be prepared to finish the job.

Years later, I spent an ungodly amount of time working on a novel, still unpublished, about an elephant hunt, and while I wouldn’t go so far as to say that it was inspired by this cartoon, I’m also not prepared to say that it wasn’t. I should also note Larson’s mastery of perfect proper names, which are harder to come up with than you might think: “Mr. Frischberg” and “Mr. Schneider” were so nice that he said them twice. It’s that inimitable mixture of the ridiculous and the specific that makes Larson such a model for storytellers. He made it to the far side thirty years ago, and we’re just catching up to him now.

Written by nevalalee

September 27, 2016 at 8:58 am

The prankster principle

leave a comment »

Totoro in Toy Story 3

In an interview with McKinsey Quarterly, Ed Catmull of Pixar was recently asked: “How do you, as the leader of a company, simultaneously create a culture of doubt—of being open to careful, systematic introspection—and inspire confidence?” He replied:

The fundamental tension [at Pixar] is that people want clear leadership, but what we’re doing is inherently messy. We know, intellectually, that if we want to do something new, there will be some unpredictable problems. But if it gets too messy, it actually does fall apart. And adhering to the pure, original plan falls apart, too, because it doesn’t represent reality. So you are always in this balance between clear leadership and chaos; in fact that’s where you’re supposed to be. Rather than thinking, “Okay, my job is to prevent or avoid all the messes,” I just try to say, “well, let’s make sure it doesn’t get too messy.”

Which sounds a lot like the observation from the scientist Max Delbrück that I never tire of quoting: “If you’re too sloppy, then you never get reproducible results, and then you never can draw any conclusions; but if you are just a little sloppy, then when you see something startling, you [can] nail it down…I called it the ‘Principle of Limited Sloppiness.’”

Most artists are aware that creativity requires a certain degree of controlled messiness, and scientists—or artists who work in fields where science and technology play a central role, as they do at Pixar—seem to be particularly conscious of this. As the zoologist John Zachary Young said:

Each individual uses the store of randomness, with which he was born, to build during his life rules which are useful and can be passed on…We might therefore take as our general picture of the universe a system of continuity in which there are two elements, randomness and organization, disorder and order, if you like, alternating with one another in such a fashion as to maintain continuity.

I suspect that scientists feel compelled to articulate this point so explicitly because there are so many other factors that discourage it in the pursuit of ordinary research. Order, cleanliness, and control are regarded as scientific virtues, and for good reason, which makes it all the more important to introduce a few elements of disorder in a systematic way. Or, failing that, to acknowledge the usefulness of disorder and to tolerate it to a certain extent.

Werner Herzog Eats His Shoe

When you’re working by yourself, you find that both your headspace and your workspace tend to arrive at whatever level of messiness works best for you. On any given day, the degree of clutter in my office is more or less the same, with occasional deviations toward greater or lesser neatness: it’s a nest that I’ve feathered into a comfortable setting for productivity—or inactivity, which often amounts to the same thing. It’s tricker when different personalities have to work together. What sets Pixar apart is its ability to preserve that healthy alternation between order and disorder, while still releasing a blockbuster movie every year. It does this, in part, by limiting the number of feature films that it has in production at any one time, and by building in systems for feedback and deconstruction, with an environment that encourages artists to start again from scratch. There’s also a tradition of prankishness that the company has tried to preserve. As Catmull says:

For example, when we were building Pixar, the people at the time played a lot of practical jokes on each other, and they loved that. They think it’s awesome when there are practical jokes and people do things that are wild and crazy…Without intending to, the culture slowly shifts. How do you keep the shift from happening? I can’t go out and say, “Okay, we’re going to organize some wild and crazy activities.” Top-down organizing of spontaneous activities isn’t a good idea.

It’s hard to scale up a culture of practical jokes, and Pixar has faced the same challenges here as elsewhere. The mixed outcomes of Brave and, to some extent, The Good Dinosaur show that the studio isn’t infallible, and a creative process that depends on a movie sucking for three out of four years can run into trouble when you shift that timeline. But the fact that Pixar places so much importance on this kind of prankishness is revealing in itself. It arises in large part from its roots in the movies, which have been faced with the problem of maintaining messiness in the face of big industrial pressures almost from the beginning. (Orson Welles spoke of “the orderly disorder” that emerges from the need to make quick decisions while moving large amounts of people and equipment, and Stanley Kubrick was constantly on the lookout for collaborators like Ken Adam who would allow him to be similarly spontaneous.) There’s a long tradition of pranks on movie sets, shading imperceptibly from the gags we associate with the likes of George Clooney to the borderline insane tactics that Werner Herzog uses to keep that sense of danger alive. The danger, as Herzog is careful to assure us, is more apparent than real, and it’s more a way of fruitfully disordering what might otherwise become safe and predictable. But just by the right amount. As the artist Frank Stella has said of his own work: “I disorder it a little bit or, I should say, I reorder it. I wouldn’t be so presumptuous to claim that I had the ability to disorder it. I wish I did.”

Forever and ever

with 8 comments

The cover of David Bowie's Hours

I knew this day would come, but I allowed myself to hope that it never would. When I first became aware of David Bowie, it happened to be at a point in his career when it seemed as if he had been around forever, and he was everywhere you looked. My dad, a longtime fan, had bought Let’s Dance just like everyone else—he and my mom even saw Bowie perform on the Serious Moonlight tour—and my parents still talk about watching me sing along as a toddler to “Modern Love.” Later, of course, there was Labyrinth, along with so much else that is so deeply embedded in my subconscious that I can’t imagine a world without it. But it took me a long time to realize that I was encountering Bowie at a moment that was a clear outlier in the larger story of his life. The massive success of Let’s Dance, which had originally been intended as a one-off detour, transformed him into a mainstream pop superstar for the first time, and he followed it with a string of commercially minded albums that most critics, along with Bowie himself, rank low in his body of work. But I still love what Sasha Frere-Jones has called “the blocky drums and sports-bar guitars” of this period. It’s richer, weirder stuff than it initially seems, and it’s the first version that comes to mind whenever I think about David Bowie. Which is an awful lot. In fact, as the years pass, I find that I’ve spent most of my life thinking about Bowie pretty much all the time.

When an artist has such a long, productive career and you tune in halfway through, you tend to see his or her music in two parallel chronologies. There’s the true chronology, which you start to piece together as you work backward and forward through the discography and listen to the songs in the order in which they were written and recorded. And there’s the autobiographical chronology, in which the albums assume positions in your memory based on when you listened to them the most. This doesn’t have much to do with their proper release dates: the songs situate themselves in your life wherever they can fit, like enzymes locking onto substrates, and they end up spelling out a new message. If the Bowie of the eighties takes me back to my childhood, I can’t listen to Scary Monsters without being plunged right away into my senior year of high school, in which I listened to it endlessly on a Discman and headphones while riding the train up to Berkeley. My arrival in New York after college was scored to Hours, an album often seen as forgettable, but which contains a handful of Bowie’s loveliest songs, especially “Thursday’s Child” and “Survive.” “Modern Love” played at my wedding. And it’s hard to think of a chapter in my life when he wasn’t important. He was such a given, in fact, that it took me a long time to get a sense of the shape of his career as a whole, in the same way that there are enormous swaths in the lives of your parents that you’ve never bothered to ask about because they’ve always been there.

David Bowie

I saw Bowie perform live twice. The first was the Outside tour with Nine Inch Nails as his opening act, and it was my first rock concert ever: Bowie came onstage to the sound of “Subterraneans” and intoned the lyrics to “Scary Monsters” as a spoken-word piece, an unforgettable moment that I was recently delighted to find online. Much later, I saw him in New York with my brother, with whom I’d also caught a retrospective at the Museum of Television and Radio—this was in the years before YouTube—that collected many of his old videos and performance clips, playing continuously on a screen in a tiny darkened room. By then, Bowie was an institution. He was so established that he had issued bonds secured by royalties from his back catalog, and going back over pictures and footage from his early days was like looking at snapshots of your father and marveling at how long his hair used to be. And occasionally it occurred to me that Bowie would have to die one day, much as I still think the same about Francis Coppola or Werner Herzog. It seemed inconceivable, although hints of mortality are woven throughout his catalog. (As I wrote on this blog once: “And the skull grins through even his most unabashedly mainstream moments. If you listen carefully to ‘Let’s Dance,’ you can hear something rattling in the background, alongside the slick horns and synthetic percussion. It’s the sound of Bowie’s false teeth.”) If David Bowie can die, it means that none of us are safe.

After reading the news, the first song I played was “Starman.” I don’t think I’m alone. But the way that song came back into my life is revealing in itself. I’d always been vaguely aware of it, from The Life Aquatic if nothing else—which links Bowie indelibly in my mind with Bill Murray, another celebrity whose departure I anticipate with dread. But I didn’t listen to it closely until I got a copy of his recent greatest hits album Nothing Has Changed. (It was a Christmas present from my brother, which is just another reminder of how entwined Bowie has been in the story of my family.) It’s an eclectic collection of songs on two chunky vinyl discs, with different track listings depending on the format, and it both reminded me of some old favorites and reintroduced me to songs that, for whatever reason, had never been integrated into my internal playlist. The best part was playing it for my two-year-old daughter, who has since been known to ask for Bowie by name. She can sing along to “Changes,” as she did unprompted when I pulled out the album this morning, and to “Heroes,” with her little voice sounding strong and clear: “We can beat dem / Forevah and evah…” It makes me feel like I’m maintaining some kind of continuity. And the phrase “forever and ever” has become a regular part of her vocabulary. She’ll ask: “Am I going to be three forever and ever?” And when it’s time to turn off the lights, and I sit on the edge of her bed, she asks: “Will you stay with me forever and ever?” I want to say yes, but of course I can’t. And neither could David Bowie.

The certainty of salvation

leave a comment »

Lotte Eisner and Werner Herzog

We German filmmakers were still a fragile group in 1974, so when a friend called me from Paris to say that Lotte [Eisner] had suffered a massive stroke and I should get on the next airplane, I started looking for flights, before realizing it wasn’t the correct way to proceed. I was unable to accept that Lotte might die, and though it was the start of the onslaught of an early winter, I decided to walk from Munich to Paris. My pilgrimage was a million steps in rebellion against her death.

I stuffed a bundle of clothes and a map into a duffel bag, then set off in the straightest line possible, sleeping under bridges, in farms and abandoned houses. I made only one detour—to the town of Troyes, where I marveled at the cathedral—and ended up walking across the Vosges mountains for about twenty miles…I’m not superstitious, but did feel that coming by foot would prevent Lotte’s death. The Catholic Church has a wonderful term for this: Heilsgewissheit, the certainty of salvation.

I moved with the faith of a pilgrim, convinced that Lotte would be alive when I got to Paris four weeks later. When I arrived in town I stopped at a friend’s place to take shelter from the rain and sat in his office, steam coming off my clothes, utterly exhausted after having walked the last fifty miles without a break. I gave him my compass, which I no longer needed, and walked to Lotte’s home. She was very surprised, but happy to see me.

Years later, bedridden and nearly blind, unable to read or see films, Lotte wrote to me, asking if I would visit her. I went to Paris, where she told me, “Werner, there is still some spell cast that prevents me from dying. But I can barely walk. I am saturated with life [lebenssatt]. It would be a good time for me now.” Jokingly, I said, “Lotte, I hereby lift the spell.” Two weeks later she died. It was the right moment for her.

Werner Herzog, A Guide for the Perplexed

Written by nevalalee

January 1, 2016 at 6:34 am

%d bloggers like this: