Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘David Mamet

The fairy tale theater

leave a comment »

It must have all started with The Princess Switch, although that’s so long ago now that I can barely remember. Netflix was pushing me hard to watch an original movie with Vanessa Hudgens in a dual role as a European royal and a baker from Chicago who trade places and end up romantically entangled with each other’s love interests at Christmas, and I finally gave in. In the weeks since, my wife and I have watched Pride, Prejudice, and MistletoeThe Nine Lives of ChristmasCrown for ChristmasThe Holiday CalendarChristmas at the Palace; and possibly one or two others that I’ve forgotten. A few were on Netflix, but most were on Hallmark, which has staked out this space so aggressively that it can seem frighteningly singleminded in its pursuit of Yuletide cheer. By now, it airs close to forty original holiday romances between Thanksgiving and New Year’s Eve, and like its paperback predecessors, it knows better than to tinker with a proven formula. As two of its writers anonymously reveal in an interview with Entertainment Weekly:

We have an idea and it maybe takes us a week or so just to break it down into a treatment, a synopsis of the story; it’s like a beat sheet where you pretty much write what’s going to happen in every scene you just don’t write the scene. If we have a solid beat sheet done and it’s approved, then it’s only going to take us about a week and a half to finish a draft. Basically, an act or two a day and there’s nine. They’re kind of simple because there are so many rules so you know what you can and can’t do, and if you have everything worked out it comes together.

And the rules are revealing in themselves. As one writer notes: “The first rule is snow. We really wanted to do one where the basic conflict was a fear that there will not be snow on Christmas. We were told you cannot do that, there must be snow. They can’t be waiting for the snow, there has to be snow. You cannot threaten them with no snow.” And the conventions that make these movies so watchable are built directly into the structure:

There cannot be a single scene that does not acknowledge the theme. Well, maybe a scene, but you can’t have a single act that doesn’t acknowledge it and there are nine of them, so there’s lots of opportunities for Christmas. They have a really rigid nine-act structure that makes writing them a lot of fun because it’s almost like an exercise. You know where you have to get to: People have to be kissing for the first time, probably in some sort of a Christmas setting, probably with snow falling from the sky, probably with a small crowd watching. You have to start with two people who, for whatever reason, don’t like each other and you’re just maneuvering through those nine acts to get them to that kiss in the snow.

The result, as I’ve learned firsthand, is a movie that seems familiar before you’ve even seen it. You can watch with one eye as you’re wrapping presents, or tune in halfway through with no fear of becoming confused. It allows its viewers to give it exactly as much attention as they’re willing to spare, and at a time when the mere act of watching prestige television can be physically exhausting, there’s something to be said for an option that asks nothing of us at all.

After you’ve seen two or three of these movies, of course, the details start to blur, particularly when it comes to the male leads. The writers speak hopefully of making the characters “as unique and interesting as they can be within the confines of Hallmark land,” but while the women are allowed an occasional flash of individuality, the men are unfailingly generic. This is particularly true of the subgenre in which the love interest is a king or prince, who doesn’t get any more personality than his counterpart in fairy tales. Yet this may not be a flaw. In On Directing Film, which is the best work on storytelling that I’ve ever read, David Mamet provides a relevant word of advice:

In The Uses of Enchantment, Bruno Bettelheim says of fairy tales the same thing Alfred Hitchcock said about thrillers: that the less the hero of the play is inflected, identified, and characterized, the more we will endow him with our own internal meaning—the more we will identify with him—which is to say the more we will be assured that we are that hero. “The hero rode up on a white horse.” You don’t say “a short hero rode up on a white horse,” because if the listener isn’t short he isn’t going to identify with that hero. You don’t say “a tall hero rode up on a white horse,” because if the listener isn’t tall, he won’t identify with the hero. You say “a hero,” and the audience subconsciously realize they are that hero.

Yet Mamet also overlooks the fact that the women in fairy tales, like Snow White, are often described with great specificity—it’s the prince who is glimpsed only faintly. Hallmark follows much the same rule, which implies that it’s less important for the audience to identify with the protagonist than to fantasize without constraint about the object of desire.

This also leads to some unfortunate decisions about diversity, which is more or less what you might expect. As one writer says candidly to Entertainment Weekly:

On our end, we just write everybody as white, we don’t even bother to fight that war. If they want to put someone of color in there, that would be wonderful, but we don’t have control of that…I found out Meghan Markle had been in some and she’s biracial, but it almost seems like they’ve tightened those restrictions more recently. Everything’s just such a white, white, white, white world. It’s a white Christmas after all—with the snow and the people.

With more than thirty original movies coming out every year, you might think that Hallmark could make a few exceptions, especially since the demand clearly exists, but this isn’t about marketing at all. It’s a reflection of the fact that nonwhiteness is still seen as a token of difference, or a deviation from an assumed norm, and it’s the logical extension of the rules that I’ve listed above. White characters have the privilege—which is invisible but very real—of seeming culturally uninflected, which is the baseline that allows the formula to unfold. This seems very close to John W. Campbell’s implicit notion that all characters in science fiction should be white males by default, and while other genres have gradually moved past this point, it’s still very much the case with Hallmark. (There can be nonwhite characters, but they have to follow the rules: “Normally there’ll be a black character that’s like a friend or a boss, usually someone benevolent because you don’t want your one person of color to not be positive.”) With diversity, as with everything else, Hallmark is very mindful of how much variation its audience will accept. It thinks that it knows the formula. And it might not even be wrong.

Ghosts and diversions

leave a comment »

Over the weekend, after I heard that the magician Ricky Jay had died, I went back to revisit the great profile, “Secrets of the Magus,” that Mark Singer wrote over a quarter of a century ago for The New Yorker. Along with Daniel Zalewski’s classic piece on Werner Herzog, it’s one of the articles in that magazine that I’ve thought about and reread the most, but what caught my attention this time around was a tribute from David Mamet:

I’ll call Ricky on the phone. I’ll ask him—say, for something I’m writing—“A guy’s wandering through upstate New York in 1802 and he comes to a tavern and there’s some sort of mountebank. What would the mountebank be doing?” And Ricky goes to his library and then sends me an entire description of what the mountebank would be doing. Or I’ll tell him I’m having a Fourth of July party and I want to do some sort of disappearance in the middle of the woods. He says, “That’s the most bizarre request I’ve ever heard. You want to do a disappearing effect in the woods? There’s nothing like that in the literature. I mean, there’s this one 1760 pamphlet—Jokes, Tricks, Ghosts, and Diversions by Woodland, Stream and Campfire. But, other than that, I can’t think of a thing.” He’s unbelievably generous. Ricky’s one of the world’s great people. He’s my hero. I’ve never seen anybody better at what he does.

Coming from Mamet, this is high praise indeed, and it gets at most of the reasons why Ricky Jay was one of my heroes, too. Elsewhere in the article, Mamet says admiringly: “I regard Ricky as an example of the ‘superior man,’ according to the I Ching definition. He’s the paradigm of what a philosopher should be: someone who’s devoted his life to both the study and the practice of his chosen field.”

And what struck me on reading these lines again was how deeply Jay’s life and work were tied up in books. A bookseller quoted in Singer’s article estimates that Jay spent more of his disposable income on rare books than anyone else he knew, and his professional legacy might turn out to be even greater as a writer, archivist, and historian as it was for sleight of hand. (“Though Jay abhors the notion of buying books as investments, his own collection, while it is not for sale and is therefore technically priceless, more or less represents his net worth,” Singer writes. And I imagine that a lot of his fellow collectors are very curious about what will happen to his library now.) His most famous book as an author, Learned Pigs & Fireproof Women, includes a chapter on Arthur Lloyd, “The Human Card Index,” a vaudevillian renowned for his ability to produce anything printed on paper—a marriage license, ringside seats to a boxing match, menus, photos of royalty, membership cards for every club imaginable—from his pockets on demand. This feels now like a metaphor for the mystique of Jay himself, who fascinated me for many of the same reasons. Like most great magicians, he exuded an aura of arcane wisdom, but in his case, this impression appears to have been nothing less than the truth. Singer quotes the magician Michael Weber:

Magic is not about someone else sharing the newest secret. Magic is about working hard to discover a secret and making something out of it. You start with some small principle and you build a theatrical presentation out of it. You do something that’s technically artistic that creates a small drama. There are two ways you can expand your knowledge—through books and by gaining the confidence of fellow magicians who will explain these things. Ricky to a large degree gets his information from books—old books—and then when he performs for magicians they want to know, “Where did that come from?” And he’s appalled that they haven’t read this stuff.

As a result, Jay had the paradoxical image of a man who was immersed in the lore of magic while also keeping much of that world at arm’s length. “Clearly, Jay has been more interested in the craft of magic than in the practical exigencies of promoting himself as a performer,” Singer writes, and Jay was perfectly fine with that reputation. In Learned Pigs, Jay writes admiringly of the conjurer Max Malini:

Yet far more than Malini’s contemporaries, the famous conjurers Herrmann, Kellar, Thurston, and Houdini, Malini was the embodiment of what a magician should be—not a performer who requires a fully equipped stage, elaborate apparatus, elephants, or handcuffs to accomplish his mysteries, but one who can stand a few inches from you and with a borrowed coin, a lemon, a knife, a tumbler, or a pack of cards convince you he performs miracles.

This was obviously how Jay liked to see himself, as he says with equal affection of the magician Dai Vernon: “Making money was only a means of allowing him to sit in a hotel room and think about his art, about cups and balls and coins and cards.” Yet the reality must have been more complicated. You don’t become as famous or beloved as Ricky Jay without an inhuman degree of ambition, however carefully hidden, and he cultivated attention in ways that allowed him to maintain his air of remove. Apart from Vernon, his other essential mentor was Charlie Miller, who seems to have played the same role in the lives of other magicians that Joe Ancis, “the funniest man in New York City,” did for Lenny Bruce. Both were geniuses who hated to perform, so they practiced their art for a small handful of confidants and fellow obsessives. And the fact that Jay, by contrast, lived the kind of life that would lead him to be widely mourned by the public indicates that there was rather more to him than the reticent persona that he projected.

Jay did perform for paying audiences, of course, and Singer’s article closes with his preparations for a show, Ricky Jay and His 52 Assistants, that promises to relieve him from the “tenuous circumstances” that result from his devotion to art. (A decade later, my brother and I went to see his second Broadway production, On the Stem, which is still one of my favorite memories from a lifetime of theatergoing.) But he evidently had mixed feelings about the whole enterprise, which left him even more detached from the performers with whom he was frequently surrounded. As Weber notes: “Ricky won’t perform for magicians at magic shows, because they’re interested in things. They don’t get it. They won’t watch him and be inspired to make magic of their own. They’ll be inspired to do that trick that belongs to Ricky…There’s this large body of magic lumpen who really don’t understand Ricky’s legacy—his contribution to the art, his place in the art, his technical proficiency and creativity. They think he’s an élitist and a snob.” Or as the writer and mentalist T.A. Walters tells Singer:

Some magicians, once they learn how to do a trick without dropping the prop on their foot, go ahead and perform in public. Ricky will work on a routine a couple of years before even showing anyone. One of the things that I love about Ricky is his continued amazement at how little magicians seem to care about the art. Intellectually, Ricky seems to understand this, but emotionally he can’t accept it. He gets as upset about this problem today as he did twenty years ago.

If the remarkable life that he lived is any indication, Jay never did get over it. According to Singer, Jay once asked Dai Vernon how he dealt with the intellectual indifference of other magicians to their craft. Vernon responded: “I forced myself not to care.” And after his friend’s death, Jay said wryly: “Maybe that’s how he lived to be ninety-eight years old.”

The confidence tricksters

with one comment

When I look back at my life, I find that I’ve always been fascinated by a certain type of personality, at least when observed from a safe distance. I may as well start with Orson Welles, who has been on my mind a lot recently. As David Thomson writes in Rosebud: “Yes, he was a trickster, a rather nasty operator, a credit thief, a bully, a manipulator, a shallow genius…a less than wholesome great man…oh, very well, a habitual liar, a liar of genius.” But in his discussion of the late masterwork F for Fake, Thomson also hints at the essence of Welles’s appeal:

The happiness in F for Fake, the exhilaration, comes from the discovery and the jubilation that knows there is no higher calling than being a magician, a storyteller, a fake who passes the time. This is the work in which Welles finally reconciled the lofty, European, intellectual aspect of himself and the tent show demon who sawed cute dames and wild dreams in half. For it can be very hard to live with the belief that nothing matters in life, that nothing is solid or real, that everything is a show in the egotist’s head. It loses friends, trust, children, home, money, security, and maybe reason. So it is comforting indeed, late in life, to come upon a proof that the emptiness and the trickery are valid and sufficient.

Welles claimed afterward that he had been “faking” his confession of being a charlatan, as if it were somehow incompatible with being an artist—although the great lesson of his life is that it can be possible and necessary to be both at the same time.

This is the kind of figure to whom I’m helplessly drawn—the genius who is also a con artist. You could even make much the same case, with strong reservations, for L. Ron Hubbard. I don’t like him or most of his work, and he caused more pain to other people than anyone else in Astounding. Yet the best clue I’ve ever found to figuring out his character is a passage by Lawrence Wright, who writes shrewdly in Going Clear:

The many discrepancies between Hubbard’s legend and his life have overshadowed the fact that he genuinely was a fascinating man…The tug-of-war between Scientologists and anti-Scientologists over Hubbard’s biography has created two swollen archetypes: the most important person who ever lived and the world’s greatest con man. Hubbard himself seemed to revolve on this same axis…But to label him a pure fraud is to ignore the complex, charming, delusional, and visionary features of his character that made him so compelling.

I’ve spent more time thinking about this than I ever wanted, and I’ve grudgingly concluded that Wright has a point. Hubbard was frankly more interesting than most of his detractors, and he couldn’t have accomplished half of what he did if it weren’t for his enormous, slippery gifts for storytelling, in person if not on the page. (On some level, he also seems to have believed in his own work, which complicates our picture of him as a con artist—although he certainly wasn’t averse to squeezing as much money out of his followers as possible.) I’ve often compared Welles to Campbell, but he has equally profound affinities with Hubbard, whose favorite film was Citizen Kane, and who perpetuated a science fiction hoax that dwarfed The War of the Worlds.

But I’m also attracted by such examples because they get at something crucial about the life of any artist, in which genius and trickery are often entwined. I don’t think of myself as a particularly devious person, but I’ve had to develop certain survival skills just to keep working, and a lot of writers come to think of themselves in the fond terms that W.H. Auden uses in The Dyer’s Hand:

All those whose success in life depends neither upon a job which satisfies some specific and unchanging social need, like a farmer’s, nor, like a surgeon’s, upon some craft which he can be taught by others and improve by practice, but upon “inspiration,” the lucky hazard of ideas, live by their wits, a phrase which carries a slightly pejorative meaning. Every “original” genius, be he an artist or a scientist, has something a bit shady about him, like a gambler or madman.

The similarities between the artist and the confidence man tend to appeal to authors with a high degree of technical facility, like David Mamet, who returns to the subject obsessively. In the lovely essay “Pool Halls,” Mamet writes: “The point of the pool hall was the intersection of two American Loves: the Game of Skill and the Short Con…Well, I guess that America is gone. We no longer revere skill, and the short con of the pool hustle and the Murphy Man and the Fuller Brush Man. The short con, which flourished in a life lived on the street and among strangers, has been supplanted by the Big Con of a life with no excitement in it at all.”

As Mamet implies, there’s something undeniably American about these figures. The confidence man has been part of this country’s mythology from the beginning, undoubtedly because it was a society that was inventing itself as it went along. There’s even an element of nostalgia at work. But I also don’t want to romanticize it. Most of our trickster heroes are white and male, which tells us something about the privilege that underlies successful fakery. A con man, like a startup founder, has to evade questions for just long enough to get away with it. That’s true of most artists, too, and the quintessentially American advice to fake it till you make it applies mostly to those who have the cultural security to pull it off. (If we’re so fascinated by confidence tricksters who were women, it might be because they weren’t held back by impostor syndrome.) Of course, the dark side of this tradition, which is where laughter dies in the throat, can be seen in the White House, which is currently occupied by the greatest con artist in American history. I don’t even mean this as an insult, but as a fundamental observation. If we’re going to venerate the con man as an American archetype, we have to acknowledge that Trump has consistently outplayed us all, even when the trick, or troll, was unfolding in plain sight. This also says something about our national character, and if Trump reminds me of Hubbard, he’s also forced me to rethink Citizen Kane. But there’s another side to the coin. During times of oppression and reaction, a different kind of deviousness can emerge, one that channels these old impulses toward ingenuity, inventiveness, resourcefulness, humor, and trickery, which are usually used to further the confidence man’s private interests, toward very different goals. If we’re going to make it through the next two years, we need to draw deeply on this tradition of genius. I’ll be talking about this more tomorrow.

Written by nevalalee

November 8, 2018 at 8:32 am

Quote of the Day

with 2 comments

One must look honestly at what one has done, and to compare it to what one was trying to do. To learn useful mechanical lessons from the comparison is difficult; many workers in the theater never learn to do it.

David Mamet, Writing in Restaurants

Written by nevalalee

October 10, 2018 at 7:30 am

The writer’s defense

leave a comment »

“This book will be the death of me,” the writer Jose Chung broods to himself halfway through my favorite episode of Millennium. “I just can’t write anymore. What possessed me to be a writer anyway? What kind of a life is this? What else can I do now, with no other skills or ability? My life has fizzled away. Only two options left: suicide, or become a television weatherman.” I’ve loved this internal monologue—written by Darin Morgan and delivered by the great Charles Nelson Reilly—ever since I first heard it more than two decades ago. (As an aside, it’s startling for me to realize that just four short years separated the series premiere of The X-Files from “Jose Chung’s Doomsday Defense,” which was enough time for an entire fictional universe to be born, splinter apart, and reassemble itself into a better, more knowing incarnation.) And I find that I remember Chung’s words every time I sit down to write something new. I’ve been writing for a long time now, and I’m better at it than I am at pretty much anything else, but I still have to endure something like a moment of existential dread whenever I face the blank page for the first time. For the duration of the first draft, I regret all of my decisions, and I wonder whether there’s still a chance to try something else instead. Eventually, it passes. But it always happens. And after spending over a decade doing nothing else but writing, I’ve resigned myself to the fact that it’s always going to be this way.

Which doesn’t mean that there aren’t ways of dealing with it. In fact, I’ve come to realize that most of my life choices are designed to minimize the amount of time that I spend writing first drafts. This means nothing else but the physical act of putting down words for the first time, which is when I tend to hit my psychological bottom. Everything else is fine by comparison. As a result, I’ve shunted aspects of my creative process to one side or the other of the rough draft, which persists as a thin slice of effort between two huge continents of preparation and consolidation. I prefer to do as much research in advance as I can, and I spend an ungodly amount of time on outlines, which I’ve elsewhere described as a stealth first draft that I can trick myself into thinking doesn’t matter. My weird, ritualistic use of mind maps and other forms of random brainstorming is another way to generate as many ideas as possible before I need to really start writing. When I finally start the first draft, I make a point of never going back to read it until I’ve physically typed out the entire thing, with my outline at my elbow, as if I’m just transcribing something that already exists. Ideally, I can crank out that part of the day’s work in an hour or less. Once it’s there on the screen, I can begin revising, taking as many passes as possible without worrying too much about any given version. In the end, I somehow end up with a draft that I can stand to read. It isn’t entirely painless, but it involves less pain than any other method that I can imagine.

And these strategies are all just specific instances of my favorite piece of writing advice, which I owe to the playwright David Mamet. I haven’t quoted it here for a while, so here it is again:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

As I’ve noted before, I badly wish that I could somehow send this paragraph back in time to my younger self, because it would have saved me years of wasted effort. But what Mamet doesn’t mention, perhaps because he thought that it was obvious, is that buried in that list of “achievable steps” is a monster of a task that can’t be eliminated, only reduced. There’s no getting around the time that you spend in front of the blank page, and even the best outline in the world can only take away so much of the pain. (An overly detailed outline may even cause problems later, if it leads to a work that seems lifeless and overdetermined—which leaves us with the uncomfortable fact that a certain amount of pain at the writing stage is necessary to avoid even greater trouble in the future.)

Of course, if you’re just looking to minimize the agony of writing that first draft, there are easier ways to anesthetize yourself. Jose Chung pours himself a glass of whiskey, and I’ve elsewhere characterized the widespread use of mind-altering chemicals by writers—particularly caffeine, nicotine, and alcohol—as a pragmatic survival tactic, like the other clichés that we associate with the bohemian life. And I haven’t been immune. For years, I’d often have a drink while working at night, and it certainly didn’t hurt my productivity. (A ring of discolored wood eventually appeared on the surface of my desk from the condensation on the glass, which said more about my habits than I realized at the time.) After I got married, and especially after I became a father, I had to drastically rethink my writing schedule. I was no longer writing long into the evening, but trying to cram as much work as I could into a few daylight hours, leaving me and my wife with a little time to ourselves after our daughter went to bed. As a result, the drinking stopped, and the more obsessive habits that I’ve developed in the meantime are meant to reduce the pain of writing with a clear head. This approach isn’t for everyone, and it may not work for anyone else at all. But it’s worth remembering that when you look at a reasonably productive writer, you’re really seeing a collection of behaviors that have accrued around the need to survive that daily engagement with the empty page. And if they tend to exhibit such an inexplicable range of strategies, vices, and rituals, ultimately, they’re all just forms of defense.

Written by nevalalee

September 12, 2018 at 8:21 am

My ten creative books #10: A Guide for the Perplexed

with 4 comments

Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.

As regular readers know, I’m a Werner Herzog fan, but not a completist—I’ve seen maybe five of his features and three or four of his documentaries, which leaves a lot of unexplored territory, and I’m not ashamed to admit that Woyzeck put me to sleep. Yet Herzog himself is endlessly fascinating. Daniel Zalewski’s account of the making of Rescue Dawn is one of my five favorite articles ever to appear in The New Yorker, and if you’re looking for an introduction to his mystique, there’s no better place to start. For a deeper dive, you can turn to A Guide for the Perplexed, an expanded version of a collection of the director’s interviews with Paul Cronin, which was originally published more than a decade ago. As I’ve said here before, I regret the fact that I didn’t pick up the first edition when I had the chance, and I feel that my life would have been subtly different if I had. Not only is it the first book I’d recommend to anyone considering a career in filmmaking, it’s almost the first book I’d recommend to anyone considering a career in anything at all. It’s huge, but every paragraph explodes with insight, and you can open it to any page and find yourself immediately transfixed. Here’s one passage picked at random:

Learn to live with your mistakes. Study the law and scrutinize contracts. Expand your knowledge and understanding of music and literature, old and modern. Keep your eyes open. That roll of unexposed celluloid you have in your hand might be the last in existence, so do something impressive with it. There is never an excuse not to finish a film. Carry bolt cutters everywhere.

Or take Herzog’s description of his relationship with his cinematographer: “Peter Zeitlinger is always trying to sneak ‘beautiful’ shots into our films, and I’m forever preventing it…Things are more problematic when there is a spectacular sunset on the horizon and he scrambles to set up the camera to film it. I immediately turn the tripod 180 degrees in the other direction.”

And this doesn’t even touch on Herzog’s stories, which are inexhaustible. He provides his own point of view on many famous anecdotes, like the time he was shot on camera while being interviewed by the BBC—the bullet was stopped by a catalog in his jacket pocket, and he asked to keep going—or how he discouraged Klaus Kinski from abandoning the production of Aguirre: The Wrath of God. (“I told him I had a rifle…and that he would only make it as far as the next bend in the river before he had eight bullets in his head. The ninth would be for me.”) We see Herzog impersonating a veterinarian at the airport to rescue the monkeys that he needed for Aguirre; forging an impressive document over the signature of the president of Peru to gain access to locations for Fitzcarraldo; stealing his first camera; and shooting oil fires in Kuwait under such unforgiving conditions that the microphone began to melt. Herzog is his own best character, and he admits that he can sometimes become “a clown,” but his example is enough to sustain and nourish the rest of us. In On Directing Film, David Mamet writes:

But listen to the difference between the way people talk about films by Werner Herzog and the way they talk about films by Frank Capra, for example. One of them may or may not understand something or other, but the other understands what it is to tell a story, and he wants to tell a story, which is the nature of dramatic art—to tell a story. That’s all it’s good for.

Herzog, believe it or not, would agree, and he recommends Casablanca and The Treasure of the Sierra Madre as examples of great storytelling. And the way in which Herzog and Capra’s reputations have diverged since Mamet wrote those words, over twenty years ago, is illuminating in itself. A Guide for the Perplexed may turn out to be as full of fabrications as Capra’s own memoirs, but they’re the kind of inventions, like the staged moments in Herzog’s “documentaries,” that get at a deeper truth. As Herzog says of another great dreamer: “The difference between me and Don Quixote is, I deliver.”

My ten creative books #7: On Directing Film

with 2 comments

On Directing Film

Note: I’m counting down ten books that have influenced the way that I think about the creative process, in order of the publication dates of their first editions. It’s a very personal list that reflects my own tastes and idiosyncrasies, and I’m always looking for new recommendations. You can find the earlier installments here.

When it comes to giving advice on something as inherently unteachable as writing, books on the subject tend to fall into one of three categories. The first treats the writing manual as an extension of the self-help genre, offering what amounts to an extended pep talk that is long on encouragement but short on specifics. A second, more useful approach is to consolidate material on a variety of potential strategies, either through the voices of multiple writers—as George Plimpton did so wonderfully in The Writer’s Chapbook, which assembles the best of the legendary interviews given to The Paris Review—or through the perspective of a writer and teacher, like John Gardner, generous enough to consider the full range of what the art of fiction can be. And the third, exemplified by David Mamet’s On Directing Film, is to lay out a single, highly prescriptive recipe for constructing stories. This last approach might seem unduly severe. Yet after a lifetime of reading what other writers have to say on the subject, Mamet’s little book is still the best I’ve ever found, not just for film, but for fiction and narrative nonfiction as well. On one level, it can serve as a starting point for your own thoughts about how the writing process should look: Mamet provides a strict, almost mathematical set of tools for building a plot from first principles, and even if you disagree with his methods, they clarify your thinking in a way that a more generalized treatment might not. But even if you just take it at face value, it’s still the closest thing I know to a foolproof formula for generating rock-solid first drafts. (If Mamet himself has a flaw as a director, it’s that he often stops there.) In fact, it’s so useful, so lucid, and so reliable that I sometimes feel reluctant to recommend it, as if I were giving away an industrial secret to my competitors.

Mamet’s principles are easy to grasp, but endlessly challenging to follow. You start by figuring out what every scene is about, mostly by asking one question: “What does the protagonist want?” You then divide each scene up into a sequence of beats, consisting of an immediate objective and a logical action that the protagonist takes to achieve it, ideally in a form that can be told in visual terms, without the need for expository dialogue. And you repeat the process until the protagonist succeeds or fails at his or her ultimate objective, at which point the story is over. This may sound straightforward, but as soon as you start forcing yourself to think this way consistently, you discover how tough it can be. Mamet’s book consists of a few simple examples, teased out in a series of discussions at a class he taught at Columbia, and it’s studded with insights that once heard are never forgotten: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” “Here is a tool—choose your shots, beats, scenes, objectives, and always refer to them by the names you chose.” “Keep it simple, stupid, and don’t violate those rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.” “The audience doesn’t want to read a sign; they want to watch a motion picture.” “A good writer gets better only by learning to cut, to remove the ornamental, the descriptive, the narrative, and especially the deeply felt and meaningful.” “Now, why did all those Olympic skaters fall down? The only answer I know is that they hadn’t practiced enough.” And my own personal favorite: “The nail doesn’t have to look like a house; it is not a house. It is a nail. If the house is going to stand, the nail must do the work of a nail. To do the work of the nail, it has to look like a nail.”

Written by nevalalee

August 7, 2018 at 9:00 am

Life on the last mile

with 2 comments

In telecommunications, there’s a concept called “the last mile,” which states that the final leg of a network—the one that actually reaches the user’s home, school or office—is the most difficult and expensive to build. It’s one thing to construct a massive trunkline, which is basically a huge but relatively straightforward feat of engineering, and quite another to deal with the tangle of equipment, wiring, and specifications on the level of thousands of individual households. More recently, the concept has been extended to public transportation, delivery and distribution services, and other fields that depend on connecting an industrial operation on the largest imaginable scale with specific situations on the retail side. (For instance, Amazon has been trying to cross the last mile through everything from its acquisition of Whole Foods to drone delivery, and the fact that these are seen as alternative approaches to the same problem points to how complicated it really is.) This isn’t just a matter of infrastructure, either, but of the difficulties inherent to any system in which a single pipeline has to split into many smaller branches, whether it’s carrying blood, water, mail, or data. Ninety percent of the wiring can be in that last mile, and success lies less in any overall principles than in the irritating particulars. It has to be solved on the ground, rather than in a design document, and you’ll never be able to anticipate all of the obstacles that you’ll face once those connections start to multiply. It’s literally about the ramifications.

I often feel the same way when it comes to writing. When I think back at how I’ve grown as a writer over the last decade or so, I see clear signs of progress. Thanks mostly to the guidelines that David Mamet presents in On Directing Film, it’s much easier for me to write a decent first draft than it was when I began. I rarely leave anything unfinished; I know how to outline and how to cut; and I’m unlikely to make any huge technical mistakes. In his book Which Lie Did I Tell?, William Goldman says something similar about screenwriting:

Stephen Sondheim once said this: “I cannot write a bad song. You begin it here, build, end there. The words will lay properly on the music so they can be sung, that kind of thing. You may hate it, but it will be a proper song.” I sometimes feel that way about my screenplays. I’ve been doing them for so long now, and I’ve attempted most genres. I know about entering the story as late as possible, entering each scene as late as possible, that kind of thing. You may hate it, but it will be a proper screenplay.

Craft, in other words, can take you most of the way—but it’s the final leg that kills you. As Goldman concludes of his initial pass on the script for Absolute Power: “This first draft was proper as hell—you just didn’t give a shit.” And sooner or later, most writers find that they spend most of their time on that last mile.

Like most other art forms, creative writing can indeed be taught—but only to the point that it still resembles an engineering problem. There are a few basic tricks of structure and technique that will improve almost anyone’s work, much like the skills that you learn in art books like Drawing on the Right Side of the Brain, and that kind of advancement can be enormously satisfying. When it comes to the last mile between you and your desired result, however, many of the rules start to seem useless. You aren’t dealing with the general principles that have gotten you this far, but with problems that arise on the level of individual words or sentences, each one of which needs to be tackled on its own. There’s no way of knowing whether or not you’ve made the right choice until you’ve looked at them all in a row, and even if something seems wrong, you may not know how to fix it. The comforting shape of the outline, which can be assembled in a reasonably logical fashion, is replaced by the chaos of the text, and the fact that you’ve done good work on this level before is no guarantee that you can do it right now. I’ve learned a lot about writing over the years, but to the extent that I’m not yet the writer that I want to be, it lies almost entirely in that last mile, where the ideal remains tantalizingly out of reach.

As a result, I end up revising endlessly, even a late stage, and although the draft always gets better, it never reaches perfection. After a while, you have to decide that it’s as good as it’s going to get, and then move on to something else—which is why it helps to have a deadline. But you can take comfort in the fact that the last mile affects even the best of us. In a recent New York Times profile of the playwright Tony Kushner, Charles McGrath writes:

What makes Angels in America so complicated to stage is not just Mr. Kushner’s need to supervise everything, but that Perestroika, the second part, is to a certain extent a work in progress and may always be. The first part, Millennium Approaches, was already up and running in the spring of 1991, when, with a deadline looming, Mr. Kushner retreated to a cabin in Northern California and wrote most of Perestroika in a feverish eight-day stint, hardly sleeping and living on junk food. He has been tinkering with it ever since…Even during rehearsal last month he was still cutting, rewriting, restructuring.

If Tony Kushner is still revising Angels in America, it makes me feel a little better about spending my life on that last mile. Or as John McPhee says about knowing when to stop: “What I know is that I can’t do any better; someone else might do better, but that’s all I can do; so I call it done.”

In the cards

leave a comment »

Vladimir Nabokov

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on October 21, 2016.

It’s been said that all of the personal financial advice that most people need to know can fit on a single index card. In fact, that’s pretty much true—which didn’t stop the man who popularized the idea from writing a whole book about it. But the underlying principle is sound enough. When you’re dealing with a topic like your own finances, instead of trying to master a large body of complicated material, you’re better off focusing on a few simple, reliable rules until you aren’t likely to break them by mistake. Once you’ve internalized the basics, you can move on. The tricky part is identifying the rules that will get you the furthest per unit of effort. In practice, no matter what we’re doing, nearly all of us operate under only a handful of conscious principles at any given moment. We just can’t keep more than that in our heads at any one time. (Unconscious principles are another matter, and you could say that intuition is another word for all the rules that we’ve absorbed to the point where we don’t need to think about them explicitly.) If the three or four rules that you’ve chosen to follow are good ones, it puts you at an advantage over a rival who is working with an inferior set. And while this isn’t enough to overcome the impact of external factors, or dumb luck, it makes sense to maximize the usefulness of the few aspects that you can control. This implies, in turn, that you should think very carefully about a handful of big rules, and let experience and intuition take care of the rest.

Recently, I’ve been thinking about what I’d include on a similar index card for a writer. In my own writing life, a handful of principles have far outweighed the others. I’ve spent countless hours discussing the subject on this blog, but you could throw away almost all of it: a single index card’s worth of advice would have gotten me ninety percent of the way to where I am now. For instance, there’s the simple rule that you should never go back to read what you’ve written until you’ve finished a complete rough draft, whether it’s a short story, an essay, or a novel—which is more responsible than any other precept for the fact that I’m still writing at all. The principle that you should cut at least ten percent from a first draft, in turn, is what helped me sell my first stories, and in my experience, it’s more like twenty percent. Finally, there’s the idea that you should structure your plot as a series of objectives, and that you should probably make some kind of outline to organize your thoughts before you begin. This is arguably more controversial than the other two, and outlines aren’t for everybody. But they’ve allowed me to write more intricate and ambitious stories than I could have managed otherwise, and they make it a lot easier to finish what I’ve started. (The advice to write an outline is a little like the fifth postulate of Euclid: it’s uglier than the others, and you get interesting results when you get rid of it, but most of us are afraid to drop it completely.)

David Mamet

Then we get to words of wisdom that aren’t as familiar, but which I think every writer should keep in mind. If I had to pick one piece of advice to send back in time to my younger self, along with the above, it’s what David Mamet says in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

It isn’t as elegantly phased as I might like, but it gets at something so important about the writing process that I’ve all but memorized it. A real writer has to be good at everything, and it’s unclear why we should expect all those skills to manifest themselves in a single person. As I once wrote about Proust: “It seems a little unfair that our greatest writer on the subject of sexual jealousy and obsession should also be a genius at describing, say, a seascape.” How can we reasonably expect our writers to create suspense, tell stories about believable characters, advance complicated ideas, and describe the bedroom curtains?

The answer—and while it’s obvious, it didn’t occur to me for years—is that the writer doesn’t need to do all of this at once. A work of art is experienced in a comparative rush, but it doesn’t need to be written that way. (As Homer Simpson was once told: “Very few cartoons are broadcast live. It’s a terrible strain on the animators’ wrists.”) You do one thing at a time, as Mamet says, and divide up your writing schedule so that you don’t need to be clever and careful at the same time. This applies to nonfiction as well. When you think about the work that goes into writing, say, a biography, it can seem absurd that we expect a writer to be the drudge who tracks down the primary sources, the psychologist who interprets the evidence, and the stylist who writes it up in good prose. But these are all roles that a writer plays at different points, and it’s a mistake to conflate them, even as each phase informs all the rest. Once you’ve become a decent stylist and passable psychologist, you’re also a more efficient drudge, since you’re better at figuring out what is and isn’t useful. Which implies that a writer isn’t dealing with just one index card of rules, but with several, and you pick and choose between them based on where you are in the process. Mamet’s point, I think, is that this kind of switching is central to getting things done. You don’t try to do everything simultaneously, and you don’t overthink whatever you’re doing at the moment. As Mamet puts it elsewhere: “Keep it simple, stupid, and don’t violate the rules that you do know. If you don’t know which rule applies, just don’t muck up the more general rules.”

Written by nevalalee

February 27, 2018 at 9:00 am

Posted in Writing

Tagged with , ,

The fictional sentence

with one comment

Of all the writers of the golden age of science fiction, the one who can be hardest to get your head around is A.E. van Vogt. He isn’t to everyone’s taste—many readers, to quote Alexei and Cory Panshin’s not unadmiring description, find him “foggy, semi-literate, pulpish, and dumb”—but he’s undoubtedly a major figure, and he was second only to Robert A. Heinlein and Isaac Asimov when it came to defining what science fiction became in the late thirties and early forties. (If he isn’t as well known as they are, it’s largely because he was taken out of writing by dianetics at the exact moment that the genre was breaking into the mainstream.) Part of his appeal is that his stories remain compelling and readable despite their borderline incoherence, and he was unusually open about his secret. In the essay “My Life Was My Best Science Fiction Story,” which was originally published in the volume Fantastic Lives, van Vogt wrote:

I learned to write by a system propounded in a book titled The Only Two Ways to Write a Story by John W. Gallishaw (meaning by flashback or in consecutive sequence). Gallishaw had made an in-depth study of successful stories by great authors. He observed that the best of them wrote in what he called “presentation units” of about eight hundred words. Each of these units contained five steps. And every sentence in it was a “fictional sentence.” Which means that it was written either with imagery, or emotion, or suspense, depending on the type of story.

So what did these units look like? Used copies of Gallishaw’s book currently go for well over a hundred dollars online, but van Vogt helpfully summarized the relevant information:

The five steps can be described as follows: 1) Where, and to whom, is it happening? 2) Make clear the scene purpose (What is the immediate problem which confronts the protagonist, and what does it require him to accomplish in this scene?) 3) The interaction with the opposition, as he tries to achieve the scene purpose. 4) Make the reader aware that he either did accomplish the scene purpose, or did not accomplish it. 5) In all the early scenes, whether protagonist did or did not succeed in the scene purpose, establish that things are going to get worse. Now, the next presentation unit-scene begins with: Where is all this taking place. Describe the surroundings, and to whom it is happening. And so forth.

Over the years, this formula was distorted and misunderstood, so that a critic could write something like “Van Vogt admits that he changes the direction of his plot every eight hundred words.” And even when accurately stated, it can come off as bizarre. Yet it’s really nothing more than the principle that every narrative should consist of a series of objectives, which I’ve elsewhere listed among the most useful pieces of writing advice that I know. Significantly, it’s one of the few elements of craft that can be taught and learned by example. Van Vogt learned it from Gallishaw, while I got it from David Mamet’s On Directing Film, and I’ve always seen it as a jewel of wisdom that can be passed in almost apostolic fashion from one writer to another.

When we read van Vogt’s stories, of course, we aren’t conscious of this structure, and if anything, we’re more aware of their apparent lack of form. (As John McPhee writes in his wonderful new book on writing: “Readers are not supposed to notice the structure. It is meant to be about as visible as someone’s bones.”) Yet we still keep reading. It’s that sequence of objectives that keeps us oriented through the centrifugal wildness that we associate with van Vogt’s work—and it shouldn’t come as a surprise that he approached the irrational side as systematically as he did everything else. I’d heard at some point that van Vogt based many of his plots on his dreams, but it wasn’t until I read his essay that I understood what this meant:

When you’re writing, as I was, for one cent a word, and are a slow writer, and the story keeps stopping for hours or days, and your rent is due, you get anxious…I would wake up spontaneously at night, anxious. But I wasn’t aware of the anxiety. I thought about story problems—that was all I noticed then. And so back to sleep I went. In the morning, often there would be an unusual solution. All my best plot twists came in this way…It was not until July 1943 that I suddenly realized what I was doing. That night I got out our alarm clock and moved into the spare bedroom. I set the alarm to ring at one and one-half hours. When it awakened me, I reset the alarm for another one and one-half hours, thought about the problems in the story I was working on—and fell asleep. I did that altogether four times during the night. And in the morning, there was the unusual solution, the strange plot twist…So I had my system for getting to my subconscious mind.

This isn’t all that different from Salvador Dali’s advice on how to take a nap. But the final sentence is the kicker: “During the next seven years, I awakened myself about three hundred nights a year four times a night.” When I read this, I felt a greater sense of kinship with van Vogt than I have with just about any other writer. Much of my life has been spent searching for tools—from mind maps to tarot cards—that can be used to systematically incorporate elements of chance and intuition into what is otherwise a highly structured process. Van Vogt’s approach comes as close as anything I’ve ever seen to the ideal of combining the two on a reliable basis, even if we differ on some of the details. (For instance, I don’t necessarily buy into Gallishaw’s notion that every action taken by the protagonist needs to be opposed, or that the situation needs to continually get worse. As Mamet writes in On Directing Film: “We don’t want our protagonist to do things that are interesting. We want him to do things that are logical.” And that’s often enough.) But it’s oddly appropriate that we find such rules in the work of a writer who frequently came across as chronically disorganized. Van Vogt pushed the limits of form further than any other author of the golden age, and it’s hard to imagine Alfred Bester or Philip K. Dick without him. But I’m sure that there were equally visionary writers who never made it into print because they lacked the discipline, or the technical tricks, to get their ideas under control. Van Vogt’s stories always seem on the verge of flying apart, but the real wonder is that they don’t. And his closing words on the subject are useful ones indeed: “It is well to point out again that these various systems were, at base, just automatic reactions to the writing of science fiction. The left side of the brain got an overdose of fantasizing flow from the right side, and literally had to do something real.”

Are you sitting down?

with one comment

Colin Wilson

Note: I’m taking a few days off, so I’ll be republishing some of my favorite pieces from earlier in this blog’s run. This post originally appeared, in a slightly different form, on October 26, 2016.

In the past, I’ve often mentioned what I’ve come to see as the most valuable piece of writing advice that I know, which is what David Mamet says in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

You don’t try to do everything at once, which is probably impossible anyway. Instead, there are days in which you do “careful” jobs that are the artistic equivalent of housekeeping—research, making outlines of physical actions, working out the logic of the plot—and others in which you perform “inventive” tasks that rely on intuition. This seems like common sense: it’s hard enough to be clever or imaginative, without factoring in the switching costs associated with moving from one frame of mind to another. The writer Colin Wilson believed that the best ideas emerge when your left and right hemispheres are moving at the same rate, which tends to occur in moments of either reverie or high excitement. This is based on an outdated model of how the human brain works, but the phenomenon it describes is familiar enough, and it’s just a small step from there to acknowledging that neither ecstatic nor dreamlike mental states are particularly suited for methodical work. When you’re laying the foundations for future creative activity, you usually end up somewhere in the middle, in a state of mind that is focused but not heightened, less responsive to connections than to discrete units, and concerned more with thoroughness than with inspiration. It’s an important stage, but it’s also the last place where you’d expect real insights to appear.

Clearly, a writer should strive to work with, rather than against, this natural division of labor. It’s also easy to agree with Mamet’s advice that it’s best to tackle one kind of thinking per day. (Mental switching costs of any kind are usually minimized when you’ve had a good night’s sleep in the meantime.) The real question is how to figure out what sort of work you should be doing at any given moment, and, crucially, whether it’s possible to predict this in advance. Any writer can tell you that there’s an enormous difference between getting up in the morning without any idea of what you’re doing that day, which is the mark of an amateur, and having a concrete plan—which is why professional authors use such tools as outlines and calendars. Ideally, it would be nice to know when you woke up whether it was going to be a “careful” day or an “inventive” day, which would allow you to prepare yourself accordingly. Sometimes the organic life cycle of a writing project supplies the answer: depending on where you are in the process, you engage in varying proportions of careful or inventive thought. But every stage requires some degree of both. As Mamet implies, you’ll often alternate between them, although not as neatly as in his hypothetical example. And while it might seem pointless to allocate time for inspiration, which appears according to no fixed schedule, you can certainly create the conditions in which it’s more likely to appear. But how do you know when?

David Mamet

I’ve come up with a simple test to answer this question: I ask myself how much time I expect to spend sitting down. Usually, before a day begins, I have a pretty good sense of how much sitting or standing I’ll be doing, and that’s really all I need to make informed decisions about how to use my time. There are some kinds of creative work that demand sustained concentration at a desk or in a seated position. This includes most of the “careful” tasks that Mamet describes, but also certain forms of intuitive, nonlinear thinking, like making a mind map. By contrast, there are other sorts of work that not only don’t require you to be at your desk, but are actively stifled by it: daydreaming, brooding over problems, trying to sketch out large blocks of the action. You often do a better job of it when you’re out taking a walk, or in the bus, bath, or bed. When scheduling creative work, then, you should start by figuring out what your body is likely to be doing that day, and then use this to plan what to do with your mind. Your brain has no choice but to tag along with your body when it’s running errands or standing in line at the bank, but if you structure your time appropriately, those moments won’t go to waste. And it’s often such external factors, rather than the internal logic of where you should be in the process, that determine what you should be doing.

At first glance, this doesn’t seem that much different from the stock advice that you should utilize whatever time you have available, whether you’re washing the dishes or taking a shower. But I think it’s a bit more nuanced than this, and that it’s more about matching the work to be done to the kind of time you have. If you try to think systematically and carefully while taking a walk in the park, you’ll feel frustrated when your mind wanders to other subjects. Conversely, if you try to daydream at your desk, not only are you likely to feel boxed in by your surroundings, but you’re also wasting valuable time that would be better spent on work that only requires the Napoleonic virtues of thoroughness and patience. Inspiration can’t be forced, and you don’t know in advance if you’re better off being careful or inventive on any given day—but the amount of time that you’ll be seated provides an important clue. (You can also reverse the process, and arrange to be seated as little as possible on days when you hope to get some inventive thinking done. For most of us, unfortunately, this isn’t entirely under our control, which makes it all the more sensible to take advantage of such moments when they present themselves.) And it doesn’t need to be planned beforehand. If you’re at work on a problem and you’re not sure what kind of thinking you should be doing, you can look at yourself and ask: Am I sitting down right now? And that’s all the information you need.

Written by nevalalee

January 19, 2018 at 9:00 am

Posted in Books

Tagged with , ,

Raising the roof

leave a comment »

Whenever I return from a walk with my daughter and catch my first glimpse of our house at the corner, I feel happy. It was built over a hundred years ago, and although it isn’t any nicer than the houses to either side, it’s a little bit taller, and the high peak of its roof gives it a distinctive silhouette—as soon as I see it, I know that I’m home. Years ago, when my wife and I were looking for a place to start our family, I knew that I wanted a roof like this. I was partially inspired by the architect Christopher Alexander’s A Pattern Language, which may be the best book that I’ve ever read on any subject. Alexander writes:

We believe that [the] connection between the geometry of roofs, and their capacity to provide psychological shelter, can be put on empirical grounds: first, there is a kind of evidence which shows that both children and adults naturally incline toward the sheltering roofs, almost as if they had archetypal properties…Despite fifty years of the flat roofs of the “modern movement,” people still find the simple pitched roof the most powerful symbol of shelter.

In fact, my own roof doesn’t quite meet those standards. As Alexander notes: “This sheltering function cannot be created by a pitched roof, or a large roof, which is merely added to the top of an existing structure. The roof itself only shelters if it contains, embraces, covers, surrounds the process of living.” Instead of coming down to the rooms themselves, the roof of my house covers an attic that we never use. And sometimes this means that our living space feels slightly incomplete.

But maybe I should be grateful that I have a roof like this at all. In his essay “The Inevitable Box,” reprinted in his recent collection Four Walls and a Roof, Reinier de Graaf writes of the triumph of the architectural cube, which he calls “the natural outcome of all rational parameters combined”:

When did the pitched roof stop being a necessity? The dirty secret of modern architecture is that it never did. We stopped using it without any superior solution having presented itself. The omission of the pitched roof is an intentional technological regression, a deliberate forgoing of the best solution in favor of an aesthetic ideal, eschewing function for form—the symbol of a desire for progress instead of progress itself. We choose to endure the inconvenience. After all, architecture and the box have had an inconvenient relation for centuries. The pitched roof helped them avoid seeing eye to eye. It was what stood between architecture and the naked truth, what prevented the box from being a box. In our drift toward the box, the pitched roof was a necessary casualty—no progress without cruelty! With bigger things at stake, the pitched roof had to go.

Yet the psychological power of the pitched roof still persists. Alexander quotes the French psychiatrist Menie Gregoire, who wrote in the early seventies: “At Nancy the children from the apartments were asked to draw a house. These children had been born in these apartment slabs which stand up like a house of cards upon an isolated hill. Without exception they each drew a small cottage with two windows and smoke curling up from a chimney on the roof.”

Alexander concedes that this preference might be “culturally induced,” but he also makes a strong case for why the pitched roof is an inherently superior form. When properly conceived—so that the interior ceilings come right up to the roof itself—it seems to surround and shelter the living space, rather than sitting on top like a cap; it becomes a distinctive element that defines the house from a distance; and it even forms a connection with people on the ground, if the eaves come low enough around the entrance to be touched. There are also practical advantages. In On Directing Film, David Mamet contrasts the “unlivable” designs of countercultural architecture with the patterns of traditional design, which he uses to make a point about storytelling:

If you want to tell a story, it might be a good idea to understand a little bit about the nature of human perception. Just as, if you want to know how to build a roof, it might be a good idea to understand a little bit about the effects of gravity and the effects of precipitation. If you go up into Vermont and build a roof with a peak, the snow will fall off. You build a flat roof, the roof will fall down from the weight of the snow—which is what happened to a lot of the countercultural architecture of the 1960s. “There may be a reason people have wanted to hear stories for ten million years,” the performance artist says, “but I really don’t care, because I have something to say.”

But the opposite of a box isn’t necessarily a house with a pitched roof. It can also be what de Graaf calls “the antibox,” in which straight lines of any kind have been omitted. He argues that such buildings, exemplified by the work of Frank Gehry, have turned architecture “into a game of chance,” relying on computer models to determine what is possible: “Authorship has become relative: with creation now delegated to algorithms, the antibox’s main delight is the surprise it causes to the designers.” And he concludes:

The antibox celebrates the death of the ninety-degree angle—in fact, of every angle. Only curves remain. Floor, walls, and roof smoothly morph into a single continuous surface that only the most complex geometrical equations can capture. In its attempts to achieve a perfect ergonomic architecture—enveloping the body and its movement like a glove—the antibox falls into an age-old trap, only with more sophistication and virtuosity. The antibox is nothing more than form follows function 2.0, that is, a perfectly executed mistake.

I think that Gehry is a genius, even if some of his buildings do look like a big pile of trash, and that what he does is necessary and important. But it’s also revealing that the triumph of the box generated a reaction that didn’t consist of a return to the sensible pitched roof, but of the antibox that disregards all angles. Neither seems to have been conceived with an eye to those who will actually live or work there, any more than most performance art is concerned with the audience’s need for storytelling. Stories take on certain forms for a reason, and so should houses, embodied by the pitched roof—which is the point where two extremes meet. For all its shortcomings, when I look at my own house, I don’t just see a building. I see the story of my life.

Written by nevalalee

December 5, 2017 at 9:43 am

The art of preemptive ingenuity

leave a comment »

Yesterday, my wife drew my attention to the latest episode of the podcast 99% Invisible, which irresistibly combines two of my favorite topics—film and graphic design. Its subject is Annie Atkins, who has designed props and visual materials for such works as The Tudors and The Grand Budapest Hotel. (Her account of how a misspelled word nearly made it onto a crucial prop in the latter film is both hilarious and horrifying.) But my favorite story that she shares is about a movie that isn’t exactly known for its flashy art direction:

The next job I went onto—it would have been Spielberg’s Bridge of Spies, which was a true story. We made a lot of newspapers for that film, and I remember us beginning to check the dates against the days, because I wanted to get it right. And then eventually the prop master said to me, “Do you know what, I think we’re just going to leave the dates off.” Because it wasn’t clear [what] sequence…these things were going to be shown in. And he said, you know, if you leave the dates off altogether, nobody will look for it. But if you put something there that’s wrong, then it might jump out. So we went with no dates in the end for those newspapers.

As far as filmmaking advice is concerned, this is cold, hard cash, even if I’ll never have the chance to put it into practice for myself. And I especially like the fact that it comes out of Bridge of Spies, a writerly movie with a screenplay by none other than the Coen Brothers, but which was still subject to decisions about its structure as late in the process as the editing stage.

Every movie, I expect, requires some degree of editorial reshuffling, and experienced directors will prepare for this during the production itself. The absence of dates on newspapers is one good example, and there’s an even better one in the book The Conversations, which the editor Walter Murch relates to the novelist Michael Ondaatje:

One thing that made it possible to [rearrange the order of scenes] in The Conversation was Francis [Coppola]’s belief that people should wear the same clothes most of the time. Harry is almost always wearing that transparent raincoat and his funny little crepe-soled shoes. This method of using costumes is something Francis had developed on other films, quite an accurate observation. He recognized that, first of all, people don’t change clothes in real life as often as they do in film. In film there’s a costume department interested in showing what it can do—which is only natural—so, on the smallest pretext, characters will change clothes. The problem is, that locks filmmakers into a more rigid scene structure. But if a character keeps the same clothes, you can put a scene in a different place and it doesn’t stand out.

Murch observes: “There’s a delicate balance between the timeline of a film’s story—which might take place over a series of days or weeks or months—and the fact that the film is only two hours long. You can stretch the amount of time somebody is in the same costume because the audience is subconsciously thinking, Well, I’ve only been here for two hours, so it’s not strange that he hasn’t changed clothes.”

The editor concludes: “It’s amazing how consistent you can make somebody’s costume and have it not stand out.” (Occasionally, a change of clothes will draw attention to editorial manipulation, as one scene is lifted out from its original place and slotted in elsewhere. One nice example is in Bullitt, where we see Steve McQueen in one scene at a grocery store in his iconic tweed coat and blue turtleneck, just before he goes home, showers, and changes into those clothes, which he wears for the rest of the movie.) The director Judd Apatow achieves the same result in another way, as his longtime editor Brent White notes: “[He’ll] have something he wants to say, but he doesn’t know exactly where it goes in the movie. Does it service the end? Does it go early? So he’ll shoot the same exact scene, the same exchange, with the actors in different wardrobes, so that I can slot it in at different points.” Like the newspapers in Bridge of Spies, this all assumes that changes to the plan will be necessary later on, and it prepares for them in advance. Presumably, you always hope to keep the order of scenes from the script when you cut the movie together, but the odds are that something won’t quite work when you sit down to watch the first assembly, so you build in safeguards to allow you to fix these issues when the time comes. If your budget is high enough, you can include reshoots in your shooting schedule, as Peter Jackson does, while the recent films of David Fincher indicate the range of problems that can be solved with digital tools in postproduction. But when you lack the resources for such expensive solutions, your only recourse is to be preemptively ingenious on the set, which forces you to think in terms of what you’ll want to see when you sit down to edit the footage many months from now.

This is the principle behind one of my favorite pieces of directorial advice ever, which David Mamet provides in the otherwise flawed Bambi vs. Godzilla:

Always get an exit and an entrance. More wisdom for the director in the cutting room. The scene involves the hero sitting in a café. Dialogue scene, blah blah blah. Well and good, but when you shoot it, shoot the hero coming in and sitting down. And then, at the end, shoot him getting up and leaving. Why? Because the film is going to tell you various things about itself, and many of your most cherished preconceptions will prove false. The scene that works great on paper will prove a disaster. An interchange of twenty perfect lines will be found to require only two, the scene will go too long, you will discover another scene is needed, and you can’t get the hero there if he doesn’t get up from the table, et cetera. Shoot an entrance and an exit. It’s free.

I learned a corollary from John Sayles: at the end of the take, in a close-up or one-shot, have the speaker look left, right, up, and down. Why? Because you might just find you can get out of the scene if you can have the speaker throw the focus. To what? To an actor or insert to be shot later, or to be found in (stolen from) another scene. It’s free. Shoot it, ’cause you just might need it.

This kind of preemptive ingenuity, in matters both large and small, is what really separates professionals from amateurs. Something always goes wrong, and the plan that we had in mind never quite matches what we have in the end. Professionals don’t always get it right the first time, either—but they know this, and they’re ready for it.

Writing with scissors

leave a comment »

Over the last few years, one of my great pleasures has been reading the articles on writing that John McPhee has been contributing on an annual basis to The New Yorker. I’ve written here about my reactions to McPhee’s advice on using the dictionary, on “greening” or cutting a piece by an arbitrary length, on structure, on frames of reference. Now his full book on the subject is here, Draft No. 4, and it’s arriving in my life at an opportune time. I’m wrapping up a draft of my own book, with two months to go before deadline, and I have a daunting set of tasks ahead of me—responding to editorial comments, preparing the notes and bibliography, wrestling the whole thing down to size. McPhee’s reasonable voice is a balm at such times, although he never minimizes the difficulty of the process itself, which he calls “masochistic, mind-fracturing self-enslaved labor,” even as he speaks of the writer’s “animal sense of being hunted.” And when you read Sam Anderson’s wonderful profile on McPhee in this week’s issue of The New York Times Magazine, it’s like listening to an old soldier who has been in combat so many times that everything that he says carries the weight of long experience. (Reading it, I was reminded a little of the film editor Walter Murch, whom McPhee resembles in certain ways—they look sort of alike, they’re both obsessed with structure, and they both seem to know everything. I was curious to see whether anyone else had made this connection, so I did a search for their names together on Google. Of the first five results, three were links from this blog.)

Anderson’s article offers us the portrait of a man who, at eighty-six, has done a better job than just about anyone else of organizing his own brain: “Each of those years seems to be filed away inside of him, loaded with information, ready to access.” I would have been equally pleased to learn that McPhee was as privately untidy as his writing is intricately patterned, but it makes sense that his interest in problems of structure—to which he returns endlessly—would manifest itself in his life and conversation. He’s interested in structure in the same way that the rest of us are interested in the lives of our own children. I never tire of hearing how writers deal with structural issues, and I find passages like the following almost pornographically fascinating:

The process is hellacious. McPhee gathers every single scrap of reporting on a given project—every interview, description, stray thought and research tidbit—and types all of it into his computer. He studies that data and comes up with organizing categories: themes, set pieces, characters and so on. Each category is assigned a code. To find the structure of a piece, McPhee makes an index card for each of his codes, sets them on a large table and arranges and rearranges the cards until the sequence seems right. Then he works back through his mass of assembled data, labeling each piece with the relevant code. On the computer, a program called “Structur” arranges these scraps into organized batches, and McPhee then works sequentially, batch by batch, converting all of it into prose. (In the old days, McPhee would manually type out his notes, photocopy them, cut up everything with scissors, and sort it all into coded envelopes. His first computer, he says, was “a five-thousand-dollar pair of scissors.”)

Anderson writes: “[McPhee] is one of the world’s few remaining users of a program called Kedit, which he writes about, at great length, in Draft No. 4.” The phrase “at great length” excites me tremendously—I’m at a point in my life where I’d rather hear about a writer’s favorite software program than his or her inspirational  thoughts on creativity—and McPhee’s process doesn’t sound too far removed from the one that I’ve worked out for myself. As I read it, though, I found myself thinking in passing of what might be lost when you move from scissors to a computer. (Scissors appear in the toolboxes of many of the writers and artists I admire. In The Elements of Style, E.B. White advises: “Quite often the writer will discover, on examining the completed work, that there are serious flaws in the arrangement of the material, calling for transpositions. When this is the case, he can save himself much labor and time by using scissors on his manuscript, cutting it to pieces and fitting the pieces together in a better order.” In The Silent Clowns, Walter Kerr describes the narrative challenges of filmmaking in the early fifties and concludes: “The problem was solved, more or less, with a scissors.” And Paul Klee once wrote in his diary: “What I don’t like, I cut away with the scissors.”) But McPhee isn’t sentimental about the tools themselves. In Anderson’s profile, the New Yorker editor David Remnick, who took McPhee’s class at Princeton, recalls: “You were in the room with a craftsman of the art, rather than a scholar or critic—to the point where I remember him passing around the weird mechanical pencils he used to use.” Yet there’s no question in my mind that McPhee would drop that one brand of pencil if he found one that he thought was objectively better. As soon as he had Kedit, he got rid of the scissors. When you’re trying to rethink structure from the ground up, you don’t have much time for nostalgia.

And when McPhee explains the rationale behind his methods, you can hear the pragmatism of fifty years of hard experience:

If this sounds mechanical, its effect was absolutely the reverse. If the contents of the seventh folder were before me, the contents of twenty-nine other folders were out of sight. Every organizational aspect was behind me. The procedure eliminated nearly all distraction and concentrated only the material I had to deal with in a given day or week. It painted me into a corner, yes, but in doing so it freed me to write.

This amounts to an elaboration of what I’ve elsewhere called my favorite piece of writing advice, which David Mamet offers in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

Mamet might as well have come out of the same box as Walter Murch and McPhee, which implies that I have a definite type when it comes to looking for advice. And what they all have in common, besides the glasses and beard, is the air of having labored at a craft for decades, survived, and returned to tell the tale. Of the three, McPhee’s career may be the most enviable of all, if only because he spent it in Princeton, not Hollywood. It’s nice to be able to structure an essay. The tricky part is structuring a life.

Shoot the piano player

with 2 comments

In his flawed but occasionally fascinating book Bambi vs. Godzilla, the playwright and director David Mamet spends a chapter discussing the concept of aesthetic distance, which is violated whenever viewers remember that they’re simply watching a movie. Mamet provides a memorable example:

An actor portrays a pianist. The actor sits down to play, and the camera moves, without a cut, to his hands, to assure us, the audience, that he is actually playing. The filmmakers, we see, have taken pains to show the viewers that no trickery has occurred, but in so doing, they have taught us only that the actor portraying the part can actually play the piano. This addresses a concern that we did not have. We never wondered if the actor could actually play the piano. We accepted the storyteller’s assurances that the character could play the piano, as we found such acceptance naturally essential to our understanding of the story.

Mamet imagines a hypothetical dialogue between the director and the audience: “I’m going to tell you a story about a pianist.” “Oh, good: I wonder what happens to her!” “But first, before I do, I will take pains to reassure you that the actor you see portraying the hero can actually play the piano.” And he concludes:

We didn’t care till the filmmaker brought it up, at which point we realized that, rather than being told a story, we were being shown a demonstration. We took off our “audience” hat and put on our “judge” hat. We judged the demonstration conclusive but, in so doing, got yanked right out of the drama. The aesthetic distance had been violated.

Let’s table this for now, and turn to a recent article in The Atlantic titled “The Remarkable Laziness of Woody Allen.” To prosecute the case laid out in the headline, the film critic Christopher Orr draws on Eric Lax’s new book Start to Finish: Woody Allen and the Art of Moviemaking, which describes the making of Irrational Man—a movie that nobody saw, which doesn’t make the book sound any less interesting. For Orr, however, it’s “an indictment framed as an encomium,” and he lists what he evidently sees as devastating charges:

Allen’s editor sometimes has to live with technical imperfections in the footage because he hasn’t shot enough takes for her to choose from…As for the shoot itself, Allen has confessed, “I don’t do any preparation. I don’t do any rehearsals. Most of the times I don’t even know what we’re going to shoot.” Indeed, Allen rarely has any conversations whatsoever with his actors before they show up on set…In addition to limiting the number of takes on any given shot, he strongly prefers “master shots”—those that capture an entire scene from one angle—over multiple shots that would subsequently need to be edited together.

For another filmmaker, all of these qualities might be seen as strengths, but that’s beside the point. Here’s the relevant passage:

The minimal commitment that appearing in an Allen film entails is a highly relevant consideration for a time-strapped actor. Lax himself notes the contrast with Mike Leigh—another director of small, art-house films—who rehearses his actors for weeks before shooting even starts. For Damien Chazelle’s La La Land, Stone and her co-star, Ryan Gosling, rehearsed for four months before the cameras rolled. Among other chores, they practiced singing, dancing, and, in Gosling’s case, piano. The fact that Stone’s Irrational Man character plays piano is less central to that movie’s plot, but Allen didn’t expect her even to fake it. He simply shot her recital with the piano blocking her hands.

So do we shoot the piano player’s hands or not? The boring answer, unfortunately, is that it depends—but perhaps we can dig a little deeper. It seems safe to say that it would be impossible to make The Pianist with Adrian Brody’s hands conveniently blocked from view for the whole movie. But I’m equally confident that it doesn’t matter the slightest bit in Irrational Man, which I haven’t seen, whether or not Emma Stone is really playing the piano. La La Land is a slightly trickier case. It would be hard to envision it without at least a few shots of Ryan Gosling playing the piano, and Damien Chazelle isn’t above indulging in exactly the camera move that Mamet decries, in which it tilts down to reassure us that it’s really Gosling playing. Yet the fact that we’re even talking about this gets down to a fundamental problem with the movie, which I mostly like and admire. Its characters are archetypes who draw much of their energy from the auras of the actors who play them, and in the case of Stone, who is luminous and moving as an aspiring actress suffering through an endless series of auditions, the film gets a lot of mileage from our knowledge that she’s been in the same situation. Gosling, to put it mildly, has never been an aspiring jazz pianist. This shouldn’t even matter, but every time we see him playing the piano, he briefly ceases to be a struggling artist and becomes a handsome movie star who has spent three months learning to fake it. And I suspect that the movie would have been elevated immensely by casting a real musician. (This ties into another issue with La La Land, which is that it resorts to telling us that its characters deserve to be stars, rather than showing it to us in overwhelming terms through Gosling and Stone’s singing and dancing, which is merely passable. It’s in sharp contrast to Martin Scorsese’s New York, New York, one of its clear spiritual predecessors, in which it’s impossible to watch Liza Minnelli without becoming convinced that she ought to be the biggest star in the world. And when you think of how quirky, repellent, and individual Minnelli and Robert De Niro are allowed to be in that film, La La Land starts to look a little schematic.)

And I don’t think I’m overstating it when I argue that the seemingly minor dilemma of whether to show the piano player’s hands shades into the larger problem of how much we expect our actors to really be what they pretend that they are. I don’t think any less of Bill Murray because he had to employ Terry Fryer as a “hand double” for his piano solo in Groundhog Day, and I don’t mind that the most famous movie piano player of them all—Dooley Wilson in Casablanca—was faking it. And there’s no question that you’re taken out of the movie a little when you see Richard Chamberlain playing Tchaikovsky’s Piano Concerto No. 1 in The Music Lovers, however impressive it might be. (I’m willing to forgive De Niro learning to mime the saxophone for New York, New York, if only because it’s hard to imagine how it would look otherwise. The piano is just about the only instrument in which it can plausibly be left at the director’s discretion. And in his article, revealingly, Orr fails to mention that none other than Woody Allen was insistent that Sean Penn learn the guitar for Sweet and Lowdown. As Allen himself might say, it depends.) On some level, we respond to an actor playing the piano much like the fans of Doctor Zhivago, whom Pauline Kael devastatingly called “the same sort of people who are delighted when a stage set has running water or a painted horse looks real enough to ride.” But it can serve the story as much as it can detract from it, and the hard part is knowing how and when. As one director notes:

Anybody can learn how to play the piano. For some people it will be very, very difficult—but they can learn it. There’s almost no one who can’t learn to play the piano. There’s a wide range in the middle, of people who can play the piano with various degrees of skill; a very, very narrow band at the top, of people who can play brilliantly and build upon a technical skill to create great art. The same thing is true of cinematography and sound mixing. Just technical skills. Directing is just a technical skill.

This is Mamet writing in On Directing Film, which is possibly the single best work on storytelling I know. You might not believe him when he says that directing is “just a technical skill,” but if you do, there’s a simple way to test if you have it. Do you show the piano player’s hands? If you know the right answer for every scene, you just might be a director.

Frogs for snakes

with one comment

If you’re the sort of person who can’t turn away from a show business scandal with leaked memos, insider anecdotes, and accusations of bad conduct on both sides, the last two weeks have offered a pair of weirdly similar cases. The first involves Frank Darabont, the former showrunner of The Walking Dead, who was fired during the show’s second season and is now suing the network for a share of profits from the most popular series in the history of cable television. In response, AMC released a selection of Darabont’s emails intended to demonstrate that his firing was justified, and it makes for queasily riveting reading. Some are so profane that I don’t feel comfortable quoting them here, but this one gives you a sense of the tone:

If it were up to me, I’d have not only fired [these two writers] when they handed me the worst episode three script imaginable, I’d have hunted them down and f—ing killed them with a brick, then gone and burned down their homes. I haven’t even spoken to those worthless talentless hack sons-of-bitches since their third draft was phoned in after five months of all their big talk and promises that they’d dig deep and have my back covered…Calling their [script] “phoned-in” would be vastly overstating, because they were too busy wasting my time and your money to bother picking the damn phone up. Those f—ing overpaid con artists.

In an affidavit, Darabont attempted to justify his words: “Each of these emails must be considered in context. They were sent during an intense and stressful two-year period of work during which I was fighting like a mother lion to protect the show from harm…Each of these emails was sent because a ‘professional’ showed up whose laziness, indifference, or incompetence threatened to sink the ship. My tone was the result of the stress and magnitude of this extraordinary crisis. The language and hyperbole of my emails were harsh, but so were the circumstances.”

Frankly, I don’t find this quite as convincing as the third act of The Shawshank Redemption. As it happened, the Darabont emails were released a few days before a similar dispute engulfed Steve Whitmire, the puppeteer who had been performing Kermit the Frog since the death of Jim Henson. After the news broke last week that Whitmire had been fired, accounts soon emerged of his behavior that strikingly echoed the situation with Darabont: “He’d send emails and letters attacking everyone, attacking the writing and attacking the director,” Brian Henson told the New York Times. Whitmire has disputed the characterization: “Nobody was yelling and screaming or using inappropriate language or typing in capitals. It was strictly that I was sending detailed notes. I don’t feel that I was, in any way, disrespectful by doing that.” And his defense, like Darabont’s, stems from what he frames as a sense of protectiveness toward the show and its characters. Of a plot point involving Kermit and his nephew Robin on the defunct series The Muppets, Whitmire said to the Hollywood Reporter:

I don’t think Kermit would lie to him. I think that as Robin came to Kermit, he would say “Things happen, people go their separate ways, but that doesn’t mean we don’t care about you.” Kermit is too compassionate to lie to him to spare his feelings…We have been doing these characters for a long, long time and we know them better than anybody. I thought I was aiding to keep it on track, and I think a big reason why the show was canceled…was because that didn’t happen. I am not saying my notes would have saved it, but I think had they listened more to all of the performers, it would have made a really big difference.

Unfortunately, the case of Whitmire, like that of Darabont, is more complicated than it might seem. Henson’s children have come out in support of the firing, with Brian Henson, the public face of the company, saying that he had reservations about Whitmire’s behavior for over a decade:

I have to say, in hindsight, I feel pretty guilty that I burdened Disney by not having recast Kermit at that point because I knew that it was going to be a real problem. And I have always offered that if they wanted to recast Kermit, I was all for it, and I would absolutely help. I am very glad we have done this now. I think the character is better served to remove this destructive energy around it.

Elsewhere, Lisa Henson told the Times that Whitmire had become increasingly controlling, refusing to hire an understudy and blackballing aspiring puppeteers after the studio tried to cast alternate performers, as a source said to Gizmodo: “[Steve] told Disney that the people who were in the audition room are never allowed to work with the Muppets again.” For a Muppet fan, this is all very painful, so I’ll stop here, except to venture two comments. One is that Darabont and Whitmire may well have been right to be concerned. The second is that in expressing their thoughts, they alienated a lot of the people around them, and their protectiveness toward the material ended in them being removed from the creative process altogether. If they were simply bad at giving notes—and the evidence suggests that at least Darabont was—they weren’t alone. No one gives or takes notes well. You could even argue that the whole infrastructure of movie and television production exists to make the exchange of notes, which usually goes in just one direction, incrementally less miserable. And it doesn’t work.

Both men responded by trying to absorb more creative control into themselves, which is a natural response. Brian Henson recalls Whitmire saying: “I am now Kermit, and if you want the Muppets, you better make me happy, because the Muppets are Kermit.” And the most fascinating passage in Darabont’s correspondence is his proposal for how the show ought to be run in the future:

The crew goes away or stands there silently without milling or chattering about bullshit that doesn’t apply to the job at hand…The director [and crew]…stand there and carefully read the scene out loud word for word. Especially and including all description…The important beats are identified and discussed in terms of how they are to be shot. In other words, sole creative authority is being taken out of the director’s hands. It doesn’t matter that our actors are doing good work if the cameras fail to capture it. Any questions come straight to me by phone or text. If necessary I will shoot the coverage on my iPhone and text it to the set. The staging follows the script to the letter and is no longer willy-nilly horseshit with cameras just hosing it down from whatever angle…If the director tries to not shoot what is written, the director is beaten to death on the spot. A trained monkey is brought in to complete the job.

Reading this, I found myself thinking of an analogous situation that arose when David Mamet was running The Unit. (I’m aware that The Unit wasn’t exactly a great show—I don’t think I got through more than two episodes—but my point remains the same.) Mamet, like Darabont, was unhappy with the scripts that he was getting, but instead of writing everything himself, he wrote a memo on plot structure so lucid and logical that it has been widely shared online as a model of how to tell a story. Instead of raging at those around him, he did what he could to elevate them to his level. It strikes me as the best possible response. But as Kermit might say, that’s none of my business.

Written by nevalalee

July 19, 2017 at 9:02 am

Avocado’s number

with 2 comments

Earlier this month, you may have noticed a sudden flurry of online discussion around avocado toast. It was inspired by a remark by a property developer named Tim Gurner, who said to the Australian version of 60 Minutes: “When I was trying to buy my first home, I wasn’t buying smashed avocados for nineteen bucks and four coffees at four dollars each.” Gurner’s statement, which was fairly bland and unmemorable in itself, was promptly transformed into the headline “Millionaire to Millennials: Stop Buying Avocado Toast If You Want to Buy a Home.” From there, it became the target of widespread derision, with commentators pointing out that if owning a house seems increasingly out of reach for many young people, it has more to do with rising real estate prices, low wages, and student loans than with their irresponsible financial habits. And the fact that such a forgettable sentiment became the focal point for so much rage—mostly from people who probably didn’t see the original interview—implies that it merely catalyzed a feeling that had been building for some time. Millennials, it’s fair to say, have been getting it from both sides. When they try to be frugal by using paper towels as napkins, they’re accused of destroying the napkin industry, but they’re also scolded over spending too much at brunch. They’re informed that their predicament is their own fault, unless they’re also being idealized as “joyfully engaged in a project of creative destruction,” as Laura Marsh noted last year in The New Republic. “There’s nothing like being told precarity is actually your cool lifestyle choice,” Marsh wrote, unless it’s being told, as the middle class likes to maintain to the poor, that financial stability is only a matter of hard work and a few small sacrifices.

It also reflects an overdue rejection of what used to be called the latte factor, as popularized by the financial writer David Bach in such books as Smart Women Finish Rich. As Helaine Olen writes in Slate:

Bach calculated that eschewing a five-dollar daily bill at Starbucks—because who, after all, really needs anything at Starbucks?—for a double nonfat latte and biscotti with chocolate could net a prospective saver $150 a month, or $2,000 a year. If she then took that money and put it all in stocks that Bach, ever an optimist, assumed would grow at an average annual rate of eleven percent a year, “chances are that by the time she reached sixty-five, she would have more than $2 million sitting in her account.”

There are a lot of flaws in this argument. Bach rounds up his numbers, assumes an unrealistic rate of return, and ignores taxes and inflation. Most problematic of all is his core assumption that tiny acts of indulgence are what prevent the average investor from accumulating wealth. In fact, big, unpredictable risk factors and fixed expenses play a much larger role, as Olen points out:

Buying common luxury items wasn’t the issue for most Americans. The problem was the fixed costs, the things that are difficult to cut back on. Housing, health care, and education cost the average family seventy-five percent of their discretionary income in the 2000s. The comparable figure in 1973: fifty percent. Indeed, studies demonstrate that the quickest way to land in bankruptcy court was not by buying the latest Apple computer but through medical expenses, job loss, foreclosure, and divorce.

It turns out that incremental acts of daily discipline are powerless in the face of systemic factors that have a way of erasing all your efforts—and this applies to more than just personal finance. Back when I was trying to write my first novel, I was struck by the idea that if I managed to write just one hundred words every day, I’d have a manuscript in less than three years. I was so taken by this notion that I wrote it down on an index card and stuck it to my bathroom mirror. That was over a decade ago, and while I can’t quite remember how long I stuck with that regimen, it couldn’t have been more than a few weeks. Novels, I discovered, aren’t written a hundred words at a time, at least not in a fashion that can be banked in the literary equivalent of a penny jar. They’re the product of hard work combined with skills that can only be developed after a period of sustained engagement. There’s a lot of trial and error involved, and you can only arrive at a workable system through the kind of experience that comes from addressing issues of craft with maximal attention. Luck and timing also play a role, particularly when it comes navigating the countless pitfalls that lie between a finished draft and its publication. In finance, we’re inclined to look at a historical return series and attribute it after the fact to genius, rather than to variables that are out of our hands. Similarly, every successful novel creates its own origin story. We naturally underestimate the impact of factors that can’t be credited to individual initiative and discipline. As a motivational tool, there’s a place for this kind of myth. But if novels were written using the literary equivalent of the latte factor, we’d have more novels, just as we’d have more millionaires.

Which isn’t to say that routine doesn’t play a crucial role. My favorite piece of writing advice ever is what David Mamet writes in Some Freaks:

As a writer, I’ve tried to train myself to go one achievable step at a time: to say, for example, “Today I don’t have to be particularly inventive, all I have to be is careful, and make up an outline of the actual physical things the character does in Act One.” And then, the following day to say, “Today I don’t have to be careful. I already have this careful, literal outline, and I all have to do is be a little bit inventive,” et cetera, et cetera.

A lot of writing comes down to figuring out what to do on any given morning—but it doesn’t mean doing the same thing each day. Knowing what achievable steps are appropriate at every stage is as important here as it is anywhere else. You can acquire this knowledge as systematically or haphazardly as you like, but you can also do everything right and still fail in the end. (If we define failure as spending years on a novel that will never be published, it’s practically a requirement of the writer’s education.) Books on writing and personal finance continue to take up entire shelves at bookstores, and they can sound very much alike. In “The Writer’s Process,” a recent, and unusually funny, humor piece in The New Yorker, Hallie Cantor expertly skewers their tone—“I give myself permission to write a clumsy first draft and vigorously edit it later”—and concludes: “Anyway, I guess that’s my process. It’s all about repetition, really—doing the same thing every single day.” We’ve all heard this advice. I’ve been guilty of it myself. But when you don’t take the big picture into account, it’s just a load of smashed avocado.

Hollywood in Limbo

with 2 comments

In his essay on the fourth canto of Dante’s Inferno, which describes the circle of Limbo populated by the souls of virtuous pagans, Jorge Luis Borges discusses the notion of the uncanny, which has proven elusively hard to define:

Toward the beginning of the nineteenth century, or the end of the eighteenth, certain adjectives of Saxon or Scottish origin (eerie, uncanny, weird) came into circulation in the English language, serving to define those places or things that vaguely inspire horror…In German, they are perfectly translated by the word unheimlich; in Spanish, the best word may be siniestro.

I was reminded of this passage while reading, of all things, Benjamin Wallace’s recent article in Vanity Fair on the decline of National Lampoon. It’s a great piece, and it captures the sense of uncanniness that I’ve always associated with a certain part of Hollywood. Writing of the former Lampoon head Dan Laikin, Wallace says:

Poor choice of partners proved a recurring problem. Unable to get traction with the Hollywood establishment, Laikin appeared ready to work with just about anyone. “There were those of us who’d been in the business a long time,” [development executive Randi] Siegel says, “who told him not to do business with certain people. Dan had a tendency to trust people that were probably not the best people to trust. I think he wanted to see the good in it and change things.” He didn’t necessarily have much choice. If you’re not playing in Hollywood’s big leagues, you’re playing in its minors, which teem with marginal characters…“Everyone Danny hung out with was sketchy,” says someone who did business with Laikin. Laikin, for his part, blames the milieu: “I’m telling you, I don’t surround myself with these people. I don’t search them out. They’re all over this town.”

Years ago, I attended a talk by David Mamet in which he said something that I’ve never forgotten. Everybody gets a break in Hollywood after twenty-five years, but some get it at the beginning and others at the end, and the important thing is to be the one who stays after everyone else has gone home. Wallace’s article perfectly encapsulates that quality, which I’ve always found fascinating, perhaps because I’ve never had to live with it. It results in a stratum of players in the movie and television industry who haven’t quite broken through, but also haven’t reached the point where they drop out entirely. They end up, in short, in a kind of limbo, which Borges vividly describes in the same essay:

There is is something of the oppressive wax museum about this still enclosure: Caesar, armed and idle; Lavinia, eternally seated next to her father…A much later passage of the Purgatorio adds that the shades of the poets, who are barred from writing, since they are in the Inferno, seek to distract their eternity with literary discussions.

You could say that the inhabitants of Hollywood’s fourth circle of hell, who are barred from actually making movies, seek to distract their eternity by talking about the movies that they wish they could make. It’s easy to mock them, but there’s also something weirdly ennobling about their sheer persistence. They’re survivors in a profession where few of us would have lasted, if we even had the courage to go out there in the first place, and at a time when such people seem more likely to end up at something like the Fyre Festival, it’s nice to see that they still exist in Hollywood.

So what is it about the movie industry that draws and retains such personalities? One of its most emblematic figures is Robert Towne, who, despite his Oscar for Chinatown and his reputation as the dean of American screenwriters, has spent his entire career looking like a man on the verge of his big break. If Hollywood is Limbo, Towne is its Caesar, “armed and idle,” and he’s been there for five decades. Not surprisingly, he has a lot of insight into the nature of that hell. In his interview with John Brady in The Craft of the Screenwriter, Towne says:

You are often involved with a producer who is more interested in making money on the making of the movie than he is on the releasing of the movie. There is a lot of money to be made on the production of a movie, not just in salary, but all sorts of ways that are just not altogether honest. So he’s going to make his money on the making, which is really reprehensible.

“Movies are so difficult that you should really make movies that you feel you absolutely have to make,” Towne continues—and the fact that this happens so rarely implies that the studio ecosystem is set up for something totally different. Towne adds:

It’s easier for a director and an actor to be mediocre and get away with it than it is for a writer. Even a writer who happens to be mediocre has to work pretty hard to get through a script, whereas a cameraman will say to the director, “Where do you think you want to put the camera? You want it here? All right, I’m going to put it here.” In other words, a director can be carried along by the production if he’s mediocre, to some extent; and that’s true of an actor, too.

Towne tosses off these observations without dwelling on them, knowing that there’s plenty more where they came from, but if you put them together, you end up with a pretty good explanation of why Hollywood is the way it is. It’s built to profit from the making of movies, rather than from the movies themselves, which is only logical: if it depended on success at the box office, everybody would be out of a job. The industry also has structures in place that allow people to skate by for years without any particular skills, if they manage to stick to the margins. (In any field where past success is no guarantee of future performance, it’s the tall poppies that get their heads chopped off.) Under such conditions, survival isn’t a matter of talent, but of something much less definable. A brand like National Lampoon, which has been leveled by time but retains some of its old allure, draws such people like a bright light draws fish in the abyss, and it provides a place where they can be studied. The fact that Kato Kaelin makes an appearance in these circles shouldn’t be surprising—he’s the patron saint of those who hang on for decades for no particular reason. And it’s hard not to relate to the hope that sustains them:

“What everyone always does at the company is feel like something big is about to happen, and I want to be here for it,” [creative director] Marty Dundics says. “We’re one hit movie away from, or one big thing away from, being back on top. It’s always this underdog you’re rooting for. And you don’t want to miss it. That big thing that’s about to happen. That was always the mood.”

Extend that mood across a quarter of a century, and you have Hollywood, which also struggles against the realization that Borges perceives in Limbo: “The certainty that tomorrow will be like today, which was like yesterday, which was like every day.”

The sound and the furry

with 3 comments

Last week, the podcast 99% Invisible devoted an episode to the editing and sound design tricks used by the makers of nature documentaries. For obvious reasons, most footage in the wild is captured from a distance using zoom lenses, and there’s no equivalent for sound, which means that unless David Attenborough himself is standing in the shot, the noises that you’re hearing were all added later. Foley artists will recreate hoofbeats or the footsteps of lions by running their hands over pits filled with gravel, while animal vocalizations can be taken from sound catalogs or captured by recordists working nowhere near the original shoot. This kind of artifice strikes me as forgivable, but there are times when the manipulation of reality crosses a line. In the fifties Disney documentary White Wilderness, lemmings were shown hurling themselves into the ocean, which required a helping hand: “The producers took the lemmings to a cliff in Alberta and, in some scenes, used a turntable device to throw them off the edge. Not only was it staged, but lemmings don’t even do this on their own. Scientists now know that the idea of a mass lemming suicide ritual is entirely apocryphal.” And then there’s the movie Wolves, which rented wolves from a game farm and filmed them in an artificial den. When Chris Palmer, the director, was asked about the scene at a screening, it didn’t go well:

Palmer’s heart sank, but he decided to come clean, and when he did, he could feel the excitement leave the room. Up to this moment, he had assumed people wouldn’t care. “But they do care,” he realized. “They are assuming they are seeing the truth…things that are authentic and genuine.”

When viewers realize that elements of nature documentaries utilize the same techniques as other genres of filmmaking, they tend to feel betrayed. When you think about the conditions under which such movies are produced, however, it shouldn’t be surprising. If every cut is a lie, as Godard famously said, that’s even more true when you’re dealing with animals in the wild. As David Mamet writes in On Directing Film:

Documentaries take basically unrelated footage and juxtapose it in order to give the viewer the idea the filmmaker wants to convey. They take footage of birds snapping a twig. They take footage of a fawn raising its head. The two shots have nothing to do with each other. They were shot days or years, and miles, apart. And the filmmaker juxtaposes the images to give the viewer the idea of great alertness. The shots have nothing to do with each other. They are not a record of what the protagonist did. They are not a record of how the deer reacted to the bird. They’re basically uninflected images. But they give the viewer the idea of alertness to danger when they are juxtaposed. That’s good filmmaking.

Mamet is trying to make a point about how isolated images—which have little choice but to be “uninflected” when the actors are some birds and a deer—can be combined to create meaning, and he chose this example precisely because the narrative emerges from nothing but that juxtaposition. But it also gets at something fundamental about the grammar of the wildlife documentary itself, which trains us to think about nature in terms of stories. And that’s a fiction in itself.

You could argue that a movie that purports to be educational or “scientific” has no business engaging in artifice of any kind, but in fact, it’s exactly in that context that this sort of manipulation is most justified. Scientific illustration is often used when a subject can’t be photographed directly—as in Ken Marschall’s wonderful paintings for Dr. Robert D. Ballard’s The Discovery of the Titanic—or when more information can conveyed through an idealized situation. In Sociobiology, Edward O. Wilson writes of Sarah Landry’s detailed drawings: “In the case of the vertebrate species, her compositions are among the first to represent entire societies, in the correct demographic proportions, with as many social interactions displayed as can plausibly be included in one scene.” Landry’s compositions of a troop of baboons or a herd of elephants could never have been captured in a photograph, but they get at a truth that is deeper than reality, or at least more useful. As the nature illustrator Jonathan Kingdon writes in Field Notes on Science and Nature:

Even an outline sketch that bears little relationship to the so-called objectivity of a photograph might actually transmit information to another human being more selectively, sometimes even more usefully, than a photograph. For example, a few quick sketches of a hippopotamus allow the difference between sexes, the peculiar architecture of amphibious existence in a giant quadruped, and the combination of biting and antlerlike clashing of enlarged lower jaws to be appreciated at a glance…”Outline drawings”…can represent, in themselves, artifacts that may correspond more closely with what the brain seeks than the charts of light-fall that photographs represent.

On some level, nature documentaries fall into much the same category, providing us with idealized situations and narratives in order to facilitate understanding. (You could even say that the impulse to find a story in nature is a convenient tool in itself. It’s no more “true” than the stories that we tell about human history, but those narratives, as Walter Pater observes of philosophical theories, “may help us to gather up what might otherwise pass unregarded by us.”) If anything, our discomfort with more extreme kinds of artifice has more to do with an implicit violation of the contract between the filmmaker and the audience. We expect that the documentarian will go into the field and shoot hundreds of hours of footage in search of the few minutes—or seconds—that will amaze us. As Jesse David Fox of Vulture wrote of the stunning iguana and snake chase from the new Planet Earth series: “This incredible footage is the result of the kind of extreme luck that only comes with hard work. A camera crew worked from dusk to dawn for weeks filming the exact spot, hoping something would happen, and if it did, that the camera would be in focus.” After shooting the hatchlings for weeks, they finally ended up with their “hero” iguana, and this combination of luck and preparation is what deserves to be rewarded. Renting wolves or throwing lemmings off a cliff seems like a form of cheating, an attempt to fit the story to the script, rather than working with what nature provided. But the boundary isn’t always clear. Every documentary depends on a sort of artificial selection, with the best clips making it into the finished result in a kind of survival of the fittest. But there’s also a lot of intelligent design.

The Mule and the Beaver

leave a comment »

If you wanted to construct the most prolific writer who ever lived, working from first principles, what features would you include? (We’ll assume, for the purposes of this discussion, that he’s a man.) Obviously, he would need to be capable of turning out clean, publishable prose at a fast pace and with a minimum of revision. He would be contented—even happy—within the physical conditions of writing itself, which requires working indoors at a desk alone for hours on end. Ideally, he would operate within a genre, either fiction or nonfiction, that lent itself to producing pages fairly quickly, but with enough variety to prevent burnout, since he’d need to maintain a steady clip every day for years. His most productive period would coincide with an era that gave him steady demand for his work, and he would have a boundless energy that was diverted early on toward the goal of producing more books. If you were particularly clever, you’d even introduce a psychological wrinkle: the act of writing would become his greatest source of satisfaction, as well as an emotional refuge, so that he would end up taking more pleasure in it than almost anything else in life. Finally, you’d provide him with cooperative publishers and an enthusiastic, although not overwhelming, readership, granting him a livelihood that was comfortable but not quite lavish enough to be distracting. Wind him up, let him run unimpeded for three or four decades, and how many books would you get? In the case of Isaac Asimov, the total comes to something like five hundred. Even if it isn’t quite enough to make him the most productive writer of all time, it certainly places him somewhere in the top ten. And it’s a career that followed all but axiomatically from the characteristics that I’ve listed above.

Let’s take these points one at a time. Asimov, like all successful pulp writers, learned how to crank out decent work on deadline, usually limiting himself to a first draft and a clean copy, with very little revision that wasn’t to editorial order. (And he wasn’t alone here. The pulps were an unforgiving school, and they quickly culled authors who weren’t able to write a sentence well enough the first time.) From a young age, Asimov was also drawn to enclosed, windowless spaces, like the kitchen at the back of his father’s candy store, and he had a persistent daydream about running a newsstand in the subway, where he could put up the shutter and read magazines in peace. After he began to write for a living, he was equally content to work in his attic office for up to ten hours a day. Yet it wasn’t fiction that accounted for the bulk of his output—which is a common misconception about his career—but a specific kind of nonfiction. Asimov was a prolific fiction writer, but no more so than many of his contemporaries. It was in nonfiction for general readers that he really shone, initially with such scientific popularizations as The Chemicals of Life and Inside the Atom. At first, his work drew on his academic and professional background in chemistry and biochemistry, but before long, he found that he was equally adept at explaining concepts from the other sciences, as well as such unrelated fields as history and literature. His usual method was to work straight from reference books, dictionaries, and encyclopedias, translating and organizing their concepts for a lay audience. As he once joked to Martin Gardner: “You mean you’re in the same racket I am? You just read books by the professors and rewrite them?”

This kind of writing is harder than it sounds. Asimov noted, correctly, that he added considerable value in arranging and presenting the material, and he was better at it than just about anyone else. (A faculty member at Boston University once observed him at work and exclaimed: “Why, you’re just copying the dictionary!” Asimov, annoyed, handed the dictionary to him and said: “Here. The dictionary is yours. Now go write the book.”) But it also lent itself admirably to turning out a lot of pages in a short period of time. Unlike fiction, it didn’t require him to come up with original ideas from scratch. As soon as he had enough projects in the hopper, he could switch between them freely to avoid becoming bored by any one subject. He could write treatments of the same topic for different audiences and cannibalize unsold material for other venues. In the years after Sputnik, there was plenty of demand for what he had to offer, and he had a ready market for short articles that could be collected into books. And since these were popular treatments of existing information, he could do all of the work from the comfort of his own office. Asimov hated to fly, and he actively avoided assignments that would require him to travel or do research away from home. Before long, his productivity became a selling point in itself, and when his wife told him that life was passing him by, Asimov responded: “If I do manage to publish a hundred books, and if I then die, my last words are likely to be, ‘Only a hundred!’” Writing became a place of security, both from life’s small crises and as an escape from an unhappy marriage, and it was also his greatest source of pleasure. When his daughter asked him what he would do if he had to choose between her and writing, Asimov said: “Why, I would choose you, dear.” But he adds: “But I hesitated—and she noticed that, too.”

Asimov was a complicated man—certainly more so than in the version of himself that he presented to the public—and he can’t be reduced to a neat set of factors. He wasn’t a robot. But those five hundred books represent an achievement so overwhelming that it cries out for explanation, and it wouldn’t exist if certain variables, both external and internal, hadn’t happened to align. In terms of his ability and ambition, Asimov was the equal of Campbell, Heinlein, or Hubbard, but in place of their public entanglements, he channeled his talents into a safer direction, where it grew to gargantuan proportions that only hint at how monstrous that energy and passion really were. (He was also considerably younger than the others, as well as more naturally cautious, and I’d like to believe that he drew a negative lesson from their example.) The result, remarkably, made him the most beloved writer of them all. It was a cultural position, outside the world of science fiction, that was due almost entirely to the body of his nonfiction work as a whole. He never had a bestseller until late in his career, but the volume and quality of his overall output were enough to make him famous. Asimov was the Mule, the unassuming superman of the Foundation series, but he conquered a world from his typewriter. He won the game. And when I think of how his talent, productivity, and love of enclosed spaces combined to produce a fortress made of books, I think of what David Mamet once said to The Paris Review. When asked to explain why he wrote, Mamet replied: “I’ve got to do it anyway. Like beavers, you know. They chop, they eat wood, because if they don’t, their teeth grow too long and they die. And they hate the sound of running water. Drives them crazy. So, if you put those two ideas together, they are going to build dams.”

Written by nevalalee

March 22, 2017 at 9:54 am

%d bloggers like this: