Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Woody Allen

The monotonous periodicity of genius

leave a comment »

Yesterday, I read a passage from the book Music and Life by the critic and poet W.J. Turner that has been on my mind ever since. He begins with a sentence from the historian Charles Sanford Terry, who says of Bach’s cantatas: “There are few phenomena in the record of art more extraordinary than this unflagging cataract of inspiration in which masterpiece followed masterpiece with the monotonous periodicity of a Sunday sermon.” Turner objects to this:

In my enthusiasm for Bach I swallowed this statement when I first met it, but if Dr. Terry will excuse the expression, it is arrant nonsense. Creative genius does not work in this way. Masterpieces are not produced with the monotonous periodicity of a Sunday sermon. In fact, if we stop to think we shall understand that this “monotonous periodicity ” was exactly what was wrong with a great deal of Bach’s music. Bach, through a combination of natural ability and quite unparalleled concentration on his art, had arrived at the point of being able to sit down at any minute of any day and compose what had all the superficial appearance of being a masterpiece. It is possible that even Bach himself did not know which was a masterpiece and which was not, and it is abundantly clear to me that in all his large-sized works there are huge chunks of stuff to which inspiration is the last word that one could apply.

All too often, Turner implies, Bach leaned on his technical facility when inspiration failed or he simply felt indifferent to the material: “The music shows no sign of Bach’s imagination having been fired at all; the old Leipzig Cantor simply took up his pen and reeled off this chorus as any master craftsman might polish off a ticklish job in the course of a day’s work.”

I first encountered the Turner quotation in The New Listener’s Companion and Record Guide by B.H. Haggin, who cites his fellow critic approvingly and adds: “This seems to me an excellent description of the essential fact about Bach—that one hears always the operation of prodigious powers of invention and construction, but frequently an operation that is not as expressive as it is accomplished.” Haggin continues:

Listening to the six sonatas or partitas for unaccompanied violin, the six sonatas or suites for unaccompanied piano, one is aware of Bach’s success with the difficult problem he set himself, of contriving for the instrument a melody that would imply its underlying harmonic progressions between the occasional chords. But one is aware also that solving this problem was not equivalent to writing great or even enjoyable music…I hear only Bach’s craftsmanship going through the motions of creation and producing the external appearances of expressiveness. And I suspect that it is the name of Bach that awes listeners into accepting the appearance as reality, into hearing an expressive content which isn’t there, and into believing that if the content is difficult to hear, this is only because it is especially profound—because it is “the passionate, yet untroubled meditation of a great mind” that lies beyond “the composition’s formidable technical frontiers.”

Haggins confesses that he regards many pieces in The Goldberg Variations or The Well-Tempered Clavier as “examples of competent construction that are, for me, not interesting pieces of music.” And he sums up: “Bach’s way of exercising the spirit was to exercise his craftsmanship; and some of the results offer more to delight an interest in the skillful use of technique than to delight the spirit.”

As I read this, I was inevitably reminded of Christopher Orr’s recent article in The Atlantic, “The Remarkable Laziness of Woody Allen,” which I discussed here last week. Part of Orr’s case against Allen involves “his frenetic pace of one feature film a year,” which can only be described as monotonous periodicity. This isn’t laziness, of course—it’s the opposite—but Orr implies that the director’s obsession with productivity has led him to cut corners in the films themselves: “Ambition simply isn’t on the agenda.” Yet the funny thing is that this approach to making art, while extreme, is perfectly rational. Allen writes, directs, and releases three movies in the time it would take most directors to finish one, and when you look at his box office and awards history, you see that about one in three breaks through to become a financial success, an Oscar winner, or both. And Orr’s criticism of this process, like Turner’s, could only have been made by a professional critic. If you’re obliged to see every Woody Allen movie or have an opinion on every Bach cantata, it’s easy to feel annoyed by the lesser efforts, and you might even wish that that the artist had only released the works in which his inspiration was at its height. For the rest of us, though, this really isn’t an issue. We get to skip Whatever Works or Irrational Man in favor of the occasional Match Point or Midnight in Paris, and most of us are happy if we can even recognize the cantata that has “Jesu, Joy of Man’s Desiring.” If you’re a fan, but not a completist, a skilled craftsman who produces a lot of technically proficient work in hopes that some of it will stick is following a reasonable strategy. As Malcolm Gladwell writes of Bach:

The difference between Bach and his forgotten peers isn’t necessarily that he had a better ratio of hits to misses. The difference is that the mediocre might have a dozen ideas, while Bach, in his lifetime, created more than a thousand full-fledged musical compositions. A genius is a genius, [Dean] Simonton maintains, because he can put together such a staggering number of insights, ideas, theories, random observations, and unexpected connections that he almost inevitably ends up with something great.

As Simonton puts it: “Quality is a probabilistic function of quantity.” But if there’s a risk involved, it’s that an artist will become so used to producing technically proficient material on a regular basis that he or she will fall short when the circumstances demand it. Which brings us back to Bach. Turner’s remarks appear in a chapter on the Mass in B minor, which was hardly a throwaway—it’s generally considered to be one of Bach’s major works. For Turner, however, the virtuosity expressed in the cantatas allowed Bach to take refuge in cleverness even when there was more at stake: “I say that the pretty trumpet work in the four-part chorus of the Gloria, for example, is a proof that Bach was being consciously clever and brightening up his stuff, and that he was not at that moment writing with the spontaneity of those really creative moments which are popularly called inspired.” And he writes of the Kyrie, which he calls “monotonous”:

It is still impressive, and no doubt to an academic musician, with the score in his hands and his soul long ago defunct, this charge of monotony would appear incredible, but then his interest is almost entirely if not absolutely technical. It is a source of everlasting amazement to him to contemplate Bach’s prodigious skill and fertility of invention. But what do I care for Bach’s prodigious skill? Even such virtuosity as Bach’s is valueless unless it expresses some ulterior beauty or, to put it more succinctly, unless it is as expressive as it is accomplished.

And I’m not sure that he’s even wrong. It might seem remarkable to make this accusation of Bach, who is our culture’s embodiment of technical skill as an embodiment of spiritual expression, but if the charge is going to have any weight at all, it has to hold at the highest level. William Blake once wrote: “Mechanical excellence is the only vehicle of genius.” He was right. But it can also be a vehicle, by definition, for literally everything else. And sometimes the real genius lies in being able to tell the difference.

Shoot the piano player

with 2 comments

In his flawed but occasionally fascinating book Bambi vs. Godzilla, the playwright and director David Mamet spends a chapter discussing the concept of aesthetic distance, which is violated whenever viewers remember that they’re simply watching a movie. Mamet provides a memorable example:

An actor portrays a pianist. The actor sits down to play, and the camera moves, without a cut, to his hands, to assure us, the audience, that he is actually playing. The filmmakers, we see, have taken pains to show the viewers that no trickery has occurred, but in so doing, they have taught us only that the actor portraying the part can actually play the piano. This addresses a concern that we did not have. We never wondered if the actor could actually play the piano. We accepted the storyteller’s assurances that the character could play the piano, as we found such acceptance naturally essential to our understanding of the story.

Mamet imagines a hypothetical dialogue between the director and the audience: “I’m going to tell you a story about a pianist.” “Oh, good: I wonder what happens to her!” “But first, before I do, I will take pains to reassure you that the actor you see portraying the hero can actually play the piano.” And he concludes:

We didn’t care till the filmmaker brought it up, at which point we realized that, rather than being told a story, we were being shown a demonstration. We took off our “audience” hat and put on our “judge” hat. We judged the demonstration conclusive but, in so doing, got yanked right out of the drama. The aesthetic distance had been violated.

Let’s table this for now, and turn to a recent article in The Atlantic titled “The Remarkable Laziness of Woody Allen.” To prosecute the case laid out in the headline, the film critic Christopher Orr draws on Eric Lax’s new book Start to Finish: Woody Allen and the Art of Moviemaking, which describes the making of Irrational Man—a movie that nobody saw, which doesn’t make the book sound any less interesting. For Orr, however, it’s “an indictment framed as an encomium,” and he lists what he evidently sees as devastating charges:

Allen’s editor sometimes has to live with technical imperfections in the footage because he hasn’t shot enough takes for her to choose from…As for the shoot itself, Allen has confessed, “I don’t do any preparation. I don’t do any rehearsals. Most of the times I don’t even know what we’re going to shoot.” Indeed, Allen rarely has any conversations whatsoever with his actors before they show up on set…In addition to limiting the number of takes on any given shot, he strongly prefers “master shots”—those that capture an entire scene from one angle—over multiple shots that would subsequently need to be edited together.

For another filmmaker, all of these qualities might be seen as strengths, but that’s beside the point. Here’s the relevant passage:

The minimal commitment that appearing in an Allen film entails is a highly relevant consideration for a time-strapped actor. Lax himself notes the contrast with Mike Leigh—another director of small, art-house films—who rehearses his actors for weeks before shooting even starts. For Damien Chazelle’s La La Land, Stone and her co-star, Ryan Gosling, rehearsed for four months before the cameras rolled. Among other chores, they practiced singing, dancing, and, in Gosling’s case, piano. The fact that Stone’s Irrational Man character plays piano is less central to that movie’s plot, but Allen didn’t expect her even to fake it. He simply shot her recital with the piano blocking her hands.

So do we shoot the piano player’s hands or not? The boring answer, unfortunately, is that it depends—but perhaps we can dig a little deeper. It seems safe to say that it would be impossible to make The Pianist with Adrian Brody’s hands conveniently blocked from view for the whole movie. But I’m equally confident that it doesn’t matter the slightest bit in Irrational Man, which I haven’t seen, whether or not Emma Stone is really playing the piano. La La Land is a slightly trickier case. It would be hard to envision it without at least a few shots of Ryan Gosling playing the piano, and Damien Chazelle isn’t above indulging in exactly the camera move that Mamet decries, in which it tilts down to reassure us that it’s really Gosling playing. Yet the fact that we’re even talking about this gets down to a fundamental problem with the movie, which I mostly like and admire. Its characters are archetypes who draw much of their energy from the auras of the actors who play them, and in the case of Stone, who is luminous and moving as an aspiring actress suffering through an endless series of auditions, the film gets a lot of mileage from our knowledge that she’s been in the same situation. Gosling, to put it mildly, has never been an aspiring jazz pianist. This shouldn’t even matter, but every time we see him playing the piano, he briefly ceases to be a struggling artist and becomes a handsome movie star who has spent three months learning to fake it. And I suspect that the movie would have been elevated immensely by casting a real musician. (This ties into another issue with La La Land, which is that it resorts to telling us that its characters deserve to be stars, rather than showing it to us in overwhelming terms through Gosling and Stone’s singing and dancing, which is merely passable. It’s in sharp contrast to Martin Scorsese’s New York, New York, one of its clear spiritual predecessors, in which it’s impossible to watch Liza Minnelli without becoming convinced that she ought to be the biggest star in the world. And when you think of how quirky, repellent, and individual Minnelli and Robert De Niro are allowed to be in that film, La La Land starts to look a little schematic.)

And I don’t think I’m overstating it when I argue that the seemingly minor dilemma of whether to show the piano player’s hands shades into the larger problem of how much we expect our actors to really be what they pretend that they are. I don’t think any less of Bill Murray because he had to employ Terry Fryer as a “hand double” for his piano solo in Groundhog Day, and I don’t mind that the most famous movie piano player of them all—Dooley Wilson in Casablanca—was faking it. And there’s no question that you’re taken out of the movie a little when you see Richard Chamberlain playing Tchaikovsky’s Piano Concerto No. 1 in The Music Lovers, however impressive it might be. (I’m willing to forgive De Niro learning to mime the saxophone for New York, New York, if only because it’s hard to imagine how it would look otherwise. The piano is just about the only instrument in which it can plausibly be left at the director’s discretion. And in his article, revealingly, Orr fails to mention that none other than Woody Allen was insistent that Sean Penn learn the guitar for Sweet and Lowdown. As Allen himself might say, it depends.) On some level, we respond to an actor playing the piano much like the fans of Doctor Zhivago, whom Pauline Kael devastatingly called “the same sort of people who are delighted when a stage set has running water or a painted horse looks real enough to ride.” But it can serve the story as much as it can detract from it, and the hard part is knowing how and when. As one director notes:

Anybody can learn how to play the piano. For some people it will be very, very difficult—but they can learn it. There’s almost no one who can’t learn to play the piano. There’s a wide range in the middle, of people who can play the piano with various degrees of skill; a very, very narrow band at the top, of people who can play brilliantly and build upon a technical skill to create great art. The same thing is true of cinematography and sound mixing. Just technical skills. Directing is just a technical skill.

This is Mamet writing in On Directing Film, which is possibly the single best work on storytelling I know. You might not believe him when he says that directing is “just a technical skill,” but if you do, there’s a simple way to test if you have it. Do you show the piano player’s hands? If you know the right answer for every scene, you just might be a director.

Asimov’s close encounter

with 4 comments

By the early seventies, Isaac Asimov had achieved the cultural status, which he still retains, of being the first—and perhaps the only—science fiction writer whom most ordinary readers would be able to name. As a result, he ended up on the receiving end of a lot of phone calls from famous newcomers to the field. In 1973, for example, he was contacted by a representative for Woody Allen, who asked if he’d be willing to look over the screenplay of the movie Sleeper. Asimov gladly agreed, and when he met with Allen over lunch, he told him that the script was perfect as it was. Allen didn’t seem to believe him: “How much science fiction have you written?” Asimov responded: “Not much. Very little, actually. Perhaps thirty books of it altogether. The other hundred books aren’t science fiction.” Allen was duly impressed, turning to ask his friends: “Did you hear him throw that line away?” Asimov turned down the chance to serve as a technical director, recommending Ben Bova instead, and the movie did just fine without him, although he later expressed irritation that Allen had never sent him a letter of thanks. Another project with Paul McCartney, whom Asimov met the following year, didn’t go anywhere, either:

McCartney wanted to do a fantasy, and he wanted me to write a story out of the fantasy out of which a screenplay would be prepared. He had the basic idea for the fantasy, which involved two sets of musical groups: a real one, and a group of extraterrestrial imposters…He had only a snatch of dialogue describing the moment when a real group realized they were being victimized by imposters.

Asimov wrote up what he thought was an excellent treatment, but McCartney rejected it: “He went back to his one scrap of dialogue, out of which he apparently couldn’t move, and wanted me to work with that.”

Of all of Asimov’s brushes with Hollywood, however, the most intriguing involved a director to whom he later referred as “Steve Spielberg.” In his memoir In Joy Still Felt, Asimov writes:

On July 18, 1975, I visited Steve Spielberg, a movie director, at his room in the Sherry-Netherland. He had done Jaws, a phenomenally successful picture, and now he planned to do another, involving flying saucers. He wanted me to work with him on it, but I didn’t really want to. The visual media are not my bag, really.

In a footnote, Asimov adds: “He went on to do it without me and it became the phenomenally successful Close Encounters of the Third Kind. I have no regrets.” For an autobiography that devotes enormous amounts of wordage to even the most trivial incidents, it’s a remarkably terse and unrevealing anecdote, and it’s hard not to wonder if something else might have been involved—because when Asimov finally saw Close Encounters, which is celebrating its fortieth anniversary this week with a new theatrical release, he hated it. A year after it came out, he wrote in Isaac Asimov’s Science Fiction Magazine:

Science Digest asked me to see the movie Close Encounters of the Third Kind and write an article for them on the science it contained. I saw the picture and was appalled. I remained appalled even after a doctor’s examination had assured me that no internal organs had been shaken loose by its ridiculous sound waves. (If you can’t be good, be loud, some say, and Close Encounters was very loud.) To begin with there was no accurate science in it; not a trace; and I said so in the article I wrote and which Science Digest published. There was also no logic in it; not a trace; and I said that, too.

Asimov’s essay on Close Encounters, in fact, might be the most unremittingly hostile piece of writing I’ve seen by him on any subject, and I’ve read a lot of it. He seems to have regarded it as little more than a cynical commercial ploy: “It made its play for Ufolators and mystics and, in its chase for the buck, did not scruple to violate every canon of good sense and internal consistency.” In response to readers who praised the special effects, he shot back:

Seeing a rotten picture for the special effects is like eating a tough steak for the smothered onions, or reading a bad book for the dirty parts. Optical wizardry is something a movie can do that a book can’t, but it is no substitute for a story, for logic, for meaning. It is ornamentation, not substance. In fact, whenever a science fiction picture is praised overeffusively for its special effects, I know it’s a bad picture. Is that all they can find to talk about?

Asimov was aware that his negative reaction had hurt the feelings of some of his fans, but he was willing to accept it: “There comes a time when one has to put one’s self firmly on the side of Good.” And he seemed particularly incensed at the idea that audiences might dare to think that Close Encounters was science fiction, and that it implied that the genre was allowed to be “silly, and childish, and stupid,” with nothing more than “loud noise and flashing lights.” He wasn’t against all instances of cinematic science fiction—he had liked Planet of the Apes and Star Wars, faintly praising the latter as “entertainment for the masses [that] did not try to do anything more,” and he even served as a technical consultant on Star Trek: The Motion Picture. But he remained unrelenting toward Close Encounters to the last: “It is a marvelous demonstration of what happens when the workings of extraterrestrial intelligence are handled without a trace of skill.”

And the real explanation comes in an interview that Asimov gave to the Los Angeles Times in 1988, in which he recalled of his close encounter with Spielberg: “I didn’t know who he was at the time, or what a hit the film would be, but I certainly wasn’t interested in a film that glorified flying saucers. I still would have refused, only with more regret.” The italics are mine. Asimov, as I’ve noted before, despised flying saucers, and he would have dismissed any movie that took them seriously as inherently unworthy of consideration. (The editor John W. Campbell was unusually cautious on the subject, writing of the UFO phenomenon in Astounding in 1959: “Its nature and cause are totally indeterminable from the data and the technical understanding available to us at the time.” Yet Asimov felt that even this was going too far, writing that Campbell “seemed to take seriously such things as flying saucers [and] psionic talents.”) From his point of view, he may well have been right to worry about the “glorification” of flying saucers in Close Encounters—its impact on the culture was so great that it seems to have fixed the look of aliens as reported by alleged abductees. And as a man whose brand as a science popularizer and explainer depended on his reputation for rationality and objectivity, he couldn’t allow himself to be associated with such ideas in any way, which may be why he attacked the movie with uncharacteristic savagery. As I’ve written elsewhere, a decade earlier, Asimov had been horrified when his daughter Robyn told him one night that she had seen a flying saucer. When he rushed outside and saw “a perfect featureless metallic circle of something like aluminum” in the sky, he was taken aback, and as he ran into the house for his glasses, he said to himself: “Oh no, this can’t happen to me.” It turned out to be the Goodyear blimp, and Asimov recalled: “I was incredibly relieved!” But his daughter may have come even closer to the truth when she said years later to the New York Times: “He thought he saw his career going down the drain.”

Live from Twin Peaks

with one comment

What does Twin Peaks look like without Agent Cooper? It was a problem that David Lynch and his writing team were forced to solve for Fire Walk With Me, when Kyle MacLachlan declined to come back for much more than a token appearance, and now, in the show’s third season, Lynch and Mark Frost seem determined to tackle the question yet again, even though they’ve been given more screen time for their leading man than anyone could ever want. MacLachlan’s name is the first thing that we see in the closing credits, in large type, to the point where it’s starting to feel like a weekly punchline—it’s the only way that we’d ever know that the episode was over. He’s undoubtedly the star of the show. Yet even as we’re treated to an abundance of Dark Cooper and Dougie Jones, we’re still waiting to see the one character that I, and a lot of other fans, have been awaiting the most impatiently. Dale Cooper, it’s fair to say, is one of the most peculiar protagonists in television history. As the archetypal outsider coming into an isolated town to investigate a murder, he seems at first like a natural surrogate for the audience, but, if anything, he’s quirkier and stranger than many of the locals he encounters. When we first meet Cooper, he comes across as an almost unplayable combination of personal fastidiousness, superhuman deductive skills, and childlike wonder. But you’re anything like me, you wanted to be like him. I ordered my coffee black for years. And if he stood for the rest of us, it was as a representative of the notion, which crumbles in the face of logic but remains emotionally inescapable, that the town of Twin Peaks would somehow be a wonderful place to live, despite all evidence to the contrary.

In the third season, this version of Cooper, whom I’ve been waiting for a quarter of a century to see again, is nowhere in sight. And the buildup to his return, which I still trust will happen sooner or later, has been so teasingly long that it can hardly be anything but a conscious artistic choice. With every moment of recognition—the taste of coffee, the statue of the gunfighter in the plaza—we hope that the old Cooper will suddenly reappear, but the light in his eyes always fades. On some level, Lynch and Frost are clearly having fun with how long they can get away with this, but by removing the keystone of the original series, they’re also leaving us with some fascinating insights into what kind of show this has been from the very beginning. Let’s tick off its qualities one by one. Over the course of any given episode, it cuts between what seems like about a dozen loosely related plotlines. Most of the scenes last between two and four minutes, with about the same number of characters, and the components are too far removed from one another to provide anything in the way of narrative momentum. They aren’t built around any obligation to advance the plot, but around striking images or odd visual or verbal gags. The payoff, as in the case of Dr. Jacoby’s golden shovels, often doesn’t come for hours, and when it does, it amounts to the end of a shaggy dog story. (The closest thing we’ve had so far to a complete sequence is the sad case of Sam, Tracey, and the glass cube, which didn’t even make it past the premiere.) If there’s a pattern, it isn’t visible, but the result is still strangely absorbing, as long as you don’t approach it as a conventional drama but as something more like Twenty-Two Short Films About Twin Peaks.

You know what this sounds like to me? It sounds like a sketch comedy show. I’ve always seen Twin Peaks as a key element in a series of dramas that stretches from The X-Files through Mad Men, but you could make an equally strong case for it as part of a tradition that runs from SCTV to Portlandia, which went so far as to cast MacLachlan as its mayor. They’re set in a particular location with a consistent cast of characters, but they’re essentially sketch comedies, and when one scene is over, they simply cut to the next. In some ways, the use of a fixed setting is a partial solution to the problem of transitions, which shows from Monty Python onward have struggled to address, but it also creates a beguiling sense of encounters taking place beyond the edges of the frame. (Matt Groening has pointed to SCTV as an inspiration for The Simpsons, with its use of a small town in which the characters were always running into one another. Groening, let’s not forget, was born in Portland, just two hours away from Springfield, which raises the intriguing question of why such shows are so drawn to the atmosphere of the Pacific Northwest.) Without Cooper, the show’s affinities to sketch comedy are far more obvious—and this isn’t the first time this has happened. After Laura’s murderer was revealed in the second season, the show seemed to lose direction, and many of the subplots, like James’s terminable storyline with Evelyn, became proverbial for their pointlessness. But in retrospect, that arid middle stretch starts to look a lot like an unsuccessful sketch comedy series. And it’s worth remembering that Lynch and Frost originally hoped to keep the identity of the killer a secret forever, knowing that it was all that was holding together the rest.

In the absence of a connective thread, it takes a genius to make this kind of thing work, and the lack of a controlling hand is a big part of what made the second season so markedly unsuccessful. Fortunately, the third season has a genius readily available. The sketch format has always been David Lynch’s comfort zone, a fact that has been obscured by contingent factors in his long career. Lynch, who was trained as a painter and conceptual artist, thinks naturally in small narrative units, like the video installations that we glimpse for a second as we wander between rooms in a museum. Eraserhead is basically a bunch of sketches linked by its titular character, and he returned to that structure in Inland Empire, which, thanks to the cheapness of digital video, was the first movie in decades that he was able to make entirely on his own terms. In between, the inclination was present but constrained, sometimes for the better. In its original cut of three hours, Blue Velvet would have played much the same way, but in paring it down to its contractually mandated runtime, Lynch and editor Duwayne Dunham ended up focusing entirely on its backbone as a thriller. (It’s an exact parallel to Annie Hall, which began as a three-hour series of sketches called Anhedonia that assumed its current form after Woody Allen and Ralph Rosenbaum threw out everything that wasn’t a romantic comedy.) Most interesting of all is Mulholland Drive, which was originally shot as a television pilot, with fragmented scenes that were clearly supposed to lead to storylines of their own. When Lynch recut it into a movie, they became aspects of Betty’s dream, which may have been closer to what he wanted in the first place. And in the third season of Twin Peaks, it is happening again.

Peak television and the future of stardom

with one comment

Kevin Costner in The Postman

Earlier this week, I devoured the long, excellent article by Josef Adalian and Maria Elena Fernandez of Vulture on the business of peak television. It’s full of useful insights and even better gossip—and it names plenty of names—but there’s one passage that really caught my eye, in a section about the huge salaries that movie stars are being paid to make the switch to the small screen:

A top agent defends the sums his clients are commanding, explaining that, in the overall scheme of things, the extra money isn’t all that significant. “Look at it this way,” he says. “If you’re Amazon and you’re going to launch a David E. Kelley show, that’s gonna cost $4 million an episode [to produce], right? That’s $40 million. You can have Bradley Whitford starring in it, [who is] gonna cost you $150,000 an episode. That’s $1.5 million of your $40 million. Or you could spend another $3.5 million [to get Costner] on what will end up being a $60 million investment by the time you market and promote it. You can either spend $60 [million] and have the Bradley Whitford show, or $63.5 [million] and have the Kevin Costner show. It makes a lot of sense when you look at it that way.”

With all due apologies to Bradley Whitford, I found this thought experiment fascinating, and not just for the reasons that the agent presumably shared it. It implies, for one thing, that television—which is often said to be overtaking Hollywood in terms of quality—is becoming more like feature filmmaking in another respect: it’s the last refuge of the traditional star. We frequently hear that movie stardom is dead and that audiences are drawn more to franchises than to recognizable faces, so the fact that cable and streaming networks seem intensely interested in signing film stars, in a post-True Detective world, implies that their model is different. Some of it may be due to the fact, as William Goldman once said, that no studio executive ever got fired for hiring a movie star: as the new platforms fight to establish themselves, it makes sense that they’d fall back on the idea of star power, which is one of the few things that corporate storytelling has ever been able to quantify or understand. It may also be because the marketing strategy for television inherently differs from that for film: an online series is unusually dependent on media coverage to stand out from the pack, and signing a star always generates headlines. Or at least it once did. (The Vulture article notes that Woody Allen’s new series for Amazon “may end up marking peak Peak TV,” and it seems a lot like a deal that was made for the sake of the coverage it would produce.)

Kevin Costner in JFK

But the most plausible explanation lies in simple economics. As the article explains, Netflix and the other streaming companies operate according to a “cost-plus” model: “Rather than holding out the promise of syndication gold, the company instead pays its studio and showrunner talent a guaranteed up-front profit—typically twenty or thirty percent above what it takes to make a show. In exchange, it owns all or most of the rights to distribute the show, domestically and internationally.” This limits the initial risk to the studio, but also the potential upside: nobody involved in producing the show itself will see any money on the back end. In addition, it means that even the lead actors of the series are paid a flat dollar amount, which makes them a more attractive investment than they might be for a movie. Most of the major stars in Hollywood earn gross points, which means that they get a cut of the box office receipts before the film turns a profit—a “first dollar” deal that makes the mathematics of breaking even much more complicated. The thought experiment about Bradley Whitford and Kevin Costner only makes sense if you can get Costner at a fixed salary per episode. In other words, movie stars are being actively courted by television because its model is a throwback to an earlier era, when actors were held under contract by a studio without any profit participation, and before stars and their agents negotiated better deals that ended up undermining the economic basis of the star system entirely.

And it’s revealing that Costner, of all actors, appears in this example. His name came up mostly because multiple sources told Vulture that he was offered $500,000 per episode to star in a streaming series: “He passed,” the article says, “but industry insiders predict he’ll eventually say ‘yes’ to the right offer.” But he also resonates because he stands for a kind of movie stardom that was already on the wane when he first became famous. It has something to do with the quintessentially American roles that he liked to play—even JFK is starting to seem like the last great national epic—and an aura that somehow kept him in leading parts two decades after his career as a major star was essentially over. That’s weirdly impressive in itself, and it testifies to how intriguing a figure he remains, even if audiences aren’t likely to pay to see him in a movie. Whenever I think of Costner, I remember what the studio executive Mike Medavoy once claimed to have told him right at the beginning of his career:

“You know,” I said to him over lunch, “I have this sense that I’m sitting here with someone who is going to become a great big star. You’re going to want to direct your own movies, produce your own movies, and you’re going to end up leaving your wife and going through the whole Hollywood movie-star cycle.”

Costner did, in fact, end up leaving his first wife. And if he also leaves film for television, even temporarily, it may reveal that “the whole Hollywood movie-star cycle” has a surprising final act that few of us could have anticipated.

Written by nevalalee

May 27, 2016 at 9:03 am

The life of a title

with one comment

Track listing for Kanye West's Waves

So I haven’t heard all of Kanye West’s new album yet—I’m waiting until I can actually download it for real—but I’m excited about what looks to be a major statement from the artist responsible for some of my favorite music of the last decade. Predictably, it was also the target of countless barbs in the weeks leading up to its release, mostly because of what have been portrayed as its constant title changes: it was originally announced as So Help Me God, changed to Swish, made a brief stopover at Waves, and finally settled on The Life of Pablo. And this was all spun as yet another token of West’s flakiness, even from media outlets that have otherwise been staunch advocates of his work. (A typical headline on The A.V. Club was “Today in god, we’re tired: Kanye West announces album title (again).” This was followed a few days later by the site’s rave review of the same album, which traces a familiar pattern of writers snarking at West’s foibles for months, only to fall all over themselves in the rush to declare the result a masterpiece. The only comparable figure who inspires the same disparity in his treatment during the buildup and the reception is Tom Cruise, who, like Kanye, is a born producer who happens to occupy the body of a star.) And there’s a constant temptation for those who cover this kind of thing for a living to draw conclusions from the one scrap of visible information they have, as if the changes in the title were symptoms of some deeper confusion.

Really, though, the shifting title is less a reflection of West’s weirdness, of which we have plenty of evidence elsewhere, than of his stubborn insistence on publicizing even those aspects of the creative process that most others would prefer to keep private. Title changes are a part of any artist’s life, and it’s rare for any work of art to go from conception to completion without a few such transformations along the way: Hemingway famously wrote up fifty potential titles for his Spanish Civil War novel, notably The Undiscovered Country, before finally deciding on For Whom the Bell Tolls. As long as we’re committed to the idea that everything needs a title, we’ll always struggle to find one that adequately represents the work—or at least catalyzes our thoughts about it—while keeping one eye on the market. Each of my novels was originally written and sold with a different title than the one that ended up on its cover, and I’m mostly happy with how it all turned out. (Although I’ll admit that I still think that The Scythian was a better title for the book that wound up being released as Eternal Empire.) And I’m currently going through the same thing again, in full knowledge that whatever title I choose for my next project will probably change before I’m done. I don’t take the task any less seriously, and if anything, I draw comfort from the knowledge that the result will reflect a lot of thought and consideration, and that a title change isn’t necessarily a sign that the process is going wrong. Usually, in fact, it’s the opposite.

Track listing for Kanye West's The Life of Pablo

The difference between a novel and an album by a massive pop star, of course, is that the latter is essentially being developed in plain sight, and any title change is bound to be reported as news. There’s also a tendency, inherited from movie coverage, to see it as evidence of a troubled production. When The Hobbit: There and Back Again was retitled The Battle of the Five Armies, it was framed, credibly enough, as a more accurate reflection of the movie itself, which spins about ten pages of Tolkien into an hour of battle, but it was also perceived as a defensive move in response to the relatively disappointing reception of The Desolation of Smaug. In many cases, nobody wins: All You Need Is Kill was retitled Edge of Tomorrow for its theatrical release and Live Die Repeat on video, a series of equivocations that only detracted from what tuned out to be a superbly confident and focused movie—which is all the evidence we need that title trouble doesn’t have much correlation, if any, with the quality of the finished product. And occasionally, a studio will force a title change that the artist refuses to acknowledge: Paul Thomas Anderson consistently refers to his first movie as Sydney, rather than Hard Eight, and you can hear a touch of resignation in director Nicholas Meyer’s voice whenever he talks about Star Trek II: The Wrath of Khan. (In fact, Meyer’s initial pitch for the title was The Undiscovered Country, which, unlike Hemingway, he eventually got to use.)

But if the finished product is worthwhile, all is forgiven, or forgotten. If I can return for the second time in two days to editor Ralph Rosenblum’s memoir When the Shooting Stops, even as obvious a title as Annie Hall went through its share of incarnations:

[Co-writer Marshall] Brickman came up to the cutting room, and he and Woody [Allen] engaged in one of their title sessions, Marshall spewing forth proposals—Rollercoaster Named Desire, Me and My Goy, It Had to be Jew—with manic glee. This seemed to have little impact on Woody, though, for he remained committed to Anhedonia until the very end. “He first sprung it on me at an early title session,” remembers Brickman. “Arthur Krim, who was the head of United Artists then, walked over to the window and threatened to jump…”

Woody, meanwhile, was adjusting his own thinking, and during the last five screenings, he had me try out a different title each night in my rough-cut speech. The first night it was Anhedonia, and a hundred faces looked at me blankly. The second night it was Anxiety, which roused a few chuckles from devoted Allen fans. Then Anhedonia again. Then Annie and Alvy. And finally Annie Hall, which, thanks to a final burst of good sense, held. It’s hard now to suppose it could ever have been called anything else.

He’s right. And I suspect that we’ll feel the same way about The Life of Pablo before we know it—which won’t stop it from happening again.

“The mirror shattered into spiderwebs…”

leave a comment »

"The mirror shattered into spiderwebs..."

Note: This post is the forty-fourth installment in my author’s commentary for Eternal Empire, covering Chapter 43. You can read the previous installments here.

“I am truly at my happiest not when I am writing an aria for an actor or making a grand political or social point,” Aaron Sorkin said a while back to Vanity Fair. “I am at my happiest when I’ve figured out a fun way for somebody to slip on a banana peel.” I know what he means. In fact, nothing makes me happier than when an otherwise sophisticated piece of entertainment cheerfully decides to go for the oldest, corniest, most obvious pratfall—which is a sign of an even greater sophistication. My favorite example is the most famous joke in Raiders of the Lost Ark, when Indy nonchalantly draws his gun and shoots the swordsman. It’s the one gag in the movie that most people remember best, and if you’re a real fan, you probably know that the scene was improvised on the set to solve an embarrassing problem: they’d originally scheduled a big fight scene, but Harrison Ford was too sick to shoot it, so he proposed the more elegant, and funnier, solution. But the most profound realization of all is that the moment works precisely because the film around it depends so much on craft and clockwork timing to achieve its most memorable effects. If every joke in the movie were pitched on that level, not only wouldn’t we remember that scene, but we probably wouldn’t be talking about Raiders at all, just as most of us don’t look back fondly on 1941. It’s the intelligence, wit, and technical proficiency of the rest of the movie that allows that one cornball moment to triumphantly emerge.

You often see the same pattern when you look at the movies in which similar moments occur. For instance, there’s a scene in Annie Hall—recently voted the funniest screenplay of all time—in which the audience needs to be told that Alvy and Annie are heading for Los Angeles. To incorporate that information, which had been lost when a previous scene was cut, Woody Allen quickly wrote and shot the bit in which he sneezes into a pile of cocaine. It included all the necessary exposition in the dialogue, but as editor Ralph Rosenblum writes in his memoir When The Shooting Stops:    

Although this scene was written and shot just for this information, audiences were always much more focused on the cocaine, and when Woody sneezes into what we’ve just learned is a two-thousand-dollar cache, blowing white powder all over the living room—an old-fashioned, lowest-common-denominator, slip-on-the-banana-peel joke—the film gets its single largest laugh. (“A complete unplanned accident,” says Woody.) The laughter was so great at each of our test screenings that I kept having to add more and more feet of dead film to keep the laughter from pushing the next scene right off the screen…Even so, the transitional information was lost on many viewers: when they stop laughing and spot Alvy and Annie in a car with Rob, who’s discussing how life has changed for him since he emigrated to Beverly Hills, they are momentarily uncertain about how or why the couple got there.

"As they ran, neither woman spoke..."

And while the two moments are very different, it’s revealing that in both cases, an improvised moment of slapstick was introduced to crack an unanticipated narrative problem. It’s no surprise that when writers have to think their way out of dilemma, they often turn to the hoariest, most proven building blocks of story, as if they’d briefly written a scene using the reptile brain—while keeping all the other levels of the brain alive and activated. This is why scenes like this are so delightful: they aren’t gratuitous, but represent an effective way of getting a finely tuned narrative to where it needs to be. And I’d also argue that this runs in both directions, particularly in genre fiction. Those big, obvious moments exist to enable the more refined touches, but also the other way around: a large part of any writer’s diligence and craft is devoted to arranging the smaller pieces so that those huge elements can take shape. As Shane Black pointed out years ago, a lot of movies seem to think that audiences want nothing but those high points, but in practice, it quickly grows exhausting. (Far too many comedies these days seem to consist of nothing but the equivalent of Alvy sneezing into the cocaine, over and over and over again.) And Sorkin’s fondness for the banana-peel gag arises, I suspect, from his realization that when such a moment works, it’s because the less visible aspects of the story around it are working as well.

My novels contain a few of these banana peels, although not as many as I’d like. (One that I still enjoy is the moment in City of Exiles when Wolfe trips over the oversized chess pieces during the chase scene at the London Chess Classic.) And while it’s not quite the same thing, there’s something similar at work in Chapter 43 of Eternal Empire, which features nothing less than a knock-down, drag-out fight between two women, one a runaway bride, the other still wearing her bridesmaid’s dress. If I’ve done my job properly, the scene should work both on its own terms and as an homage to something you’d see on a soapy network or basic cable show like Revenge. And I kind of love the result. I like it, in part, because I know exactly how much thought was required to line up the pieces of the plot to get to this particular payoff: it’s the kind of set piece that you spend ninety percent of the novel trying to reach, only to hope that it all works in the end. The resulting fight lasts for about a page—I’m not trying to write Kill Bill here—but I still think it’s one of the half dozen or so most satisfying moments in the entire trilogy, and it works mostly because it isn’t afraid to go for a big, borderline ridiculous gesture. (If Eternal Empire is my favorite of the three books, and on most days it is, it’s because it contains as many of those scenes as the previous two installments combined, and only because of the groundwork that comes with two volumes’ worth of accumulated backstory.) And although there’s no banana peel, both Wolfe and Asthana are falling now, and they won’t land until the book is over…

Laugh and let die

leave a comment »

Jennifer Lawrence in American Hustle

Note: Minor spoilers follow for American Hustle and The Wolf of Wall Street.

Ever since the Golden Globes, there’s been a lot of talk about the state of modern cinematic comedy, and especially about how the category has expanded to include films that we wouldn’t necessarily classify with the likes of Airplane! Two of the year’s presumptive Oscar frontrunners, American Hustle and The Wolf of Wall Street, are ostensible comedies that are really closer in tone to Goodfellas, and along with the other nominees for the Golden Globe for Best Musical or Comedy—Her, Nebraska, and Inside Llewyn Davis—they made for a rather melancholy slate. Which isn’t to say that these movies aren’t consistently, brutally funny. David O. Russell has become the hottest director in America thanks largely to his ability to marry a compassionate view of his characters to a prankish, almost anarchic humor, and Scorsese has long been a stealth comic master. (Most of Scorsese’s great classics, with the possible exception of Raging Bull, could be recut into savage comedies, although probably at the expense of a “Layla” montage or two.) And what we’re seeing here is less a new development than a confirmation that comedy can, and should, emerge from some unexpectedly dark places.

I’ve noted before that the line between comedy and tragedy is finer than you might suspect, even at the highest levels: give Romeo and Juliet a happy ending, and you have a play that is tonally indistinguishable from All’s Well That Ends Well. Shakespeare incorporates the threat of death into many of his problem comedies, and although it’s narrowly averted in the end, we’re still left with a sense that it could have gone either way. You might even argue that it’s the relative absence of death that allows American Hustle and Wolf to squeak into comedic territory. Nobody dies in American Hustle—unless you count a brief flashback, almost too quick to process, to an unrelated contract killing—and the stakes are exclusively emotional: Russell prefers to mine conflict from his characters, rather than generating suspense in more conventional ways, and we’re too interested in their interactions to be overly concerned about whether they’ll get away with their central con, much less get whacked by the mob. The Wolf of Wall Street doesn’t contain much in the way of death, either, and the most lamented character is a distant relative whose offscreen demise leaves millions of dollars inconveniently stranded in Switzerland. (Jordan Belfort’s grief at this, needless to say, is perfectly genuine.)

Leonardo DiCaprio in The Wolf of Wall Street

And yet the idea of risk, physical and emotional, is central to both movies, as it is to many of the greatest comedies. If contemporary comedies suffer from one flaw, it’s that they often take place in a sanitized world devoid of danger, when it’s really in response to danger that laughter is most cathartic. Many of the biggest laughs I’ve had at the movies have been at lines or moments that stand in contrast to a mood of mounting tension or excitement: think of the Indiana Jones trilogy, the films of Quentin Tarantino, or the Bruce Willis movie of your choice. It’s perhaps no accident that both American Hustle and The Wolf of Wall Street are joined, oddly, by musical homages to James Bond: a cover of “Goldfinger” plays in the background of Belfort’s lavish wedding, and Jennifer Lawrence’s showstopping rendition of “Live and Let Die” may be Hustle‘s single most memorable moment. The Bond movies, many of which are thinly disguised comedies in themselves, know that we’re more likely to be amused by a gag when it emerges in counterpoint to action or violence. Bond’s frequently derided one-liners—“Shocking!”—have become a cliché, but like most other clichés in these movies, they exist because they fundamentally work.

That may be why there are surprisingly few “pure” comedies among my own favorite movies. When a film wants nothing more than to make us laugh, I’m likely to find it a little unsatisfying: the best jokes are all about surprise, or catching us with our guard down, which is why a movie that tries to spring a gag every minute can start to seem thin and forced. (This also works the other way around: a movie that is unrelentingly grim can feel equally untrue to life.) Humor is at its most powerful when it’s set against a dramatic baseline, however exaggerated, that provides a contrast to the moments when the comedy erupts. The best movies of Wes Anderson, not to mention Woody Allen, are strangely preoccupied with death, and Kubrick’s genius lay in constructing movies that were so finely poised between comedy and tragedy that they evolve in our own minds between viewings: The Shining becomes a richer, more baroque comedy each time I see it, and Eyes Wide Shut is really a farce played at the speed of a dirge. My favorite description of any of Kubrick’s films is Paul Thomas Anderson’s take on Barry Lyndon: “When I saw it, I thought it was very serious, and then I saw it the second time, and I said, ‘This is fucking hilarious!'” And that’s the zone in which real comedy thrives.

Comedy and the high-functioning psychotic

leave a comment »

Patton Oswalt

Last week, I noted that there isn’t a lot of humor in my writing, which I chalked up to the fact that I’ve spent most of my life as an author developing other elements of fiction. However, there’s an alternative explanation: maybe I’m just not wired for comedy, at least not compared to those who are good enough to do it for a living. A recent study in the British Journal of Psychiatry reports that comedians tend to score higher on four categories of psychotic traits, measured against a control group of actors—who are used to going on stage—and individuals working in noncreative fields. These tendencies include “unusual experiences,” such as a belief in psychic phenomena; impulsive or antisocial behavior; difficulty in focusing for long periods of time; and anhedonia, or the inability to feel pleasure. (This last quality won’t come as a surprise to anyone familiar with the work of Woody Allen, whose working title for the film that eventually became Annie Hall was Anhedonia.) And although any study like this needs to be taken with a grain of salt, it certainly seems consistent with what we know about the inner lives of most comedians.

But there’s also a paradox here, which is that performing comedy on stage is one of the most demanding of all creative professions, and those who succeed at it have invested inhuman amounts of time and energy into perfecting their craft, which requires all the traits of focus and organization that true psychotics might seem to lack. In an interview last week with The A.V. Club, Patton Oswalt described his process of preparation for his first HBO special in terms more suited to an athletic event:

It was my dream to do a half-hour on HBO, so that was a big deal for me. I know this is such a cliché and other comedians have said this, but I did treat it like a prizefight. I treated it like I’m ready to go in to the most perfect half-hour that I could, and I was doing sets non-stop. In clubs, I just booked myself everywhere leading up to it. On the nights that I had off, I would find a stage somewhere where I could work on a big chunk of it over and over and over again.

Trevor Noah

Later, however, Oswalt was told by his director that he’d been preparing so obsessively that he’d drained the life out of the material—he’d performed the same routines so often that all the spontaneity was gone. The director’s advice: “Tomorrow night I want you to go out, go get dinner, go see a movie, forget that you’re doing the special, and then on Friday, when you do the special, I guarantee you, it’ll all come rushing back into your head, and you’ll have a great set.” And it worked. Which gets close to the heart of the inherent contradiction, and the challenge, of being a good comedian. If comedy’s origins lie in the kinds of lateral, nonlinear thinking that we associate in their more extreme forms with certain kinds of mental illness, its execution depends on systematic rehearsal, revision, and refinement, all of which has to remain invisible to the audience. The perfect comedy set has an organic, inevitable structure, but it also feels as if it’s being thought up by the performer on the fly, no matter how many weeks or months or preparation really lie behind every line.

That’s true of all great works of art, of course; if we’re aware of the effort that has gone into an artistic production, it implies that the effort wasn’t enough. (Or as Whistler puts it: “Industry in art is a necessity—not a virtue—and any evidence of the same, in the production, is a blemish, not a quality; a proof, not of achievement, but of absolutely insufficient work, for work alone will efface the footsteps of work.”) In comedy, that balance between the spontaneous and the structured is especially crucial, and difficult, which is why there are so many more psychotics than there are comedians, and so few great comedians overall. Watching a flawless comedy set, like Trevor Noah’s eight perfect minutes on Live at the Apollo in London, allows us to see a virtuoso craftsman at work, and it’s all the more remarkable when we consider how seamlessly these raw emotional materials have been transmuted into a skilled performance. It requires a personality both sane enough to master a nearly impossible skill and crazy enough to be drawn to it in the first place. And although that combination may be rare, it’s more amazing that it happens at all.

Written by nevalalee

January 20, 2014 at 9:47 am

How to think in the shower

with 4 comments

I realized recently that what one thinks about in the shower in the morning is more important than I’d thought. I knew it was a good time to have ideas. Now I’d go further: now I’d say it’s hard to do a really good job on anything you don’t think about in the shower.

Paul Graham

I know what he means. For as long as I can remember, my morning shower has been my best thinking time, the protected space in which I can most comfortably work through whatever problems I’m trying to solve. And while it’s easy to let your mind wander, which, as Graham points out, is a good way of discovering what really matters to you at the moment, I’ve decided that this time is too precious to be left entirely to chance. When I’m writing a novel, I try to look over my notes for the day just before I turn on the water, and I usually find that I’ve come up with a number of new ideas before it shuts off. If I’m stuck for a topic for a blog post, I’ll take whatever sliver of inspiration I can—often in the form of one of Brian Eno’s Oblique Strategies—and mull it over for five minutes as the shower runs. More often than not, I’ll emerge with something useful. It works so consistently, in fact, that I’ve come to see it as an essential part of my writing routine, an extension of my office or brain. And I’m far from alone in this. Woody Allen, for instance, takes his showers very seriously:

I’ve found over the years that any momentary change stimulates a fresh burst of mental energy…The shower is particularly good in cold weather. This sounds so silly, but I’ll be working dressed as I am and I’ll want to get into the shower for a creative stint. So I’ll take off some of my clothes and make myself an English muffin or something and try to give myself a little chill so I want to get in the shower. I’ll stand there with steaming hot water coming down for thirty minutes, forty-five minutes, just thinking out ideas and working on plot. Then I get out and dry myself and dress and then flop down on the bed and think there.

Allen here is as insightful as always—if you haven’t checked out Eric Lax’s Conversations With Woody Allen, from which this quote is taken, you really should—but he’s particularly shrewd on identifying a shower as a moment of change. In the shower, we’re taken out of our usual environment; we become semiaquatic creatures, in a humid little cube, and it’s at such points of transition that our minds are likely to move in promising directions.

There are other ways of encouraging this kind of mental and physical shift, most of them linked to relaxing, unconscious activities: taking a walk, doing routine chores, shaving. But there’s also something about the shower itself that seems especially conductive to mental activity. Alone, unclothed, we’re in a particularly vulnerable state, which is what makes the shower’s most famous cinematic appearance so effective. All the same, we’re in a state of relaxation, but also standing, and although I know that a lot of writers have done good thinking in the bathtub, I don’t think it’s quite as conducive to the kind of focused mental trip that the shower provides. You can read in the bathtub, after all, as long as you’re careful with the pages, while the shower is an enforced citadel of quiet. Hanging a radio or, worse, an iPad on the tile robs us of one of our last remaining fortresses of solitude. It’s best just to stand there in the cone of white noise that the cascade of water creates, as removed from the world as we can be while still remaining awake, and it’s the best time I know for uninterrupted, right-brained, intuitive thought.

And keeping an eye on your thoughts in the shower isn’t just a way of working through problems, but of clarifying which problems really matter. To close on Paul Graham once again:

I suspect a lot of people aren’t sure what’s the top idea in their mind at any given time. I’m often mistaken about it. I tend to think it’s the idea I’d want to be the top one, rather than the one that is. But it’s easy to figure this out: just take a shower. What topic do your thoughts keep returning to? If it’s not what you want to be thinking about, you may want to change something.

In the shower, we come as close as we can to who we really are when all the masks are gone, and we can learn a lot about ourselves by seeing where our minds wander. My own shower has a little window that looks out on my backyard, and I’ll often catch myself looking out at the square of lawn behind my house, thinking over my life, what I’ve accomplished, and what still remains to be done. It’s something like the state we enter as we’re drifting off to sleep, but with our eyes wide open. When we emerge, we’re refreshed and at peace, with a new perspective on the tasks ahead. If this were a new invention, it would seem like magic. And it is.

Written by nevalalee

December 4, 2013 at 8:44 am

Keeping it short

with 8 comments

Elie Weisel

Yesterday, I noted that Shoah, Claude Lanzmann’s epic film about the Holocaust, uses its own enormous length as a narrative strategy: its nine-hour runtime is a way of dramatizing, assimilating, and ultimately transforming the incomprehensible vastness of its subject. But there are other valid approaches as well, even to similar material. Here’s Elie Wiesel talking to The Paris Review:

I reduce nine hundred pages [the original length of Night] to one hundred sixty pages. I also enjoy cutting. I do it with a masochistic pleasure although even when you cut, you don’t. Writing is not like painting where you add. It is not what you put on the canvas that the reader sees. Writing is more like a sculpture where you remove, you eliminate in order to make the work visible. Even those pages you remove somehow remain. There is a difference between a book of two hundred pages from the very beginning, and a book of two hundred pages which is the result of an original eight hundred pages. The six hundred pages are there. Only you don’t see them.

Instead of expanding his work to encompass the enormity of the events involved, Wiesel cuts it down to its core. It’s just one of millions of such stories that could have been told, and its power is only increased by the sense that it’s a single volume in an invisible library of libraries.

A big book is immediately impressive, even newsworthy, but if anything, the author’s hand is more visible in shorter works. The implicit premise of a long book is that it’s giving us an entire world, and in many of the great social epics—from War and Peace to A Suitable Boy—the writer himself is invisible by design. A short work, by contrast, is more about selection, and it foregrounds the author’s choices: the boundaries of the narrative are set within a narrow window, and the result is just as evocative for what it omits as includes. Every painter knows that one of the hardest decisions in making a new composition is knowing where to put the frame. If a big novel is the literary equivalent of a huge pane of plate glass, a short book is more like what the great architect Christopher Alexander has called a Zen view, a tiny opening in a wall that only exposes a fraction of the landscape. When we see a spectacular panorama all at once, it becomes dead to us after a day or two, as if it were part of the wallpaper; if we view it through a tiny opening, or glimpse it only as we pass from one room to the next, it remains vital forever, even if we live with it for fifty years. A short work of narrative sets up some of the same vibrations, with a sense that there’s more taking place beyond the edge of the pane, if only we could see it.

Woody Allen

A shorter length is also more suited for stories that hinge on the reader’s suspension of belief, or on the momentary alignment of a few extraordinary factors. This includes both comedy and its darker cousin noir. Great comic works, whether in fiction, film, or drama, tend to be relatively short, both because it’s hard to sustain the necessary pitch for long and because the story often hinges on elements that can’t be spun out forever: coincidence, misunderstanding, an elaborate series of mistakes. Another turn of the screw and you’ve got a thriller, which tends to be similarly concise. Some of the best suspense novels in the language were written to fit in a pocket: The Postman Always Rings Twice is maybe 120 pages long, Double Indemnity even shorter, the Travis McGee books a reliable 150 or so. Like comedy, noir and suspense are built on premises that would fall apart, either narratively or logically, if spun out to six hundred pages: characters are presented to us at their lowest point, or at a moment of maximum intensity, and it doesn’t particularly matter what they were doing before or after the story began. That kind of concentration and selectiveness is what separates great writers from the rest: the secret of both comedy and suspense is knowing what to leave out.

And that’s equally true of the movies, even if it’s something that a filmmaker discovers only after hard experience. Cutting a novel can be agonizing, but it’s all the more painful to excise scenes from a movie, when the footage you’re removing represents hundreds or thousands of hours of collective effort—which is why an editor like Walter Murch never visits the set, allowing him to remain objective. There’s no better contemporary model of cinematic brevity than Woody Allen, whose movies rarely run more than ninety minutes, partly because his own attention starts to wander: “For me, if I make a film which is one hour forty minutes, it’s long. I just run out of story impetus after a certain time.” And although he’s never said so in public, it’s clear that he arrived at this artistic philosophy in the late seventies, after laboring hard with the screenwriter Marshall Brickman on a three-hour monster of a comedy. Its working title was Anhedonia, and it was going to cover every aspect of its protagonist’s life—childhood, career, romance—with countless surreal sketches and fantasy sequences. The result was an unwatchable mess, so it was only with the help of editor Ralph Rosenblum that Allen was able to find its heart: a quirky, focused love story, with only two major characters, that ran a clean 93 minutes. It was Annie Hall.

Quote of the Day

with one comment

Written by nevalalee

October 4, 2013 at 7:30 am

Posted in Movies, Quote of the Day

Tagged with ,

“My biggest life lesson…”

leave a comment »

I made the statement years ago which is often quoted that eighty percent of life is showing up. People used to always say to me that they wanted to write a play, they wanted to write a movie, they wanted to write a novel, and the couple of people that did it were eighty percent of the way to having something happen. All the other people struck out without ever getting that pack. They couldn’t do it, that’s why they don’t accomplish a thing, they don’t do the thing, so once you do it, if you actually write your film script, or write your novel, you are more than halfway towards something good happening. So that I was say my biggest life lesson that has worked. All others have failed me.

Woody Allen, to Collider

Written by nevalalee

September 22, 2012 at 9:50 am

For love or money

with 2 comments

Whenever I think about the relationship between writing and money, I remember an exchange in What’s New Pussycat? between Peter O’Toole and Woody Allen:

O’Toole: Did you find a job?
Allen: Yeah, I got something at the striptease. I help the girls dress and undress.
O’Toole: Nice job.
Allen: Twenty francs a week.
O’Toole: Not very much.
Allen: It’s all I can afford.

It’s a great gag, but the reason I like it so much is that it points to a universal truth: when we’re doing what we love for a living, we’ll gladly pay for the privilege. (Incidentally, this exchange, which you can watch starting at the 2:53 mark here, forms part of Allen’s movie debut, which shows how fully realized his persona was from the very beginning.)

Here’s another example. I have a friend who loves to knit, and whenever I see her, she’s always working on scarves and socks as gifts for friends. (She even hopes to raise goats for their wool one day.) When she’s asked if she’d ever consider selling her work on Etsy, however, she says no. Why? Given how much effort and energy she invests in one pair of socks, she says, she’d have to sell them for something like three hundred dollars in order to be fairly compensated for her time. Knitting by hand is a losing proposition, at least in financial terms, but she does it because she enjoys it. This is true of a lot of hobbies, even when we get paid for our work. When we bring the tomatoes from our garden to sell at the farmer’s market, we don’t expect to break even on the transaction, but it’s still gratifying to make the sale.

And this is often true of writing as well. Even setting aside the fact that I do a lot of my writing for free—I haven’t seen a cent from this blog, for one thing—the writing I do for money doesn’t always make sense from a financial point of view. When I publish a story in Analog, for instance, I get paid, at most, seven cents a word. Given the fact that it takes me two solid weeks to research, outline, and write even a relatively short story, when I do the math, I find that I’m basically working for minimum wage. And this is one of the best possible outcomes for this kind of writing. Analog, as it happens, is at the high end of what science fiction magazines can pay these days, with many of the smaller magazines, in any genre, essentially asking authors to write for free. The days in which a writer like Isaac Asimov could make a comfortable living from his short fiction alone are long gone.

So why do I do it? Mostly because I grew up loving the kinds of stories that Analog publishes, and I’m still tickled by the prospect of appearing in its pages, to the point where I’ll more or less pay for the chance, at least when you measure my work in terms of its opportunity cost. For the past couple of years, I’ve been in the enviable position of having at least one story in the pipeline at all times, but after my novelette “The Voices” comes out next month in the September issue, I won’t have anything coming up. And although my schedule this year is uncomfortably packed as it is, I’ll almost certainly take a couple of weeks off at some point to knock out another story, without any guarantee of acceptance, even though my time could be more profitably spent in other ways. And if I could, I’d do this even more often. One short story a year isn’t very much. But it’s all I can afford.

Written by nevalalee

May 31, 2012 at 9:53 am

The pause that refreshes

leave a comment »

For most writers, working too hard is the least of their problems, but sometimes it’s necessary to slow down. In this respect, I’m a bigger offender than most. As regular readers will know, I’m a member of the cult of productivity: I believe that in order to write well, you need to write a lot, and I take pride in the fact that I can reliably crank out a few pages on demand. (Although not without the preliminary work of brainstorming, researching, and outlining, which effectively triples my writing time, without even counting revision.) Yet as I start the process of outlining The Scythian, I’m repeatedly reminded of the fact that it’s occasionally good to pause, look around, and see where you are. Because it’s in the moments between sessions of furious activity, when no visible work is being done, that some of our most important insights take place.

In the old days, writers found plenty of occasions to pause during the day, simply because their materials demanded it. You had quills to cut, inkwells to fill, or, later, typewriter ribbons to replace. (Not to mention figuring out how to reboot WordPerfect.) These tasks were tedious, but they also provided useful intervals of downtime. I never get tired of quoting these lines from Behind the Seen about the great film editor Walter Murch, who found moments of surprising introspection on an old-fashioned editing machine:

As Murch often points out, the simple act of having to rewind film on a flatbed editing machine gave him the chance to see footage in other context (high-speed, reverse) that could reveal a look, a gesture, or a completely forgotten shot. Likewise, the few moments he had to spend waiting for a reel to rewind injected a blank space into the process during which he could simply let his mind wander into subconscious areas.

These days, of course, with modern editing systems and word processing programs, such blank spaces have become harder to find. (Although it’s likely that later generations will look back with amazement on how we managed to get so much work done without the benefit of neural implants.) And while Word still crashes from time to time—in my case, for some reason, whenever I try to use the highlighting tool—that isn’t a substitute for more regular pauses.

In fact, I suspect that many of the brainstorming tools used by writers, including myself, are actually veiled ways of slowing down the creative process, which allows the two hemispheres of the brain to fall into line. Mind maps are a great example. I’ve found that mind maps drawn by hand are infinitely more useful than those made with a computer program, simply because they take longer to make. When I’m seated with a pad of cheap paper, letting my pen wander across the page, I have no choice but to slow down and let my thoughts wander at the same pace as the physical act of writing. As a result, when I’m reviewing the action of the scene I’m outlining, I find myself drilling deeper into individual moments, when I might have hurried past them if I were typing lines into a text box. The activity itself doesn’t really matter: the important thing is to ruminate for an hour or so at a fairly slow speed. Drawing a mind map conveniently gives my eye and hand something to do while my brain does the work.

Other writers will find their own ways of inserting a pause into the creative process. Often just the act of getting up from one’s desk, walking around the room, and doing a few chores—although nothing mentally taxing—will allow the brain to relax. I’ve spoken before of how shaving is the perfect activity for this sort of thing, and I’m not the only one. Here’s Laurence Sterne, author of Tristram Shandy, on dealing with writer’s block:

For if a pinch of snuff, or a stride or two across the room will not do the business for me—I take a razor at once; and having tried the edge of it upon the palm of my hand, without further ceremony, except that of first lathering my beard, I shave it off.

Woody Allen, as I’ve noted before, takes a shower or a walk in the park, and I’ll often get ideas while doing the dishes. Just about anything, in fact, can be used to insert a pause into one’s routine—except going online. Not every writer needs to go as far as Jonathan Franzen, who glued an Ethernet cable into his laptop and broke it off, but it’s worth remembering that nearly all the time you spend online could be more profitably used somewhere else, even if that means doing nothing at all. Which raises the question, of course, of why you’re even reading this post…but lucky for you, I’m done.

Written by nevalalee

February 2, 2012 at 9:51 am

The lost years of Alexander Payne

leave a comment »

It’s hard to believe that seven years have passed since the release of Alexander Payne’s Sideways. When I first saw it, I thought it was close to perfect, if resolutely minor, but if anything, it has grown even more impressive over time—and in retrospect, it’s more clearly a predictor of the last decade’s dominant strain in comedy. In the years since it first appeared, countless directors have tried to recreate its heady mixture of slapstick and intensely observed discomfort—much of Judd Apatow’s recent output feels like a younger, hipper version of Payne’s work, and both Jason Reitman and The Office owe a lot to it as well—but none has ever quite managed to satisfy its audience on so many levels. In fact, I loved it so much that I wrote at the time, without any sense that I was voicing an ironic prophecy: “If there’s any director who ought to make an annual movie for the next twenty years, it’s Alexander Payne.”

Cut to the present day, when Payne’s lack of productivity has been so notorious that it inspired its own article in the New York Times. Payne hasn’t been inactive—he developed various projects, shot the pilot for Hung, and was “credited” with longtime writing partner Jim Taylor on the script for I Now Pronounce You Chuck and Larry—but it’s still a startling gap between Sideways and his new film The Descendants. It’s true that the vagaries of Hollywood can lead to long hiatuses in careers for no discernible reason, but Payne himself seems to view his case as exceptional: he says that he hoped to have directed five more movies by now, and intends to pick up the pace. But there’s no escaping a sense that he may have lost some of the most productive years of his life. The Times article, by Frank Bruni, ends on a sobering note:

“They say you can do honest, sincere work for decades, but you’re given in general a 10-year period when what you do touches the zeitgeist—when you’re relevant,” [Payne] observed during another of our talks. “And I’m aware of that, and I don’t want my time to go by.”

Did the seven years between Sideways and The Descendants eat up some of his charmed decade, or is that decade just beginning now?

He was silent a few seconds.

“I have no idea,” he said.

That said, it’s hard to imagine Payne, or anyone else, making a movie like The Descendants every year. It’s precisely the film by Payne that everyone was hoping to see: small, intimate, agonizingly well-observed, yet emotionally and thematically ambitious in a way that sneaks up on you over time. It’s so modest in tone that it’s easy to overlook how beautifully shot and designed it is: its locations, its art direction, even the clothes by Wendy Chuck—all those Hawaiian shirts tucked into khakis!—are among the most subtly satisfying I’ve seen all year. And, not least of all, it features George Clooney’s most moving performance. Payne has always been great with actors, and watching what he does here with Clooney, who gets to indulge in everything from broad physical comedy to moments that draw on Brando’s scene with his dead wife in Last Tango in Paris, makes you feel the loss of the past seven years even more keenly.

Payne’s case is a difficult one, because he’s a formal perfectionist who tells shaggy human stories that feel as if they should be more numerous than they are. And while The Descendants was worth the wait, there’s still a sense of incompleteness to Payne’s filmography, as if a few lines had gone missing on IMDb. I don’t know if Payne, or his audience, would be any happier if he had been more like Woody Allen, who makes two minor films in a row so that everyone dismisses him, then comes roaring back with Midnight in Paris. But Allen’s career, with its amazing variety and productivity, comes closest to a model of what a director like Payne should be. Now that Spike Lee is taking a break, Allen is one of the few major directors who makes a virtue out of quantity—which, as I’ve noted here before, is often what makes quality possible. The Descendants is a great movie. And it makes me sincerely hope that we aren’t at the end of Payne’s ten years of relevance, but the beginning.

Written by nevalalee

November 21, 2011 at 9:59 am

Turn off, tune out, drop in

with 3 comments

For most of the past decade, I’ve been wearing white headphones. I got my first iPod nine years ago, when I was a senior in college, and at the time, I thought it was the most beautiful thing I’d ever seen. (Today, it looks like a big brick of lucite, but that’s another story.) I’ve updated my music player twice since then, and there’s rarely been a day when I didn’t put on those white earbuds. I drive only very rarely and walk or take public transit almost everywhere around Chicago, as I did when I was living in Boston and New York, so the iPod and its successors have always been a big part of my life. But now, reluctantly, I’m starting to let it go. And I’m writing this post partly as a way of reminding myself why.

I’d been thinking about taking the headphones off for a long time, but it was only last week, when I saw the documentary Public Speaking, that I decided to do something about it. Public Speaking is Martin Scorsese’s loving portrait of occasional writer and professional raconteur Fran Lebowitz. (On her legendary writer’s block: “It’s more of a writer’s blockade.”) Lebowitz doesn’t own a cell phone, a Blackberry, or a computer, and seems vaguely puzzled by those who do. In the film, while miming someone texting furiously, she notes that when you’re down there, on your mobile device, you’re nowhere else, including wherever you happen to be. And much of Lebowitz’s own brilliance and charm comes from her intense engagement with her surroundings.

None of this is exactly groundbreaking, of course, but for whatever reason, it crystallized something in my own mind. For a while, I’ve been obsessed by the fact that every moment in a writer’s life is, potentially, a time that can be used for creation. A writer can’t be working all the time, of course—that way lies madness—but much of the art of surviving as an artist is knowing how to exploit what stray moments of creativity we’re given. Many of my best ideas have popped spontaneously into my head, as I’ve said in the past, while shaving, or while doing otherwise mindless chores like washing the dishes. I’ve quoted Woody Allen on this point before, but because it’s some of the most useful writing advice I know, I’ll quote him again, from Eric Lax’s great Conversations with Woody Allen:

I never like to let any time go unused. When I walk somewhere in the morning, I still plan what I’m going to think about, which problem I’m going to tackle. I may say, This morning I’m going to think of titles. When I get in the shower in the morning, I try to use that time. So much of my time is spent thinking because that’s the only way to attack these writing problems.

And walking alone, as Colin Fletcher and others have realized, is perhaps the best time for thinking. I’ve rarely had to deal with a plot problem that couldn’t be solved, all but unconsciously, by a short walk to the grocery store.  And yet here’s the thing: when my iPod is playing, it doesn’t work. Music, I’m increasingly convinced, anesthetizes the right side of the brain. Sometimes it can help your mind drift and relax, which can lead to insight as well, but for the most part, it’s an excuse to avoid leaving yourself open to ideas—which is unacceptable when you’re counting on those ideas to survive. So from now on, whenever I go out, I’m leaving the headphones at home. Not all the time, perhaps: there are times when I just need to hear, I don’t know, “Blue Monday.” But for the most part, for the first time in years, I’m going to try and listen to my thoughts.

Written by nevalalee

July 26, 2011 at 9:04 am

Midnight in Paris and the true golden age

with one comment

Yesterday my wife and I finally caught a screening of Midnight in Paris, which is already on track to become Woody Allen’s highest-grossing movie since Hannah and Her Sisters. While it’s definitely one of Allen’s slighter films, it’s easy to see why it’s doing so well: it’s clever and fun, and by the end, it’s hard not to be charmed by its premise. I was especially envious of the fact that my wife managed to enter the theater without knowing the movie’s central twist, which is that—spoiler alert—the main character, played by Owen Wilson, travels back in time to Paris in the 1920s, allowing him to rub elbows with Hemingway, the Fitzgeralds, Gertrude Stein, and many other luminaries. (I was really hoping for a cameo by Duchamp, but had to settle for Dali and Man Ray.)

The funny thing is that even though I liked the movie a lot, I responded more to its air of Parisian romance (the cinematography, by the legendary Darius Khondji, is gorgeous) than to its underlying conceit, which is that it would be awesome to have the chance to hang out with your favorite writers. In my own experience, writers generally aren’t great company: the best ones put so much of themselves into their work that there isn’t much left for social niceties. And that applies to great writers as much as to anyone else. Joyce and Proust met only once, at a party thrown by art patrons Violet and Sydney Schiff, and while they evidently shared a carriage ride home, they didn’t have much to say to each other. (Proust, evidently, spent most of the night complaining about his health problems.)

And in the end, the books themselves are more than enough. It’s possible to know Proust more intimately than just about any other person, because he put so much of himself into his writing. Reading, as others have pointed out, is the only form of time travel that we’re currently afforded, and the nice thing about being a reader in the present is that you can access so much of previous eras. One of the messages of Midnight in Paris is that every generation, even the ones that we idealize today, has looked back to a lost golden age. But objectively speaking, if there’s a real golden age, it’s right now, even if you’re the kind of person—like me—who tends to be stuck in the past. There’s simply more past than ever before, in libraries, record shops, movie houses, and, yes, even online. And I’d never want to give up any of it.

That said, it’s still fun to think about what your own golden age might be (as the AV Club did last year, in one of my favorite Q&As). I’d happily spend an afternoon with any version of Orson Welles, or, if we’re going to restrict ourselves to a more recent period, to Coppola and the Zoetrope Studios, ideally in the narrow window after Apocalypse Now and before One From the Heart. As I’ve mentioned before, I’d love to go back in time to Berkeley of the 1970s. And there’s something very tempting about that party with Proust and Joyce, which was also attended by Picasso, Stravinsky, and Diaghilev. In the end, though, I’m happiest here, because I can enjoy the best of the past and look forward to more to come. The trouble with going back in time, after all, is that you’d know all that was coming, good and bad, and would never have the chance to be surprised by a masterpiece—or even just a very good Woody Allen movie. And where’s the fun in that?

Woody Allen on the discipline of writing

with one comment

Woody Allen: I used to get at it [writing] early in the morning and work at it and stay at it and write and rewrite and rethink and tear up my stuff and start over again. I came up with such a hard-line approach—I never waited for inspiration; I always had to go in and do it. You know, you gotta force it. So I could always do the writing and rewriting because I’d force myself. I found a million little tricks over the years to help get through that unpleasant time…

Eric Lax: What are some of the million little tricks you’ve found?

Woody Allen: Always setting myself something to think about for the project at any given free moment: When I go into the shower in the morning; when I go to sleep at night; when I’m waiting for an elevator. Somebody told me years ago about a major league pitcher who always wanted to be a pitcher. When he was growing up on his farm his father told him, “Whenever you’re sitting around pick up a stone and try and hit a blade of grass with it, try and hit a twig with it. Make use of every moment.” And that sounds very logical to me.

Eric Lax, Conversations with Woody Allen

Written by nevalalee

April 3, 2011 at 12:00 am

Beyond the valley of procrastination

with 4 comments

Whenever I think about the virtues of procrastination, and how misunderstood a part of the creative process it is, I remember a story that Roger Ebert tells of the late director Russ Meyer. When they were working on the screenplay for Beyond the Valley of the Dolls—and yes, I’m aware that we aren’t exactly talking The King’s Speech here—Ebert writes:

Working with Meyer was exhilarating but demanding. He equated writing with typing. He kept his office door open, and whenever he couldn’t hear my typewriter keys, he’d shout, “What’s the matter?”

Meyer, in other words, felt that when a writer wasn’t physically typing, he was just wasting time. (I imagine that a lot of editors and studio executives feel the same way.) Yet every professional writer knows that maybe ten percent of his or her workday—at most—is spent actually typing. The rest is spent pacing, staring into space, or, most likely, goofing off on the Internet. And yet, with the possible exception of that last example, these are the times when the real writing occurs. In most cases, typing is only the working out of a conception that has already arisen from a much less expected place.

As I’ve mentioned before, Woody Allen sets himself plot problems to solve while he’s taking a walk or in the shower. I can testify from my own experience that when I assign myself a problem before I go to the grocery store, by the time I get home again, I’ve almost invariably solved it. Why? It might be that a change of scene puts my brain to work. It may even be a case of Faculty X, in which the left brain slows down long enough to let the right brain catch up. Whatever the reason, it’s fair to say that an act of procrastination can be creatively liberating in ways that discipline alone never can.

This might be the final, most mysterious secret of good writing: that it takes place at the most unexpected times. It can happen on walks, in the shower, or, in Nicholas Meyer’s case, in the bathtub. And when procrastination calls, it’s important to let it do its work. Without the structure of a daily routine, procrastination can easily turn into an excuse to avoid the hard work of writing; within that structure, though, it’s an indispensable part of the process. That’s why it’s important to build breaks into your schedule, to use downtime judiciously, and to be brave enough, when necessary, to be lazy.

Written by nevalalee

March 4, 2011 at 9:07 am

%d bloggers like this: