Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Christopher Nolan

“He had played his part admirably…”

leave a comment »

"Laszlo, the bosun of the megayacht..."

Note: This post is the forty-first installment in my author’s commentary for Eternal Empire, covering Chapter 40. You can read the previous installments here.

A few weeks ago, I briefly discussed the notorious scene in The Dark Knight Rises in which Bruce Wayne reappears—without any explanation whatsoever—in Gotham City. Bane’s henchmen, you might recall, have blown up all the bridges and sealed off the area to the military and law enforcement, and the entire plot hinges on the city’s absolute isolation. Bruce, in turn, has just escaped from a foreign prison, and although its location is left deliberately unspecified, it sure seems like it was in a different hemisphere. Yet what must have been a journey of thousands of miles and a daring incursion is handled in the space of a single cut: Bruce simply shows up, and there isn’t even a line of dialogue acknowledging how he got there. Not surprisingly, this hiatus has inspired a lot of discussion online, with most explanations boiling down to “He’s Batman.” If asked, Christopher Nolan might reply that the specifics don’t really matter, and that the viewer’s attention is properly focused elsewhere, a point that the writer John Gardner once made with reference to Hamlet:

We naturally ask how it is that, when shipped off to what is meant to be his death, the usually indecisive prince manages to hoist his enemies with their own petard—an event that takes place off stage and, at least in the surviving text, gets no real explanation. If pressed, Shakespeare might say that he expects us to recognize that the fox out-foxed is an old motif in literature—he could make up the tiresome details if he had to…

Gardner concludes: “The truth is very likely that without bothering to think it out, Shakespeare saw by a flash of intuition that the whole question was unimportant, off the point; and so like Mozart, the white shark of music, he snapped straight to the heart of the matter, refusing to let himself be slowed for an instant by trivial questions of plot logic or psychological consistency—questions unlikely to come up in the rush of drama, though they do occur to us as we pore over the book.” And while this might seem to apply equally well to The Dark Knight Rises, it doesn’t really hold water. The absence of an explanation did yank many of us out of the movie, however briefly, and it took us a minute to settle back in. Any explanation at all would have been better than this, and it could have been conveyed in less than a sentence. It isn’t an issue of plausibility, but of narrative flow. You could say that Bruce’s return to the city ought to be omitted, in the same way a director like Kurosawa mercilessly cuts all transitional moments: when you just need to get a character from Point A to Point B, it’s best to trim the journey as much as you can. In this instance, however, Nolan erred too much on one side, at least in the eyes of many viewers. And it’s a reminder that the rules of storytelling are all about context. You’ve got to judge each problem on its own terms and figure out the solution that makes the most sense in each case.

"He had played his part admirably..."

What’s really fascinating is how frequently Nolan himself seems to struggle with this issue. In terms of sheer technical proficiency, I’d rank him near the top of the list of all working directors, but if he has one flaw as a filmmaker, aside from his lack of humor, it’s his persistent difficulty in finding the right balance between action and exposition. Much of Inception, which is one of my ten favorite movies of all time, consists of the characters breathlessly explaining the plot to one another, and it more or less works. But he also spends much of Interstellar trying with mixed success to figure out how much to tell us about the science involved, leading to scenes like the one in which Dr. Romilly explains the wormhole to Cooper seemingly moments before they enter it. And Nolan is oddly prone to neglecting obligatory beats that the audience needs to assemble the story in their heads, as when Batman appears to abandon a room of innocent party guests to the Joker in The Dark Knight. You could say that such lapses simply reflect the complexity of the stories that Nolan wants to tell, and you might be right. But David Fincher, who is Nolan’s only peer among active directors, tells stories of comparable or greater complexity—indeed, they’re often about their own complexity—and we’re rarely lost or confused. And if I’m hard on Nolan about this, it’s only a reflection of how difficult such issues can be, when even the best mainstream director of his generation has trouble working out how much information the audience needs.

It all boils down to Thomas Pynchon’s arch aside in Gravity’s Rainbow: “You will want cause and effect. All right.” And knowing how much cause will yield the effect you need is a problem that every storyteller has to confront on a regular basis. Chapter 40 of Eternal Empire provides a good example. For the last hundred pages, the novel has been building toward the moment when Ilya sneaks onto the heavily guarded yacht at Yalta. There’s no question that he’s going to do it; otherwise, everything leading up to it would seem like a ridiculous tease. The mechanics of how he gets aboard don’t really matter, but I also couldn’t avoid the issue, or else readers would rightly object. All I needed was a solution that was reasonably plausible and that could be covered in a few pages. As it happens, the previous scene ends with this exchange between Maddy and Ilya: “But you can’t just expect to walk on board.” “That’s exactly what I intend to do.” When I typed those lines, I didn’t know what Ilya had in mind, but I knew at once that they pointed at the kind of simplicity that the story needed, at least at this point in the novel. (If it came later in the plot, as part of the climax, it might have been more elaborate.) So I came up with a short sequence in which Ilya impersonates a dockwalker looking for work on the yacht, cleverly ingratiates himself with the bosun, and slips below when Maddy provides a convenient distraction. It’s a cute scene—maybe a little too cute, in fact, for this particular novel. But it works exactly as well as it should. Ilya is on board. We get just enough cause and effect. And now we can move on to the really good stuff to come…

“And what does that name have to do with this?”

with 2 comments

"The word on the side of your yacht..."

Note: This post is the thirtieth installment in my author’s commentary for Eternal Empire, covering Chapter 29. You can read the previous installments here.

Earlier this week, in response to a devastating article in the New York Times on the allegedly crushing work environment in Amazon’s corporate offices, Jeff Bezos sent an email to employees that included the following statement:

[The article] claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter is heard. Again, I don’t recognize this Amazon and I very much hope you don’t, either…I strongly believe that anyone working in a company that really is like the one described in the [Times] would be crazy to stay. I know I would leave such a company.

Predictably, the email resulted in numerous headlines along the lines of “Jeff Bezos to Employees: You Don’t Work in a Dystopian Hellscape, Do You?” Bezos, a very smart guy, should have seen it coming. As Richard Nixon learned a long time ago, whenever you tell people that you aren’t a crook, you’re really raising the possibility that you might be. If you’re concerned about the names that your critics might call you, the last thing you want to do is put words in their mouths—it’s why public relations experts advise their clients to avoid negative language, even in the form of a denial—and saying that Amazon isn’t a soulless, dystopian workplace is a little like asking us not to think of an elephant.

Writers have recognized the negative power of certain loaded terms for a long time, and many works of art go out of their way to avoid such words, even if they’re central to the story. One of my favorite examples is the film version of The Girl With the Dragon Tattoo. Coming off Seven and Zodiac, David Fincher didn’t want to be pigeonholed as a director of serial killer movies, so the dialogue exclusively uses the term “serial murderer,” although it’s doubtful how effective this was. Along the same lines, Christopher Nolan’s superhero movies are notably averse to calling their characters by their most famous names: The Dark Knight Rises never uses the name “Catwoman,” while Man of Steel, which Nolan produced, avoids “Superman,” perhaps following the example of Frank Miller’s The Dark Knight Returns, which indulges in similar circumlocutions. Robert Towne’s script for Greystoke never calls its central character “Tarzan,” and The Walking Dead uses just about every imaginable term for its creatures aside from “zombie,” for reasons that creator Robert Kirkman explains:

One of the things about this world is that…they’re not familiar with zombies, per se. This isn’t a world [in which] the Romero movies exist, for instance, because we don’t want to portray it that way…They’ve never seen this in pop culture. This is a completely new thing for them.

"And what does that name have to do with this?"

Kirkman’s reluctance to call anything a zombie, which has inspired an entire page on TV Tropes dedicated to similar examples, is particularly revealing. A zombie movie can’t use that word because an invasion of the undead needs to feel like something unprecedented, and falling back on a term we know conjures up all kinds of pop cultural connotations that an original take might prefer to avoid. In many cases, avoiding particular words subtly encourages us treat the story on its own terms. In The Godfather, the term “Mafia” is never uttered—an aversion, incidentally, not shared by the original novel, the working title of which was actually Mafia. This quietly allows us to judge the Corleones according to the rules of their own closed world, and it circumvents any real reflection about what the family business actually involves. (According to one famous story, the mobster Joseph Colombo paid a visit to producer Al Ruddy, demanding that the word be struck from the script as a condition for allowing the movie to continue. Ruddy, who knew that the screenplay only used the word once, promptly agreed.) The Godfather Part II is largely devoted to blowing up the first movie’s assumptions, and when the word “Mafia” is uttered at a senate hearing, it feels like the real world intruding on a comfortable fantasy. And the moment wouldn’t be as effective if the first installment hadn’t been as diligent about avoiding the term, allowing it to build a new myth in its place.

While writing Eternal Empire, I found myself confronting a similar problem. In this case, the offending word was “Shambhala.” As I’ve noted before, I decided early on that the third novel in the series would center on the Shambhala myth, a choice I made as soon as I stumbled across an excerpt from Rachel Polonsky’s Molotov’s Magic Lantern, in which she states that Vladimir Putin had taken a particular interest in the legend. A little research, notably in Andrei Znamenski’s Red Shambhala, confirmed that the periodic attempts by Russia to confirm the existence of that mythical kingdom, carried out in an atmosphere of espionage and spycraft in Central Asia, was a rich vein of material. The trouble was that the word “Shambhala” itself was so loaded with New Age connotations that I’d have trouble digging my way out from under it: a quick search online reveals that it’s the name of a string of meditation centers, a music festival, and a spa with its own line of massage oils, none of which is exactly in keeping with the tone that I was trying to evoke. My solution, predictably, was to structure the whole plot around the myth of Shambhala while mentioning it as little as possible: the name appears perhaps thirty times across four hundred pages. (The mythological history of Shambhala is treated barely at all, and most of the references occur in discussions of the real attempts by Russian intelligence to discover it.) The bulk of those references appear here, in Chapter 29, and I cut them all down as much as possible, focusing on the bare minimum I needed for Maddy to pique Tarkovsky’s interest. I probably could have cut them even further. But as it stands, it’s more or less enough to get the story to where it needs to be. And it doesn’t need to be any longer than it is…

Gatsby’s fortune and the art of ambiguity

with 3 comments

F. Scott Fitzgerald

In November 1924, the editor Maxwell Perkins received the manuscript of a novel tentatively titled Trimalchio in West Egg. He loved the book—he called it “extraordinary” and “magnificent”—but he also had a perceptive set of notes for its author. Here are a few of them:

Among a set of characters marvelously palpable and vital—I would know Tom Buchanan if I met him on the street and would avoid him—Gatsby is somewhat vague. The reader’s eyes can never quite focus upon him, his outlines are dim. Now everything about Gatsby is more or less a mystery, i.e. more or less vague, and this may be somewhat of an artistic intention, but I think it is mistaken. Couldn’t he be physically described as distinctly as the others, and couldn’t you add one or two characteristics like the use of that phrase “old sport”—not verbal, but physical ones, perhaps…

The other point is also about Gatsby: his career must remain mysterious, of course…Now almost all readers numerically are going to feel puzzled by his having all this wealth and are going to feel entitled to an explanation. To give a distinct and definite one would be, of course, utterly absurd. It did occur to me, thought, that you might here and there interpolate some phrases, and possibly incidents, little touches of various kinds, that would suggest that he was in some active way mysteriously engaged.

The novel, of course, ultimately appeared under the title The Great Gatsby, and before it was published, F. Scott Fitzgerald took many of the notes from Perkins to heart, adding more descriptive material on Gatsby himself—along with several repetitions of the phrase “old sport”—and the sources of his mysterious fortune. Like Tay Hohoff, whose work on To Kill a Mockingbird has recently come back into the spotlight, Perkins was the exemplar of the editor as shaper, providing valued insight and active intervention for many of the major writers of his generation: Fitzgerald, Hemingway, Wolfe. But my favorite part of this story lies in Fitzgerald’s response, which I think is one of the most extraordinary glimpses into craft we have from any novelist:

I myself didn’t know what Gatsby looked like or was engaged in and you felt it. If I’d known and kept it from you you’d have been too impressed with my knowledge to protest. This is a complicated idea but I’m sure you’ll understand. But I know now—and as a penalty for not having known first, in other words to make sure, I’m going to tell more.

Which is only to say that there’s a big difference between what an author deliberately withholds and what he doesn’t know himself. And an intelligent reader, like Perkins, will sense it.

On Growth and Form

And it has important implications for the way we create our characters. I’ve never been a fan of the school that advocates working out every detail of a character’s background, from her hobbies to her childhood pets: the questionnaires and worksheets that spring up around this impulse always seem like an excuse for procrastination. My own sense of character is closer to what D’Arcy Wentworth Thompson describes in On Growth and Form, in which an animal’s physical shape is determined largely by the outside pressures to which it is subjected. Plot emerges from character, yes, but there’s also a sense in which character emerges from plot: these men and women are distinguished primarily by the fact that they’re the only people in the world to whom these particular events could happen. When I combine this with my natural distrust of backstory, even if I’m retreating from this a bit, I’ll often find that there are important things about my characters I don’t know myself, even after I’ve lived with them for years. There can even be something a little false about keeping the past constantly present in a character’s mind, as we see in so much “realistic” fiction: even if we’re all the sum of our childhood experiences, in practice, we reveal more about ourselves in how we react to the pattern of forces in our lives at the moment, and our actions have a logic that can be worked out independently, as long as the situation is honestly developed.

But that doesn’t apply to issues, like the sources of Gatsby’s fortune, in which the reader’s curiosity might be reasonably aroused. If you’re going to hint at something, you’d better have a good idea of the answer, even if you don’t plan on sharing it. This applies especially to stories that generate a deliberate ambiguity, as Chris Nolan says of the ending of Inception:

Interviewer: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”

Nolan: Oh no, I’ve got an answer.

Interviewer: You do?!

Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated.

Ambiguity, as I’ve said elsewhere, is best created out of a network of specifics with one crucial piece removed. That specificity requires a great deal of knowledge on the author’s part, perhaps more here than anywhere else. And as Fitzgerald notes, if you do it properly, they’ll be too impressed by your knowledge to protest—or they’ll protest in all the right ways.

My ten great movies #10: Inception

with 3 comments

Inception

Note: Four years ago, I published a series of posts here about my ten favorite movies. Since then, the list has evolved, as all such rankings do, with a few new titles and a reshuffling of the survivors, so it seems like as good a time as any to revisit it now.

Five years after its release, when we think of Inception, what we’re likely to remember first—aside from its considerable merits as entertainment—is its apparent complexity. With five or more levels of reality and a set of rules being explained to us, as well as to the characters, in parallel with breathless action, it’s no wonder that its one big laugh comes at Ariadne’s bewildered question: “Whose subconscious are we going into?” It’s a line that gives us permission to be lost. Yet it’s all far less confusing than it might have been, thanks largely to the work of editor Lee Smith, whose lack of an Oscar nomination, in retrospect, seems like an even greater scandal than Nolan’s snub as Best Director. This is one of the most comprehensively organized movies ever made. Yet a lot of credit is also due to Nolan’s script, and in particular to the shrewd choices it makes about where to walk back its own complications. As I’ve noted before, once the premise has been established, the action unfolds more or less as we’ve been told it will: there isn’t the third-act twist or betrayal that similar heist movies, or even Memento, have taught us to expect. Another nudge would cause it all to collapse.

It’s also in part for the sake of reducing clutter that the dream worlds themselves tend to be starkly realistic, while remaining beautiful and striking. A director like Terry Gilliam might have turned each level into a riot of production design, and although the movie’s relative lack of surrealism has been taken as a flaw, it’s really more of a strategy for keeping the clean lines of the story distinct. The same applies to the characters, who, with the exception of Cobb, are defined mostly by their roles in the action. Yet they’re curiously compelling, perhaps because we respond so instinctively to stories of heists and elaborate teamwork. I admire Interstellar, but I can’t say I need to spend another three hours in the company of its characters, while Inception leaves me wanting more. This is also because its premise is so rich: it hints at countless possible stories, but turns itself into a closed circle that denies any prospect of a sequel. (It’s worth noting, too, how ingenious the device of the totem really is, with the massive superstructure of one of the largest movies ever made coming to rest on the axis of a single trembling top.) And it’s that unresolved tension, between a universe of possibilities and a remorseless cut to black, that gives us the material for so many dreams.

Tomorrow: The greatest story in movies. 

Written by nevalalee

May 11, 2015 at 8:27 am

The poster problem

leave a comment »

Avengers: Age of Ultron

Three years ago, while reviewing The Avengers soon after its opening weekend, I made the following remarks, which seem to have held up fairly well:

This is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many…so that a lot of the film, probably too much, is spent slotting all the components into place.

If the early reactions to Age of Ultron are any indication, I could copy and paste this text and make it the centerpiece of a review of any Avengers movie, past or future. This isn’t to say that the latest installment—which I haven’t seen—might not be fine in its way. But even the franchise’s fans, of which I’m not really one, seem to admit that much of it consists of Whedon dealing with all those moving parts, and the extent of your enjoyment depends largely on how well you feel he pulls it off.

Whedon himself has indicated that he has less control over the process than he’d like. In a recent interview with Mental Floss, he says:

But it’s difficult because you’re living in franchise world—not just Marvel, but in most big films—where you can’t kill anyone, or anybody significant. And now I find myself with a huge crew of people and, although I’m not as bloodthirsty as some people like to pretend, I think it’s disingenuous to say we’re going to fight this great battle, but there’s not going to be any loss. So my feeling in these situations with Marvel is that if somebody has to be placed on the altar and sacrificed, I’ll let you guys decide if they stay there.

Which, when you think about it, is a startling statement to hear from one of Hollywood’s most powerful directors. But it accurately describes the situation. Any Avengers movie will always feel less like a story in itself than like a kind of anomalous weather pattern formed at the meeting point of several huge fronts: the plot, such as it is, emerges in the transition zone, and it’s dwarfed by the masses of air behind it. Marvel has made a specialty of exceeding audience expectations just ever so slightly, and given the gigantic marketing pressures involved, it’s a marvel that it works as well as it does.

Inception

It’s fair to ask, in fact, whether any movie with that poster—with no fewer than eight names above the title, most belonging to current or potential franchise bearers—could ever be more than an exercise in crowd control. In fact, there’s a telling counterexample, and it looks, as I’ve said elsewhere, increasingly impressive with time: Christopher Nolan’s Inception. As the years pass, Inception remains a model movie in many respects, but particularly when it comes to the problem of managing narrative complexity. Nolan picks his battles in fascinating ways: he’s telling a nested story with five or more levels of reality, and like Thomas Pynchon, he selectively simplifies the material wherever he can. There’s the fact, for instance, that once the logic of the plot has been explained, it unfolds more or less as we expect, without the twist or third-act betrayal that we’ve been trained to anticipate in most heist movies. The characters, with the exception of Cobb, are defined largely by their surfaces, with a specified role and a few identifying traits. Yet they don’t come off as thin or underdeveloped, and although the poster for Inception is even more packed than that for Age of Ultron, with nine names above the title, we don’t feel that the movie is scrambling to find room for everyone.

And a glance at the cast lists of these movies goes a long way toward explaining why. The Avengers has about fifty speaking parts; Age of Ultron has sixty; and Inception, incredibly, has only fifteen or so. Inception is, in fact, a remarkably underpopulated movie: aside from its leading actors, only a handful of other faces ever appear. Yet we don’t particularly notice this while watching. In all likelihood, there’s a threshold number of characters necessary for a movie to seem fully peopled—and to provide for enough interesting pairings—and any further increase doesn’t change our perception of the whole. If that’s the case, then it’s another shrewd simplification by Nolan, who gives us exactly the number of characters we need and no more. The Avengers movies operate on a different scale, of course: a movie full of superheroes needs some ordinary people for contrast, and there’s a greater need for extras when the stage is as big as the universe. (On paper, anyway. In practice, the stakes in a movie like this are always going to remain something of an abstraction, since we have eight more installments waiting in the wings.) But if Whedon had been more ruthless at paring down his cast at the margins, we might have ended up with a series of films that seemed, paradoxically, larger: each hero could have expanded to fill the space he or she deserved, rather than occupying one corner of a masterpiece of Photoshop.

Written by nevalalee

April 29, 2015 at 8:44 am

Left brain, right brain, samurai brain

leave a comment »

Seven Samurai

The idea that the brain can be neatly divided into its left and right hemispheres, one rational, the other intuitive, has been largely debunked, but that doesn’t make it any less useful as a metaphor. You could play an instructive game, for instance, by placing movie directors on a spectrum defined by, say, Kubrick and Altman as the quintessence of left-brained filmmaking and its right-brained opposite, and although such distinctions may be artificial, they can generate their own kind of insight. Christopher Nolan, for one, strikes me as a fundamentally left-brained director who makes a point of consciously willing himself into emotion. (Citing some of the cornier elements of Interstellar, the writer Ta-Nehisi Coates theorizes that they were imposed by the studio, but I think it’s more likely that they reflect Nolan’s own efforts, not always successful, to nudge the story into recognizably human places. He pulled it off beautifully in Inception, but it took him ten years to figure out how.) And just as Isaiah Berlin saw Tolstoy as a fox who wanted to be a hedgehog, many of the recent films of Wong Kar-Wai feel like the work of a right-brained director trying to convince himself that the left hemisphere is where he belongs.

Of all my favorite directors, the one who most consistently hits the perfect balance between the two is Akira Kurosawa. I got to thinking about this while reading the editor and teacher Richard D. Pepperman’s appealing new book Everything I Know About Filmmaking I Learned Watching Seven Samurai, which often reads like the ultimate tribute to Kurosawa’s left brain. It’s essentially a shot for shot commentary, cued up to the definitive Criterion Collection release, that takes us in real time through the countless meaningful decisions made by Kurosawa in the editing room: cuts, dissolves, wipes, the interaction between foreground and background, the use of music and sound, and the management of real and filmic space, all in service of story. It’s hard to imagine a better movie for a study like this, and with its generous selection of stills, the book is a delight to browse through—it reminds me a little of Richard J. Anobile’s old photonovels, which in the days before home video provided the most convenient way of revisiting Casablanca or The Wrath of Khan. I’ve spoken before of the film editor as a kind of Apollonian figure, balancing out the Dionysian personality of the director on the set, and this rarely feels so clear as it does here, even, or especially, when the two halves are united in a single man.

Seven Samurai

As for Kurosawa’s right brain, the most eloquent description I’ve found appears in Donald Richie’s The Films of Akira Kurosawa, which is still the best book of its kind ever written. In his own discussion of Seven Samurai, Richie speaks of “the irrational rightness of an apparently gratuitous image in its proper place,” and continues:

Part of the beauty of such scenes…is just that they are “thrown away” as it were, that they have no place, that they do not ostensibly contribute, that they even constitute what has been called bad filmmaking. It is not the beauty of these unexpected images, however, that captivates…but their mystery. They must remain unexplained. It has been said that after a film is over all that remains are a few scattered images, and if they remain then the film was memorable…Further, if one remembers carefully one finds that it is only the uneconomical, mysterious images which remain…

Kurosawa’s films are so rigorous and, at the same time, so closely reasoned, that little scenes such as this appeal with the direct simplicity of water in the desert…[and] in no other single film are there as many as in Seven Samurai.

What one remembers best from this superbly economical film then are those scenes which seem most uneconomical—that is, those which apparently add nothing to it.

Richie goes on to list several examples: the old crone tottering forward to avenge the death of her son, the burning water wheel, and, most beautifully, the long fade to black before the final sequence of the villagers in the rice fields. My own favorite moment, though, occurs in the early scene when Kambei, the master samurai, rescues a little boy from a thief. In one of the greatest character introductions in movie history, Kambei shaves his head to disguise himself as a priest, asking only for two rice balls, which he’ll use to lure the thief out of the barn where the boy has been taken hostage. This information is conveyed in a short conversation between the farmers and the townspeople, who exit the frame—and after the briefest of pauses, a woman emerges from the house in the background, running directly toward the camera with the rice balls in hand, looking back for a frantic second at the barn. It’s the boy’s mother. There’s no particular reason to stage the scene like this; another director might have done it in two separate shots, if it had occurred to him to include it at all. Yet the way in which Kurosawa films it, with the crowd giving way to the mother’s isolated figure, is both formally elegant and strangely moving. It offers up a miniature world of story and emotion without a single cut, and like Kurosawa himself, it resists any attempt, including this one, to break it down into parts.

The Ian Malcolm rule

with one comment

Jeff Goldblum in Jurassic Park

A man is rich in proportion to the number of things he can afford to leave alone.

—Henry David Thoreau, Walden

Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:

I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.

There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.

Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:

Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.

Mark Zuckerberg

If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:

In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.

As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.

Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:

It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.

Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.

Follow

Get every new post delivered to your Inbox.

Join 11,147 other followers

%d bloggers like this: