Posts Tagged ‘Christopher Nolan’
The idea that the brain can be neatly divided into its left and right hemispheres, one rational, the other intuitive, has been largely debunked, but that doesn’t make it any less useful as a metaphor. You could play an instructive game, for instance, by placing movie directors on a spectrum defined by, say, Kubrick and Altman as the quintessence of left-brained filmmaking and its right-brained opposite, and although such distinctions may be artificial, they can generate their own kind of insight. Christopher Nolan, for one, strikes me as a fundamentally left-brained director who makes a point of consciously willing himself into emotion. (Citing some of the cornier elements of Interstellar, the writer Ta-Nehisi Coates theorizes that they were imposed by the studio, but I think it’s more likely that they reflect Nolan’s own efforts, not always successful, to nudge the story into recognizably human places. He pulled it off beautifully in Inception, but it took him ten years to figure out how.) And just as Isaiah Berlin saw Tolstoy as a fox who wanted to be a hedgehog, many of the recent films of Wong Kar-Wai feel like the work of a right-brained director trying to convince himself that the left hemisphere is where he belongs.
Of all my favorite directors, the one who most consistently hits the perfect balance between the two is Akira Kurosawa. I got to thinking about this while reading the editor and teacher Richard D. Pepperman’s appealing new book Everything I Know About Filmmaking I Learned Watching Seven Samurai, which often reads like the ultimate tribute to Kurosawa’s left brain. It’s essentially a shot for shot commentary, cued up to the definitive Criterion Collection release, that takes us in real time through the countless meaningful decisions made by Kurosawa in the editing room: cuts, dissolves, wipes, the interaction between foreground and background, the use of music and sound, and the management of real and filmic space, all in service of story. It’s hard to imagine a better movie for a study like this, and with its generous selection of stills, the book is a delight to browse through—it reminds me a little of Richard J. Anobile’s old photonovels, which in the days before home video provided the most convenient way of revisiting Casablanca or The Wrath of Khan. I’ve spoken before of the film editor as a kind of Apollonian figure, balancing out the Dionysian personality of the director on the set, and this rarely feels so clear as it does here, even, or especially, when the two halves are united in a single man.
As for Kurosawa’s right brain, the most eloquent description I’ve found appears in Donald Richie’s The Films of Akira Kurosawa, which is still the best book of its kind ever written. In his own discussion of Seven Samurai, Richie speaks of “the irrational rightness of an apparently gratuitous image in its proper place,” and continues:
Part of the beauty of such scenes…is just that they are “thrown away” as it were, that they have no place, that they do not ostensibly contribute, that they even constitute what has been called bad filmmaking. It is not the beauty of these unexpected images, however, that captivates…but their mystery. They must remain unexplained. It has been said that after a film is over all that remains are a few scattered images, and if they remain then the film was memorable…Further, if one remembers carefully one finds that it is only the uneconomical, mysterious images which remain…
Kurosawa’s films are so rigorous and, at the same time, so closely reasoned, that little scenes such as this appeal with the direct simplicity of water in the desert…[and] in no other single film are there as many as in Seven Samurai.
What one remembers best from this superbly economical film then are those scenes which seem most uneconomical—that is, those which apparently add nothing to it.
Richie goes on to list several examples: the old crone tottering forward to avenge the death of her son, the burning water wheel, and, most beautifully, the long fade to black before the final sequence of the villagers in the rice fields. My own favorite moment, though, occurs in the early scene when Kambei, the master samurai, rescues a little boy from a thief. In one of the greatest character introductions in movie history, Kambei shaves his head to disguise himself as a priest, asking only for two rice balls, which he’ll use to lure the thief out of the barn where the boy has been taken hostage. This information is conveyed in a short conversation between the farmers and the townspeople, who exit the frame—and after the briefest of pauses, a woman emerges from the house in the background, running directly toward the camera with the rice balls in hand, looking back for a frantic second at the barn. It’s the boy’s mother. There’s no particular reason to stage the scene like this; another director might have done it in two separate shots, if it had occurred to him to include it at all. Yet the way in which Kurosawa films it, with the crowd giving way to the mother’s isolated figure, is both formally elegant and strangely moving. It offers up a miniature world of story and emotion without a single cut, and like Kurosawa himself, it resists any attempt, including this one, to break it down into parts.
A man is rich in proportion to the number of things he can afford to leave alone.
—Henry David Thoreau, Walden
Last week, at the inaugural town hall meeting at Facebook headquarters, one brave questioner managed to cut through the noise and press Mark Zuckerberg on the one issue that really matters: what’s the deal with that gray shirt he always wears? Zuckerberg replied:
I really want to clear my life to make it so I have to make as few decisions as possible about anything except best how to serve this community…I’m in this really lucky position where I get to wake up every day and help serve more than a billion people. And I feel like I’m not doing my job if I spend any of my energy on things that are silly or frivolous about my life…So even though it kind of sounds silly—that that’s my reason for wearing a gray t-shirt every day—it also is true.
There’s a surprising amount to unpack here, starting with the fact, as Allison P. Davis of New York Magazine points out, that it’s considerably easier for a young white male to always wear the same clothes than a woman in the same situation. It’s also worth noting that wearing the exact same shirt each day turns simplicity into a kind of ostentation: there are ways of minimizing the amount of time you spend thinking about your wardrobe without calling attention to it so insistently.
Of course, Zuckerberg is only the latest in a long line of high-achieving nerds who insist, rightly or wrongly, that they have more important things to think about than what they’re going to wear. There’s more than an echo here of the dozens of black Issey Miyake turtlenecks that were stacked in Steve Jobs’s closet, and in the article linked above, Vanessa Friedman of The New York Times also notes that Zuckerberg sounds a little like Obama, who told Michael Lewis in Vanity Fair: “You’ll see I wear only gray or blue suits. I’m trying to pare down decisions. I don’t want to make decisions about what I’m eating or wearing. Because I have too many other decisions to make.” Even Christopher Nolan gets into the act, as we learn in the recent New York Times Magazine profile by Gideon Lewis-Kraus:
Nolan’s own look accords with his strict regimen of optimal resource allocation and flexibility: He long ago decided it was a waste of energy to choose anew what to wear each day, and the clubbable but muted uniform on which he settled splits the difference between the demands of an executive suite and a tundra. The ensemble is smart with a hint of frowzy, a dark, narrow-lapeled jacket over a blue dress shirt with a lightly fraying collar, plus durable black trousers over scuffed, sensible shoes.
If you were to draw a family tree between all these monochromatic Vulcans, you’d find that, consciously or not, they’re all echoing their common patron saint, Ian Malcolm in Jurassic Park, who says:
In any case, I wear only two colors, black and gray…These colors are appropriate for any occasion…and they go well together, should I mistakenly put on a pair of gray socks with my black trousers…I find it liberating. I believe my life has value, and I don’t want to waste it thinking about clothing.
As Malcolm speaks, Crichton writes, “Ellie was staring at him, her mouth open”—apparently stunned into silence, as all women would be, at this display of superhuman rationality. And while it’s easy to make fun of it, I’m basically one of those guys. I eat the same breakfast and lunch every day; my daily uniform of polo shirt, jeans, and New Balance sneakers rarely, if ever, changes; and I’ve had the same haircut for the last eighteen years. If pressed, I’d probably offer a rationale more or less identical to the ones given above. As a writer, I’m called upon to solve a series of agonizingly specific problems each time I sit down at my desk, so the less headspace I devote to everything else, the better.
Which is all well and good. But it’s also easy to confuse the externals with their underlying intention. The world, or at least the Bay Area, is full of young guys with the Zuckerberg look, but it doesn’t matter how little time you spend getting dressed if you aren’t mindfully reallocating the time you save, or extending the principle beyond the closet. The most eloquent defense of minimizing extraneous thinking was mounted by the philosopher Alfred North Whitehead, who writes:
It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
Whitehead isn’t talking about his shirts here; he’s talking about the Arabic number system, a form of “good notation” that frees the mind to think about more complicated problems. Which only reminds us that the shirts you wear won’t make you more effective if you aren’t being equally thoughtful about the decisions that really count. Otherwise, they’re only an excuse for laziness or indifference, which is just as contagious as efficiency. And it often comes to us as a wolf in nerd’s clothing.
By now, Interstellar has inspired plenty of conversation on subjects ranging from the accuracy of its science to the consistency of its intricate timelines, but I wanted to highlight one aspect of the film that hasn’t received as much attention: its use of physical miniatures. If you’re a visual effects nerd like me, Interstellar represents a welcome return to a style of filmmaking that other directors seem to have all but abandoned, with huge, detailed models—the one for the spacecraft Endurance was a full twenty-five feet across—shot against star fields in the studio, a tradition that stretches back through Star Wars to 2001. And the result speaks for itself. The effects are so good that they practically fade into the background; for long stretches of the film, we’re barely aware of them as effects at all, but as elements in a story that persuasively takes place on the largest imaginable scale. (There’s even a sense in which the film’s scientific rigor and its reliance on modelwork go hand in hand. Dealing with big, unwieldy miniatures and hydraulics can only make a filmmaker more aware of the physics involved.)
Last week, I suggested that Christopher Nolan, the most meticulous creator of blockbusters we have, is drawn to IMAX and the logistical problems it presents as a way of getting out of his own head, or of grounding his elaborate conceits in recognizably vivid environments, and much the same is true of his approach to effects. If Inception had unfolded in a flurry of digital imagery, as it might easily have done in the hands of a lesser filmmaker, the story itself would have been far less interesting. Dreams, as Cobb reminds Ariadne, feel real while you’re inside them, and it’s revealing that the most controlling of directors understands the value of techniques that force him to give up control, while paradoxically allowing for greater realism. As Nolan says:
These are things you could try to calculate into CG if you had to, but the wonderful thing about miniature shooting is that it shows you things you never knew were there or couldn’t plan for. I refer to it as serendipity—this random quality that gives the image a feeling of life.
And the randomness is key. Critics often speak of the uncanny valley when describing how virtual actors are never as convincing as the real thing, and a similar principle seems to be at work with other visual effects. Computers have made enormous advances in depicting anything a filmmaker likes, but there are still crucial details—artifacts of lighting, the behavior of surfaces seen against real backdrops—that digital artistry struggles to replicate, precisely because they’re so unpredictable.
Light, it seems, is a problem as intractable, in its own way, as the subtleties of human expression, and while we may feel less of a visceral reaction when the technology falls short, it still prevents us from immersing ourselves completely in the experience. Even in films like The Return of the King or Avatar, which look undeniably spectacular, we’re often conscious of how expertly the imagery has been constructed, with the uniform, unreal light of a world that exists only on a hard drive at Weta. It holds us at arm’s distance even as it draws us in. That said, technology marches on, and it’s telling that Interstellar arrives in theaters almost exactly one year after Gravity, a movie that takes a diametrically opposite approach to many of the same problems: few practical sets or models were built, and for much of the film, everything in sight, from the spacesuits to the interiors to the panorama of the earth in the background, is a digital creation. The result, to put it mildly, looks fantastic, even in IMAX, and it’s the first movie I’ve seen in a long time in which computer effects are truly indistinguishable from reality.
At first glance, then, it might seem like Interstellar arrives at the scene a few months too late, at a point where digital effects have met and exceeded what might be possible using painstaking practical techniques. Really, though, the two films have a great deal in common. If the effects in Gravity work so well, it’s in large part due to the obsessiveness that went into lighting and wirework during principal photography: Emmanuel Lubezki’s famous light box amounts to a complicated way of addressing the basic—and excruciatingly specific—challenge of keeping the actors’ faces properly lit, a detail destined to pass unnoticed until it goes wrong. Interstellar takes much the same approach, with enormous projections used on the sound stage, rather than green screens, in order to immerse the actors in the effects in real time. In other words, both films end up converging on similar solutions from opposite directions, ultimately meeting in the same place: on the set itself. They understand that visible magic only works when grounded in invisible craft, and if the tools they use are very different, they’re united in a common goal. And the cinematic universe, thankfully, is big enough for them both.
Note: This post does its best to avoid spoilers for Interstellar. I hope to have a more detailed consideration up next week.
Halfway through the first showing of Interstellar at the huge IMAX theater at Chicago’s Navy Pier, the screen abruptly went black. At a pivotal moment, the picture cut out first, followed immediately by the sound, and it took the audience a second to realize that the film had broken. Over the five minutes or so that followed, as we waited for the movie to resume, I had time to reflect on the sheer physicality of the technology involved. As this nifty featurette points out, a full print of Interstellar weighs six hundred pounds, mounted on a six-foot platter, and just getting it to move smoothly through the projector gate presents considerable logistical challenges, as we found out yesterday. (The film itself is so large that there isn’t room on the platter for any previews or extraneous features: it’s the first movie I’ve ever seen that simply started at the scheduled time, without any tedious preliminaries, and its closing credits are startlingly short.) According to Glenn Newland, the senior director of operations at IMAX, the company started making calls eighteen months ago to theater owners who were converting from film to digital, saying, in effect: Please hold on to that projector. You’re going to need it.
And they were right. I’ve noted before that if Christopher Nolan has indelibly associated himself with the IMAX format, that’s no accident. Nolan’s intuition about his large-scale medium seems to inform the narrative choices he makes: he senses, for instance, that plunging across a field of corn can be as visually thrilling as a journey through a wormhole or the skyline of Gotham City. Watching it, I got the impression that Nolan is drawn to IMAX as a kind of corrective to his own naturally hermetic style of storytelling: the big technical problems that the format imposes force him to live out in the world, not simply in his own head. And if the resulting image is nine times larger than that of conventional celluloid, that squares well with his approach to screenwriting, which packs each story with enough ideas for nine ordinary movies. Interstellar sometimes groans under the weight of its own ambitions; it lacks the clean lines provided by the heist plot of Inception or the superhero formula of his Batman films. It wants to be a popcorn movie, a visionary epic, a family story, and a scientifically rigorous adventure that takes a serious approach to relativity and time dilation, and it succeeds about two-thirds of the time.
Given the loftiness of its aims, that’s not too bad. Yet it might have worked even better if it had taken a cue from the director whose influence it struggles so hard to escape. Interstellar is haunted by 2001 in nearly every frame, from small, elegant touches, like the way a single cut is used to cover a vast stretch of time—in this case, the two-year journey from Earth to Saturn—to the largest of plot points. Like Kubrick’s film, it pauses in its evocation of vast cosmic vistas for a self-contained interlude of intimate, messy drama, which in both cases seems designed to remind us that humanity, or what it creates, can’t escape its most primitive impulses for self-preservation. Yet it also suffers a little in the comparison. Kubrick was shrewd enough to understand that a movie showing mankind in its true place in the universe had no room for ordinary human plots, and if his characters seem so drained of personality, it’s only a strategy for eliminating irrelevant distractions. Nolan wants to have it all, so he ends up with a film in which the emotional pieces sit uneasily alongside the spectacle, jostling for space when they should have had all the cosmos at their disposal.
Like most of Nolan’s recent blockbuster films, Interstellar engages in a complicated triangulation between purity of vision and commercial appeal, and the strain sometimes shows. It suffers, though much less glaringly, from the same tendency as Prometheus, in which characters stand around a spacecraft discussing information, like what the hell a wormhole is, that should have probably been covered long before takeoff. And while it may ultimately stand as Nolan’s most personal film—it was delivered to theaters under the fake title Flora’s Letter, which is named after his daughter—its monologues on the transcendent power of love make a less convincing statement than the visual wonders on display. (All praise and credit, by the way, are due to Matthew McConaughey, who carries an imperfectly conceived character with all the grace and authority he brought to True Detective, which also found him musing over the existence of dimensions beyond our own.) For all its flaws, though, it still stands as a rebuke to more cautious entertainments, a major work from a director who hardly seems capable of anything else. In an age of massless movies, it exerts a gravitational pull all its own, and if it were any larger, the theater wouldn’t be able to hold it.
Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.
The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.
As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.
At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the pleasures of following Hannibal this season has been watching the show acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. I haven’t had a chance to catch Fargo on FX, but I’m curious about it, because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.
Last night, I watched The Lone Ranger. Given the fact that I haven’t yet seen 12 Years a Slave, Captain Phillips, or Before Midnight, this might seem like an odd choice. In my defense, I can only plead that on those rare evenings when my wife is out of the house, I usually seize the opportunity to watch something that I don’t think she’ll enjoy—the last time around it was Battle Royale. I’ve also been intrigued by The Lone Ranger ever since it flamed out in spectacular fashion last summer. Regular readers will know that I have a weakness for flops, and everything I’d read made me think that this was the kind of fascinating studio mess that I find impossible to resist. Quentin Tarantino’s guarded endorsement counted for a lot as well, and we’re already seeing the first rumblings of a revisionist take that sees the film as a neglected treasure. I wouldn’t go quite so far; it has significant problems, and I’m not surprised that the initial reaction was so underwhelming. But I liked it a lot all the same. It’s an engaging, sometimes funny, occasionally exciting movie with more invention and ambition than your average franchise installment, and I’d sooner watch its climactic train chase again than, say, most of The Avengers.
And what interests me the most is its most problematic element, which is the range of tones it encompasses. The Lone Ranger isn’t content just to be a Western; on some level, it wants to be all Westerns, quoting freely from Dead Man and Once Upon a Time in the West while also indulging in slapstick, adventure, gruesome violence, hints of the supernatural, and even moments of tragedy. It’s a revenge narrative by way of Blazing Saddles, and it’s no surprise that the result is all over the map. Part of this may be due to the sheer scale of the production—when someone gives you $200 million to make a Western, you may as well throw everything you can into the pot—but it’s also a reflection of the sensibilities involved. Director Gore Verbinski and screenwriters Ted Elliot and Terry Rossio had collaborated earlier, of course, on the Pirates of the Caribbean franchise, which gained a lot of mileage from a similar stylistic mishmash, though with drastically diminishing returns. And Verbinski at his best has the talent to pull it off: he combines the eye of Michael Bay with a real knack for comedy, and I predicted years ago that he’d win an Oscar one day. (He eventually did, for Rango.)
But playing with tone is a dangerous thing, as we see in the later Pirates films, and The Lone Ranger only gets maybe eighty percent of the way to pulling it off. Watching it, I was reminded of what the screenwriter Tony Gilroy says in his contribution to William Goldman’s Which Lie Did I Tell? Gilroy starts by listing examples of movies that experiment with tone, both good (Dr. Strangelove, The Princess Bride) and bad (Batman and Robin, Year of the Comet) and concludes:
But tone? Tone scares me…Why? Because when it goes wrong it just sucks out loud. I think the audience—the reader—I think they make some critical decisions in the opening movements of a film. How deeply do I invest myself here? How much fun can I have? Should I be consciously referencing the rest of my life during the next two hours, or is this an experience I need to surrender to? Are you asking for my heart or my head or both? Am I rooting for the hero or the movie? Just how many pounds of disbelief are you gonna ask me to suspend before this is through?
The Lone Ranger tramples on all these questions, asking us to contemplate the slaughter of Comanches a few minutes before burying our heroes up to their necks in a nest of scorpions, and the fact that it holds together even as well as it does is a testament both to the skill of the filmmakers and the power of a strong visual style. If nothing else, it looks fantastic, which helps us over some of the rough spots, although not all of them.
And it’s perhaps no accident that William Goldman’s first great discovery of a new tone came in Butch Cassidy and the Sundance Kid. It’s possible that there’s something about the Western that encourages this kind of experimentation: all it needs is a few men and horses, and the genre has been so commercially weakened in recent years that filmmakers have the freedom to try whatever they think might work. It’s true that The Lone Ranger works best in its last forty minutes, when The William Tell Overture blasts over the soundtrack and it seems content to return to its roots as a cliffhanging serial, but when you compare even its most misguided digressions to the relentless sameness of tone in a Liam Neeson thriller or a Bourne knockoff, it feels weirdly like a step forward. (Even Christopher Nolan, a director I admire immensely, has trouble operating outside of a narrow, fundamentally serious tonal range—it’s his one great shortcoming as a storyteller.) Going to the movies every summer would be more fun in general if more megabudgeted blockbusters looked and felt like The Lone Ranger, and its failure means that we’re more likely to see the opposite.
I’ve said more than once that whenever I start a new writing project, I’m trying to end up with a story that I want to read. That’s the kind of thing writers tend to say when asked why they’re drawn to certain types of material, and in a limited sense, it’s true. When I look back at my body of work, it’s clear that it reflects my own particular tastes in fiction: I like tight, detailed narratives with an emphasis on plot and unusual ideas, and for the most part, that’s the kind of story I’ve written. To the extent that the outcome ever surprises me, it’s usually because of the subject matter—I’ll often decide to write a story about a world I know little about, trusting in research and brainstorming to take me into unexpected places. That element of the unknown goes a long way toward keeping the process interesting, and one of the trickiest parts of being a writer is balancing the desire to express one’s own personality with the need to discover something new. The result, if I’ve done it right, is a story that contains a touch of the unanticipated while also looking more or less like the unwritten work I had in mind. Or, as the artist Carl Andre puts it: “A creative person is a person who simply has a desire…to add something to the world that’s not there yet, and goes about arranging for that to happen.”
But there’s an inherent shortcoming to this approach, which lies in the fact that the works of art that matter the most to us often show us something we never knew we needed. When I think about the movies I love, for instance, they tend to be films that blindsided me completely, either with their stories themselves or with the way in which they were told. Knowing what I know about myself, it doesn’t come as a surprise that I’d enjoy the hell out of a movie like Gravity or Inception, but I never would have expected that my favorite movie of all time would turn out to be The Red Shoes, or that I’d passionately love recent films as different as Once, In Bruges, and Certified Copy. These are movies that snuck into my heart, rather than selling me in advance on their intentions, and I feel all the more grateful because they modestly expanded my sense of the possible. As much as I admire a director like Christopher Nolan, there’s no question that he’s primarily adept at delivering exactly the kind of movie that I think I want: a big, expensive, formally ambitious entertainment with just enough complexity to set it apart from the work of other skilled popular filmmakers. And while Nolan’s career has been extraordinary, it’s of a different order entirely from that of, say, Wong Kar-Wai, who at his best made small, messy, gorgeous movies that I never could have imagined on my own.
The same is true of fiction. Looking back over the list of my own favorite novels, surprisingly few resemble the stories I’ve tried to write myself. I love these books because they come from places that I haven’t explored firsthand, whether it’s the sustained performance of a massive novel of ideas like The Magic Mountain or a bejeweled toy like Dictionary of the Khazars. When it comes to novels that stick more closely to the categories that I understand from the inside, like The Day of the Jackal or The Silence of the Lambs, my appreciation is a little different: it’s a respect for craft, for the flawless execution of a genre I know well, and although nothing can diminish my admiration for these books, it’s altogether different from the feeling I get from a novel that comes to us as a fantastic mythical beast, or as a dispatch from some heretofore unexplored country. And it doesn’t need to be deliberately difficult or obscure. Books from Catch-22 to The Time Traveler’s Wife have left me with the sense that I’ve finished reading something that nobody else, least of all me, could have pulled off. (It’s also no accident that it took me a long time to get around to many of the books I’ve mentioned above. More than even the most difficult movies, few of which demand more than two or three hours of our attention, a novel that doesn’t resemble the ones we’ve read before demands a considerable leap of faith.)
That said, I don’t know if it’s possible for writers to feel that away about their own work, especially not for something the size of a novel, in which any flashes of outside inspiration need to share space with months or years of continuous effort. (A short story or poem, which can be conceived and written in a more compressed window of time, is more likely to retain some of that initial strangeness.) But it does imply that writing only the kinds of stories we already like goes only part of the way toward fulfilling our deepest artistic needs. A reader who spends his or her life reading only one kind of book—romance, fantasy, science fiction—ends up with a limited imaginative palate, and a big part of our literary education comes from striking out into books that might seem unfamiliar or uninviting. For writers, this means following a story wherever it takes us, giving up some measure of control, and even deliberately pushing forward into areas of writing that we don’t fully understand, trusting that we’ll find something new and worthwhile along the way. Like all ventures into the unknown, it carries a degree of risk, and we may find that we’ve invested time and energy that we can’t recover into a story that was never meant to be. But it’s far more dangerous to never take that risk in the first place.