Posts Tagged ‘Christopher Nolan’
Tone, as I’ve mentioned before, can be a tricky thing. On the subject of plot, David Mamet writes: “Turn the thing around in the last two minutes, and you can live quite nicely. Turn it around in the last ten seconds and you can buy a house in Bel Air.” And if you can radically shift tones within a single story and still keep the audience on board, you can end up with even more. If you look at the short list of the most exciting directors around—Paul Thomas Anderson, David O. Russell, Quentin Tarantino, David Fincher, the Coen Brothers—you find that what most of them have in common is the ability to alter tones drastically from scene to scene, with comedy giving way unexpectedly to violence or pathos. (A big exception here is Christopher Nolan, who seems happiest when operating within a fundamentally serious tonal range. It’s a limitation, but one we’re willing to accept because Nolan is so good at so many other things. Take away those gifts, and you end up with Transcendence.) Tonal variation may be the last thing a director masters, and it often only happens after a few films that keep a consistent tone most of the way through, however idiosyncratic it may be. The Coens started with Blood Simple, then Raising Arizona, and once they made Miller’s Crossing, they never had to look back.
The trouble with tone is that it imposes tremendous switching costs on the audience. As Tony Gilroy points out, during the first ten minutes of a movie, a viewer is making a lot of decisions about how seriously to take the material. Each time the level of seriousness changes gears, whether upward or downward, it demands a corresponding moment of consolidation, which can be exhausting. For a story that runs two hours or so, more than a few shifts in tone can alienate viewers to no end. You never really know where you stand, or whether you’ll be watching the same movie ten minutes from now, so your reaction is often how Roger Ebert felt upon watching Pulp Fiction for the first time: “Seeing this movie last May at the Cannes Film Festival, I knew it was either one of the year’s best films, or one of the worst.” (The outcome is also extremely subjective. I happen to think that Vanilla Sky is one of the most criminally underrated movies of the last two decades—few other mainstream films have accommodated so many tones and moods—but I’m not surprised that so many people hate it.) It also annoys marketing departments, who can’t easily explain what the movie is about; it’s no accident that one of the worst trailers I can recall was for In Bruges, which plays with tone as dexterously as any movie in recent memory.
As a result, tone is another element in which television has considerable advantages. Instead of two hours, a show ideally has at least one season, maybe more, to play around with tone, and the number of potential switching points is accordingly increased. A television series is already more loosely organized than a movie, which allows it to digress and go off on promising tangents, and we’re used to being asked to stop and start from week to week, so we’re more forgiving of departures. That said, this rarely happens all at once; like a director’s filmography, a show often needs a season or two to establish its strengths before it can go exploring. When we think back to a show’s pivotal episodes—the ones in which the future of the series seemed to lock into place—they’re often installments that discovered a new tone that worked within the rules that the show had laid down. Community was never the same after “Modern Warfare,” followed by “Abed’s Uncontrollable Christmas,” demonstrated how much it could push its own reality while still remaining true to its characters, and The X-Files was altered forever by Darin Morgan’s “Humbug,” which taught the show how far it could kid itself while probing into ever darker places.
At its best, this isn’t just a matter of having a “funny” episode of a dramatic series, or a very special episode of a sitcom, but of building a body of narrative that can accommodate surprise. One of the pleasures of following Hannibal this season has been watching the show acknowledge its own absurdity while drawing the noose ever tighter, which only happens after a show has enough history for it to engage in a dialogue with itself. Much the same happened to Breaking Bad, which had the broadest tonal range imaginable: it was able to move between borderline slapstick and the blackest of narrative developments because it could look back and reassure itself that it had already done a good job with both. (Occasionally, a show will emerge with that kind of tone in mind from the beginning. I haven’t had a chance to catch Fargo on FX, but I’m curious about it, because it draws its inspiration from one of the most virtuoso experiments with tone in movie history.) If it works, the result starts to feel like life itself, which can’t be confined easily within any one genre. Maybe that’s because learning to master tone is like putting together the pieces of one’s own life: first you try one thing, then something else, and if you’re lucky, you’ll find that they work well side by side.
Last night, I watched The Lone Ranger. Given the fact that I haven’t yet seen 12 Years a Slave, Captain Phillips, or Before Midnight, this might seem like an odd choice. In my defense, I can only plead that on those rare evenings when my wife is out of the house, I usually seize the opportunity to watch something that I don’t think she’ll enjoy—the last time around it was Battle Royale. I’ve also been intrigued by The Lone Ranger ever since it flamed out in spectacular fashion last summer. Regular readers will know that I have a weakness for flops, and everything I’d read made me think that this was the kind of fascinating studio mess that I find impossible to resist. Quentin Tarantino’s guarded endorsement counted for a lot as well, and we’re already seeing the first rumblings of a revisionist take that sees the film as a neglected treasure. I wouldn’t go quite so far; it has significant problems, and I’m not surprised that the initial reaction was so underwhelming. But I liked it a lot all the same. It’s an engaging, sometimes funny, occasionally exciting movie with more invention and ambition than your average franchise installment, and I’d sooner watch its climactic train chase again than, say, most of The Avengers.
And what interests me the most is its most problematic element, which is the range of tones it encompasses. The Lone Ranger isn’t content just to be a Western; on some level, it wants to be all Westerns, quoting freely from Dead Man and Once Upon a Time in the West while also indulging in slapstick, adventure, gruesome violence, hints of the supernatural, and even moments of tragedy. It’s a revenge narrative by way of Blazing Saddles, and it’s no surprise that the result is all over the map. Part of this may be due to the sheer scale of the production—when someone gives you $200 million to make a Western, you may as well throw everything you can into the pot—but it’s also a reflection of the sensibilities involved. Director Gore Verbinski and screenwriters Ted Elliot and Terry Rossio had collaborated earlier, of course, on the Pirates of the Caribbean franchise, which gained a lot of mileage from a similar stylistic mishmash, though with drastically diminishing returns. And Verbinski at his best has the talent to pull it off: he combines the eye of Michael Bay with a real knack for comedy, and I predicted years ago that he’d win an Oscar one day. (He eventually did, for Rango.)
But playing with tone is a dangerous thing, as we see in the later Pirates films, and The Lone Ranger only gets maybe eighty percent of the way to pulling it off. Watching it, I was reminded of what the screenwriter Tony Gilroy says in his contribution to William Goldman’s Which Lie Did I Tell? Gilroy starts by listing examples of movies that experiment with tone, both good (Dr. Strangelove, The Princess Bride) and bad (Batman and Robin, Year of the Comet) and concludes:
But tone? Tone scares me…Why? Because when it goes wrong it just sucks out loud. I think the audience—the reader—I think they make some critical decisions in the opening movements of a film. How deeply do I invest myself here? How much fun can I have? Should I be consciously referencing the rest of my life during the next two hours, or is this an experience I need to surrender to? Are you asking for my heart or my head or both? Am I rooting for the hero or the movie? Just how many pounds of disbelief are you gonna ask me to suspend before this is through?
The Lone Ranger tramples on all these questions, asking us to contemplate the slaughter of Comanches a few minutes before burying our heroes up to their necks in a nest of scorpions, and the fact that it holds together even as well as it does is a testament both to the skill of the filmmakers and the power of a strong visual style. If nothing else, it looks fantastic, which helps us over some of the rough spots, although not all of them.
And it’s perhaps no accident that William Goldman’s first great discovery of a new tone came in Butch Cassidy and the Sundance Kid. It’s possible that there’s something about the Western that encourages this kind of experimentation: all it needs is a few men and horses, and the genre has been so commercially weakened in recent years that filmmakers have the freedom to try whatever they think might work. It’s true that The Lone Ranger works best in its last forty minutes, when The William Tell Overture blasts over the soundtrack and it seems content to return to its roots as a cliffhanging serial, but when you compare even its most misguided digressions to the relentless sameness of tone in a Liam Neeson thriller or a Bourne knockoff, it feels weirdly like a step forward. (Even Christopher Nolan, a director I admire immensely, has trouble operating outside of a narrow, fundamentally serious tonal range—it’s his one great shortcoming as a storyteller.) Going to the movies every summer would be more fun in general if more megabudgeted blockbusters looked and felt like The Lone Ranger, and its failure means that we’re more likely to see the opposite.
I’ve said more than once that whenever I start a new writing project, I’m trying to end up with a story that I want to read. That’s the kind of thing writers tend to say when asked why they’re drawn to certain types of material, and in a limited sense, it’s true. When I look back at my body of work, it’s clear that it reflects my own particular tastes in fiction: I like tight, detailed narratives with an emphasis on plot and unusual ideas, and for the most part, that’s the kind of story I’ve written. To the extent that the outcome ever surprises me, it’s usually because of the subject matter—I’ll often decide to write a story about a world I know little about, trusting in research and brainstorming to take me into unexpected places. That element of the unknown goes a long way toward keeping the process interesting, and one of the trickiest parts of being a writer is balancing the desire to express one’s own personality with the need to discover something new. The result, if I’ve done it right, is a story that contains a touch of the unanticipated while also looking more or less like the unwritten work I had in mind. Or, as the artist Carl Andre puts it: “A creative person is a person who simply has a desire…to add something to the world that’s not there yet, and goes about arranging for that to happen.”
But there’s an inherent shortcoming to this approach, which lies in the fact that the works of art that matter the most to us often show us something we never knew we needed. When I think about the movies I love, for instance, they tend to be films that blindsided me completely, either with their stories themselves or with the way in which they were told. Knowing what I know about myself, it doesn’t come as a surprise that I’d enjoy the hell out of a movie like Gravity or Inception, but I never would have expected that my favorite movie of all time would turn out to be The Red Shoes, or that I’d passionately love recent films as different as Once, In Bruges, and Certified Copy. These are movies that snuck into my heart, rather than selling me in advance on their intentions, and I feel all the more grateful because they modestly expanded my sense of the possible. As much as I admire a director like Christopher Nolan, there’s no question that he’s primarily adept at delivering exactly the kind of movie that I think I want: a big, expensive, formally ambitious entertainment with just enough complexity to set it apart from the work of other skilled popular filmmakers. And while Nolan’s career has been extraordinary, it’s of a different order entirely from that of, say, Wong Kar-Wai, who at his best made small, messy, gorgeous movies that I never could have imagined on my own.
The same is true of fiction. Looking back over the list of my own favorite novels, surprisingly few resemble the stories I’ve tried to write myself. I love these books because they come from places that I haven’t explored firsthand, whether it’s the sustained performance of a massive novel of ideas like The Magic Mountain or a bejeweled toy like Dictionary of the Khazars. When it comes to novels that stick more closely to the categories that I understand from the inside, like The Day of the Jackal or The Silence of the Lambs, my appreciation is a little different: it’s a respect for craft, for the flawless execution of a genre I know well, and although nothing can diminish my admiration for these books, it’s altogether different from the feeling I get from a novel that comes to us as a fantastic mythical beast, or as a dispatch from some heretofore unexplored country. And it doesn’t need to be deliberately difficult or obscure. Books from Catch-22 to The Time Traveler’s Wife have left me with the sense that I’ve finished reading something that nobody else, least of all me, could have pulled off. (It’s also no accident that it took me a long time to get around to many of the books I’ve mentioned above. More than even the most difficult movies, few of which demand more than two or three hours of our attention, a novel that doesn’t resemble the ones we’ve read before demands a considerable leap of faith.)
That said, I don’t know if it’s possible for writers to feel that away about their own work, especially not for something the size of a novel, in which any flashes of outside inspiration need to share space with months or years of continuous effort. (A short story or poem, which can be conceived and written in a more compressed window of time, is more likely to retain some of that initial strangeness.) But it does imply that writing only the kinds of stories we already like goes only part of the way toward fulfilling our deepest artistic needs. A reader who spends his or her life reading only one kind of book—romance, fantasy, science fiction—ends up with a limited imaginative palate, and a big part of our literary education comes from striking out into books that might seem unfamiliar or uninviting. For writers, this means following a story wherever it takes us, giving up some measure of control, and even deliberately pushing forward into areas of writing that we don’t fully understand, trusting that we’ll find something new and worthwhile along the way. Like all ventures into the unknown, it carries a degree of risk, and we may find that we’ve invested time and energy that we can’t recover into a story that was never meant to be. But it’s far more dangerous to never take that risk in the first place.
For Christmas, I got my wife a copy of The Wes Anderson Collection by Matt Zoller Seitz, which is one of those ideal presents that the giver buys for the recipient because he secretly wants it for himself—I’ve spent at least as much time browsing through it as she has. It’s a beautiful book of interviews with a fascinating subject, and I suspect that it will provide a lot of material for this blog. Today, though, I’d like to focus on one short exchange, which occurs during a discussion of Anderson’s use of extended tracking shots. Seitz points to the drinking contest in Raiders of the Lost Ark as an example of a great director subtly shooting a long scene in a single take without cuts, and shrewdly notes that our knowledge that the action is unfolding in real time subliminally increases the suspense. Anderson agrees: “You’re not only waiting to see who’s going to get knocked out with the liquor; you’re waiting to see who’s going to screw up the take.” Elsewhere, Seitz has written of how the way the scene was shot adds “a second, subtle layer of tension to an already snappy scene…our subliminal awareness that we’re seeing a filmed live performance, and our sporting interest in seeing how long they can keep it going.”
This is a beautiful notion, because it exemplifies a quality that many of my favorite films share: the fictional story that the movie is telling shades imperceptibly into the factual story of how the movie itself was made, which unfolds in parallel to the main action, both invisibly and right in front of our eyes. It’s something like Truffaut’s statement that a movie should simultaneously express “an idea of life and an idea of cinema,” but it’s less about any specific philosophical idea than a sense that the narrative that the movie presents to us is a metaphor for its own creation. We see this in a movie like Citizen Kane, in which it’s hard not to read the youthful excitement of Kane’s early days at the Inquirer as a portrait of Orson Welles arriving on the RKO lot, and its later, disillusioned passages as a weird prefiguring of what would happen to Welles decades down the line; or even a movie like Inception, in which the roles of the participants in the mind heist correspond to those of the team behind the camera—the director, the producer, the production designer—and the star looks a little like Chris Nolan himself. (Someone, possibly me, should really make a slideshow on how directors tend to cast leading roles with their own doubles, as Anderson often does as well.)
And the ultimate expression of the marriage between the filmed story and the story of its creation is the extended shot. It’s a moment in which the movie we’re watching fuses uncannily with its own behind-the-scenes documentary: for a minute or two, we’re on the set, watching the action at the director’s side, and the result is charged with the excitement of live performance. If every cut, as Godard says, is a lie, a continuous take brings us as close to the truth—or at least to a clever simulacrum of it—as the movies can manage. It doesn’t need to be overtly flashy, either: I’ve never seen a better use of an extended take than in the party scene in 4 Months, 3 Weeks, and 2 Days, in which the camera remains stationary for an entire reel. But there’s also a childlike pleasure in seeing filmmakers taking a big risk and getting away with it. You see this in the massively choreographed long takes, involving dozens or hundreds of players, in movies as different as Absolute Beginners, Boogie Nights, and Hard Boiled. And if the hallway fight in Inception ranks among the most thrilling sequences of the decade, it’s because we’re witnessing something astonishing as it must have appeared that day on the set, with Joseph Gordon-Levitt getting battered by the walls of that rotating corridor.
So it’s worth taking a moment to remember that it’s not the long take itself that matters, but the fact that it puts us in the filmmaker’s shoes, which we lose when an extended take is the result of digital trickery. I’m as big a fan as any of the opening shot of Gravity, which recently made my updated list of the greatest movie openings of all time, but there’s no escaping the fact that we’re seeing something that has been invisibly stitched together over many different days of filming, and nearly everything in sight has been constructed through visual effects. This doesn’t make it any less miraculous: along with Life of Pi, it marks a turning point, at least for me, in which digital effects finally live up to their promise of giving us something that can’t be distinguished from reality. But it’s a triumph of vision, planning, and conceptual audacity, without the extra frisson that arises from the sustained tightrope act of an extended shot done in the camera. As time goes by, it will become easier to create this sort of effect from multiple takes, as Cuarón himself did so brilliantly in Children of Men. But it can’t compare to the conspiratorial tension we get from a true tracking shot, done with the full possibility of a disastrous mistake, in which the movies, so often crafted from tricks and illusions, really do seem to defy gravity.
I’m starting to come to terms with an uncomfortable realization: I don’t much like The Avengers. Watching it again recently on Netflix, I was impressed by how fluidly it constructs an engaging movie out of so many prefabricated parts, but I couldn’t help noticing how arbitrary much of it seems. Much of the second act, in particular, feels like it’s killing time, and nothing seems all that essential: it clocks along nicely, but the action scenes follow on one another without building, and the stakes never feel especially high, even as the fate of the world hangs in the balance. And I don’t think this is Joss Whedon’s fault. He comes up with an entertaining package, but he’s stuck between the need to play with all the toys he’s been given while delivering them intact to their next three movies. Each hero has his or her own franchise where the real story development takes place, so The Avengers begins to play like a sideshow, rather than the main event it could have been. This is a story about these characters, not the story, and for all its color and energy, it’s a movie devoted to preserving the status quo. (Even its most memorable moment seems to have been retconned out of existence by the upcoming Agents of S.H.I.E.L.D.)
And while it may seem pointless to worry about this now, I think it’s worth asking what kind of comic book movies we really want, now that it seems that they’re going to dominate every summer for the foreseeable future. I’ve been pondering this even more since finally seeing Man of Steel, which I liked a lot. It has huge problems, above all the fact that its vision of Superman never quite comes into focus: by isolating him from his supporting cast for much of the movie, it blurs his identity to the point where major turning points, like his decision to embrace his role as a hero, flit by almost unnoticed. Yet once it ditches its awkward flashback structure, the movie starts to work, and its last hour has a real sense of awe, scale, and danger. And I’m looking forward to the inevitable sequel, even if it remains unclear if Henry Cavill—much less Zach Snyder or Christopher Nolan—can give the scenes set at the Daily Planet the necessary zest. At their best, the Superman films evoke a line of classic newspaper comedies that extends back to His Girl Friday and even Citizen Kane, and it’s in his ability to both wear the suit and occupy the skin of Clark Kent that Christopher Reeve is most sorely missed.
If nothing else, Man of Steel at least has a point of view about its material, however clouded it might be, which is exactly what most of the Marvel Universe movies are lacking. At this point, when dazzling special effects can be taken for granted, what we need more than anything is a perspective toward these heroes that doesn’t feel as if it were dictated solely by a marketing department. Marvel itself doesn’t have much of an incentive to change its way of doing business: it’s earned a ton of money with this approach, and these movies have made a lot of people happy. But I’d still rather watch Chris Nolan’s Batman films, or even an insanity like Watchmen or Ang Lee’s Hulk, than yet another impersonal raid on the Marvel toy chest. Whedon himself is more than capable of imposing an idiosyncratic take on his projects, and even though it only intermittently comes through in The Avengers itself, I’m hopeful that its success will allow him to express himself more clearly in the future—which is one reason why I’m looking forward to Agents of S.H.I.E.L.D., which seems more geared toward his strengths.
And although I love Nolan’s take on the material, it doesn’t need to be dark, or even particularly ambitious. For an illustration, we need look no further than Captain America, which increasingly seems to me like the best of the Marvel movies. Joe Johnston’s Spielberg imitation is the most credible we’ve seen in a long time—even better, in many ways, than Spielberg himself has managed recently with similar material—and you can sense his joy at being given a chance to make his own Raiders knockoff. Watching it again last night, even on the small screen, I was utterly charmed by almost every frame. It’s a goof, but charged with huge affection toward its sources, and I suspect that it will hold up better over time than anyone could have anticipated. Unfortunately, it already feels like an anomaly. Much of its appeal is due to the period setting, which we’ve already lost for the sequel, and it looks like we’ve seen the last of Hugo Weaving’s Red Skull, who may well turn out to be the most memorable villain the Marvel movies will ever see. Marvel’s future is unlikely to be anything other than hugely profitable for all concerned, but it’s grown increasingly less interesting.
Over the last few weeks, I’ve become fascinated with Brian Eno’s Oblique Strategies. I’ve always been drawn to the creative possibilities of randomness, and this is a particularly interesting example: in its original form, it’s a deck of cards, designed to be drawn from at random, each of which contains a single short aphorism, paradox, or suggestion intended to help break creative blocks. The tone of the aphorisms ranges from practical to gnomic to cheeky—”Overtly resist change,” “Turn it upside down,” “Is the tuning appropriate?”—but their overall intention is to gently disrupt the approach you’ve been taking toward the problem at hand, which often involves inverting your assumptions. This morning, for instance, when I drew a random card from the excellent online version, the result was: “Use clichés.” At first glance, this seems like strange advice, since most of us try to follow William Safire’s advice to avoid clichés like the plague. In reality, though, it’s a useful reminder that clichés do have their place, at least for an artist who has the skill and experience to deploy them correctly.
A cliché, by definition, is a unit of language or narrative that is already familiar to the reader, often to the point of losing all meaning. At their worst, clichés shut down thought by substituting a stereotyped formula for actual engagement with the subject. Still, there are times when this kind of conceptual invisibility can be useful. Songwriters, in particular, know that they can be an invaluable way of managing complexity within a piece of music, which often incorporates lulls or repetition as a courtesy to the listener. Paul Simon says it best:
So when I begin, I usually improvise a melody and sing words—and often those words are just clichés. If it is an old songwriting cliché, most of the time I throw it away, but sometimes I keep it, because they’re nice to have. They’re familiar. They’re like a breather for the listener. You can stop wondering or thinking for a little while and just float along with the music.
This kind of pause is one of the subtlest of all artistic tools: it provides a moment of consolidation, allowing the listener—or reader—to process the information presented so far. When we hear or read a cliché, we don’t need to pay attention to it, and that license to relax can be crucial in a work of art that is otherwise dense and challenging.
This is a simply particular case of a larger point I’ve made elsewhere, which is that not every page of a story can be pitched at the same level of complexity or intensity. With few exceptions, even the most compressed narratives need to periodically rise and fall, both to give the reader a break and to provide a contrast or baseline for more dramatic moments. As the blogger Mike Meginnis has pointed out, this is one reason that we find flat, cartoonish characters in the fiction of Thomas Pynchon: any attempt to create conventionally plausible personalities when the bounds of complexity are being pushed in every other direction would quickly become unmanageable. And I’ve pointed out before that the plot of a movie like Inception needs to be simpler than it seems at first glance: the characters are mostly defined by type, without any real surprises after they’ve been introduced, and once the premise has been established, the plot unfolds in a fairly straightforward way. Christopher Nolan is particularly shrewd at using the familiar tropes of the story he’s telling—the thriller, the comic book movie, the heist film—for grounding us on one level while challenging us on others, which is one reason why I embedded a conventional procedural story at the heart of The Icon Thief.
If there’s one place where clichés don’t work, however, it’s in the creation of character. Given the arguments above, it might seem fine to use stereotypes or stock characters in the supporting cast, which allows the reader to tune them out in favor of the more important players, but in practice, this approach can easily backfire. Simple characters have their place, but it’s best to convey this through clean, uncomplicated motivations: characters who fall too easily into familiar categories often reflect a failure of craft or diligence on the author’s part, and they tend to cloud the story—by substituting a list of stock behaviors for clear objectives—rather than to clarify it. And this applies just as much to attempts to avoid clichés by turning them on their heads. In an excellent list of rules for writing science fiction and fantasy, the author Terry Bisson notes: “Racial and sexual stereotypes are (still) default SF. Avoiding them takes more than reversals.” It isn’t enough, in other words, to make your lead female character really good at archery. Which only hints at the most important point of all: as Niels Bohr said, the opposite of a great truth is another great truth, and the opposite of a cliché is, well, another cliché.
Earlier this month, faced with a break between projects, I began reading Infinite Jest for the first time. If you’re anything like me, this is a book you’ve been regarding with apprehension for a while now—I bought my copy five or six years ago, and it’s followed me through at least three moves without being opened beyond the first page. At the moment, I’m a couple of hundred pages in, and although I’m enjoying it, I’m also glad I waited: Wallace is tremendously original, but he also pushes against his predecessors, particularly Pynchon, in fascinating ways, and I’m better equipped to engage him now than I would have been earlier on. The fact that I’ve published two novels in the meantime also helps. As a writer, I’m endlessly fascinated by the problem of managing complexity—of giving a reader enough intermediate rewards to justify the demands the author makes—and Wallace handles this beautifully. Dave Eggers, in the introduction to the edition I’m reading now, does a nice job of summing it up:
A Wallace reader gets the impression of being in a room with a very talkative and brilliant uncle or cousin who, just when he’s about to push it too far, to try our patience with too much detail, has the good sense to throw in a good lowbrow joke.
And the ability to balance payoff with frustration is a quality shared by many of our greatest novels. It’s relatively easy to write a impenetrable book that tries the reader’s patience, just as it’s easy to create a difficult video game that drives players up the wall, but parceling out small satisfactions to balance out the hard parts takes craft and experience. Mike Meginnis of Uncanny Valley makes a similar point in an excellent blog post about the narrative lessons of video games. While discussing the problem of rules and game mechanics, he writes:
In short, while it might seem that richness suggests excess and maximal inclusion, we actually need to be selective about the elements we include, or the novel will not be rich so much as an incomprehensible blur, a smear of language. Think about the very real limitations of Pynchon as a novelist: many complain about his flat characters and slapstick humor, but without those elements to manage the text and simplify it, his already dangerously complex fiction would become unreadable.
Pynchon, of course, casts a huge shadow over Wallace—sometimes literally, as when two characters in Infinite Jest contemplate their vast silhouettes while standing on a mountain range, as another pair does in Gravity’s Rainbow. And I’m curious to see how Wallace, who seems much more interested than Pynchon in creating plausible human beings, deals with this particular problem.
The problem of managing complexity is one that has come up on this blog several times, notably in my discussion of the work of Christopher Nolan: Inception‘s characters, however appealing, are basically flat, and the action is surprisingly straightforward once we’ve accepted the premise. Otherwise, the movie would fall apart from trying to push complexity in more than one direction at once. Even works that we don’t normally consider accessible to a casual reader often incorporate elements of selection or order into their design. The Homeric parallels in Joyce’s Ulysses are sometimes dismissed as an irrelevant trick—Borges, in particular, didn’t find them interesting—but they’re very helpful for a reader trying to cut a path through the novel for the first time. When Joyce dispensed with that device, the result was Finnegans Wake, a novel greatly admired and rarely read. That’s why encyclopedic fictions, from The Divine Comedy to Moby-Dick, tend to be structured around a journey or other familiar structure, which gives the reader a compass and map to navigate the authorial wilderness.
On a more modest level, I’ve frequently found myself doing this in my own work. I’ve mentioned before that I wanted one of the three narrative strands in The Icon Thief to be a police procedural, which, with its familiar beats and elements, would serve as a kind of thread to pull the reader past some of the book’s complexities. More generally, this is the real purpose of plot. Kurt Vonnegut, who was right about almost everything, says as much in one of those writing aphorisms that I never tire of quoting:
I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading.
The emphasis is mine. Plot is really a way of easing the reader into that greatest of imaginative leaps, which all stories, whatever their ambitions, have in common: the illusion that these events are really taking place, and that characters who never existed are worthy of our attention and sympathy. Plot, structure, and other incidental pleasures are what keep the reader nourished while the real work of the story is taking place. If we take it for granted, it’s because it’s a trick that most storytellers learned a long time ago. But the closer we look at its apparent simplicity, the sooner we realize that, well, it’s complicated.