Posts Tagged ‘Christopher Nolan’
Over the last few weeks, I’ve become fascinated with Brian Eno’s Oblique Strategies. I’ve always been drawn to the creative possibilities of randomness, and this is a particularly interesting example: in its original form, it’s a deck of cards, designed to be drawn from at random, each of which contains a single short aphorism, paradox, or suggestion intended to help break creative blocks. The tone of the aphorisms ranges from practical to gnomic to cheeky—”Overtly resist change,” “Turn it upside down,” “Is the tuning appropriate?”—but their overall intention is to gently disrupt the approach you’ve been taking toward the problem at hand, which often involves inverting your assumptions. This morning, for instance, when I drew a random card from the excellent online version, the result was: “Use clichés.” At first glance, this seems like strange advice, since most of us try to follow William Safire’s advice to avoid clichés like the plague. In reality, though, it’s a useful reminder that clichés do have their place, at least for an artist who has the skill and experience to deploy them correctly.
A cliché, by definition, is a unit of language or narrative that is already familiar to the reader, often to the point of losing all meaning. At their worst, clichés shut down thought by substituting a stereotyped formula for actual engagement with the subject. Still, there are times when this kind of conceptual invisibility can be useful. Songwriters, in particular, know that they can be an invaluable way of managing complexity within a piece of music, which often incorporates lulls or repetition as a courtesy to the listener. Paul Simon says it best:
So when I begin, I usually improvise a melody and sing words—and often those words are just clichés. If it is an old songwriting cliché, most of the time I throw it away, but sometimes I keep it, because they’re nice to have. They’re familiar. They’re like a breather for the listener. You can stop wondering or thinking for a little while and just float along with the music.
This kind of pause is one of the subtlest of all artistic tools: it provides a moment of consolidation, allowing the listener—or reader—to process the information presented so far. When we hear or read a cliché, we don’t need to pay attention to it, and that license to relax can be crucial in a work of art that is otherwise dense and challenging.
This is a simply particular case of a larger point I’ve made elsewhere, which is that not every page of a story can be pitched at the same level of complexity or intensity. With few exceptions, even the most compressed narratives need to periodically rise and fall, both to give the reader a break and to provide a contrast or baseline for more dramatic moments. As the blogger Mike Meginnis has pointed out, this is one reason that we find flat, cartoonish characters in the fiction of Thomas Pynchon: any attempt to create conventionally plausible personalities when the bounds of complexity are being pushed in every other direction would quickly become unmanageable. And I’ve pointed out before that the plot of a movie like Inception needs to be simpler than it seems at first glance: the characters are mostly defined by type, without any real surprises after they’ve been introduced, and once the premise has been established, the plot unfolds in a fairly straightforward way. Christopher Nolan is particularly shrewd at using the familiar tropes of the story he’s telling—the thriller, the comic book movie, the heist film—for grounding us on one level while challenging us on others, which is one reason why I embedded a conventional procedural story at the heart of The Icon Thief.
If there’s one place where clichés don’t work, however, it’s in the creation of character. Given the arguments above, it might seem fine to use stereotypes or stock characters in the supporting cast, which allows the reader to tune them out in favor of the more important players, but in practice, this approach can easily backfire. Simple characters have their place, but it’s best to convey this through clean, uncomplicated motivations: characters who fall too easily into familiar categories often reflect a failure of craft or diligence on the author’s part, and they tend to cloud the story—by substituting a list of stock behaviors for clear objectives—rather than to clarify it. And this applies just as much to attempts to avoid clichés by turning them on their heads. In an excellent list of rules for writing science fiction and fantasy, the author Terry Bisson notes: “Racial and sexual stereotypes are (still) default SF. Avoiding them takes more than reversals.” It isn’t enough, in other words, to make your lead female character really good at archery. Which only hints at the most important point of all: as Niels Bohr said, the opposite of a great truth is another great truth, and the opposite of a cliché is, well, another cliché.
Earlier this month, faced with a break between projects, I began reading Infinite Jest for the first time. If you’re anything like me, this is a book you’ve been regarding with apprehension for a while now—I bought my copy five or six years ago, and it’s followed me through at least three moves without being opened beyond the first page. At the moment, I’m a couple of hundred pages in, and although I’m enjoying it, I’m also glad I waited: Wallace is tremendously original, but he also pushes against his predecessors, particularly Pynchon, in fascinating ways, and I’m better equipped to engage him now than I would have been earlier on. The fact that I’ve published two novels in the meantime also helps. As a writer, I’m endlessly fascinated by the problem of managing complexity—of giving a reader enough intermediate rewards to justify the demands the author makes—and Wallace handles this beautifully. Dave Eggers, in the introduction to the edition I’m reading now, does a nice job of summing it up:
A Wallace reader gets the impression of being in a room with a very talkative and brilliant uncle or cousin who, just when he’s about to push it too far, to try our patience with too much detail, has the good sense to throw in a good lowbrow joke.
And the ability to balance payoff with frustration is a quality shared by many of our greatest novels. It’s relatively easy to write a impenetrable book that tries the reader’s patience, just as it’s easy to create a difficult video game that drives players up the wall, but parceling out small satisfactions to balance out the hard parts takes craft and experience. Mike Meginnis of Uncanny Valley makes a similar point in an excellent blog post about the narrative lessons of video games. While discussing the problem of rules and game mechanics, he writes:
In short, while it might seem that richness suggests excess and maximal inclusion, we actually need to be selective about the elements we include, or the novel will not be rich so much as an incomprehensible blur, a smear of language. Think about the very real limitations of Pynchon as a novelist: many complain about his flat characters and slapstick humor, but without those elements to manage the text and simplify it, his already dangerously complex fiction would become unreadable.
Pynchon, of course, casts a huge shadow over Wallace—sometimes literally, as when two characters in Infinite Jest contemplate their vast silhouettes while standing on a mountain range, as another pair does in Gravity’s Rainbow. And I’m curious to see how Wallace, who seems much more interested than Pynchon in creating plausible human beings, deals with this particular problem.
The problem of managing complexity is one that has come up on this blog several times, notably in my discussion of the work of Christopher Nolan: Inception‘s characters, however appealing, are basically flat, and the action is surprisingly straightforward once we’ve accepted the premise. Otherwise, the movie would fall apart from trying to push complexity in more than one direction at once. Even works that we don’t normally consider accessible to a casual reader often incorporate elements of selection or order into their design. The Homeric parallels in Joyce’s Ulysses are sometimes dismissed as an irrelevant trick—Borges, in particular, didn’t find them interesting—but they’re very helpful for a reader trying to cut a path through the novel for the first time. When Joyce dispensed with that device, the result was Finnegans Wake, a novel greatly admired and rarely read. That’s why encyclopedic fictions, from The Divine Comedy to Moby-Dick, tend to be structured around a journey or other familiar structure, which gives the reader a compass and map to navigate the authorial wilderness.
On a more modest level, I’ve frequently found myself doing this in my own work. I’ve mentioned before that I wanted one of the three narrative strands in The Icon Thief to be a police procedural, which, with its familiar beats and elements, would serve as a kind of thread to pull the reader past some of the book’s complexities. More generally, this is the real purpose of plot. Kurt Vonnegut, who was right about almost everything, says as much in one of those writing aphorisms that I never tire of quoting:
I guarantee you that no modern story scheme, even plotlessness, will give a reader genuine satisfaction, unless one of those old-fashioned plots is smuggled in somewhere. I don’t praise plots as accurate representations of life, but as ways to keep readers reading.
The emphasis is mine. Plot is really a way of easing the reader into that greatest of imaginative leaps, which all stories, whatever their ambitions, have in common: the illusion that these events are really taking place, and that characters who never existed are worthy of our attention and sympathy. Plot, structure, and other incidental pleasures are what keep the reader nourished while the real work of the story is taking place. If we take it for granted, it’s because it’s a trick that most storytellers learned a long time ago. But the closer we look at its apparent simplicity, the sooner we realize that, well, it’s complicated.
Jordan Goldberg: In closing, what would you guys say you’ve learned through this experience?
Christopher Nolan: I’ve learned to get more reaction shots. [All laugh.] I’ve learned you can never have too many reaction shots to something extraordinary. Just on a technical level. In order to portray an extraordinary figure in an ordinary world, you have to really invest in the reality of the ordinary and in the reactions of people to him. That, to me, was what was fun about taking on this character because it hadn’t been done before. He is such an extraordinary figure, but if you can believe in the world he’s in, you can really enjoy that extraordinariness and that theatricality.
On Saturday, my wife and I went to the Siskel Center in Chicago to see the engaging new documentary Side by Side, which focuses on the recent shift toward digital filmmaking and its implications for movies as a whole. Despite some soporific narration by producer and interviewer Keanu Reeves—who is not a man who should ever be allowed to do voiceover—this is a smart, interesting film that treats us to a dazzling range of perspectives, many of them from artists I’ve discussed repeatedly on this blog: David Lynch, Christopher Nolan, David Fincher, George Lucas, Stephen Soderbergh, Lars Von Trier, and the indispensable Walter Murch, not to mention Martin Scorsese, James Cameron, Michael Ballhaus, Robert Rodriguez, the Wachowskis, and many more. And while the interviewees come down on various sides of the digital issue—Rodriguez is probably the most unapologetic defender, Nolan the greatest skeptic—there’s one clear message: digital filmmaking is here to stay, and movies will never be the same.
If there’s one thread that runs through the entire movie, it’s the tradeoffs that come when you trade an expensive, cumbersome, highly challenging medium for something considerably cheaper and easier. At first glance, the benefits are enormous: you can run the camera for as long as you like for next to nothing, allowing you to capture more material, and the relatively small size of digital cameras lets you bring them places and achieve effects that might have been impossible before. Digital photography allows for greater control over technical details like color correction; makes editing far less difficult, at least on a practical level; and offers access to advanced tools to filmmakers with limited budgets. Yet there are tradeoffs as well. Film is still capable of visual glories that digital can’t match, and it’s curious that a movie that features Nolan and his genius cinematographer Wally Pfister lacks a single mention of IMAX. (Despite the multiplicity of voices here, I would have loved to have heard from Brad Bird, who because famous working in an exclusively digital medium but still chose IMAX to film much of Mission: Impossible—Ghost Protocol.)
Still, as the movie demonstrates, resolution and image quality for digital video is advancing at an exponential rate, and within the next ten years or so, it’s possible that we won’t notice the difference between digital photography and even the highest-resolution images available on film. Even then, however, something vital threatens to be lost. As Greta Gerwig, of all people, points out, when there’s real film running through the camera, everyone on set takes the moment very seriously, an intensity that tends to be diminished when video is cheap. The end of constraints comes at the cost of a certain kind of serendipity: as Anne V. Coates, the editor of Lawrence of Arabia, reveals, the greatest cut in the history of movies was originally meant as a dissolve, but was discovered by accident in the editing room. And as both David Lynch and producer Lorenzo DiBonaventure note, the increased availability of digital filmmaking doesn’t necessarily mean that we’ll see a greater number of good movies. In fact, the opposite is more likely to be true, as digital technology lowers the barriers to entry for artists who may not be ready to release movies in the first place—the cinematic equivalent of Kindle publishing.
The answer, clearly, is that we need to continue to impose constraints even as we’re liberated by new technology. That sense of intensity that Gerwig mentions is something that directors can still create, but only if they consciously choose to do so. As I’ve argued before, with a nod to Walter Murch, it’s important to find analog moments in a digital world, by intentionally slowing down the process, using pen and paper, and embracing randomness and restriction whenever possible. Most of all, we need to find time to render, to acknowledge that even when digital technology cuts the production schedule in half, there’s still a necessary period in which works of art must be given time to ripen. David Lynch says he’s done with film, and he’s earned the right to make movies in any way he likes. But when I look at Inland Empire, I see an extraordinary movie that could have been far greater—and central to my own life—if, like Blue Velvet, it had been cut from three hours down to two. Digital technology makes it possible to avoid these hard choices. But that doesn’t mean we should.
Let’s talk about scale. For much of the past decade, the major movie studios have waged a losing battle to keep audiences in theaters, while competing with the vast array of more convenient entertainment options available at home. Hollywood’s traditional response to the threat of new media has always been to offer greater spectacle, these days in the form of IMAX or 3D, with an additional surcharge, of course. But as the new formats bring us closer to the action, computerized effects push us further away. No matter how beautifully rendered a digital landscape may be, it’s still strangely airless and sterile, with a sense that we’re being given a view of more megapixels, not a window on the world. Even so immersive a film as Avatar ultimately keeps us at arm’s length: Pandora is a universe unto itself, yes, but it still sits comfortably on a hard drive at Weta. And for all their size and expense, most recent attempts to create this kind of immersion, from John Carter to The Avengers, fail to understand the truth about spectacle: large-scale formats are most exciting when they give us a vision of a real, tangible, photographed world.
This is why The Dark Knight Rises is such a landmark. Christopher Nolan, who cited the films of David Lean as an influence in Batman Begins, understands that the real appeal of the great Hollywood epics in VistaVision and Cinerama was the startling clarity and scope of the world they presented. It’s the kind of thing that can only be achieved on location, with practical effects, real stunts, aerial photography, and a cast of thousands. The Dark Knight Rises is packed with digital effects, but we’re never aware of them. Instead, we’re in the presence of a director luxuriating in the huge panoramic effects that IMAX affords—with image, with music, with sound—when trained on the right material on real city streets. As a result, it feels big in a way that no other movie has in a long time. Brad Bird achieved some of the same effect in Mission: Impossible—Ghost Protocol, but while Bird invited us to marvel at his surfaces, Nolan wants us to plunge us into a world he’s created, and he uses the medium as it was meant to be used: to tell a rich, dense story about an entire city.
Even more than The Dark Knight, this final installment makes it clear that Nolan’s twin obsessions with epic filmmaking and narrative complexity aren’t two different impulses, but opposite sides of the same coin: the massive IMAX screen, which surrounds us with images of staggering detail, is the visual equivalent of what Nolan is trying to do with the stories he tells. One thinks of The Last Judgment, of Bruegel, of Bosch. And his narrative skills have only improved with time. The Dark Knight had a great script, but it occasionally seemed to strain under the weight of its ideas, until it came off as two hugely eventful movies packed into one. The new movie doesn’t quite reach the heights of its predecessor, but it’s also more confident and assured: we’re sucked in at once and held rapt for two hours and forty minutes. And Nolan seems to have gotten over his ambivalence about the character of Batman himself. He’s always been shy about the Batsuit, which served as a kinky reminder of the story’s comic book origins, but here, he keeps Bruce Wayne vulnerable and unmasked for as long as possible, until he becomes more of a hero than ever before.
This is, in short, something close to a masterpiece—not just a worthy conclusion to the best series of comic book movies ever made, but the year’s first really great studio film. And yet I do have one big complaint. I’ve spoken before about Hollywood’s weird obsession with secrets, in which it refuses to disclose simple information about a movie for no other reason than a fetish over secrecy for its own sake, when in fact the film itself has no interesting surprises. (See: Prometheus and Super 8.) The same impulse often applies to casting rumors. For The Dark Knight Rises, the studio adamantly refused to confirm who Anne Hathaway would be playing, despite it being fairly obvious, and did the same with the characters played by Tom Hardy and Joseph Gordon-Levitt. Yet even at the earliest point in the film’s production, it was made very clear that a certain character was going to be appearing in the film—thus ruining the movie’s one big surprise. In short, Hollywood has no idea what a secret is: it routinely hides information to no purpose, but then, when it really counts for once, it reveals it in a way that utterly destroys the filmmaker’s intentions. And there’s no other living director whose intentions deserve greater respect and admiration.
Over the weekend, along with everyone else in the Northern Hemisphere, my wife and I saw The Avengers. I’m not going to bother with a formal review, since there are plenty to go around, and in any case, if you haven’t already seen it, your mind is probably made up either way. I’ll just say that while I enjoyed it, this is a movie that comes across as a triumph more of assemblage and marketing than of storytelling: you want to cheer, not for the director or the heroes, but for the executives at Marvel who brought it all off. Joss Whedon does a nice, resourceful job of putting the pieces together, but we’re left with the sense of a director gamely doing his best with the hand he’s been dealt, which is an odd thing to say for a movie that someone paid $200 million to make. Whedon has been saddled with at least two heroes too many, as well as a rather dull villain—far better if they had gone with the Red Skull of Captain America—so that a lot of the film, probably too much, is spent slotting all the components into place.
Still, once everything clicks, it moves along efficiently, if not always coherently, and it’s a bright, shiny toy for the eyes, certainly compared to the dreary Thor. It doesn’t force us to rethink what this genre is capable of doing, as The Dark Knight did, but it’s a movie that delivers exactly what audiences want, and perhaps a touch more, which is more than enough to deliver the highest opening weekend in history. And this, more than anything else, puts its director in a peculiar position. Joss Whedon has made a career out of seeming to work against considerable obstacles, and never quite succeeding, except in the eyes of his devoted fans. Buffy switched networks; Firefly was canceled before its time; Dollhouse struggled on for two seasons in the face of considerable interference. All of his projects carry a wistful sense of what might have been, and throughout it all, Whedon has been his own best character, unfailingly insightful in interviews, gracious, funny and brave, the underdog whose side he has always so eloquently taken.
So what happens when the underdog becomes responsible for a record-shattering blockbuster? The Avengers isn’t all that interesting as a movie—far less so than The Cabin in the Woods—but it’s fascinating as a portent of things to come. Whedon has delivered the kind of big popular success that can usually be cashed in for the equivalent of one free movie with unlimited studio resources, as if all the holes in his frequent shopper’s card had finally been punched. For most of his career, at least since Buffy, Whedon has had everything—charm, talent, an incredibly avid fanbase—except the one thing that a creative type needs to survive in Hollywood: power. Now, abruptly, he has oodles of it, obtained in the only way possible, by making an ungodly amount of money for a major studio. Which means that he’s suddenly in a position, real or imaginary, to make every fanboy’s dreams come true.
The question is what he intends to do with it. Unlike Christopher Nolan, he isn’t a director who seems to gain personal satisfaction from deepening and heightening someone else’s material, so The Avengers 2 doesn’t seem like the best use of his talents. Personally, I hope he pulls a Gary Ross, takes the money, and runs. He could probably make another Firefly movie, although that doesn’t seem likely at this point. He could make Goners. He could pick up an ailing franchise with fewer moving parts and do wonderful things with it—I hear that Green Lantern is available. Or, perhaps, he’ll surprise us. The Avengers isn’t a bad film, but it gives us only occasional glimpses of the full Whedon, peeking out from between those glossy toys, and those hints make you hunger for a big movie that he could control from beginning to end. For most of his career, fans have been wondering what he’d do with the full resources and freedom he’d long been denied—even as he seemed to thrive on the struggle. And if he’s as smart and brave as he’s always seemed, he won’t wait long to show us.
The release of the final trailer for The Dark Knight Rises gives me as good an excuse as any to talk once more about the work of Christopher Nolan, who, as I’ve said before, is the contemporary director who fills me with the most awe. Nolan has spent the past ten years pushing narrative complexity on the screenplay level as far as it will go while also mastering every aspect of large-scale blockbuster filmmaking, and along the way, he’s made some of the most commercially successful films of the decade while retaining a sensibility that remains uniquely his own. In particular, he returns repeatedly to issues of storytelling, and especially to the theme of how artists, for all their intelligence and preparation, can find themselves lost in their own labyrinths. Many works of art are ultimately about the process of their own creation, of course, but to a greater extent than usual, Nolan has subtly given us a portrait of the director himself—meticulous, resourceful, but also strangely ambivalent toward the use of his own considerable talents.
Yesterday, I referred to my notes toward a novel as urgent communications between my past and future selves, “a la Memento,” but it was only after typing that sentence that I realized how accurate it really is. Leonard Shelby, the amnesiac played by Guy Pearce, is really a surrogate for the screenwriter: he’s thrust into the middle of a story, without any context, and has to piece together not just what comes next, but what happened before. His notes, his visual aids, and especially the magnificent chart he hangs on his motel room wall are variations of the tools that a writer uses to keep himself oriented in during a complex project—including, notably, Memento itself. It isn’t hard to imagine Nolan and his brother Jonathan, who wrote the original story on which the screenplay is based, using similar charts to keep track of their insanely intricate narrative, with a protagonist who finally turns his own body into a sort of corkboard, only to end up stranded in his own delusions.
This theme is explored repeatedly in Nolan’s subsequent films—notably The Prestige, in which the script’s endless talk about magic and sleight of hand is really a way of preparing us for the trick the movie is trying to play on the audience—but it reaches its fullest form in Inception. If Memento is a portrait of the independent screenwriter, lonely, paranoid, and surrounded by fragments of his own stories, Inception is an allegory for blockbuster moviemaking, with a central figure clearly based on the director himself. Many viewers have noted the rather startling visual similarity between Nolan and his hero, and it’s easy to assign roles to each of the major characters: Cobb is the director, Saito the producer, Ariadne the art director, all working toward the same goal as that of the movie itself—to transport the viewer into a reality where the strangest things seem inevitable. While Nolan has claimed that such an allegory wasn’t intentional, Inception couldn’t have been conceived, at least not in its current form, by a man who hadn’t made several huge movies. And at the end, we’re given the sense that the artist himself has been caught in a web of his own design.
In this light, Nolan’s Batman movies start to seem like his least personal work, which is probably true, but his sensibility comes through here as well. Batman Begins has an art director’s fascination with how things are really made—like Batman’s cowl, assembled from parts from China and Singapore—and The Dark Knight takes the figure of the director as antihero to its limit. The more we watch it, the more Nolan seems to uneasily identify, not with Batman, but with the Joker, the organized, methodical, nearly omniscient toymaker who can only express himself through violence. If the wintry, elegiac tone of our early glimpses of The Dark Knight Rises is any indication, Nolan seems ready to move beyond this, much as Francis Coppola—also fond of directorial metaphors in his work—came to both to identify with Michael Corleone and to dislike the vision of the world he had expressed in The Godfather. And if Nolan evolves in similar ways, it implies that the most interesting phase of his career is yet to come.
I didn’t want to see Captain America. The trailer wasn’t great, Joe Johnston wasn’t exactly my idea of a dream director, and most of all, I was getting a little tired of superheroes. The fact that we’ve seen four major comic book adaptations this summer alone wasn’t the only reason. Ten years ago, a movie like Spider-Man felt like a cultural event, a movie that I’d been waiting decades to see. Today, they’ve become the norm, to the point where a movie that isn’t driven by digital effects and an existing comic book property seems strangely exotic. At worst, such movies come off as the cynical cash grabs that, frankly, most of them are, a trend epitomized by Green Lantern, a would-be marketing bonanza so calculated that an A.V. Club headline summed it up as “Superhero movies are popular right now. Here’s another one.”
Which is why it gives me no small pleasure to report that Captain America is a pretty good movie, and in ways that seem utterly reproducible. This isn’t a film like The Dark Knight, which seems like an increasingly isolated case of a genius director being given all the resources he needed to make a singular masterpiece. Captain America is more the work of talented journeymen, guys who like what they do and are reasonably skilled at it, and who care enough to give the audience a good time—presumably with the kind of movie that they’d enjoy seeing themselves. Joe Johnston is no Chris Nolan, but in his own way, he does an even more credible Spielberg imitation than the J.J. Abrams of Super 8, and to more of a purpose. If this is clearly a cash grab—as its closing minutes make excruciatingly clear—it’s also full-blooded and lovingly rendered.
As a result, it’s probably the comic book movie I enjoyed most this year. While it doesn’t have the icy elegance of X-Men: First Class, it has a better script (credited to Christopher Markus and Stephen McFeely), and it’s far superior to the muddled, halfhearted, and overpraised Thor. Part of this is due to the fact that it’s the only recent superhero movie to manage a credible supervillain: in retrospect, Hugo Weaving’s Red Skull doesn’t do much more than strut around, but he’s still mostly glorious. And it’s also one of the rare modern comic book movies that remembers that the audience might still like to see some occasional action. As Thor failed to understand, special effects alone aren’t enough: I’ve had my mind blown too many times before. Yet it’s still fun to see an expertly staged action scene that arises organically from the story, and Captain America has a good handful of those, at a time when I’ve almost forgotten what it was like to see one.
What Captain America does, then, isn’t rocket science: it’s what you’d expect from any big studio movie, done with a modicum of care, aiming to appeal to the largest possible audience. So why aren’t there more movies like this? Perhaps because it’s harder to do than it looks: for one thing, it requires a decent script, which, more than anything else, is the limiting factor in a movie’s quality, and can’t be fixed by throwing money at it. The more movies I see, the more I respect mainstream entertainment that tries to be more than disposable, an effort that can seem quixotic in an industry where Pirates of the Caribbean: On Stranger Tides earns a billion dollars worldwide. Like it or not, movies are going to look increasingly like this, which is why it’s a good idea to welcome quality wherever we find it. Because it isn’t enough for a superhero to be super anymore; he also needs to be special.
Never has a city been more lovingly destroyed on camera than Chicago in Transformers: Dark of the Moon. By the time the movie reaches its crowning orgy of destruction, my wife and I had been staring at the screen for close to ninety minutes, along with an enthusiastic crowd in the IMAX theater at Navy Pier. My wife had seen much of the movie being shot on Michigan Avenue, just up the street from her office at the Tribune Tower, and I think it was with a sort of grim amusement, or satisfaction, that she watched her own building crumble to pieces as an alien robot spacecraft crashed into its beautiful gothic buttresses. It’s an image that merits barely five seconds in the movie’s final hour, which devastates most of downtown Chicago in gorgeous, even sensual detail, but it still struck me as a pivotal moment in our personal experience of the movies. (And hasn’t the Tribune already suffered enough?)
Like its immediate predecessor, Transformers 3 is generally pretty lousy. (I actually liked the first one, which had the advantage of comparative novelty, as well as a genuinely nimble comic performance by Shia LaBeouf that both director and star have struggled to recreate ever since.) As a story, it’s ridiculous; as a perfunctory attempt at a coherent narrative, it’s vaguely insulting. It’s also staggeringly beautiful. For the first ten minutes, in particular, the IMAX screen becomes a transparent window onto the universe, delivering the kind of transcendent experience, with its view of millions of miles, that even Avatar couldn’t fully provide. And even after its nonstop visual and auditory assault has taken its toll on your senses, it still gives new meaning to the phrase “all the money is there on the screen.” Here, it feels like the cash used to render just one jaw-dropping frame could have been used to pay down much of the national debt.
As I watched Dark of the Moon, or rather was pummeled into submission by it, I had the nagging feeling that Armond White’s notoriously glowing review of Revenge of the Fallen deserved some kind of reappraisal. At the time, White was dismissed, not without reason, as a troll, for issuing such pronouncements as “In the history of motion pictures, Bay has created the best canted angles—ever.” And yet I don’t think he was trolling, or even entirely wrong: it’s just that he was one movie too early. Michael Bay’s genius, and I use this word deliberately, is visible in every shot of Dark of the Moon, but it’s weirdly overdeveloped in just one direction. Bay is like one of those strange extinct animals that got caught in an evolutionary arms race until they became all horns, claws, or teeth. While a director like Christopher Nolan continues to develop along every parameter of storytelling, Bay is nothing but a massive eye: cold, brilliant, and indifferent to story or feeling. And it’s pointless to deny his talents, as ridiculously squandered as they might be.
So what exactly am I saying here? To steal a phrase from Roger Ebert’s review of The Life Aquatic, I can’t recommend Transformers: Dark of the Moon, but I would not for one second discourage you from seeing it—provided that you shell out fifteen dollars or more for the real experience in IMAX 3-D, which Bay has lovingly bullied the nation’s projectionists into properly presenting. (On video, I suspect that you might have the same reaction that my wife and I did when we rented Revenge of the Fallen: within forty minutes, both of us had our laptops out.) It’s an objectively terrible movie that, subjectively, I can’t get out of my head. As an author, I’m horrified by it: it’s a reminder of how useless, or disposable, writers can be. I won’t go as far as to say that it’s a vision of my own obsolescence, or that I believe that the robots are our future. But at this point in history, the burden is on writers to demonstrate that we’re necessary. And the momentum isn’t on our side.
On Saturday, my wife and I finally saw Source Code, the new science fiction thriller directed by Moon‘s Duncan Jones. I liked Moon a lot, but wasn’t sure what to expect from his latest film, and was pleasantly surprised when it turned out to be the best new movie I’ve seen this year. Admittedly, this is rather faint praise—by any measure, this has been a slow three months for moviegoers. And Source Code has its share of problems. It unfolds almost perfectly for more than an hour, then gets mired in an ending that tries, not entirely successfully, to be emotionally resonant and tie up all its loose ends, testing the audience’s patience at the worst possible time. Still, I really enjoyed it. The story draws you in viscerally and is logically consistent, at least up to a point, and amounts to a rare example of real science fiction in a mainstream Hollywood movie.
By “real” science fiction, of course, I don’t mean that the science is plausible. The science in Source Code is cheerfully absurd, explained with a bit of handwaving about quantum mechanics and parabolic calculus, but the movie is unusual in having the courage to follow a tantalizing premise—what if you could repeatedly inhabit the mind of a dead man eight minutes before he died?—through most of its possible variations. This is what the best science fiction does: it starts with an outlandish idea and follows it relentlessly through all its implications, while never violating the rules that the story has established. And one of the subtlest pleasures of Ben Ripley’s screenplay for Source Code lies in its gradual reveal of what the rules actually are. (If anything, I wish I’d known less about the story before entering the theater.)
This may sound like a modest accomplishment, but it’s actually extraordinarily rare. Most of what we call science fiction in film is thinly veiled fantasy with a technological sheen. A movie like Avatar could be set almost anywhere—the futuristic trappings are incidental to a story that could have been lifted from any western or war movie. (Walter Murch even suggests that George Lucas based the plot of Star Wars on the work he did developing Apocalypse Now.) Star Trek was often a show about ideas, but its big-screen incarnation is much more about action and spectacle: Wrath of Khan, which I think is the best science fiction film ever made, has been aptly described as Horatio Hornblower in space. And many of the greatest sci-fi movies—Children of Men, Blade Runner, Brazil—are more about creating the look and feel of a speculative future than any sense of how it might actually work.
And this is exactly how it should be. Movies, after all, aren’t especially good at conveying ideas; a short story, or even an episode of a television show, is a much better vehicle for working out a clever premise than a feature film. Because movies are primarily about action, character, and image, it isn’t surprising that Hollywood has appropriated certain elements of science fiction and left the rest behind. What’s heartening about Source Code, especially so soon after the breakthrough of Inception, is how it harnesses its fairly ingenious premise to a story that works as pure entertainment. There’s something deeply satisfying about seeing the high and low aspects of the genre joined so seamlessly, and it requires a peculiar set of skills on the part of the director, who needs to be both fluent with action and committed to ideas. Chris Nolan is one; Duncan Jones, I’m excited to say, looks very much like another.
Daniel Zalewski’s recent New Yorker piece on Guillermo del Toro, director of Pan’s Labyrinth and the Hellboy movies, is the most engaging profile I’ve read of any filmmaker in a long time. Much of this is due to the fact that del Toro himself is such an engaging character: enthusiastic and overweight, he’s part auteur and part fanboy, living in a house packed with ghouls and monsters, including many of the maquettes from his own movies. And the article itself is equally packed with insights into the creative process. On creature design:
Del Toro thinks that monsters should appear transformed when viewed from a fresh angle, lest the audience lose a sense of awe. Defining silhouettes is the first step in good monster design, he said. “Then you start playing with movement. The next element of design in color. And then finally—finally—comes detail. A lot of people go the other way, and just pile up a lot of detail.”
On Ray Harryhausen:
“He used to say, ‘Whenever you think of a creature, think of a lion—how a lion can be absolutely malignant or benign, majestic, depending on what it’s doing. If your creature cannot be in repose, then it’s a bad design.’”
And in an aside that might double as del Toro’s personal philosophy:
“In emotional genres, you cannot advocate good taste as an argument.”
Reading this article makes me freshly mourn the fact that del Toro won’t be directing The Hobbit. I like Peter Jackson well enough, but part of me feels that if del Toro had been allowed to apply his practical, physical approach to such a famous property—much as Christopher Nolan did with the effects in Inception—the history of popular filmmaking might have been different. As it stands, I can only hope that Universal gives the green light to del Toro’s adaptation of At the Mountains of Madness, a prospect that fills me with equal parts joy and eldritch terror. Judging from what I’ve heard so far, it sounds like del Toro is planning to make the monster movie to end all monster movies. Let’s all hope that he gets the chance.
The publication of the fifth edition of David Thomson’s New Biographical Dictionary of Film, the best book ever written on the movies, is cause for celebration, and an excuse for me to talk about one of the weirdest books in all of literature. Thomson is a controversial figure, and for good reason: his film writing isn’t conventional criticism so much as a single huge work of fiction, with Thomson himself as both protagonist and nemesis. It isn’t a coincidence that one of Thomson’s earliest books was a biography of Laurence Sterne, author of Tristram Shandy: his entire career can be read as one long Shandean exercise, in which Thomson, as a fictional character in his own work, is cheerfully willing to come off as something of a creep, as long as it illuminates our reasons for going to the movies.
First, a word about the book’s shortcomings. As in previous editions, instead of revising the entries for living subjects in their entirety, Thomson simply adds another paragraph or two to the existing filmographies, so that the book seems to grow by accretion, like a coral reef. This leads to inconsistencies in tone within individual articles, and also to factual mistakes when the entry hasn’t been updated recently enough—like the article on George Lucas, for instance, in which the latter two Star Wars prequels still evidently lie in the future. And the book is full of the kind of errors that occur when one tries to keep up, in print, with the vagaries of movie production—as when it credits David O. Russell with the nonexistent Nailed and omits The Fighter. (Now that this information is readily available online, Thomson should really just delete all of the detailed filmographies in the next edition, which would cut the book’s size by a quarter or more.)
And then, of course, there are Thomson’s own opinions, which are contrarian in a way that can often seem perverse. He’s lukewarm on Kurosawa, very hard on Kubrick (The Shining is the only movie he admires), and thinks that Christopher Nolan’s work “has already become progressively less interesting.” He thinks that The Wrestler is “a wretched, interminable film,” but he loves Nine. He displays next to no interest in animation or international cinema. There’s something to be outraged about on nearly every page, which is probably why the Dictionary averages barely more than three stars from reviewers on Amazon. And if you’re the sort of person who thinks that a critic whose opinions differ from your own must be corrupt, crazy, or incompetent—as many of Roger Ebert’s correspondents apparently do—then you should stay far, far away from Thomson, who goes out of his way to infuriate even his most passionate defenders.
Yet Thomson’s perversity is part of his charm. Edmund Wilson once playfully speculated that George Saintsbury, the great English critic, invented his own Toryism “in the same way that a dramatist or novelist arranges contrasting elements,” and there are times when I suspect that Thomson is doing the same thing. And it’s impossible not to be challenged and stirred by his opinions. There is a way, after all, in which Kurosawa is a more limited director than Ozu—although I know which one I ultimately prefer. Kubrick’s alienation from humanity would have crippled any director who was not Kubrick. Until The Dark Knight and Inception, Nolan’s movies were, indeed, something of a retreat from the promise of Memento. And for each moment of temporary insanity on Thomson’s part, you get something equally transcendent. Here he is on Orson Welles, for example, in a paragraph that has forever changed how I watch Citizen Kane:
Kane is less about William Randolph Hearst—a humorless, anxious man—than a portrait and prediction of Welles himself…As if Welles knew that Kane would hang over his own future, regularly being used to denigrate his later works, the film is shot through with his vast, melancholy nostalgia for self-destructive talent…Kane is Welles, just as every apparent point of view in the film is warmed by Kane’s own memories, as if the entire film were his dream in the instant before death.
On Spielberg and Schindler’s List:
Schindler’s List is the most moving film I have ever seen. This does not mean it is faultless. To take just one point: the reddening of one little girl’s coat in a black-and-white film strikes me as a mistake, and a sign of how calculating a director Spielberg is. For the calculations reveal themselves in these few errors that escape. I don’t really believe in Spielberg as an artist…But Schindler’s List is like an earthquake in a culture of gardens. And it helps persuade this viewer that cinema—or American film—is not a place for artists. It is a world for producers, for showmen, and Schindlers.
And, wonderfully, on what is perhaps my own favorite bad movie of all time:
Yet in truth, I think Kevin [Spacey] himself is the biggest experiment, and to substantiate that one has only to call to the stand Beyond the Sea, written, produced and directed by Kev and with himself as Bobby Darin. The result is intoxicating, one of the really great dreadful films ever made, worthy of an annual Beyond the Sea award (why not give it on Oscar night?), as well as clinching evidence that this man is mad. Anything could happen.
The result, as I note above, is a massive Proustian novel in which nearly every major figure in the history of film plays a role. (Thomson has already written a novel, Suspects, that does this more explicitly, and his book-length study of Nicole Kidman is manifestly a novel in disguise.) Reading the Dictionary, which is as addictive as Wikipedia or TV Tropes, is like diving headfirst into a vast ocean, and trying to see how deep you can go before coming up for air. Although if it really is a novel, it’s less like Proust than like Pale Fire, in which Thomson plays the role of Kinbote, and every article seems to hint darkly at some monstrous underlying truth. (In that light, even the book’s mistakes seem to carry a larger meaning. What does it mean, for instance, that Thomson’s brilliant article on Heath Ledger, in which he muses on “the brief purchasing power” of fame, was “inadvertently dropped” from the fifth edition?)
And what monstrous truth does the Dictionary conceal? It’s the same truth, which applies as much to Thomson himself as it does to you and me, as the one that he spells out, unforgettably, at the end of Rosebud, his study of Orson Welles:
So film perhaps had made a wasted life?
One has to do something.
To continue my recent run of stating the obvious: I know I’m not alone in considering Christopher Nolan to be the most interesting director of the past ten years. In just over a decade, he’s gone from Memento to Inception, with The Dark Knight as one big step along the way, which ranks with Powell and Pressburger’s golden period as one of the most impressive runs in the history of movies. And his excellent interview with Wired last week, timed to coincide with Inception’s release on DVD, serves as a reminder that Nolan’s example is valuable for reasons that go far beyond his intelligence, skill, and massive popular success.
Nolan’s artistic trajectory has been a fascinating one. While most artists start with passion and gradually work their way toward craft, Nolan has always been a consummate craftsman, and is just now starting to piece together the emotional side of the equation. He’s been accused of being overly cold and cerebral, a criticism that has some basis in fact. But his careful, deliberate efforts to invest his work with greater emotion—and humor—have been equally instructive. As he says to Wired:
The problem was that I started [Inception] with a heist film structure. At the time, that seemed the best way of getting all the exposition into the beginning of the movie—heist is the one genre where exposition is very much part of the entertainment. But I eventually realized that heist films are usually unemotional. They tend to be glamorous and deliberately superficial. I wanted to deal with the world of dreams, and I realized that I really had to offer the audience a more emotional narrative, something that represents the emotional world of somebody’s mind. So both the hero’s story and the heist itself had to be based on emotional concepts. That took years to figure out. [Italics mine.]
Nolan’s masterstroke, of course, was to make the ghost that haunts Inception—originally that of a dead business partner—the main character’s wife. He also made strategic choices about where to keep things simple, in order to pump up the complexity elsewhere: the supporting cast is clearly and simply drawn, as is the movie’s look, which gives necessary breathing room to the story’s multiple layers. For a writer, the lesson is obvious: if you’re going to tell a complicated story, keep an eye out for ways to ease up on the reader in other respects.
In the case of Inception, the result is a film that is both intellectually dense and emotionally involving, and which famously rewards multiple viewings. In that light, this exchange is especially interesting:
Wired: I know that you’re not going to tell me [what the ending means], but I would have guessed that really, because the audience fills in the gaps, you yourself would say, “I don’t have an answer.”
Nolan: Oh no, I’ve got an answer.
Wired: You do?!
Nolan: Oh yeah. I’ve always believed that if you make a film with ambiguity, it needs to be based on a sincere interpretation. If it’s not, then it will contradict itself, or it will be somehow insubstantial and end up making the audience feel cheated. I think the only way to make ambiguity satisfying is to base it on a very solid point of view of what you think is going on, and then allow the ambiguity to come from the inability of the character to know, and the alignment of the audience with that character.
Wired: Oh. That’s a terrible tease.
Well, yes. But it’ll be interesting to see where Nolan goes from here. After Inception and The Dark Knight, he has as much power as any director in Hollywood. (Worldwide, Inception is the fourth highest-grossing movie in history based on an original screenplay, behind only Avatar, Titanic, and Finding Nemo.) He continues to grow in ambition and skill with every film. He seems determined to test the limits of narrative complexity in movies intended for a mass audience.
And he’s still only forty years old.
The only person standing in your way is you.
—Thomas Leroy (Vincent Cassel), in Black Swan
It’s safe to say that no other movie this year, aside perhaps from Inception, filled me with so much unnatural anticipation as Black Swan. Ever since my first encounter with Michael Powell and Emeric Pressburger’s The Red Shoes, which I think is the best movie ever made, I’ve had an uninformed but highly emotional interest in ballet, especially ballet on film. Darren Aronofsky, coming off The Wrestler, is easily one of the ten most interesting directors in America. And while Natalie Portman has been making a career, as Pauline Kael once said of Meryl Streep, of seeming to overcome being miscast, she’s still an actress for whom I have a lot of affection and respect (even if she seems determined to squander it).
The result, unfortunately, comes precariously close to being a bad movie. It’s chilly and lurid at the same time; the story is both overcooked and underconceived; and it descends so rapidly into overwrought melodrama that it’s hard to take any of it seriously. (At its worst, it’s nothing but one long mirror scare.) And yet it’s a work of undeniable skill and commitment, with extraordinary images and moments, and even at its worst, it’s still more interesting to think about than many conventionally good movies. On our way home, my wife asked me if I thought it would become a midnight movie classic. I think it will become something even better: it’s the kind of movie where, if it had come out before I was born, I might have skipped school to see it in revival on the big screen. (I did that only once in high school, and that was to see Last Tango in Paris.)
But Black Swan is still a deeply problematic movie, in ways that I don’t think Aronofsky intended. The story, without giving too much away, is that of a young ballet dancer’s descent into madness. And it plunges you into that madness so quickly, almost from the very first shot, that there’s no sense of loss as her sanity slips away. From the beginning, Portman’s character, Nina, comes off as hopelessly fragile and neurotic, and she’s never given the kind of emotional grounding—a scene with friends, say, or even a moment of ordinary human behavior—that might have made her story genuinely tragic, rather than a chilling exercise. What Black Swan needs, above all else, is a first act, set in the real world, before Aronofsky releases all of his lovingly conceived visual and aural shocks.
As it stands, it’s tempting to see Nina as a surrogate for the director himself (though it should be noted that Aronofsky did not write Black Swan, which is based on a screenplay by Andres Heinz, Mark Heyman, and John J. McLaughlin). Nina is repeatedly told that she has perfect technique, but needs to lose herself in the moment, a criticism that can be leveled, not without reason, at Aronofsky. Even more than Christopher Nolan, Aronofsky is the most left-brained of all directors with access to stars and large budgets, and he might well argue that, objectively speaking, Black Swan is perfect. Which is probably true. But subjectively, in ordinary human terms, it’s dangerously close to ridiculous.
Aronofsky has obviously seen The Red Shoes, and includes one scene—an audition filmed from the point of view of a pirouetting ballerina—that is clearly intended as homage. And both movies are about dancers whose leading roles become tragically literal, and ultimately destroy their lives. The difference, though, is that The Red Shoes implicitly contains all of Black Swan, and embeds it in a much larger story about art, love, and the wider world that Aronofsky only shows us in fragments. Vicky, in The Red Shoes, is destroyed by the conflict between art and life. For Nina, there is no life, only art, and thus no conflict: she’s a creature of art in a movie that cares about nothing else. And by the end, it’s unclear why she, and nobody else, has gone crazy.