Posts Tagged ‘Tony Gilroy’
Over the last few decades, we’ve seen a series of mostly unheralded technological and cultural developments that have allowed movies to be shaped more like novels—that is, as works of art that remain malleable and open to revision almost up to the last minute. Digital editing tools allow for cuts and rearrangements to be made relatively quickly, and they open up the possibility of even more sophisticated adjustments. The Girl With the Dragon Tattoo, for instance, includes shots in which an actor’s performance from one take was invisibly combined with another, while the use of high-definition digital video made it possible to crop the frame, recenter images, and even create camera movement where none was there before. In the old days, the addition of new material in postproduction was mostly restricted to voiceovers that played over an existing shot to clarify a plot point, or to pickup shots, which are usually inserts that can filmed on the cheap. (It would be amusing to make a list of closeup shots of hands in movies that actually belong to the editor or director. I can think of two examples off the top of my head: The Conversation and The Usual Suspects.) In some cases, you can get the main cast back for new scenes, and directors like Peter Jackson, who learned how useful it could be to have an actor constantly available during the shooting of The Lord of the Rings, have begun to allocate a few weeks into their schedule explicitly for reshoots.
As I was writing this post, my eye was caught by an article in the New York Times that notes that the famous subway grate scene in The Seven Year Itch was actually a reshoot, which reminds us that this isn’t anything new. But it feels more like a standard part of the blockbuster toolbox than it ever did before, and along with the resources provided by digital editing, it means that movies can be continually refined almost up to the release date. (It’s worth noting, of course, that the full range of such tools are available only to big tentpole pictures, which means that millions of dollars are required to recreate the kind of creative freedom that every writer possesses while sitting alone at his or her desk.) But we still tend to associate reshoots with a troubled production. Reports of new footage being shot dogged Rogue One for most of the summer, and the obvious rejoinder, which was made at the time, is to argue that such reshoots are routine. In fact, the truth was a bit more complicated. As the Hollywood Reporter pointed out, the screenwriter Tony Gilroy was initially brought in for a rewrite, but his role quickly expanded:
Tony Gilroy…will pocket north of $5 million for his efforts…[He] first was brought in to help write dialogue and scenes for Rogue’s reshoots and was being paid $200,000 a week, according to several sources. That figure is fairly normal for a top-tier writer on a big-budget studio film. But as the workload (and the reshoots) expanded, so did Gilroy’s time and paycheck.
The article continued: “Gilroy started on Rogue One in June, and by August, he was taking a leading role with Edwards in postproduction, which lasted well into the fall. The reshoots are said to have tackled several issues in the film, including the ending.” This is fairly unprecedented, at least in the way it’s being presented here. You’ll occasionally hear about one director taking over for another, but not about one helming reshoots for a comparable salary and a writer’s credit alone. In part, this may be a matter of optics: Disney wouldn’t have wanted to openly replace the director for such an important release. It may also reflect Tony Gilroy’s peculiar position in Hollywood. Gilroy never seemed particularly content as a screenwriter, but his track record as a director has been mixed, so he responded by turning himself into a professional fixer, like a Robert Towne upgraded for the digital age. And the reshoots appear to have been both unusually extensive and conceived with a writerly touch. As editor John Gilroy—Tony’s brother—told Yahoo Movies UK:
[The reshoots] gave you the film that you see today. I think they were incredibly helpful. The story was reconceptualized to some degree, there were scenes that were added at the beginning and fleshed out. We wanted to make more of the other characters, like Cassian’s character, and Bodhi’s character…The scene with Cassian’s introduction with the spy, Bodhi traipsing through Jedha on his way to see Saw, these are things that were added. Also Jyn, how we set her up and her escape from the transporter, that was all done to set up the story better.
The editor Colin Goudie added: “The point with the opening scenes that John was just describing was that the introductions in the opening scene, in the prologue, [were] always the same. Jyn’s just a little girl, so when you see her as an adult, what you saw initially was her in a meeting. That’s not a nice introduction. So having her in prison and then a prison breakout, with Cassian on a mission…everybody was a bit more ballsy, or a bit more exciting, and a bit more interesting.” In other words, the new scenes didn’t just clarify what was already there, but brought out character points that didn’t exist at all, which is exactly the sort of thing that a writer does in a rewrite. And it worked. Rogue One can feel a little mechanical at times, but all of the pieces come together in a satisfying way, and it has a cleaner and more coherent narrative line than The Force Awakens. The strategies that it used to get there, from the story reel to the reshoot, were on a larger scale than usual, but that was almost certainly due to the tyranny of the calendar. Even more than its predecessor, Rogue One had to come out on schedule and live up to expectations: it’s the film that sets the pattern for an annual Star Wars movie between now and the end of time. The editorial team’s objective was to deliver it in the window available, and they succeeded. (Goudie notes that the first assembly was just ten minutes longer than the final cut, thanks largely to the insights that the story reel provided—it bought them time at the beginning that they could cash in at the end.) Every film requires some combination of time, money, and ingenuity, and as Rogue One demonstrates, any two of the three can be used to make for a lack of the third. As Goudie concludes: “It was like life imitating art. Let’s get a band of people and put them together on this secret mission.”
How do you release blockbusters like clockwork and still make each one seem special? It’s an issue that the movie industry is anxious to solve, and there’s a lot riding on the outcome. When I saw The Phantom Menace nearly two decades ago, there was an electric sense of excitement in the theater: we were pinching ourselves over the fact that we were about to see see the opening crawl for a new Star Wars movie on the big screen. That air of expectancy diminished for the two prequels that followed, and not only because they weren’t very good. There’s a big difference, after all, between the accumulated anticipation of sixteen years and one in which the installments are only a few years apart. The decade that elapsed between Revenge of the Sith and The Force Awakens was enough to ramp it up again, as if fan excitement were a battery that recovers some of its charge after it’s allowed to rest for a while. In the past, when we’ve watched a new chapter in a beloved franchise, our experience hasn’t just been shaped by the movie itself, but by the sudden release of energy that has been bottled up for so long. That kind of prolonged wait can prevent us from honestly evaluating the result—I wasn’t the only one who initially thought that The Phantom Menace had lived up to my expectations—but that isn’t necessarily a mistake. A tentpole picture is named for the support that it offers to the rest of the studio, but it also plays a central role in the lives of fans, which have been going on long before the film starts and will continue after it ends. As Robert Frost once wrote about a different tent, it’s “loosely bound / By countless silken ties of love and thought / to every thing on earth the compass round.”
When you have too many tentpoles coming out in rapid succession, however, the outcome—if I can switch metaphors yet again—is a kind of wave interference that can lead to a weakening of the overall system. On Christmas Eve, I went to see Rogue One, which was preceded by what felt like a dozen trailers. One was for Spider-Man: Homecoming, which left me with a perplexing feeling of indifference. I’m not the only one to observe that the constant onslaught of Marvel movies makes each installment feel less interesting, but in the case of Spider-Man, we actually have a baseline for comparison. Two baselines, really. I can’t defend every moment of the three Sam Raimi films, but there’s no question that each of those movies felt like an event. There was even enough residual excitement lingering after the franchise was rebooted to make me see The Amazing Spider-Man in the theater, and even its sequel felt, for better or worse, like a major movie. (I wonder sometimes if audiences can sense the pressure when a studio has a lot riding on a particular film: even a mediocre movie can seem significant if a company has tethered all its hopes to it.) Spider-Man: Homecoming, by contrast, feels like just one more component in the Marvel machine, and not even a particularly significant one. It has the effect of diminishing a superhero who ought to be at the heart of any universe in which he appears, relegating one of the two or three most successful comic book characters of all time to a supporting role in a larger universe. And because we still remember how central he was to no fewer than two previous franchises, it feels like a demotion, as if Spider-Man were an employee who had left the company, came back, and is now reporting to Iron Man.
It isn’t that I’m all that emotionally invested in the future of Spider-Man, but it’s a useful case study for what it tells us about the pitfalls of these films, which can take something that once felt like a milestone and reduce it to a midseason episode of an ongoing television series. What’s funny, of course, is that the attitude we’re now being asked to take toward these movies is actually closer to the way in which they were originally conceived. The word “episode” is right there in the title of every Star Wars movie, which George Lucas saw as an homage to classic serials, with one installment following another on a weekly basis. Superhero films, obviously, are based on comic books, which are cranked out by the month. The fact that audiences once had to wait for years between movies may turn out to have been a historical artifact caused by technological limitations and corporate inertia. Maybe the logical way to view these films is, in fact, in semiannual installments, as younger viewers are no doubt growing up to expect. In years to come, the extended gaps between these movies in prior decades will seem like a structural quirk, rather than an inherent feature of how we relate to them. This transition may not be as meaningful as, say, the shift from silent films to the talkies, but they imply a similar change in the way we relate to the film onscreen. Blockbusters used to be released with years of anticipation baked into the response from moviegoers, which is no longer something that can be taken for granted. It’s a loss, in its way, to fan culture, which had to learn how to sustain itself during the dry periods between films, but it also implies that the movies themselves face a new set of challenges.
To be fair, Disney, which controls both the Marvel and Star Wars franchises, has clearly thought a lot about this problem, and they’ve hit on approaches that seem to work pretty well. With the Marvel Universe, this means pitching most of the films at a level at which they’re just good enough, but no more, while investing real energy every few years into a movie that is first among equals. This leads to a lot of fairly mediocre installments, but also to the occasional Captain America: Civil War, which I think is the best Marvel movie yet—it pulls off the impossible task of updating us on a dozen important characters while also creating real emotional stakes in the process, which is even more difficult than it looks. Rogue One, which I also liked a lot, takes a slightly different tack. For most of the first half, I was skeptical of how heavily it was leaning on its predecessors, but by the end, I was on board, and for exactly the same reason. This is a movie that depends on our knowledge of the prior films for its full impact, but it does so with intelligence and ingenuity, and there’s a real satisfaction in how neatly it aligns with and enhances the original Star Wars, while also having the consideration to close itself off at the end. (A lot of the credit for this may be due to Tony Gilroy, the screenwriter and unbilled co-director, who pulled off much of the same feat when he structured much of The Bourne Ultimatum to take place during gaps in The Bourne Supremacy.) Relying on nostalgia is a clever way to compensate for the reduced buildup between movies, as if Rogue One were drawing on the goodwill that Star Wars built up and hasn’t dissipated, like a flywheel that serves as an uninterruptible power supply. Star Wars isn’t just a tentpole, but a source of energy. And it might just be powerful enough to keep the whole machine running forever.
Occasionally, a piece of technology appears in the real world that fits the needs of fiction so admirably that authors rush to adopt it in droves. My favorite example is the stun gun. The ability to immobilize characters without killing or permanently incapacitating them is one that most genre writers eventually require. It allows the hero to dispatch a henchman or two while removing the need to murder them in cold blood, which is essential if your protagonist is going to remain likable, and it also lets the villain temporarily disable the hero while still keeping him alive for future plot purposes. Hence the ubiquitous blow to the back of the head that causes unconsciousness, which was a cliché long before movies like Conspiracy Theory ostentatiously drew attention to it. The beauty of the stun gun is that it produces all of the necessary effects—instantaneous paralysis with no lasting consequences—that the convention requires, while remaining comfortably within the bounds of plausibility. In my case, it was the moment when Mathis is conveniently dispatched toward the end of Casino Royale that woke me up to its possibilities, and I didn’t hesitate to use it repeatedly in The Icon Thief. By now, though, it’s become so overused that writers are already seeking alternatives, and even so meticulous an entertainment as the David Fincher version of The Girl With the Dragon Tattoo falls back on the even hoarier device of knockout gas. But the stun gun is here to stay.
Much the same principle applies to the two most epochal technological developments of our time, which have affected fiction as much as they’ve transformed everyday life: the cell phone and the Internet. Even the simple flip phone was a game changer, instantly rendering obsolete all stories that depend on characters being unable to contact one another or the police—which is why service outages and spotty coverage seem so common in horror movies. It’s hard not to watch movies or television from earlier in this century without reflecting on how so many problems could be solved by a simple phone call. (I’m catching up on The People v. O.J. Simpson, and I find myself thinking about the phones they’re using, or the lack thereof, as much as the story itself.) And the smartphone, with the instant access it provides to all the world’s information, generates just as many new problems and solutions, particularly for stories that hinge on the interpretation of obscure facts. Anyone writing conspiracy fiction these days has felt this keenly: there isn’t much call for professional symbologists when ordinary bystanders can solve the mystery by entering a couple of search terms. In City of Exiles, there’s a dramatic moment when Wolfe asks Ilya: “What is the Dyatlov Pass?” On reading it, my editor noted, not unreasonably: “Doesn’t anybody there have a cell phone?” In the end, I kept the line, and I justified it to myself by compressing the timeline: Wolfe has just been too busy to look it up herself. But I’m not sure if it works.
Search engines are a particularly potent weapon of storytelling, to the point where they’ve almost become dangerous. At their best, they can provide a neat way of getting the story from one plot point to the next: hence the innumerable movie scenes in which someone like Jason Bourne stops in an Internet café and conducts a few searches, cut into an exciting montage, that propel him to the next stage of his journey. Sometimes, it seems too easy, but as screenwriter Tony Gilroy has said on more than one occasion, for a complicated action movie, you want to get from one sequence to the next with the minimum number of intermediate steps—and the search engine was all but designed to provide such shortcuts. More subtly, a series of queries can be used to provide a glimpse into a character’s state of mind, while advancing the plot at the same time. (My favorite example is when Bella looks up vampires in the first Twilight movie.) Google itself was ahead of the curve in understanding that a search can provide a stealth narrative, in brilliant commercials like “Parisian Love.” We’re basically being given access to the character’s interior monologue, which is a narrative tool of staggering usefulness. Overhearing someone’s thoughts is easy enough in prose fiction, but not in drama or film, and conventions like the soliloquy and the voiceover have been developed to address the problem, not always with complete success. Showing us a series of search queries is about as nifty a solution as exists, to the point where it starts to seem lazy.
And an additional wrinkle is that our search histories don’t dissipate as our thoughts do: they linger, which means that other characters, as well as the viewer or reader, have potential access to them as well. (This isn’t just a convention of fiction, either: search histories have become an increasingly important form of evidence in criminal prosecutions. This worries me a bit, since anyone looking without the proper context at my own searches, which are often determined by whatever story I’m writing at the time, might conclude that I’m a total psychopath.) I made good use of this in Chapter 45 of Eternal Empire, in which Wolfe manages to access Asthana’s search history on her home computer and deduces that she was looking into Maddy Blume. It’s a crucial moment in the narrative, which instantly unites two widely separated plotlines, and this was the most efficient way I could devise of making the necessary connection. In fact, it might be a little too efficient: it verges on unbelievable that Asthana, who is so careful in all other respects, would fail to erase her search history. I tried to make it more acceptable by adding an extra step with a minimum of technical gobbledegook—Asthana has cleared her browser history, so Wolfe checks the contents of the disk and memory caches, which are saved separately to her hard drive—but it still feels like something of a cheat. But as long as search histories exist, authors will use them as a kind of trace evidence, like the flecks of cigarette ash that Sherlock Holmes uses to identify a suspect. And unlike most clues, they’re written for all to see…
Last night, I watched The Lone Ranger. Given the fact that I haven’t yet seen 12 Years a Slave, Captain Phillips, or Before Midnight, this might seem like an odd choice. In my defense, I can only plead that on those rare evenings when my wife is out of the house, I usually seize the opportunity to watch something that I don’t think she’ll enjoy—the last time around it was Battle Royale. I’ve also been intrigued by The Lone Ranger ever since it flamed out in spectacular fashion last summer. Regular readers will know that I have a weakness for flops, and everything I’d read made me think that this was the kind of fascinating studio mess that I find impossible to resist. Quentin Tarantino’s guarded endorsement counted for a lot as well, and we’re already seeing the first rumblings of a revisionist take that sees the film as a neglected treasure. I wouldn’t go quite so far; it has significant problems, and I’m not surprised that the initial reaction was so underwhelming. But I liked it a lot all the same. It’s an engaging, sometimes funny, occasionally exciting movie with more invention and ambition than your average franchise installment, and I’d sooner watch its climactic train chase again than, say, most of The Avengers.
And what interests me the most is its most problematic element, which is the range of tones it encompasses. The Lone Ranger isn’t content just to be a Western; on some level, it wants to be all Westerns, quoting freely from Dead Man and Once Upon a Time in the West while also indulging in slapstick, adventure, gruesome violence, hints of the supernatural, and even moments of tragedy. It’s a revenge narrative by way of Blazing Saddles, and it’s no surprise that the result is all over the map. Part of this may be due to the sheer scale of the production—when someone gives you $200 million to make a Western, you may as well throw everything you can into the pot—but it’s also a reflection of the sensibilities involved. Director Gore Verbinski and screenwriters Ted Elliot and Terry Rossio had collaborated earlier, of course, on the Pirates of the Caribbean franchise, which gained a lot of mileage from a similar stylistic mishmash, though with drastically diminishing returns. And Verbinski at his best has the talent to pull it off: he combines the eye of Michael Bay with a real knack for comedy, and I predicted years ago that he’d win an Oscar one day. (He eventually did, for Rango.)
But playing with tone is a dangerous thing, as we see in the later Pirates films, and The Lone Ranger only gets maybe eighty percent of the way to pulling it off. Watching it, I was reminded of what the screenwriter Tony Gilroy says in his contribution to William Goldman’s Which Lie Did I Tell? Gilroy starts by listing examples of movies that experiment with tone, both good (Dr. Strangelove, The Princess Bride) and bad (Batman and Robin, Year of the Comet) and concludes:
But tone? Tone scares me…Why? Because when it goes wrong it just sucks out loud. I think the audience—the reader—I think they make some critical decisions in the opening movements of a film. How deeply do I invest myself here? How much fun can I have? Should I be consciously referencing the rest of my life during the next two hours, or is this an experience I need to surrender to? Are you asking for my heart or my head or both? Am I rooting for the hero or the movie? Just how many pounds of disbelief are you gonna ask me to suspend before this is through?
The Lone Ranger tramples on all these questions, asking us to contemplate the slaughter of Comanches a few minutes before burying our heroes up to their necks in a nest of scorpions, and the fact that it holds together even as well as it does is a testament both to the skill of the filmmakers and the power of a strong visual style. If nothing else, it looks fantastic, which helps us over some of the rough spots, although not all of them.
And it’s perhaps no accident that William Goldman’s first great discovery of a new tone came in Butch Cassidy and the Sundance Kid. It’s possible that there’s something about the Western that encourages this kind of experimentation: all it needs is a few men and horses, and the genre has been so commercially weakened in recent years that filmmakers have the freedom to try whatever they think might work. It’s true that The Lone Ranger works best in its last forty minutes, when The William Tell Overture blasts over the soundtrack and it seems content to return to its roots as a cliffhanging serial, but when you compare even its most misguided digressions to the relentless sameness of tone in a Liam Neeson thriller or a Bourne knockoff, it feels weirdly like a step forward. (Even Christopher Nolan, a director I admire immensely, has trouble operating outside of a narrow, fundamentally serious tonal range—it’s his one great shortcoming as a storyteller.) Going to the movies every summer would be more fun in general if more megabudgeted blockbusters looked and felt like The Lone Ranger, and its failure means that we’re more likely to see the opposite.
You’ve seen this baguette before. In any movie or television show in which a character is shown carrying groceries, a big loaf of french bread is invariably seen peeking out over the top of the bag. On the few occasions when it isn’t there, a similar role is assumed by a leafy bunch of carrots, or, in exceptional cases, celery. As the comically detailed TV Tropes entry on the subject points out, you’ll see the baguette among groceries carried by the unlikeliest of characters, like Liam Neeson in Taken, who carries not one, but two. (He’s in Paris, after all.) And given how often this loaf of bread turns up, it was only a matter of time before a clever screenwriter, in this case Tony Gilory in Michael Clayton, gave us a grocery bag full of nothing but baguettes. In this instance, it’s partially intended as a reflection of the unstable mental state of the character played by Tom Wilkinson, but it’s also a nod to a cinematic convention that, over time, has come to seem like a particularly ludicrous visual cliché.
And yet that baguette is there for a reason. For one thing, it’s a convenient prop that is unlikely to wilt under hot studio lights or after hours spent on location. It’s also a handy bit of narrative shorthand. If we see a character carrying a paper bag without any clues about what it contains, we immediately start to wonder what might be inside. The baguette poking out over the top is a visual flag that, paradoxically, actually makes the bag less visible: as soon as we understand that it’s just a bag of groceries, we stop worrying about it. (Thomas Harris, a shrewd exploiter and creator of narrative tropes, even utilizes it as a plot point in Red Dragon, when Francis Dolarhyde, the killer, uses a big bunch of leafy celery as camouflage in his escape from a crime scene: “He stuffed his books and clothing into the grocery bag, then the weapons. The celery stuck out the top.” And when he passes the police a moment later, carrying what is obviously just a bag of groceries, they don’t give him a second glance.)
Most clichés, after all, start out as a piece of authorial shorthand that allows the reader or viewer to focus on what really matters. William Goldman, who is close friends with Gilroy, makes a similar point in his wonderful book Which Lie Did I Tell? He ticks off some of the most notorious examples of how the movies depart from real life—the hero can always find a parking space when he needs one, the local news invariably happens to be talking about a necessary plot point when a character turns on the television, taxi fares can always be paid with the first bill you happen to grab without looking down at your wallet—and goes on to make an excellent observation: all of these clichés are about saving time. In a good movie, everything that isn’t relevant to the story goes out the window, which is why we see so many ridiculously convenient moments that allow us to move on without pausing to the next important scene. That baguette serves a useful purpose. If they gave awards to props, it would at least merit a nod for Best Supporting Actor.
The trouble, of course, is that as soon as a narrative device proves its usefulness, it’s immediately copied by every writer in sight. And it’s easy to understand why: such tricks are worth their weight in gold. In my own novels, I’m constantly trying to find the right balance between advancing the plot and avoiding story beats that seem too obvious or convenient. (For example, in both The Icon Thief and City of Exiles, there’s a scene in which a suspect cracks a bit too easily under interrogation, just because I wanted to get on to the next big thing. I try to disguise such moments as best as I can, but I can’t claim the effect is entirely successful.) And whenever a writer discovers a novel piece of shorthand, or a clever spin on an old cliché, it’s like stumbling across a new industrial process. You’d like to patent it, but once it’s in print, it’s there for anyone to use. So the search for new tropes goes on, as it should. Because a baguette, as we all know, doesn’t stay fresh for long.
William Goldman, the dean of American screenwriters, likes to tell the story of how Tony Gilroy saved the day. In Which Lie Did I Tell?—my favorite book on screenwriting, and one of the most entertaining books I’ve read of any kind—Goldman goes into great detail about his travails in adapting the novel Absolute Power, with its huge number of characters and infuriating structure, which kills off the protagonist halfway through and doesn’t have anything resembling a useable ending. Frustrated, Goldman found himself at a basketball game with Gilroy, a much younger writer who agreed to take a look at the project. The following day, Gilroy came in with a number of fixes, all of which diverged dramatically from the book. When Goldman objected, Gilroy shot back: “Forget about the novel—I haven’t read the novel—my main strength is that I haven’t read the novel—the novel is killing you.” In the end, Goldman saw the light, made the changes that Gilroy suggested, and finished the screenplay at last.
It’s a great story that has contributed significantly to Tony Gilroy’s current standing in Hollywood, which is similar to the one that Goldman occupied forty years ago—the smartest screenwriter in the room, the man who can fix any script. Yet there’s something deeply comic about the story as well. These are two incredibly smart, talented writers giving their all to the script of Absolute Power, a movie that didn’t exactly set the world on fire. When you look at Gilroy’s history ever since, you see a deep ambivalence toward his own reputation as a genius fixer. This comes through clearly in the title character of Michael Clayton, who says bitterly: “I’m not a miracle worker. I’m a janitor.” It’s made even more obvious by a famous New Yorker profile, which reveals that not only was Gilroy unhappy about how his work was treated on The Bourne Supremacy, but he wrote a draft of The Bourne Ultimatum only on the condition that he wouldn’t have to talk to director Paul Greengrass. Not surprisingly, then, his goal has long been to get to a place where he can direct his own movies.
And the results have been fascinating, if not always successful. Let’s start with The Bourne Legacy, which is a singular mix of expertise and almost unbelievable amateurishness. At its best, its set pieces are stunning: a grim workplace shooting in a government laboratory is almost too harrowing—it takes us right out of the movie—but the followup, in which Rachel Weisz’s character is visited by a pair of sinister psychologists, is a nice, nasty scene that Hitchcock would have relished. The movie, shot by the great Robert Elswit, looks terrific, and it holds our attention for well over two hours. But it never establishes a clear point of view or tells us who Jeremy Renner’s Bourne successor is supposed to be. Its attempt to layer its plot over events from The Bourne Ultimatum is interesting, but unnecessary: all of those clever connective scenes could be cut without any harm to the story. And its ending is ludicrously abrupt and unsatisfying: it concludes, like all the Bourne movies, by playing Moby’s “Extreme Ways,” but it might as well be a techno remix of “Is That All There Is?”
Still, I have huge admiration for Tony Gilroy, who has taught all of us a lot about storytelling. (In my limited experience, I’ve found that he’s the writer whose work tends to come up the most when literary agents talk about what they want in a suspense novel.) But his work as a director has been frustratingly uneven. Michael Clayton is a great movie that benefits, oddly, from its confusion over whether it’s a thriller or a character piece: its story is layered enough to encompass a satisfyingly wide range of tones. Duplicity was a real passion project, but so underwhelming that it became a key example in my formulation of the New Yorker feature curse. And what The Bourne Legacy demonstrates is that for all Gilroy’s considerable gifts, being a director may not be his first, best destiny. There’s no shame in that: Goldman, among others, was never tempted to direct, and the number of great screenwriters who became major directors is shatteringly small. Gilroy may not be a born director, but he’s one of the smartest writers of movies we’ve ever had. Is that really so bad a legacy?