Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘John Carter

Inside the sweatbox

leave a comment »

Yesterday, I watched a remarkable documentary called The Sweatbox, which belongs on the short list of films—along with Hearts of Darkness and the special features for The Lord of the Rings—that I would recommend to anyone who ever thought that it might be fun to work in the movies. It was never officially released, but a copy occasionally surfaces on YouTube, and I strongly suggest watching the version available now before it disappears yet again. For the first thirty minutes or so, it plays like a standard featurette of the sort that you might have found on the second disc of a home video release from two decades ago, which is exactly what it was supposed to be. Its protagonist, improbably, is Sting, who was approached by Disney in the late nineties to compose six songs for a movie titled Kingdom of the Sun. (One of the two directors of the documentary is Sting’s wife, Trudie Styler, a producer whose other credits include Lock, Stock and Two Smoking Barrels and Moon.) The feature was conceived by animator Roger Allers, who was just coming off the enormous success of The Lion King, as a mixture of Peruvian mythology, drama, mysticism, and comedy, with a central plot lifted from The Prince and the Pauper. After two years of production, the work in progress was screened for the first time for studio executives. As always, the atmosphere was tense, but no more than usual, and it inspired the standard amount of black humor from the creative team. As one artist jokes nervously before the screening: “You don’t want them to come in and go, ‘Oh, you know what, we don’t like that idea of the one guy looking like the other guy. Let’s get rid of the basis of the movie.’ This would be a good time for them to tell us.”

Of course, that’s exactly what happened. The top brass at Disney hated the movie, production was halted, and Allers left the project that was ultimately retooled into The Emperor’s New Groove, which reused much of the design work and finished animation while tossing out entire characters—along with most of Sting’s songs—and introducing new ones. It’s a story that has fascinated me ever since I first heard about it, around the time of the movie’s initial release, and I’m excited beyond words that The Sweatbox even exists. (The title of the documentary, which was later edited down to an innocuous special feature for the DVD, refers to the room at the studio in Burbank in which rough work is screened.) And while the events that it depicts are extraordinary, they represent only an extreme case of the customary process at Disney and Pixar, at least if you believe the ways in which that the studio likes to talk about itself. In a profile that ran a while back in The New Yorker, the director Andrew Stanton expressed it in terms that I’ve never forgotten:

“We spent two years with Eve getting shot in her heart battery, and Wall-E giving her his battery, and it never worked. Finally—finally—we realized he should lose his memory instead, and thus his personality…We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

This statement appeared in print six months before the release of Stanton’s live action debut John Carter, which implies that this method is far from infallible. And the drama behind The Emperor’s New Groove was unprecedented even by the studio’s relentless standards. As executive Thomas Schumacher says at one point: “We always say, Oh, this is normal. [But] we’ve never been through this before.”

As it happens, I watched The Sweatbox shortly after reading an autobiographical essay by the artist Cassandra Smolcic about her experiences in the “weird, hermetically sealed freakazoid” environment of Pixar. It’s a long read, but riveting throughout, and it makes it clear that the issues at the studio went far beyond the actions of John Lasseter. And while I could focus on any number of details or anecdotes, I’d like to highlight one section, about the firing of director Brenda Chapman halfway through the production of Brave:

Curious about the downfall of such an accomplished, groundbreaking woman, I began taking the company pulse soon after Brenda’s firing had been announced. To the general population of the studio — many of whom had never worked on Brave because it was not yet in full-steam production — it seemed as though Brenda’s firing was considered justifiable. Rumor had it that she had been indecisive, unconfident and ineffective as a director. But for me and others who worked closely with the second-time director, there was a palpable sense of outrage, disbelief and mourning after Brenda was removed from the film. One artist, who’d been on the Brave story team for years, passionately told me how she didn’t find Brenda to be indecisive at all. Brenda knew exactly what film she was making and was very clear in communicating her vision, the story artist said, and the film she was making was powerful and compelling. “From where I was sitting, the only problem with Brenda and her version of Brave was that it was a story told about a mother and a daughter from a distinctly female lens,” she explained.

Smolcic adds: “During the summer of 2009, I personally worked on Brave while Brenda was still in charge. I likewise never felt that she was uncertain about the kind of film she was making, or how to go about making it.”

There are obvious parallels between what happened to Allers and to Chapman, which might seem to undercut the notion that the latter’s firing had anything to do with the fact that she was a woman. But there are a few other points worth raising. One is that no one seems to have applied the words “indecisive, unconfident, and ineffective” to Allers, who voluntarily left the production after his request to push back the release date was denied. And if The Sweatbox is any indication, the situation of women and other historically underrepresented groups at Disney during this period was just as bad as it was at Pixar—I counted exactly one woman who speaks onscreen, for less than fifteen seconds, and all the other faces that we see are white and male. (After Sting expresses concern about the original ending of The Emperor’s New Groove, in which the rain forest is cut down to build an amusement park, an avuncular Roy Disney confides to the camera: “We’re gonna offend somebody sooner or later. I mean, it’s impossible to do anything in the world these days without offending somebody.” Which betrays a certain nostalgia for a time when no one, apparently, was offended by anything that the studio might do.) One of the major players in the documentary is Thomas Schumacher, the head of Disney Animation, who has since been accused of “explicit sexual language and harassment in the workplace,” according to a report in the Wall Street Journal. In the footage that we see, Schumacher and fellow executive Peter Schneider don’t come off particularly well, which may just be a consequence of the perspective from which the story is told. But it’s equally clear that the mythical process that allows such movies to “suck” for three out of four years is only practicable for filmmakers who look and sound like their counterparts on the other side of the sweatbox, which grants them the necessary creative freedom to try and fail repeatedly—a luxury that women are rarely granted. What happened to Allers on Kingdom of the Sun is still astounding. But it might be even more noteworthy that he survived for as long as he did.

No man’s game

with 5 comments

No Man's Sky

I first heard about the video game No Man’s Sky in an article by Raffia Khatchadourian that appeared last year in The New Yorker. This was probably a warning sign in itself. The New Yorker may be the best magazine in the world, but its coverage of gaming has often been disappointing, in part because it assigns novelty stories to gifted writers—like Nicholson Baker—with minimal knowledge of the subject, and because it’s difficult to thread the needle between being both informative to fans and comprehensible to readers with no firsthand experience of the creations of Shigeru Miyamoto. Years ago, I speculated half-seriously in Salon that there was a New Yorker feature curse, much like the more famous one that haunts the cover of Sports Illustrated. It seemed to me that a lot of artists who received coverage in the magazine, especially on the movie side, went on to spectacularly implode soon thereafter, often for the very project that had been glowingly described in the article itself. (John Carter is the first example that comes to mind, but there are plenty of others.) At the time, I suggested that this might be due to regression toward the mean: whenever a filmmaker attracts the attention of a magazine that only runs a handful of Hollywood profiles every year, it’s usually because of an outsized success in the recent past, which is generally followed by what seems like a failure in comparison. But now I think that this is only half the reason. In order to appear in a timely fashion, a feature article has to be written and edited long in advance of a work’s completion, and there’s no way to predict the quality of the result. A reporter might be able to watch the dailies or look at some demo footage, but that’s pretty much it. And if it’s ambitious enough to merit this kind of extended treatment, it’s no surprise that it frequently fails to live up to expectations. It’s just hard to spot a masterpiece before the fact.

No Man’s Sky is beginning to look like a perfect example of this phenomenon, and it’s likely to supplant even John Carter, at least in my own head, as the definitive case study. Khatchadourian’s article, which, by the way, is engaging and intelligent, opens with these lines:

The universe is being built in an old two-story building, in the town of Guildford, half an hour by train from London. About a dozen people are working on it. They sit at computer terminals in three rows on the building’s first floor and, primarily by manipulating lines of code, they make mathematical rules that will determine the age and arrangement of virtual stars, the clustering of asteroid belts and moons and planets, the physics of gravity, the arc of orbits, the density and composition of atmospheres—rain, clear skies, overcast. Planets in the universe will be the size of real planets, and they will be separated from one another by light-years of digital space. A small fraction of them will support complex life. Because the designers are building their universe by establishing its laws of nature, rather than by hand-crafting its details, much about it remains unknown, even to them. They are scheduled to finish at the end of this year; at that time, they will invite millions of people to explore their creation, as a video game, packaged under the title No Man’s Sky.

Khatchadourian goes on how to describe how the game, the brainchild of developer Sean Murray, is meant to evoke a feeling of limitless discovery. It uses procedural generation—in which elements are created on the fly by algorithms, rather than manually by a human being—to populate its universe with quintillions of planets. “Because of the game’s near-limitless proportions, players will rarely encounter one another by chance,” Khatchadourian continues. “As they move toward the center, the game will get harder, and the worlds—the terrain, the fauna and flora—will become more alien, more surreal.”

No Man's Sky

As many of you have probably heard, this isn’t exactly how it turned out. No Man’s Sky was released earlier this month to enormous early sales, followed immediately by a furious backlash as players realized that it wasn’t the game that they had been promised. (For the sake of full disclosure, I should note that I haven’t played it myself, and the game also has its defenders—although even the positive reviews tend to be carefully qualified.) There are vast numbers of planets, but they don’t orbit stars: they just sort of hang there in space. The characteristics of a planet, like its climate or natural resources, don’t depend on where it is in relation to anything else, or on any kind of physical logic. Nothing much changes as you get closer to the center. Your freedom to fly your spaceship is severely limited. A multiplayer experience isn’t just rare, but it doesn’t even seem possible, based on the design of the game itself. It was billed as a fantasy of exploring the unknown, but every planet has already been colonized by an alien race with the same generic architectural style. Worst of all, the procedural generation that lies at the heart of No Man’s Sky results, by most accounts, in a boring, repetitive playing experience: all of the planets are technically different, but they mostly follow the same basic patterns and templates, with the knobs twiddled here and there to achieve minor variations. The result might be a decent game on its own merits, but as a damning thread on Reddit makes clear, it doesn’t bear much of a resemblance to the experience that players were sold. When you look at the promises that the developers made, compared to the version of the game that was actually released, it isn’t hard to conclude, as many players have, that they were lying through their teeth. But it seems more likely that at some point during the development process, Murray realized that he was unable to deliver on the grand conceptions that he had outlined to the media. Features were cut or scaled back, ambitions were lowered, and the game was reduced to something more manageable. It’s easier to sketch out an expansive vision than to execute it in detail, and they just couldn’t give players a full universe in a box. As the Dean once said on Community: “Time travel is really hard to write about!”

Inevitably, the reaction to No Man’s Sky has also turned into a referendum on procedural generation itself. If you really want eighteen quintillion planets, you can’t make any of them particularly interesting, because extreme values for any of the relevant parameters could render the results broken or unplayable. Because the variations have to occur within a narrow range, they suffer from a certain sameness. It is possible to create levels that break all the rules, but it requires a human eye, an ability to tinker and revise, and endless user testing, which isn’t available when all your planets are being generated, as Khatchadourian puts it, “out of only fourteen hundred lines of code.” (To be fair, Khatchadourian also notes: “Games based on procedural generation often suffer from unrelenting sameness…or from visual turmoil.” But he adds that Murray hopes to find “a middle ground” that, in retrospect, was probably impossible.) A game like Super Mario Galaxy, which has never received a writeup in New Yorker, has only a hundred levels or so, but each one has been lovingly conceived and burnished by a real person, taking infinite pains and thinking hard about how to delight the player, which is a far more impressive accomplishment. And there were reasons to be skeptical. When I first read the article, my eye was caught, for obvious reasons, by this passage:

The game is an homage to the science fiction that Murray loved when he was growing up—Asimov, Clarke, Heinlein—and to the illustrations that often accompanied the stories. In the nineteen-seventies and eighties, sci-fi book covers often bore little relation to the stories within; sometimes they were commissioned independently, and in bulk, and for an imaginative teen-ager it was a special pleasure to imbue the imagery with its own history and drama.

The italics are mine. No Man’s Sky claims to have been inspired by classic science fiction, but it’s really about the artwork, and the confusion between the two is the real problem. Khatchadourian’s profile contains one unforgivable sentence: No Man’s Sky’s references may be dime-store fiction, but the game reimagines the work with a sense of nostalgia and a knowing style that is often more sophisticated than the original.” If anything, this statement seems even more ridiculous now than it did at the time. Genuine sophistication, as Heinlein and the others knew, isn’t just about creating a sense of wonder, but about using it to tell a real story. And Murray might have been better off if he had thought less about what was on the cover and more about what was inside.

Written by nevalalee

August 29, 2016 at 9:04 am

How Hollywood learned to love the bomb

with 2 comments

Jeff Bridges and Ryan Reynolds in R.I.P.D.

Why are we so fascinated by flops? Each year, there’s a race in the media to declare one movie or another the biggest disaster of the summer, and these days, the speculation seems to start weeks or months before a film’s release. In some cases, as with World War Z, the result blows past expectations to turn into a “surprise” success; for others, like The Lone Ranger, the early forebodings turn out to have been more than justified. At times, the media’s salivation over an impending bomb verges on the unseemly: the knives were out for R.I.P.D. long before it underwhelmed over this past weekend, to a point where a lot of potential audience members may have been discouraged from attending by the negative press. Nobody likes to back a loser. (It’s also worth noting that the definition of a bomb is more subjective than you might think: Pacific Rim has already made a number of lists of big-budget disasters, but fans on Reddit and elsewhere are equally eager to declare it a success, and the real truth is likely to land somewhere in the middle, especially once international grosses are taken into account.)

I’m just as guilty of this as anyone else: God knows I’ve spent enough time on this blog writing about John Carter. And the media has been obsessed with high-profile flops for as long as we’ve been going to the movies. Still, there’s no denying that the cycle has accelerated in recent years, to an extent that goes hand in hand with our fixation with getting the weekend numbers as quickly as possible. I’ve always been a box office junkie: I’ve visited the sites Box Office Mojo and Box Office Guru twice a week for most of the last fifteen years, and I’ll frequently check the latter’s Twitter feed for the latest updates. For big releases, estimated numbers for opening day are often available online early Saturday morning, which leads to an extraordinarily rapid verdict on the movie’s ultimate prospects: Friday numbers can be used to project the weekend as whole, which can give you a decent sense of the final gross, meaning that a picture that took a studio two or three years to conceive, produce, and market can have its success or failure decided in less than twenty-four hours.

Taylor Kitsch and friend in John Carter

In theory, that’s great drama—greater, in many cases, than what’s visible on the screen. And you’d think it would only get more intense as the major studios continue to place all of their bets on a handful of big tentpole pictures, rather than the traditional slate of small- to medium-sized releases. It’s an article of faith in contemporary Hollywood that it’s better to invest everything in a single movie that costs half a billion dollars to make and distribute rather than to spread the cost over half a dozen smaller films. The risks are enormous, but the returns can be equally great. Until, of course, they aren’t. And the stakes involved mean that even otherwise forgettable factory products can seem like insane acts of hubris. A few decades ago, an epochal flop worthy of a book like Final Cut or The Devil’s Candy—which respectively chronicle the stories behind Heaven’s Gate and The Bonfire of the Vanities—seemed to only appear every five years or so, and now, you could write a book like this every summer.

After a while, though, they all start to blur together. A movie like Heaven’s Gate may be a disaster—and make no mistake, it is, despite the recent attempts to reevaluate it as a neglected masterpiece—but at least it was the product of a particular crazed vision, however misguided it might have been. These days, flops are just part of the cost of doing business, and if there’s a story to be told here, it’s less about any one director’s megalomania than about the bumps in what has turned out to be a surprisingly viable corporate model. The blockbuster mentality can be hard to defend from an artistic perspective, and it leads to a glut of sequels and comic book franchises, but from a financial point of view, it works. Universal may have taken a hit from R.I.P.D., but that’s more than offset by the gains from Fast and Furious 6 and Despicable Me 2. It’s a model designed to absorb big disasters, which are a necessary side effect of an industry that rounds everything to the nearest hundred million. Flops, in short, are no longer interesting even as flops. And that’s the saddest part of all.

Written by nevalalee

July 22, 2013 at 9:00 am

Flight, Wreck-It Ralph, and the triumph of the mainstream

leave a comment »

At first glance, it’s hard to imagine two movies more different than Flight and Wreck-It Ralph. The former is an emphatically R-rated message movie with a refreshing amount of nudity, drug and alcohol abuse, and what used to be known as adult situations, in the most literal sense of the term; the latter is a big, colorful family film that shrewdly joins the latest innovations in animation and digital effects to the best of classic Disney. On the surface, they appeal to two entirely separate audiences, and as a result, you’d expect them to coexist happily at the box office, which is precisely what happened: both debuted this weekend to numbers that exceeded even the most optimistic expectations. (This is, in fact, the first weekend in a long time when my wife and I went to the movies on two consecutive nights.) Yet these two films have more in common than first meets the eye, and in particular, they offer an encouraging snapshot of Hollywood’s current potential for creating great popular entertainment. And even if their proximity is just a fluke of scheduling, it’s one that should hearten a lot of mainstream moviegoers.

In fact, for all their dissimilarities, the creative team behind Flight would have been more than capable of making Wreck-It Ralph, and vice-versa, and under the right circumstances, they might well have done so. Flight is Robert Zemeckis’s first live-action movie in years, after a long, self-imposed exile in the motion-capture wilderness, and the script is by John Gatins, who spent a decade trying to get it made, while also slaving away for two years on the screenplay for Real Steel. It’s a handsome movie, old-fashioned in its insistence on big themes and complex characters, but it’s also a product of the digital age: Zemeckis’s Forrest Gump, whatever its other flaws, remains a landmark in the use of unobtrusive special effects to advance the story and fill in a movie’s canvas, and their use here allowed Flight to be brought in on a startlingly low budget of $31 million. At his best, Zemeckis is one of the most technically gifted of mainstream directors, and in some ways, he’s an important spiritual godfather for Wreck-It Ralph, whose true precursor isn’t Toy Story, as many critics have assumed, but Who Framed Roger Rabbit?

Similarly, Wreck-It Ralph is the product of a canny, often surprisingly mature set of sensibilities that only happens to have ended up in animation. Along with the usual stable of Pixar and Disney veterans, the creative team includes Rich Moore and Jim Reardon, a pair of directors whose work on The Simpsons collectively represents the best fusion of high and low art in my lifetime, and they’ve given us a movie that appeals to both adults and kids, and not just in the obvious ways. It’s full of video game in-jokes that will fly over or under the heads of many viewers—a reference to Metal Gear Solid represents one of the few times a joke in a movie had the audience laughing while I was scratching my head—but this is really the least impressive aspect of the movie’s sophistication. The script is very clever, with a number of genuinely ingenious surprises, and there are touches here that go well beyond nerd culture to something older and weirder, like Alan Tudyk’s brilliant Ed Wynn impression as the villainous King Candy. (The cast, which includes John C. Reilly, Jack McBrayer, and Sarah Silverman, all of them wonderful, is a modern version of the Disney trick of recruiting old pros like Ed Wynn and Phil Harris to bring its characters to life.)

It’s tempting to say that it all comes down to good storytelling, but there’s something else going on here. Last year, I predicted that the incursion of Pixar talent into live-action movies would represent a seismic shift in popular filmmaking, and although John Carter was a bust, Brad Bird’s work on Mission: Impossible—Ghost Protocol indicates that I wasn’t entirely off the mark. This weekend’s top two movies are a sign that, at its best, Hollywood is still capable of making solid movies for adults and children that come from essentially the same place—from good scripts, yes, but also from studios and creative teams that understand the potential of technology and draw on a similar pool of skilled professionals. This is how Hollywood should look: not a world neatly divided into summer tentpole pictures, Oscar contenders, and a lot of mediocrity, but a system capable of turning out mainstream entertainment for different audiences linked by a common respect for craft. The tools and the talent are there, led by directors like Zemeckis and backed up by studios like Pixar and Disney. This ought to be the future of moviemaking. And at least for one weekend, it’s already here.

The glorious fiasco of Cloud Atlas

leave a comment »

A few months ago, I wrote a piece for Salon on whether there was such a thing as a New Yorker feature curse. I was largely inspired by the example of John Carter, in which the magazine’s highly positive profile of director Andrew Stanton was followed shortly thereafter by a debacle that deserves its own book, like Final Cut or The Devil’s Candy, to unpack in all its negative glory. Judging from the response, a lot of readers misunderstood the piece, with one commenter sniffing that I should read Carl Sagan’s The Demon-Haunted World before spreading so much superstition. My point, to the extent I had one, was that the New Yorker curse, like its counterpart at Sports Illustrated, was likely a case of regression to the mean: magazines like this have only a limited amount of feature space to devote to the movies, which means they tend to pick artists who have just had an outstanding outlier of a success—which often means that a correction is on the way. And although my theory has been badly tested by Seth MacFarlane’s Ted, which is now the highest-grossing R-rated comedy of all time, at first glance, the recent failure of Cloud Atlas, which follows a fascinating profile of the Wachowskis by Aleksandar Hemon, seems to indicate that the curse is alive and well.

Yet at the risk of sounding exactly as arbitrary as my critics have accused me of being, I can’t quite bring myself to lump it into the same category. This isn’t a movie like John Carter, which was undermined by a fundamentally flawed conception and a lot of tactical mistakes along the way. Cloud Atlas has its problems, but as directed by the Wachowskis and Tom Tykwer after the novel by David Mitchell, it’s a real movie, an ambitious, entertaining, often technically spellbinding film that probably never had a shot at finding a large popular audience. I’m not a huge fan of the Wachkowskis, who over the past decade have often seemed more intelligent in their interviews than in their movies, but I give them and Tykwer full credit for pursuing this dazzling folly to its very end. Cloud Atlas is like The Tree of Life made in a jazzy, sentimental, fanboyish state of mind, and although it doesn’t succeed entirely, under the circumstances, it comes closer than I ever expected. It’s the kind of weird, personal, expensive project that gives fiascos a good name, and it’s one of the few movies released this year that I expect to watch again.

And with one exception, which I’ll mention in a moment, the movie’s flaws are inseparable from its fidelity to the underlying material. I liked Mitchell’s novel a lot, and as with the movie it inspired, it’s hard not to be impressed by the author’s talents and ambition. That said, not all of its nested novelettes are equally interesting, and its structure insists on a deeper network of resonance that isn’t always there. Some of its connections—the idea that Somni-451 would become a messianic figure for the world after the fall, for instance, or that she’d want to spend her last few moments in life catching up with the story of Timothy Cavendish—don’t quite hold water, and in general, its attempts to link the stories together symbolically, as with the comet-shaped birthmark that its primary characters share, are too facile to be worthy of Mitchell’s huge authorial intelligence. (You only need to compare Cloud Atlas to a book like Dictionary of the Khazars, which does keep the promises its structure implies, to see how the former novel falls short of the mark.) And the movie suffers from the same tendency to inform us that everything here is connected, when really, they’re simply juxtaposed in the editing room.

All the same, the movie, like the book, is one that demands to be experienced. There are a few serious lapses, most unforgivably at the end, in which we’re given a new piece of information about the frame story—not present in the original novel—in the clumsiest way imaginable. For the most part, however, it’s fun to watch, and occasionally a blast. Somewhat to my surprise, my favorite sequences were the ones directed by Tykwer, an unreliable director who also offered up one of the best action scenes in recent years with the Guggenheim shootout in The International: he gives the Louisa Rey narrative a nice ’70s conspiracy feel, and the story of Timothy Cavendish, which I thought was unnecessary in the novel, turns out to be the most entertaining of all. (A lot of this is due to the presence of Jim Broadbent, who gives the best performance in the movie, and one of the few not hampered by elaborate but frequently distracting makeup.) The Wachowskis can’t do much with the journal of Adam Ewing, but the futuristic ordeal of Somni-451 is right in their wheelhouse. It’s a movie that takes great risks and succeeds an impressive amount of the time. And as far as I’m concerned, the curse is broken. At least for now.

Written by nevalalee

November 1, 2012 at 10:02 am

So what happened to John Carter?

leave a comment »

In recent years, the fawning New Yorker profile has become the Hollywood equivalent of the Sports Illustrated cover—a harbinger of bad times to come. It isn’t hard to figure out why: both are awarded to subjects who have just reached the top of their game, which often foreshadows a humbling crash. Tony Gilroy was awarded a profile after the success of Michael Clayton, only to follow it up with the underwhelming Duplicity. For Steve Carrell, it was Dinner with Schmucks. For Anna Faris, it was What’s Your Number? And for John Lasseter, revealingly, it was Cars 2. The latest casualty is Andrew Stanton, whose profile, which I discussed in detail last year, now seems laden with irony, as well as an optimism that reads in retrospect as whistling in the dark. “Among all the top talent here,” a Pixar executive is quoted as saying, “Andrew is the one who has a genius for story structure.” And whatever redeeming qualities John Carter may have, story structure isn’t one of them. (The fact that Stanton claims to have closely studied the truly awful screenplay for Ryan’s Daughter now feels like an early warning sign.)

If nothing else, the making of John Carter will provide ample material for a great case study, hopefully along the lines of Julie Salamon’s classic The Devil’s Candy. There are really two failures here, one of marketing, another of storytelling, and even the story behind the film’s teaser trailer is fascinating. According to Vulture’s Claude Brodesser-Akner, a series of lost battles and miscommunications led to the release of a few enigmatic images devoid of action and scored, in the manner of an Internet fan video, with Peter Gabriel’s dark cover of “My Body is a Cage.” And while there’s more to the story than this—I actually found the trailer quite evocative, and negative responses to early marketing materials certainly didn’t hurt Avatar—it’s clear that this was one of the most poorly marketed tentpole movies in a long time. It began with the inexplicable decision to change the title from John Carter of Mars, on the assumption that women are turned off by science fiction, while making no attempt to lure in female viewers with the movie’s love story or central heroine, or even to explain who John Carter is. This is what happens when a four-quadrant marketing campaign goes wrong: when you try to please everybody, you please no one.

And the same holds true of the movie itself. While the story itself is fairly clear, and Stanton and his writers keep us reasonably grounded in the planet’s complex mythology, we’re never given any reason to care. Attempts to engage us with the central characters fall curiously flat: to convey that Princess Dejah is smart and resourceful, for example, the film shows her inventing the Barsoomian equivalent of nuclear power, evidently in her spare time. John Carter himself is a cipher. And while some of these problems might have been solved by miraculous casting, the blame lands squarely on Stanton’s shoulders. Stanton clearly loves John Carter, but forgets to persuade us to love him as well. What John Carter needed, more than anything else, was a dose of the rather stark detachment that I saw in Mission: Impossible—Ghost Protocol, as directed by Stanton’s former Pixar colleague Brad Bird. Bird clearly had no personal investment in the franchise, except to make the best movie he possibly could. John Carter, by contrast, falls apart on its director’s passion and good intentions, as well as a creative philosophy that evidently works in animation, but not live action. As Stanton says of Pixar:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

Which only makes us wonder what might have happened if John Carter had been granted a fourth year.

Stanton should take heart, however. If there’s one movie that John Carter calls to mind, it’s Dune, another financial and critical catastrophe that was doomed—as much as I love it—by fidelity to its source material. (In fact, if you take Roger Ebert’s original review of Dune, which came out in 1985, and replace the relevant proper names, you end up with something remarkably close to a review of John Carter: “Actors stand around in ridiculous costumes, mouthing dialogue with little or no context.”) Yet its director not only recovered, but followed it up with my favorite movie ever made in America. Failure, if it results in another chance, can be the opposite of the New Yorker curse. And while Stanton may not be David Lynch, he’s not without talent: the movie’s design is often impressive, especially its alien effects, and it displays occasional flashes of wit and humor that remind us of what Stanton can do. John Carter may go on record as the most expensive learning experience in history, and while this may be cold comfort to Disney shareholders, it’s not bad for the rest of us, as long as Stanton gets his second chance. Hopefully far away from the New Yorker.

Written by nevalalee

March 15, 2012 at 10:31 am

Andrew Stanton and the world beyond Pixar

leave a comment »

Art is messy, art is chaos—so you need a system.

Andrew Stanton, to the New Yorker

For the second time in less than six months, the New Yorker takes on the curious case of Pixar, and this time around, the results are much more satisfying. In May, the magazine offered up a profile of John Lasseter that was close to a total failure, since critic Anthony Lane’s customary air of disdain was unprepared to draw any useful conclusions about a studio that, at least up to that point, had gotten just about everything blessedly right. This week’s piece by Tad Friend is far superior, focusing on the relatively unsung talents of Andrew Stanton, director of Finding Nemo and Wall-E. And while the publication of a fawning New Yorker profile of a hot creative talent rarely bodes well for his or her next project—as witness the recent articles on Tony Gilroy, Steve Carrell, Anna Faris, or even Lasseter himself, whose profile only briefly anticipated the release of the underwhelming Cars 2—I’m still excited by Stanton’s next project, the Edgar Rice Burroughs epic John Carter, which will serve as a crucial test as to whether Pixar’s magic can extend to the world beyond animation.

Stanton’s case is particularly interesting because of the role he plays at the studio: to hear the article tell it, he’s Pixar’s resident storyteller. “Among all the top talent here,” says Jim Morris, the head of Pixar’s daily operations, “Andrew is the one who has a genius for story structure.” And what makes this all the more remarkable is the fact that Stanton seems to have essentially willed this talent into existence. Stanton was trained as an animator, and began, like most of his colleagues, by focusing on the visual side. As the script for Toy Story was being developed, however, he decided that his future would lie in narrative, and quietly began to train himself in the writer’s craft, reading classic screenplays—including, for some reason, the truly awful script for Ryan’s Daughter—and such texts as Lajos Egri’s The Art of Dramatic Writing. In the end, he was generally acknowledged as the senior writer at Pixar, which, given the caliber of talent involved, must be a heady position indeed.

And while the article is littered with Stanton’s aphorisms on storytelling—”Inevitable but not predictable,” “Conflict + contradiction,” “Do the opposite”—his main virtue as a writer seems to lie in the most universal rule of all: “Be wrong fast.” More than anything else, Stanton’s success so far has been predicated on an admirable willingness to throw things out and start again. He spent years, for instance, working on a second act for Wall-E that was finally junked completely, and while I’m not sure he ever quite cracked the plot for that moviewhich I don’t think lives up to the promise of its first twenty minutes—there’s no question that his ruthlessness with structure did wonders for Finding Nemo, which was radically rethought and reconceived several times over the course of production. Pixar, like the rest of us, is making things up as it goes along, but is set apart by its refusal to let well enough alone. As Stanton concludes:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

The real question, of course, is whether this approach to storytelling, with its necessary false starts and extensive rendering time, can survive the transition to live action, in which the use of real actors and sets makes retakes—and thus revision—drastically more expensive. So far, it sounds like John Carter is doing fine, at least judging from the trailer and early audience response, which has reportedly been encouraging. And more rides on this movie’s success or failure than the fate of one particular franchise. Pixar’s story has been extraordinary, but its most lasting legacy may turn out to be the migration of its talent beyond the safety zone of animation—assuming, of course, that their kung fu can survive. With Brad Bird’s Mission: Impossible—Ghost Protocol and John Carter in the wingswe’re about to discover if the directors who changed animation at Pixar can do the same in live action. The New Yorker article is fine, but it buries the lede: Stanton and Bird are the first of many. And if their next movies are half as entertaining as the ones they’ve made so far, we’re looking at an earthquake in the world of pop culture.

Ray Bradbury on literature’s obligation of joy

leave a comment »

Interviewer: Does literature, then, have any social obligation?

Bradbury: Not a direct one. It has to be through reflection, through indirection. Nikos Kazantzakis says, “Live forever.” That’s his social obligation. The Saviors of God celebrates life in the world. Any great work does that for you. All of Dickens says live life at the top of your energy. Edgar Rice Burroughs never would have looked upon himself as a social mover and shaker with social obligations. But as it turns out—and I love to say it because it upsets everyone terribly—Burroughs is probably the most influential writer in the entire history of the world.

Interviewer: Why do you think that?

Bradbury: By giving romance and adventure to a whole generation of boys, Burroughs caused them to go out and decide to become special. That’s what we have to do for everyone, give the gift of life with our books. Say to a girl or boy at age ten, Hey, life is fun! Grow tall! I’ve talked to more biochemists and more astronomers and technologists in various fields, who, when they were ten years old, fell in love with John Carter and Tarzan and decided to become something romantic. Burroughs put us on the moon. All the technologists read Burroughs. I was once at Caltech with a whole bunch of scientists and they all admitted it. Two leading astronomers—one from Cornell, the other from Caltech—came out and said, Yeah, that’s why we became astronomers. We wanted to see Mars more closely.

Ray Bradbury, to The Paris Review

Written by nevalalee

March 12, 2011 at 9:22 am

%d bloggers like this: