Alec Nevala-Lee

Thoughts on art, culture, and the writing life.

Posts Tagged ‘Pixar

Blinn’s Law and the paradox of efficiency

leave a comment »

Disney's Frozen

Note: I seem to have come down with a touch of whatever stomach bug my daughter had this week, so I’m taking the day off. This post was originally published, in a somewhat different form, on August 9, 2011. 

As technology advances, rendering time remains constant.
—Blinn’s Law

Why isn’t writing easier? Looking at the resources that contemporary authors have at their disposal, it’s easy to conclude that we should all be perfect writing machines. Word processing software, from WordStar to Scrivener, has made the physical process of writing more streamlined than ever; Google and Amazon have given us access to a world of information that would have been inconceivable even fifteen years ago; and research, editing, and revision have been made immeasurably more efficient. And yet writing itself doesn’t seem all that much less difficult than before. The amount of time a professional novelist needs to spend writing each day—let’s say three or four hours—hasn’t decreased much since Trollope, and for most of us, it still takes a year or two to write a decent novel.

So what happened? In some ways, it’s an example of the paradox of labor-saving devices: instead of taking advantage of our additional leisure time, we create more work for ourselves. It also reflects the fact that the real work of writing a novel is rarely connected to how fast you can type. But I prefer to think of it as a variation on Blinn’s Law. As graphics pioneer James Blinn first pointed out, in animation, rendering time remains constant, even as computers get faster. An artist gets accustomed to waiting a certain number of hours for an image to render, so as hardware improves, instead of using it to save time, he employs it to render more complex graphics. This is why rendering time at Pixar has remained essentially constant over the past fifteen years. (Although the difference between Toy Story and Cars 2, or even Brave, is a reminder that rendering isn’t everything.)

Disney's Frozen

Similarly, whatever time I save by writing on a laptop rather than a manual typewriter is canceled out by the hours I spend making additional small changes and edits along the way. The Icon Thief probably went through eighteen substantial drafts before the final version was delivered to my publisher, an amount of revision and rewriting that would have been unthinkable without Word. Is the novel better as a result? On a purely technical level, yes. Is the underlying story more interesting than if I’d written it by hand? Probably not. Blinn’s Law tells us that the leaves and grass in the background of a shot will look increasingly great, but it says nothing about the quality of storytelling. Which seems to imply that the countless tiny changes that a writer like me makes to each draft are only a waste of effort.

And yet here’s the thing: I still needed all that time. No matter how efficient the physical side of the process becomes, it’s still desirable for a writer to live with a novel, or a studio to live with a movie, for at least a year or so. (In the case of a film like Frozen, that gestational period can amount to a decade or more.) For most of us, there seems to be a fixed developmental period for decent art, a minimum amount of time that a story needs to simmer and evolve. The endless small revisions aren’t the point: the point is that while you’re altering a word or shifting a paragraph here or there, the story is growing in your head in unexpected ways. Even as you fiddle with the punctuation, seismic changes are taking place. But for this to happen, you need to be at your desk for a certain number of hours. So what do we do in the meantime? We do what Pixar does: we render. That’s the wonderful paradox of efficiency: it allows us, as artists, to get the inefficiency we all need.

Written by nevalalee

February 24, 2014 at 9:00 am

Toy Story of delight

leave a comment »

Toy Story of Terror

Breaking Bad may be over, but last night, my wife and I watched what emphatically ranks as a close second in our most highly anticipated television events of the year: Toy Story of Terror, the first of what I hope will be many Pixar specials featuring the characters from my favorite animated franchise. Not surprisingly, I loved it, even if I’d rate it a notch below the sublime Partysaurus Rex. It’s constructed in the usual shrewd, slightly busy Pixar manner, with complication piled on complication, and it packs a startling amount of plot into a runtime of slightly over twenty minutes. A big part of the appeal of the Toy Story franchise has always been its narrative density: these aren’t long movies, but each installment, especially the latter two, is crammed with ideas, a tradition that the shorts have honored and maintained. And although it may not rank among the greatest holiday specials ever made, it gave me a hell of a lot of pleasure, mostly because I was delighted, as I always am, to see these characters again.

I’ve spoken frequently on this blog about the power of ensembles, which allow a television show to exploit different, surprising combinations of characters, but I don’t think I’ve really delved into its importance in film, which operates under a different set of constraints. Instead of multiple seasons, you have, at best, a handful of installments, and often just one movie. A rich supporting cast can lead to many satisfactions in the moment—think of the ensembles in Seven Samurai, The Godfather, L.A. Confidential—but it also allows you to dream more urgently of what else might have taken place, both in the runtime of the movie itself and after the story ends. When a great ensemble movie is over, it leaves you with a sense of loss: you feel that the characters were doing other things beyond the edges of the frame, pairing off in unexpected ways, and you wish there were time for more. (It’s no accident that the franchises that inspire the most devoted fanfic communities, from Harry Potter to Star Trek, are the ones that allow fans to play with the widest range of characters.)

Toy Story of Terror

And I don’t think I’ve ever felt this so keenly as I have with the characters from Toy Story. Over the course of three films and a handful of shorts, the franchise has created dozens of memorable characters, and it’s remarkable how vividly even briefly glimpsed figures—Wheezy, the Chatter Telephone—are drawn. Part of this is due to the fact that toys advertise their personalities to us at once, and you can mine a lot of material from either underlining or subverting that initial impression, as in the case of Lots-o’-Huggin’ Bear, who stands as one of the most memorable movie villains of the last decade. But it’s also thanks to some sensational writing, directing, and voice acting in the established Pixar style, as well as the ingenuity of the setting and premise itself. At its best, the franchise is an adventure series crossed with a workplace comedy, and much of its energy comes from the idea of these toys, literally from different worlds, thrown together into the same playroom. Andy’s bedroom, or Bonnie’s, or any child’s, is a stage on which an endless number of stories can be told, and they don’t need to be spectacular: I’d be happy just to watch these toys hanging out all day.

That’s the mark of great storytelling, and as time goes on, I’ve begun to suspect that this may be the best movie trilogy I’ve ever seen. I’ve loved this series for a long time, but it wasn’t until Toy Story 2 came out that it took up a permanent place in my heart. At the time, I was working as an online movie critic, and I was lucky enough to see it at a preview screening—I almost typed “screaming,” which is a revealing typo—packed with kids. And I don’t think any movie has ever left me feeling happier on leaving the theater, both because the film itself was a masterpiece, and because I knew that every child in the world was going to see it. Ten years later, Toy Story 3 provided the best possible conclusion to the central story, and I don’t think I want any more movies, as much as I want to spend more time with these characters. But the decision to release additional shorts and specials was a masterstroke. For any other franchise, it might have seemed like a cash grab, but I can’t help but read it as an act of generosity: it gives us a little more, but not too much, of what we need. And it makes me a little envious of my own daughter, who, if all goes according to plan, will grow up with Woody, Buzz, Rex, and the rest, not just as beloved characters, but as friends.

Written by nevalalee

October 17, 2013 at 9:00 am

A writer’s progress

with 2 comments

Over the past few days, I’ve been engaged in a long conversation with my younger self, using what Stephen King has rightly called the only real form of time travel we have. Years ago, when I left my job in New York to become a professional writer, my first major project was a long novel about India. I spent two years writing and revising it in collaboration with an agent, only to abandon it unpublished in the end for reasons that I’ve described elsewhere. It was a bittersweet experience at best, one that taught me much of what I know about writing, while also leaving me with little to show for it, and as a result, I haven’t gone back and read that novel in a long time—more than four years, in fact. Now that I’ve finished Eternal Empire, however, I’ve got some time on my hands, and one of the first things I wanted to do, since I no longer have a book under contract, is go back and look at that early effort to see whether there’s anything there worth saving. And the prospect of reading this novel again after so long filled me with a lot of trepidation.

In some ways, I doubt I’ll ever have this kind of experience again. This first novel represents the very best that I could do at that time in my career: I lavished everything I had on it for two years of my life, and so it would be surprising, at least to me, if there wasn’t at least something worthwhile there. But I’ve changed a lot in the meantime, too. When I sat down to write my first book, it largely to prove to myself that I could do it at all: I’d never written an original novel before, unless you count the science-fiction epic I cranked out in the summer between seventh and eighth grade, and I’d suffered through several unfinished projects in the meantime. Today, the situation couldn’t be more different: I have something like 350,000 words of professional work behind me, and I’ve gone through the process of writing, cutting, and revising a manuscript with an editor twice, with a third time just around the corner. I’m a better, smarter writer now, and this is probably the only chance I’ll have to confront the best work of my early days with the detachment that four years of distance affords.

And while reading the novel again, I discovered something fascinating: this manuscript, which is the final version of a book that went through countless edits, revisions, and iterations, is basically as good as the first drafts I write today. It isn’t a bad novel by any means: there’s a lot of interesting material, some exciting scenes, and many extended passages of decent writing. But it’s clearly the work of a novice. Most chapters go on for longer than they should; I spell out motivations and subtext rather than leaving them to the reader; and, much to my embarrassment, I even have long sections of backstory. At the time, this was the novel I wanted to write, and since then, my tastes have changed and developed in certain ways, which is precisely how it should be. Yet here’s the funny thing: I still write chapters that are too long, spell things out too explicitly, and tell more than I should about a character’s background. The difference now is that I cut it, usually before I’ve even printed out a copy to mark up with a red pencil. And whatever mistakes that remain tend to be made, and addressed, more quickly.

Which gets me to an important point about progress. There’s no such thing as real progress in the arts, at least as far as storytelling is concerned, but there’s certainly room for an individual author to grow and improve over time—and the best thing that a writer can learn, as they say at Pixar, is that you need to be wrong fast. Sometimes I’m wrong only in my own head, while I’m working out a scene in my mind’s eye, and I’ve corrected the mistake long before I begin to type. More often, I need to write something out first and change it once I realize it isn’t working. But the fact that my first drafts now are as good as my final drafts of a few years ago implies, if nothing else, that I’ve accelerated the process, which is about all a writer can ask. I’m still fundamentally the same person I was when I wrote my first novel, but a lot more efficient, and I’ll be happy as long as I can continue in the same general direction. As David Belle, the founder of parkour, says: “First, do it. Second, do it well. Third, do it well and fast—that means you’re a professional.”

Written by nevalalee

November 8, 2012 at 9:32 am

Posted in Writing

Tagged with , , ,

Flight, Wreck-It Ralph, and the triumph of the mainstream

leave a comment »

At first glance, it’s hard to imagine two movies more different than Flight and Wreck-It Ralph. The former is an emphatically R-rated message movie with a refreshing amount of nudity, drug and alcohol abuse, and what used to be known as adult situations, in the most literal sense of the term; the latter is a big, colorful family film that shrewdly joins the latest innovations in animation and digital effects to the best of classic Disney. On the surface, they appeal to two entirely separate audiences, and as a result, you’d expect them to coexist happily at the box office, which is precisely what happened: both debuted this weekend to numbers that exceeded even the most optimistic expectations. (This is, in fact, the first weekend in a long time when my wife and I went to the movies on two consecutive nights.) Yet these two films have more in common than first meets the eye, and in particular, they offer an encouraging snapshot of Hollywood’s current potential for creating great popular entertainment. And even if their proximity is just a fluke of scheduling, it’s one that should hearten a lot of mainstream moviegoers.

In fact, for all their dissimilarities, the creative team behind Flight would have been more than capable of making Wreck-It Ralph, and vice-versa, and under the right circumstances, they might well have done so. Flight is Robert Zemeckis’s first live-action movie in years, after a long, self-imposed exile in the motion-capture wilderness, and the script is by John Gatins, who spent a decade trying to get it made, while also slaving away for two years on the screenplay for Real Steel. It’s a handsome movie, old-fashioned in its insistence on big themes and complex characters, but it’s also a product of the digital age: Zemeckis’s Forrest Gump, whatever its other flaws, remains a landmark in the use of unobtrusive special effects to advance the story and fill in a movie’s canvas, and their use here allowed Flight to be brought in on a startlingly low budget of $31 million. At his best, Zemeckis is one of the most technically gifted of mainstream directors, and in some ways, he’s an important spiritual godfather for Wreck-It Ralph, whose true precursor isn’t Toy Story, as many critics have assumed, but Who Framed Roger Rabbit?

Similarly, Wreck-It Ralph is the product of a canny, often surprisingly mature set of sensibilities that only happens to have ended up in animation. Along with the usual stable of Pixar and Disney veterans, the creative team includes Rich Moore and Jim Reardon, a pair of directors whose work on The Simpsons collectively represents the best fusion of high and low art in my lifetime, and they’ve given us a movie that appeals to both adults and kids, and not just in the obvious ways. It’s full of video game in-jokes that will fly over or under the heads of many viewers—a reference to Metal Gear Solid represents one of the few times a joke in a movie had the audience laughing while I was scratching my head—but this is really the least impressive aspect of the movie’s sophistication. The script is very clever, with a number of genuinely ingenious surprises, and there are touches here that go well beyond nerd culture to something older and weirder, like Alan Tudyk’s brilliant Ed Wynn impression as the villainous King Candy. (The cast, which includes John C. Reilly, Jack McBrayer, and Sarah Silverman, all of them wonderful, is a modern version of the Disney trick of recruiting old pros like Ed Wynn and Phil Harris to bring its characters to life.)

It’s tempting to say that it all comes down to good storytelling, but there’s something else going on here. Last year, I predicted that the incursion of Pixar talent into live-action movies would represent a seismic shift in popular filmmaking, and although John Carter was a bust, Brad Bird’s work on Mission: Impossible—Ghost Protocol indicates that I wasn’t entirely off the mark. This weekend’s top two movies are a sign that, at its best, Hollywood is still capable of making solid movies for adults and children that come from essentially the same place—from good scripts, yes, but also from studios and creative teams that understand the potential of technology and draw on a similar pool of skilled professionals. This is how Hollywood should look: not a world neatly divided into summer tentpole pictures, Oscar contenders, and a lot of mediocrity, but a system capable of turning out mainstream entertainment for different audiences linked by a common respect for craft. The tools and the talent are there, led by directors like Zemeckis and backed up by studios like Pixar and Disney. This ought to be the future of moviemaking. And at least for one weekend, it’s already here.

Brave and the fate of Pixar

with 6 comments

Note: Spoilers follow for Brave.

It pains me to say this, but there’s no other way: I no longer fully trust Pixar. While I’m aware that this may not be a popular opinion, Brave strikes me as their weakest movie of any kind, weaker even than Cars 2. As I said at the time, Cars 2 had big problems, but it was only a rewrite or two away from being a entertaining movie. Brave, by contrast, comes off as fundamentally misconceived, and in ways that raise troubling questions about Pixar’s vaunted storytelling skills. There’s no doubt Pixar takes its storytelling very seriously, and as we saw with the recent list of narrative tips shared by artist Emma Coats, it’s developed a formidable bag of tricks. But in the case of a movie like Brave, such tricks amount to smart tactics in the service of no strategy whatsoever. Much of Brave works fine on its own terms—it’s consistently beautiful, ambitious, and rendered with a lot of love. But the more I think about it, the more it looks like a story that could only be fixed by being thrown out and radically reconceived.

At its heart, Brave‘s story is startlingly simple: a teenage princess, Merida, annoyed by her mother, Queen Elinor, casts a spell that turns her mother into a bear. This isn’t a bad premise in itself, but as handled by Brave, it suffers from three major problems: 1. Neither Merida nor her mother are strongly developed enough as characters to make the latter’s transformation meaningful. We don’t really know Queen Elinor before she’s transformed and can no longer speak, so the long sequences with Merida and Elinor as a bear can’t build on anything that came before. 2. Elinor’s metamorphosis is supposed to bring mother and daughter closer together, but there’s nothing in the situation that reveals anything new about their relationship. It’s just a generic crisis that doesn’t cast any light on the central conflict, which is that Merida is smarting under her mother’s expectations. 3. The movie’s treatment of magic is casual at best, with Merida essentially getting her spell from a witch who dispenses plot points, and the rules are never really explained, which undermines any narrative tension, especially near the end.

It isn’t hard to think of a version of this story that would have worked better than the one we’ve been given. We could make Elinor, not Merida, the central character, which automatically makes her transformation more interesting. We could turn Elinor’s father, the king, into a bear, and have mother and daughter work together to save him. We could have Merida take a rebellious interest in magic, and be drawn to a witch—not the witch we see here, but perhaps someone more like Maleficent—as an alternative mother figure in place of the queen, with disastrous consequences. Or we could even keep the story we have and approach it with a lighter touch, as Miyazaki might have done. Totoro barely has any plot at all, yet the grace of its conception makes it seem elegant rather than half-baked. Brave‘s technical splendor actually works against it here: it’s so visually compelling that it takes us a long time to realize that we’ve been given a rather simpleminded children’s movie, and that the studio gave less effort to exploring Merida’s motivations than it did to developing her hair.

In the end, we’re left with a deeply muddled movie whose constant harping on themes of destiny only makes its confusions all the more clear. Merida, for all her talk about fate, doesn’t seem to have any particular sense of what she wants out of life, and neither does the movie around her. (Just repeating the word “fate” over and over won’t convince us that you have anything interesting to say on the subject.) And the result is a film that seems less like an ordinary misfire than a tragic waste of resources. It’s possible that the change of directors was to blame, or the fact that, contrary to what the filmmakers have said, the studio was so intent on making a movie with a female protagonist and a fairy tale setting that it forgot to make either distinctive—or to see that Tangled had already done a better job. If I’m being hard on Pixar, it’s because it’s capable of far more, and I’m afraid it may see Brave as the best it can do. But it isn’t: it’s the work of a great studio that has lost its way. And only time will tell if Pixar can manage to change its fate.

Written by nevalalee

June 25, 2012 at 9:52 am

The elusive magic of Hayao Miyazaki

with 5 comments

Earlier this month, the Siskel Center in Chicago began presenting a loving retrospective of the work of Studio Ghibli and Hayao Miyazaki, the Japanese animator who, as I’ve argued before, may be our greatest living director in any medium. Of all the contemporary directors whose work I revisit on a regular basis, Miyazaki may be the one who fills me with the most awe, and he’s also the one whose mastery I find hardest to explain. His best films are totally accessible to viewers of all ages, and some, like My Neighbor Totoro, stand out for their apparent simplicity. But while the Pixar style of storytelling can be taken apart and analyzed—at their best, Pixar’s films are beautiful machines of narrative—the work of Miyazaki resists easy explanation. A set of narrative rules tweeted by Pixar storyboard artist Emma Coats recently made the rounds online, and they’re full of good advice: “What are the stakes?” “Give your characters opinions.” “Putting it on paper lets you start fixing it.” But what would the rules look like for Miyazaki?

As one possible way in, I’ll start by noting that Miyazaki’s work falls into two different categories, one of which is significantly greater than the other—although I know that a lot of fans would take issue with this. His best work, to my mind, has always been about children: My Neighbor Totoro, Spirited Away, and Ponyo are among the best animated movies ever made, and they’re all significantly different in tone, style, and mood. Totoro is a perfect tone poem about a child’s life in the satoyama, or Japanese countryside, with the gentle rhythms of a bedtime story; Spirited Away is a dense, superbly organized epic of fantasy seen through a child’s eyes; and Ponyo is sort of a hybrid of the two, with scenes of intense joy, humor, and lyricism paired with strange, goofy fantasy. Compared to these three, I find his work centering on older characters—such as Nausicaa, Princess Mononoke, and Howl’s Moving Castle—to be rather less interesting. These movies are often brilliant and visually distinctive, but Miyazaki has many rivals here, while there’s no one who matches him at capturing the inner lives of children.

Spirited Away is my favorite Miyazaki movie, but after watching Totoro again last night, I wonder if it might not be the greater accomplishment. I’ve spoken before about the American need to make movies centered on restless movement—on action that breaks out, to use David Thomson’s words. Spirited Away is almost like a Pixar film in this respect, although infinitely weirder and more graceful: it’s packed with incident, action, and spectacular images. Totoro, by contrast, takes its time. It contains only the tiniest sliver of plot or conflict. For most of the film, its magical creatures are offstage: Totoro himself appears for only a few minutes, and most of the movie is devoted to an idyllic but comparatively realistic depiction of the lives of two little girls. And yet the entire movie is riveting and magical. I can understand how Spirited Away works, but Totoro is beyond words. Ponyo lacks Totoro‘s clean lines, but it, too, is full of gorgeous moments that are impossible to explain but indisputably right.

And the childlike perspective here is crucial, because it allows the film to slow down and take in the world with the eyes of a child to whom everything is interesting. What impresses me the most about Miyazaki these days aren’t his flights of fancy but his attention to the small details of everyday life. In Totoro, he notices how an old door or window sticks slightly before you open it for the first time, or how a girl of ten sleeps more or less like an adult while a girl of four sleeps in a tangle of blankets. Ponyo, in turn, mines poetry out of making ramen or starting a generator after a storm. That kind of perspective, when channeled through years of artistic experience, is truly precious, and I watch Miyazaki’s films again and again just for the chance to relive those moments. The craft on display here isn’t the kind that can be easily taught: it requires a good eye and steady hand as well as a generous heart. It can’t be reduced to a set of rules. But if it could, it wouldn’t be magical, would it?

Written by nevalalee

June 21, 2012 at 10:02 am

So what happened to John Carter?

leave a comment »

In recent years, the fawning New Yorker profile has become the Hollywood equivalent of the Sports Illustrated cover—a harbinger of bad times to come. It isn’t hard to figure out why: both are awarded to subjects who have just reached the top of their game, which often foreshadows a humbling crash. Tony Gilroy was awarded a profile after the success of Michael Clayton, only to follow it up with the underwhelming Duplicity. For Steve Carrell, it was Dinner with Schmucks. For Anna Faris, it was What’s Your Number? And for John Lasseter, revealingly, it was Cars 2. The latest casualty is Andrew Stanton, whose profile, which I discussed in detail last year, now seems laden with irony, as well as an optimism that reads in retrospect as whistling in the dark. “Among all the top talent here,” a Pixar executive is quoted as saying, “Andrew is the one who has a genius for story structure.” And whatever redeeming qualities John Carter may have, story structure isn’t one of them. (The fact that Stanton claims to have closely studied the truly awful screenplay for Ryan’s Daughter now feels like an early warning sign.)

If nothing else, the making of John Carter will provide ample material for a great case study, hopefully along the lines of Julie Salamon’s classic The Devil’s Candy. There are really two failures here, one of marketing, another of storytelling, and even the story behind the film’s teaser trailer is fascinating. According to Vulture’s Claude Brodesser-Akner, a series of lost battles and miscommunications led to the release of a few enigmatic images devoid of action and scored, in the manner of an Internet fan video, with Peter Gabriel’s dark cover of “My Body is a Cage.” And while there’s more to the story than this—I actually found the trailer quite evocative, and negative responses to early marketing materials certainly didn’t hurt Avatar—it’s clear that this was one of the most poorly marketed tentpole movies in a long time. It began with the inexplicable decision to change the title from John Carter of Mars, on the assumption that women are turned off by science fiction, while making no attempt to lure in female viewers with the movie’s love story or central heroine, or even to explain who John Carter is. This is what happens when a four-quadrant marketing campaign goes wrong: when you try to please everybody, you please no one.

And the same holds true of the movie itself. While the story itself is fairly clear, and Stanton and his writers keep us reasonably grounded in the planet’s complex mythology, we’re never given any reason to care. Attempts to engage us with the central characters fall curiously flat: to convey that Princess Dejah is smart and resourceful, for example, the film shows her inventing the Barsoomian equivalent of nuclear power, evidently in her spare time. John Carter himself is a cipher. And while some of these problems might have been solved by miraculous casting, the blame lands squarely on Stanton’s shoulders. Stanton clearly loves John Carter, but forgets to persuade us to love him as well. What John Carter needed, more than anything else, was a dose of the rather stark detachment that I saw in Mission: Impossible—Ghost Protocol, as directed by Stanton’s former Pixar colleague Brad Bird. Bird clearly had no personal investment in the franchise, except to make the best movie he possibly could. John Carter, by contrast, falls apart on its director’s passion and good intentions, as well as a creative philosophy that evidently works in animation, but not live action. As Stanton says of Pixar:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

Which only makes us wonder what might have happened if John Carter had been granted a fourth year.

Stanton should take heart, however. If there’s one movie that John Carter calls to mind, it’s Dune, another financial and critical catastrophe that was doomed—as much as I love it—by fidelity to its source material. (In fact, if you take Roger Ebert’s original review of Dune, which came out in 1985, and replace the relevant proper names, you end up with something remarkably close to a review of John Carter: “Actors stand around in ridiculous costumes, mouthing dialogue with little or no context.”) Yet its director not only recovered, but followed it up with my favorite movie ever made in America. Failure, if it results in another chance, can be the opposite of the New Yorker curse. And while Stanton may not be David Lynch, he’s not without talent: the movie’s design is often impressive, especially its alien effects, and it displays occasional flashes of wit and humor that remind us of what Stanton can do. John Carter may go on record as the most expensive learning experience in history, and while this may be cold comfort to Disney shareholders, it’s not bad for the rest of us, as long as Stanton gets his second chance. Hopefully far away from the New Yorker.

Written by nevalalee

March 15, 2012 at 10:31 am

Thinking in groups, thinking alone

leave a comment »

Where do good ideas come from? A recent issue of the New Yorker offers up a few answers, in a fascinating article on the science of groupthink by Jonah Lehrer, who debunks some widely cherished notions about creative collaboration. Lehrer suggests that brainstorming—narrowly defined as a group activity in which a roomful of people generates as many ideas as possible without pausing to evaluate or criticize—is essentially useless, or at least less effective than spirited group debate or working alone. The best kind of collaboration, he says, occurs when people from diverse backgrounds are thrown together in an environment where they can argue, share ideas, or simply meet by chance, and he backs this up with an impressive array of data, ranging from studies of the genesis of Broadway musicals to the legendary Building 20 at MIT, where individuals as different as Amar Bose and Noam Chomsky thrived in an environment in which the walls between disciplines could literally be torn down.

What I love about Lehrer’s article is that its vision of productive group thinking isn’t that far removed from my sense of what writers and other creative artists need to do on their own. The idea of subjecting the ideas in brainstorming sessions to a rigorous winnowing process has close parallels to Dean Simonton’s Darwinian model of creativity: quality, he notes, is a probabilistic function of quantity, so the more ideas you have, the better—but only if they’re subjected to the discipline of natural selection. This selection can occur in the writer’s mind, in a group, or in the larger marketplace, but the crucial thing is that it take place at all. Free association or productivity isn’t enough without that extra step of revision, or rendering, which in most cases requires a strong external point of view. Hence the importance of outside readers and editors to every writer, no matter how successful.

The premise that creativity flowers most readily from interactions between people from different backgrounds has parallels in one’s inner life as well. In The Act of Creation, Arthur Koestler concludes that bisociation, or the intersection of two unrelated areas of knowledge in unexpected ways, is the ultimate source of creativity. On the highest plane, the most profound innovations in science and the arts often occur when an individual of genius changes fields. On a more personal level, nearly every good story idea I’ve ever had came from the juxtaposition of two previously unrelated concepts, either done on purpose—as in my focused daydreaming with science magazines, which led to stories like “Kawataro,” “The Boneless One,” and “Ernesto”—or by accident. Even accidents, however, can benefit from careful planning, as in the design of the Pixar campus, as conceived by Steve Jobs, in which members of different departments have no choice but to cross paths on their way to the bathroom or cafeteria.

Every creative artist needs to find ways of maximizing this sort of serendipity in his or her own life. My favorite personal example is my own home library: partially out of laziness, my bookshelves have always been a wild jumble of volumes in no particular order, an arrangement that sometimes makes it hard to find a specific book when I need it, but also leads to serendipitous arrangements of ideas. I’ll often be looking for one book when another catches my eye, even if I haven’t read it in years, which takes me, in turn, in unexpected directions. Even more relevant to Lehrer’s article is the importance of talking to people from different fields: writers benefit enormously from working around people who aren’t writers, which is why college tends to be a more creatively fertile period than graduate school. “It is the human friction,” Lehrer concludes, “that makes the sparks.” And we should all arrange our lives accordingly.

Written by nevalalee

February 1, 2012 at 10:26 am

Ghost in the machine

with 2 comments

I’ve always had a soft spot for the Mission: Impossible franchise, which feels less like a coherent series of feature films than a sandbox for a succession of gifted directors to play with the idea of the spy movie itself. Aside from the title and Lalo Schifrin’s indispensable theme, the movies have little in common with the show of the same name, but these elements, along with a star who seems admirably willing to try variations on his screen persona, have allowed for a wide range of approaches, from impersonal puzzle box to fiery action extravaganza to TV-inspired ensemble piece. And while Brad Bird’s Mission: Impossible—Ghost Protocol has the least personal stamp of any movie in the series, it’s perhaps the most culturally significant: along with something of a personal triumph for Tom Cruise, it’s the opening salvo from a generation of Pixar directors who seem destined to shake up the world of live-action film.

To take the most obvious example: the massively hyped action scene at the Burj Khalifa isn’t merely as good as they say, it’s the best use of IMAX I’ve ever seen. As far as I’m concerned, it definitively establishes the supremacy of IMAX over 3D as a medium for generating thrills: the entire sequence, with its crystalline cinematography and breathtaking stunts, is as close to an out-of-body experience as I’ve had at the movies. Like Christopher Nolan, Bird knows how to ground sensational action in what feels like reality—there are only a handful of obvious special effects shots in the entire film. And throughout, he shows a preternatural gift for staging and executing the best kind of action scene: one conceived at the script and storyboard stage, with cleanly defined beats and a real beginning, middle, and end, rather than a Michael Bay-style nightmare of second-unit footage assembled after the fact in the editing room. (In recent years, only the Guggenheim shootout in The International comes close to what Bird offers here in terms of inventiveness and excitement.)

If Ghost Protocol has a flaw, it’s that it never manages to come up with an overarching narrative of the same fluency as its individual parts. It’s true that story has never been this franchise’s strong point—the first installment, in particular, plays like an attempt to spin a feature film from the most gossamer of plot threads. But I’ve always thought that the script for Mission: Impossible II, still my favorite, was surprisingly engaging and self-aware, with a central love triangle profitably copied from Notorious and a lot of witty details. Mission: Impossible III, in turn, was a calculated attempt to humanize the franchise, as well as the only time that J.J. Abrams, as a feature director or producer, has bothered to deliver on the twists that he constantly promises. Ghost Protocol has a lot of cute touches, but it lacks that kind of surprise, and the basic elements have been even more casually assembled than usual, with a vaguely deployed threat of nuclear annihilation and an off-the-shelf bad guy. (The absence of a great villain from Bird, who gave us the hateful Syndrome in The Incredibles, is perhaps the film’s only real disappointment.)

In the end, then, Ghost Protocol comes off as the world’s greatest demo reel, a chance for Bird to demonstrate that he has the willingness and technical ability to do almost anything, as if the real drama here was being played out in the context of the director’s résumé. As I watched it, my mind was curiously divided: while my lower brain was tingling with adrenaline, my higher functions remained relatively detached. For all the film’s excitement, its sense of risk is more visceral than narrative: despite an appealing cast—and this is by far the best team that Ethan Hunt has ever had—the movie never really creates any possibility of danger toward the characters themselves. Still, it’s a movie that I’d happily see again and again, and I doubt that many viewers will complain. As Walter Kerr might have said, this is a machine for exciting the audience, a watch that thrills. And it makes me all the more curious to see the next movie from Brad Bird, who emerges here as a director of great skill and assurance. Once he gets a real story, he’ll be unstoppable.

Written by nevalalee

December 19, 2011 at 10:42 am

Andrew Stanton and the world beyond Pixar

leave a comment »

Art is messy, art is chaos—so you need a system.

Andrew Stanton, to the New Yorker

For the second time in less than six months, the New Yorker takes on the curious case of Pixar, and this time around, the results are much more satisfying. In May, the magazine offered up a profile of John Lasseter that was close to a total failure, since critic Anthony Lane’s customary air of disdain was unprepared to draw any useful conclusions about a studio that, at least up to that point, had gotten just about everything blessedly right. This week’s piece by Tad Friend is far superior, focusing on the relatively unsung talents of Andrew Stanton, director of Finding Nemo and Wall-E. And while the publication of a fawning New Yorker profile of a hot creative talent rarely bodes well for his or her next project—as witness the recent articles on Tony Gilroy, Steve Carrell, Anna Faris, or even Lasseter himself, whose profile only briefly anticipated the release of the underwhelming Cars 2—I’m still excited by Stanton’s next project, the Edgar Rice Burroughs epic John Carter, which will serve as a crucial test as to whether Pixar’s magic can extend to the world beyond animation.

Stanton’s case is particularly interesting because of the role he plays at the studio: to hear the article tell it, he’s Pixar’s resident storyteller. “Among all the top talent here,” says Jim Morris, the head of Pixar’s daily operations, “Andrew is the one who has a genius for story structure.” And what makes this all the more remarkable is the fact that Stanton seems to have essentially willed this talent into existence. Stanton was trained as an animator, and began, like most of his colleagues, by focusing on the visual side. As the script for Toy Story was being developed, however, he decided that his future would lie in narrative, and quietly began to train himself in the writer’s craft, reading classic screenplays—including, for some reason, the truly awful script for Ryan’s Daughter—and such texts as Lajos Egri’s The Art of Dramatic Writing. In the end, he was generally acknowledged as the senior writer at Pixar, which, given the caliber of talent involved, must be a heady position indeed.

And while the article is littered with Stanton’s aphorisms on storytelling—”Inevitable but not predictable,” “Conflict + contradiction,” “Do the opposite”—his main virtue as a writer seems to lie in the most universal rule of all: “Be wrong fast.” More than anything else, Stanton’s success so far has been predicated on an admirable willingness to throw things out and start again. He spent years, for instance, working on a second act for Wall-E that was finally junked completely, and while I’m not sure he ever quite cracked the plot for that moviewhich I don’t think lives up to the promise of its first twenty minutes—there’s no question that his ruthlessness with structure did wonders for Finding Nemo, which was radically rethought and reconceived several times over the course of production. Pixar, like the rest of us, is making things up as it goes along, but is set apart by its refusal to let well enough alone. As Stanton concludes:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

The real question, of course, is whether this approach to storytelling, with its necessary false starts and extensive rendering time, can survive the transition to live action, in which the use of real actors and sets makes retakes—and thus revision—drastically more expensive. So far, it sounds like John Carter is doing fine, at least judging from the trailer and early audience response, which has reportedly been encouraging. And more rides on this movie’s success or failure than the fate of one particular franchise. Pixar’s story has been extraordinary, but its most lasting legacy may turn out to be the migration of its talent beyond the safety zone of animation—assuming, of course, that their kung fu can survive. With Brad Bird’s Mission: Impossible—Ghost Protocol and John Carter in the wingswe’re about to discover if the directors who changed animation at Pixar can do the same in live action. The New Yorker article is fine, but it buries the lede: Stanton and Bird are the first of many. And if their next movies are half as entertaining as the ones they’ve made so far, we’re looking at an earthquake in the world of pop culture.

Blinn’s Law and the paradox of efficiency

leave a comment »

As technology advances, rendering time remains constant.
—Blinn’s Law

Why isn’t writing easier? Looking at the resources that contemporary authors have at their disposal, it’s easy to conclude that we should all be perfect writing machines: word processing software has made the physical process of writing more streamlined than ever, Google and Amazon have given us access to a world of information that would have been inconceivable even fifteen years ago, and research, editing, and revision have been made immeasurably more efficient. And yet writing itself doesn’t seem all that much easier than before. The amount of time a professional novelist needs to spend writing each day—let’s say three or four hours—hasn’t decreased since Trollope, and for most of us, it still takes a year or two to write a decent novel.

So what happened? In some ways, it’s an example of the paradox of labor-saving devices: instead of having more leisure time, we create more work for ourselves. It also reflects the fact that the real work of writing a novel is rarely connected to how fast you can type. But I prefer to think of it as a variation on Blinn’s Law. As graphics pioneer James Blinn first pointed out, in animation, rendering time remains constant, even as computers get faster. An artist gets accustomed to waiting a certain number of hours for an image to render, so as hardware improves, instead of using it to save time, he employs it to render more complex graphics. This is why rendering time at Pixar has remained essentially constant over the past fifteen years. (Although the difference between Toy Story and Cars 2 is a reminder that rendering isn’t everything.)

Similarly, whatever time I save by writing on a laptop rather than a manual typewriter is canceled out by the hours I spend making additional small changes and edits along the way. The Icon Thief probably went through eighteen substantial drafts before the final version was delivered to my publisher, an amount of revision and rewriting that would have been unthinkable without Word. Is the novel better as a result? On a purely technical level, yes. Is the underlying story more interesting than if I’d written it by hand? Probably not. Blinn’s Law tells us that the leaves and grass in the background of a shot will look increasingly great, but it says nothing about the quality of storytelling. Which seems to imply that the countless tiny changes that a writer like me makes to each draft are only a waste of effort.

And yet here’s the thing: I still needed all that time. No matter how efficient the physical side of the process becomes, it’s still desirable for a writer to live with a novel, or a studio to live with a movie, for at least a year or so. For most of us, there seems to be a fixed gestation period for decent art, a minimum amount of time that a story needs to simmer and evolve. The endless small revisions aren’t the point: the point is that while you’re shifting a paragraph here or there, the story is growing in your head in unexpected ways. Even as you fiddle with the punctuation, seismic changes are taking place. But for this to happen, you need to be at your desk for a certain number of hours. So what do we do in the meantime? We do what Pixar does: we render. That’s the wonderful paradox of efficiency: it allows us, as artists, to get the inefficiency we sometimes need.

Written by nevalalee

August 9, 2011 at 10:43 am

So what happened to Cars 2?

leave a comment »

On Saturday afternoon, at my insistence, my wife and I ended up in a theater full of excited kids and obliging parents at a screening of Cars 2, which had already received the worst reviews in the history of Pixar. Rather to my surprise, my wife enjoyed it more than I did, and the kids seemed okay with it as well (aside from the one who kicked my chair repeatedly throughout most of the last twenty minutes). Yet the film itself is undeniably underwhelming: a bright, shiny mediocrity. Cars 2 isn’t a bad movie, exactly—it’s watchable and reasonably fun—but it’s a disappointment, not only in comparison to Pixar’s past heights but also to a strong recent showing from DreamWorks, which includes How to Train Your Dragon and the sublime Kung Fu Panda series. And while Pixar can take comfort in good box office and decent audience reactions, I hope that the negative critical response inspires some introspection at the studio as to how things went wrong.

It’s important to note that it wouldn’t have taken a miracle to make Cars 2 a better movie. While the original Cars struck me as somewhat misconceived, the basic elements of the sequel are all sound: the shift in tone from nostalgic Americana to international thriller is a masterstroke, and the underlying story and premise are fine, although never particularly involving. The trouble is that the script, by writer Ben Queen, never really sparkles, at least not by the standards we’ve come to expect: there are some laughs, but only a few hit home, and the movie seems content to coast on the level of cleverness rather than taking the leap to really inspired comedy or action. Cars 2 is constantly on the verge of breaking through to something more engaging, but never quite makes it, when I suspect that another pass on the screenplay, and some honest notes, would have made all the difference.

This brings us to the second big problem: it’s hard to give notes to the man who founded the studio. John Lasseter is undeniably a genius—he’s the rare example of a great creative artist who has also demonstrated a willingness to tackle the practical problems of building a major corporation—but it was probably too much to ask one man to oversee Pixar, Disney animation, and a movie of his own. A recent New York Magazine profile makes it clear that the process left Lasseter pressed for time, which would have made it hard for him to address his own movie’s more glaring flaws. Even more importantly, it seems likely that his status as a Pixar legend and founding father prevented him from receiving the feedback he needed. Just a glance at the history of movies reminds us that the heads of studios can make remarkable producers—just look at David O. Selznick—but that even the greatest directors can’t operate entirely without accountability.

I’ve talked about Pixar’s singular culture before, in a much more comprehensive post, so I won’t repeat the same points here. But it seems clear that Pixar’s previous excellence was due to a process that allowed its central brain trust to mercilessly criticize and improve the studio’s works in progress. For Cars 2, this process seems to have broken down, partly because of Lasseter’s deserved stature, and also because of his personal attachment to the Cars franchise. (Pixar has famously canceled other projects, such as Newt, deep into the planning stages because of quality concerns, something it’s hard to imagine happening to Cars 2.) Judging from the outcome, Lasseter needs to return to what he does better than anyone else alive: overseeing the work of the world’s greatest animation studio. If not, he will end up with a legacy more like that of George Lucas than Walt Disney. And that would be a shame.

Written by nevalalee

June 27, 2011 at 10:05 am

Posted in Movies

Tagged with , , , , ,

The Pixar problem

leave a comment »

A week ago, in my appreciation of Hayao Miyazaki, I wrote the following about Pixar:

Pixar has had an amazing run, but it’s a singularly corporate excellence. The craft, humor, and love of storytelling that we see in the best Pixar movies feels learned, rather than intuitive; it’s the work of a Silicon Valley company teaching itself to be compassionate.

Which I still believe is true. But the more I think about this statement, the more I realize that it raises as many questions as it answers. Yes, Pixar’s excellence is a corporate one—but why does it strive to be compassionate and creative, when so many other studios seem ready to settle for less? Faced with Pixar’s historic run of eleven quality blockbusters in fifteen years, it’s easy to fall into the trap of saying that Pixar’s culture is simply different from that of other studios, or that it has a special, mysterious genius for storytelling, which, again, simply avoids the question. So what is it, really, that sets Pixar apart?

It’s tempting to reduce it to a numbers game. Pixar releases, at most, one movie per year, while the other major studios release dozens. This means that Pixar can devote all of its considerable resources to a single flagship project, rather than spreading them across a larger slate of films. If every studio released only one picture a year, it’s nice to think that, instead of a hundred mostly forgettable movies, we’d get a handful of big, ambitious films like Inception, or even Avatar. Of course, we might also end up with a dozen variations on Transformers: Revenge of the Fallen. So I suspect that there’s something else going on here that can’t be explained by the numbers alone.

And as much as I hate to say it, Pixar’s special quality does, in fact, seem to boil down to a question of culture. So where does culture come from? Two places. The first, more accidental source is history: studios, like artists, tend to be subconsciously defined by their first successful works. In Pixar’s case, it was Toy Story; for DreamWorks, it was Shrek. And the contrast between these two films goes a long way toward accounting for the differences between their respective studios. Because its first movie was a classic, Pixar was encouraged to aim high, especially once they saw how audiences responded. If the first Pixar movie had been, say, Cars, I don’t think we’d be having this conversation.

The second factor is even more important. For reasons of luck, timing, and corporate politics, the creative side of Pixar is essentially run by John Lasseter, a director of genius. And his genius is less important than the fact that he’s a director at all. Most studios are run by men and women who have never directed a movie or written a screenplay, and as talented as some of these executives may be, there’s a world of difference between receiving notes from a Wharton MBA and from the man who directed Toy Story. The result, at best, is a climate where criticism is seen as a chance to make a movie better, rather than as inference from overhead. As a recent Wired article on Pixar pointed out:

The upper echelons also subject themselves to megadoses of healthy criticism. Every few months, the director of each Pixar film meets with the brain trust, a group of senior creative staff. The purpose of the meeting is to offer comments on the work in progress, and that can lead to some major revisions. “It’s important that nobody gets mad at you for screwing up,” says Lee Unkrich, director of Toy Story 3. “We know screwups are an essential part of making something good. That’s why our goal is to screw up as fast as possible.” [Italics mine.]

In other words, it isn’t true that Pixar has never made a bad movie: it makes bad movies—or parts of movies—all the time. The difference is that the bad movies are reworked until they get better, which isn’t the case at most other studios. (And at Pixar, if they still aren’t any good, they get canceled.) And because the cultural factors that made this climate possible are as much the result of timing and luck as intentional planning, the situation is more fragile than it seems. A real Pixar flop, with its ensuing loss of confidence, could change things overnight. Which is why, in the end, what I said of Miyazaki is also true of Pixar: if it goes away, we may never see anything like it again.

Written by nevalalee

January 14, 2011 at 12:02 pm

The best movies of the year

leave a comment »

First, the bad news. This was a terrible year for movies. Some combination of recessionary cutbacks, the delayed effects of the writer’s strike, and a determination to convert every imaginable movie to muddy 3D resulted in stretches of up to two or three months when multiplexes were basically a wasteland. And even if this cinematic dead zone turns out to be temporary, it’s hard not to see it as karmic comeuppance for the Academy’s recent decision to bump the number of Best Picture nominees to ten, an act of desperation that is looking more misguided with every passing day. Still, there were some very good movies released this year, including one that ranks among the best I’ve ever seen. It’s almost enough to make me think that this year was better than it actually was:

1. Inception. After a decade of extraordinary productivity, Christopher Nolan is beginning to look like nothing so much as two great directors working as one: the first is obsessed with pushing the bounds of filmic complexity on the narrative level, while the other has devoted himself to mastering every aspect of modern blockbuster filmmaking. Inception is the ultimate result of this paradoxical partnership: it’s one of those rare movies in which every aspect of the production—acting, story, visual effects, art direction, stunts, music, editing, even costume design—is both immediately exhilarating and endless to meditation. I only wish there were more of it.

2. Toy Story 3. I was hard on this movie yesterday, so let’s set the record straight: this is the best Pixar film since Finding Nemo, and one of the finest animated movies ever made. It’s touching, exciting, thematically rich, and very funny, with an enormous cast of characters—both existing and new—who are so engaging that I’m sad we won’t have a chance to see them in other stories. (Fanfic, as usual, is ready to come to the rescue.) It’s enough to make me wish that I were ten years younger, just so I could have grown up with these toys—and movies—on my playroom shelves.

3. The Social Network. Over the past few years, David Fincher has gone from being a stylish but chilly visual perfectionist to a director who can seemingly do anything. Zodiac was the best movie ever made about serial killers and journalism, as well as the best Bay Area picture since Vertigo; The Social Network, in turn, is the best Harvard movie of all time, as well as a layered, trashy story of money and friendship, with an Aaron Sorkin script that manages to evoke both John Hughes and Citizen Kane. It’s almost enough to make me excited about The Girl With the Dragon Tattoo.

4. Exit Through the Gift Shop. Even more than Inception, this was the best film of the year for inspiring endless heated debate. Months later, I’m still not sure what to think about the strange case of Banksy and Mr. Brainwash, which is some combination of cautionary tale, Horatio Alger story, fascinating reportage, and practical joke. I do know that it’s impossible to watch it without questioning your deepest assumptions about art, commerce, and the nature of documentary filmmaking. And even if it’s something of a put-on, which I think at least part of it is, it’s still the best movie of its kind since F for Fake.

5. The Ghost Writer. Roman Polanski’s modest but wickedly sophisticated thriller is a reminder that a movie doesn’t need to be big to be memorable. The ingredients couldn’t be simpler: a tight story, an impeccable cast (aside from Kim Cattrall’s distractingly plummy British accent), and an isolated house on the beach. The result is one of the great places in the movies, as real as Hannibal Lecter’s cell or the detective’s office in The Usual Suspects. By the end, we feel as if we could find our way around this house on our own, and the people inside it—especially the devastating Olivia Williams—have taken up residence in our dreams.

6. Fair Game. Aside from a pair of appealingly nuanced performances by Naomi Watts (as Valerie Plame) and Sean Penn (as Joseph Wilson), Fair Game doesn’t even try to be balanced: it’s a story of complex good against incredible, mustache-twirling evil, which would be objectionable from a narrative perspective if it weren’t so close to the truth. At its best, it’s reminiscent of The Insider, both in its sense of outrage and in the massive technical skill that it lavishes on intimate spaces. It’s impossible to watch it without being swept up again by renewed indignation.

7. The Town. True, it’s slightly confused about its main character, who comes off as more of a sociopath than the film wants to admit, and I have problems with the last ten minutes, in which Ben Affleck, as both director and star, slips from an admirable objectivity into a strange sort of self-regard. Still, for most of its length, this is a terrific movie, with one of the best supporting casts in years—notably Jeremy Renner, Rebecca Hall, Jon Hamm, and the late Pete Postlethwaite. The result is a genre piece that is both surprisingly layered and hugely entertaining, with a fine sense of Boston atmosphere.

8. The Secret in Their Eyes. Technically, this Argentine movie—which won the Academy Award for Best Foreign Language Film—came out last year, but I’d feel irresponsible if I didn’t include it here. Like The Lives of Others, which it superficially resembles, it’s one of those foreign films, aware of but unimpressed by the conventions of Hollywood, that seems so rich and full of life that it passes beyond genre: it’s funny, romantic, and unbearably tense, and contains one of the most virtuoso action sequences this side of Children of Men. I don’t know what to call it, but I love it.

9. Scott Pilgrim vs. The World. A week doesn’t go by in which I don’t think fondly of Knives Chau, Scott Pilgrim’s hapless but unexpectedly resourceful Chinese-Canadian love interest. The film in which Knives finds herself is equally adorable: it has enough wit and invention for three ordinary movies, and it’s one of the few comedies of recent years that knows what to do with Michael Cera. It’s something of a mess, and its eagerness to please can be exhausting, but it still contains more delights per reel than any number of tidier films.

10. The American. Despite opening at the top of the box office over Labor Day weekend, this odd, nearly perfect little movie was mostly hated or dismissed by audiences soon after its release. The crucial thing is to adjust your expectations: despite what the commercials say, this isn’t a thriller so much as a loving portrait of a craftsman—in this case, an assassin—at work, as well as a visual essay on such important subjects as the Italian countryside, a woman’s naked body, and George Clooney’s face. It’s perilously close to ridiculous, but until its ludicrous final shot, it casts its own kind of peculiar spell.

Honorable mention goes to Winter’s Bone, A Prophet, Tangled, and How to Train Your Dragon, as well as to parts of The Kids Are All Right, The King’s Speech, and even Black Swan, which really deserves a category of its own. (As for Tron: Legacy, well, the less said about that, the better.)

Hayao Miyazaki and the future of animation

with 6 comments

Yesterday was the seventieth birthday of Japanese filmmaker Hayao Miyazaki, the director of Spirited Away, which makes this as appropriate a time as any to ask whether Miyazaki might be, in fact, the greatest living director in any medium. He certainly presents a strong case. My own short list, based solely on ongoing quality of output rather than the strength of past successes, includes Martin Scorsese, Wong Kar-Wai, and Errol Morris, but after some disappointing recent work by these last three, Miyazaki remains the only one who no longer seems capable of delivering anything less than a masterpiece. And he’s also going to be the hardest to replace.

Why is that? Trying to pin down what makes Miyazaki so special is hard for the same reason that it’s challenging to analyze any great work of children’s fiction: it takes the fun out of it. I’m superstitiously opposed to trying to figure out how the Alice books work, for example, in a way that I’m not for Joyce or Nabokov. Similarly, the prospect of taking apart a Miyazaki movie makes me worry that I’ll come off as a spoilsport—or, worse, that the magic will somehow disappear. That’s one reason why I ration out my viewings of Ponyo, one of the most magical movies ever made, so carefully. And it’s why I’m going to tread cautiously here. But it’s still possible to hint at some of the qualities that set Miyazaki apart from even the greatest animators.

The difference, and I apologize in advance for my evasiveness, comes down to a quality of spirit. Miyazaki is as technically skilled as any animator in history, of course, but his craft would mean little without his compassion, and what I might also call his eccentricity. Miyazaki has a highly personal attachment to the Japanese countryside—its depiction of the satoyama is much of what makes My Neighbor Totoro so charming—as well as the inner lives of small children, especially girls. He knows how children think, look, and behave, which shapes both his characters and their surrounding movies. His films can seem as capricious and odd as the stories that very young children tell to themselves, so that Spirited Away feels both beguilingly strange and like a story that you’ve always known and only recently rediscovered.

Which is why Miyazaki is greater than Pixar. Don’t get me wrong: Pixar has had an amazing run, but it’s a singularly corporate excellence. The craft, humor, and love of storytelling that we see in the best Pixar movies feels learned, rather than intuitive; it’s the work of a Silicon Valley company teaching itself to be compassionate. Even the interest in children, which is very real, seems like it has been deliberately cultivated. Pixar, I suspect, is run by men who love animation for its own sake, and who care about children only incidentally, which was also true of Walt Disney himself. (If they could make animated movies solely for adults, I think they would, as the career trajectory of Brad Bird seems to indicate. If nothing else, it would make it easier for them to win an Oscar for Best Picture.)

By contrast, the best Miyazaki movies, like the Alice books, are made for children without a hint of condescension, or any sense that children are anything but the best audience in the world. And as traditional animation is replaced by monsters of CGI that can cost $200 million or more, I’m afraid that this quality will grow increasingly rare. We’ve already seen a loss of personality that can’t be recovered: it’s impossible to be entirely original, not to mention eccentric, with so much money on the line. The result, at best, is a technically marvelous movie that seems to have been crafted by committee, even if it’s a committee of geniuses. Toy Story 3 is a masterpiece, and not good enough.

Miyazaki is seventy now, and judging from Ponyo, he’s still at the top of his game. I hope he keeps making movies for a long time to come. Because it’s unclear if the world of animation, as it currently exists, will ever produce anyone quite like him again.

The Legend of Miyamoto

leave a comment »

For reasons known only to itself, The New Yorker has evidently decided that the best way to write about video games is to assign these stories to writers who emphatically have no gaming experience. This approach, which wouldn’t be tolerated for any other art form, high or low, has already resulted in this notorious article by Nicholson Baker—one of my favorite living writers, but clearly unequipped to say anything interesting about Red Dead Redemption. And now we have Nick Paumgarten’s disappointing profile of Shigeru Miyamoto, which is a huge missed opportunity, in more ways than one.

Miyamoto, the creator of the Mario and Zelda franchises and the greatest video game designer of all time, has often been compared to Walt Disney, an accolade he shares with his fellow genius Hayao Miyazaki. (Miyamoto and Miyazaki also share a deep nostalgia for the forests and villages of rural Japan, an abiding affection that shows up throughout their work.) Miyamoto is an artist, a storyteller, an engineer, and a visionary, and he’s exactly the sort of creative force that the readers of The New Yorker ought to know more about. The fact that Paumgarten scored only a brief interview with Miyamoto, which he pads out to feature length with pages of unenlightening digressions, is only the most disappointing thing about the profile. A single glimpse of one of Miyamoto’s sketches for Zelda would be more interesting than anything on display here.

Still, there are a few moments worth mentioning. Here’s Miyamoto on calibrating the difficulty of a game, and how important it is to incorporate quiet moments alongside every challenge:

A lot of the so-called action games are not made that way…All the time, players are forced to do their utmost. If they are challenged to the limit, is it really fun for them?…[In Miyamoto's own games] you are constantly providing the players with a new challenge, but at the same time providing them with some stages or some occasions where they can simply, repeatedly, do something again and again. And that can be a joy.

This is especially good advice for writers in genres, such as suspense, that place a premium on intensity. A few strategically timed breaks in the action, which give the reader a moment of breathing room, can make the rest of the novel read much more quickly. The key, as Miyamoto knows, is putting yourself in the position of a person approaching a work of art for the first time:

I always remind myself, when it comes to a game I’m developing, that I’m the perfect, skillful player. I can manipulate all this controller stuff. So sometimes I ask the younger game creators to try playing the games they are making by switching their left and right hands. In that way, they can understand how inexperienced the first-timer is.

Similarly, once a writer has internalized the plot of a novel, it can be hard to see it with fresh eyes. One solution is to set the book aside for a month and read it again once the memory of the story has faded. Another approach, which I’ve done a few times, is to read a sequence of chapters in reverse, or at random, which often reveals problems or repetitions that I wouldn’t have noticed otherwise.

Finally, here’s Paumgarten on one of my favorite topics, the importance of constraints as a creative tool:

Mario, [Miyamoto's] most famous creation, owes his appearance to the technological limitations of the first Donkey Kong game. The primitive graphics—there were hardly enough pixels to approximate a human form—compelled Miyamoto to give Mario white gloves and red overalls (so that you could see his arms swing), a big bushy mustache and a red hat (to hide the fact that engineers couldn’t yet do mouths or hair that moved), and a big head (to exaggerate his collisions). Form has always followed functionality. The problem now, if you want to call it one, is the degree of functionality. [Italics mine.]

This is a nice, crucial point. And it applies to more than video games. The limitations that made Mario so distinctive are the same ones that led to the look of Mickey Mouse, among so many other stars of early animation. One problem with the recent availability of beautifully rendered computer graphics is that character design is becoming a lost art. Even the best recent Pixar, Disney, and DreamWorks films have suffered from this: they can render every hair on a character’s head, but can’t make the character itself a memorable one. (Kung Fu Panda may be the last computer-animated movie with really distinctive character designs.)

So are video games art? Paumgarten glances at the subject only briefly, but with all due respect to Roger Ebert, there’s no doubt in my mind that the best video games are indeed art. At least, that’s the only explanation I have for something like Super Mario Galaxy, which is one of the few recent works, in any medium, that has filled me with something like my childhood envy for those who get to spend their lives telling stories. (The J.J. Abrams reboot of Star Trek is another.) Miyamoto’s great skill, as the article reminds us, is to bring us back to the best moments of our childhood. And while not all art needs to aspire to this, the world definitely needs art that does.

Tangled’s web

with 2 comments

The big news in pop culture this week, of course, is the unexpected resurgence, in the form of Tangled, of the classic Walt Disney brand. As many critics have noted, Tangled is the closest thing to the full Disney package—fairy tale setting, beautiful princess, dashing hero, amusing animal sidekicks, Alan Menken—that we’ve had in at least fifteen years.  The result, while sentimental, undeniably works. Watching Tangled, I felt something like what Pauline Kael described when reviewing a very different movie: “The pieces of the story fit together so beautifully that eventually the director has you wrapped up in the foolishness. By the end, all the large, sappy, satisfying emotions get to you.”

So what are the lessons for writers? It’s easy, and definitely accurate, to credit John Lasseter, the Pixar genius who shepherded Tangled throughout its entire production, with much of the movie’s success. But it’s also worth spotlighting the contribution of animator Glen Keane, who would have directed Tangled had he not suffered a heart attack midway through production. Den of Geek has a really fine interview with Keane, which is worth reading in its entirety, but especially for this story, which describes something to which any writer can relate:

The amazing thing was that in May, this year, we were only at 40% finished in our animation. We had to have 60% of the movie by the middle of July. And it was impossible. And it was all of the most subtle, difficult, stuff.

I remember telling [the animators], “look, we have an impossible amount of work to do, none of you will be the animators you are now by the end of the film, you will have grown. You will have animated scenes that you can’t even imagine that you did. And I can’t tell you how you will do them. But you will do them, and there’s just something that is happening right now, and I call it collective learning.”

The history of animation, in general, is something that every writer should study, because it’s by far the best documented of any of the narrative arts. Because every stage in the animation process—initial sketches, concept art, storyboards, backgrounds—is fun to look at in itself (which isn’t true, for example, of most novelists’ first drafts), the creative process is exceptionally well chronicled. A book like Paper Dreams: The Art and Artists of Disney Storyboards is an inspiring read for any writer who needs a reminder of how tentative and exploratory the artistic process can be, especially in its earliest stages.

Animation is also worth studying because it tells stories simply and cleanly, as most writers should strive to do. It’s especially good at breaking stories down into their basic units, which, as I’ve noted already, is the first and most important rule of writing. Any writer, for example, would benefit from the sort of advice that Shamus Culhane gives in Animation: From Script to Screen:

One good method of developing a story is to make a list of details. For example [for a cartoon about elves as clock cleaners in a cathedral], what architectural features come to mind—steeples, bells, windows, gargoyles? What props would the elves use—brushes, pails, mops, sponges…what else? Keep on compiling lists without stopping to think about them. Let your mind flow effortlessly, and don’t try to be neat or orderly. Scribble as fast as you can until you run out of ideas.

Disney can be accused of tastelessness and commercialism, to put it mildly, but it’s also better than anyone I know (except, perhaps, the team of Michael Powell and Emeric Pressburger, about whom I’ll have much more to say later) at creating works of art that exemplify the most fundamental reasons we go to the movies, or seek out any kind of art at all. The success of Tangled is only the most recent reminder of how powerful those elements can be.

Written by nevalalee

December 1, 2010 at 11:45 am

Follow

Get every new post delivered to your Inbox.

Join 3,261 other followers

%d bloggers like this: