Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Dune

The search for the zone

leave a comment »

Note: This post discusses plot points from Twin Peaks.

Last night’s episode of Twin Peaks featured the surprise return of Bill Hastings, the high school principal in Buckhorn, South Dakota who is somehow connected to the headless body of Major Garland Briggs. We hadn’t seen Hastings, played by Matthew Lillard, since the season premiere, and his reappearance marked one of the first times that the show has gone back to revisit an earlier subplot. Hastings, we’re told, maintained a blog called The Search for the Zone, in which he chronicled his attempts to contact other planes of reality, and the site really exists, of course, in the obligatory manner of such online ephemera as Save Walter White and the defunct What Badgers Eat. It’s a marketing impulse that seems closer to Mark Frost than David Lynch—if either of them were even involved—and I normally wouldn’t even mention it at all. Along with its fake banner ads and retro graphics, however, the page includes a section titled “Heinlein Links,” with a picture of Robert A. Heinlein and a list of a few real sites, including my friends over at The Heinlein Society. As “Hastings” writes: “Science Fiction has been a source of enjoyment for me since I was ten years old, when I read Orphans of the Sky.” Frankly, this already feels like a dead end, and, like the references to L. Ron Hubbard and Jack Parsons in The Secret History of Twin Peaks, it recalls some of the show’s least intriguing byways. (Major Briggs and the villainous Windom Earle, you might recall, were involved in Project Blue Book, the study of unidentified flying objects conducted by the Air Force, but the thread didn’t really lead anywhere, except perhaps to set off a train of thought for Chris Carter.) I enjoyed last night’s episode, but it was the most routine installment of the season so far, and this attempt at virality might be the most conventional touch of all. But since this might represent the only time in which my love of Twin Peaks will overlap with my professional interests, I should probably dig into it.

Orphans of the Sky, which was originally published as the two novellas “Universe” and “Common Sense” in Astounding Science Fiction in 1941, is arguably the most famous treatment of one of the loveliest ideas in science fiction—the generation starship, a spacecraft designed to travel for centuries or millennia until it reaches its destination. (Extra points if the passengers forget that they’re on a spaceship at all.) It’s also one of the few stories by Heinlein that can be definitively traced back to an idea provided by the editor John W. Campbell. On September 20, 1940, Campbell wrote to Heinlein with a detailed outline of the premise:

Sometime along about 3763, an expedition is finally launched from Earth to outer space—and I mean outer space…[The ship is] five miles in diameter, intended for about two thousand inhabitants, and equipped with gardens, pasturage, etc., for animals. It’s a self-sustaining economy…They’re bound for Alpha Centauri at a gradually building speed…The instruments somehow develop a systematic error, due to imperfect compensation for the rotation; they miss Centauri, plunging past it too rapidly and too far away to make landing. A brief revolt leads to the death of the few men aboard fully competent to make the necessary changes of mechanism for changing course and backtracking to Centauri. The ship can only plunge on.

But the story would be laid somewhere about 1430 After the Beginning. The characters are the remote descendants of those who took off, centuries before, from Earth. And they’re savages. The High Chiefs are the priest-engineers, who handle the small amount of non-automatic machinery…There are princes and nobles—and dull peasants. There are monsters, too, who are usually killed at birth, since every woman giving birth is required to present her baby before an inspector. That’s because of mutations, some of which are unspeakably hideous. One of which might, however, be a superman, and the hero of the story.

If you’ve read “Universe,” you can see that Campbell laid out most of it here, and that Heinlein used nearly all of it, down to the smallest details, although he later played down the extent of Campbell’s influence. (Decades later, in the collection Expanded Universe, Heinlein flatly, and falsely, stated that the unrelated serial Sixth Column “was the only story of mine ever influenced to any marked degree by John W. Campbell, Jr.”) But the two men also chose to emphasize different aspects of the narrative, in ways that reflected their interests and personalities. Most of Campbell’s letter, when it wasn’t talking about the design of the spacecraft itself, was devoted to the idea of the “scientisthood,” or a religion founded on a misinterpretation of science:

They’ve lost science, save for the priest class, who study it as a religion, and horribly misunderstand it because they learn from books written by and for people who dwelt on a planet near a sun. Here, the laws of gravity are meaningless, astronomy senseless, most of science purely superstition from a forgotten time. Naturally, there was a religious schism, a building-up of a new bastard science-religion that based itself on a weird and unnatural blending of the basic laws of science and the basic facts of their own experience…Anything is possible, and might be darned interesting. Particularly the queer, fascinating system of science-religion and so forth they’d have to live by.

The idea of a religion based on a misreading of the textbook Basic Modern Physics is a cute inversion of one of Campbell’s favorite plot devices—a fake religion deliberately dreamed up by scientists, which we see in such stories as the aforementioned Sixth Column, Isaac Asimov’s “Bridle and Saddle,” and Fritz Leiber, Jr.’s Gather, Darkness. In “Universe,” Heinlein touches on this briefly, but he was far more interested in the jarring perceptual and conceptual shift that the premise implied, which tied back into his interest in Alfred Korzybski and General Semantics: how do you come to terms with the realization that the only world you’ve ever known is really a small part of an incomprehensibly vaster reality?

“Universe” is an acknowledged landmark of the genre, although its sequel, “Common Sense,” feels more like work for hire. It isn’t hard to relate it to Hastings, whose last blog post reads in part:

We will have to reconcile with the question that if someone from outside our familiar world gains access to our plane of existence, what ramifications will that entail? There might be forces at work from deep dimensional space, or from the future…or are these one in [sic] the same?

But I’d hesitate to take the Heinlein connection too far. Twin Peaks—and most of David Lynch’s other work—has always asked us to look past the surface of things to an underlying pattern that is stranger than we can imagine, but it has little in common with the kind of cold, slightly dogmatic rationalism that we tend to see in Campbell and early Heinlein. Both men, like Korzybski or even Ayn Rand, claimed that they were only trying to get readers to think for themselves, but in practice, they were markedly impatient of anyone who disagreed with their answers. Lynch and Mark Frost’s brand of transcendence is looser, more dreamlike, and more intuitive, and its insights are more likely to be triggered by a song, the taste of coffee, or a pair of red high heels than by logical analysis. (When the show tries to lay out the pieces in a more systematic fashion, as it did last night, it doesn’t always work.) But there’s something to be said for the idea that beyond our familiar world, there’s an objective reality that would be blindingly obvious if we only managed to see it. With all the pop cultural baggage carried by Twin Peaks, it’s easy to forget that it’s also from the director and star of Dune, which took the opposite approach, with a unified past and future visible to the superhuman Kwisatz Haderach. Yet Lynch’s own mystical inclinations are more modest and humane, and neither Heinlein nor Frank Herbert have much in common with the man whose favorite psychoactive substances have always been coffee and cigarettes. And I’d rather believe in a world in which the owls are not what they seem than one in which nothing at all is what it seems. But there’s one line from “Universe” that can serve as a quiet undertone to much of Lynch’s career, and I’d prefer to leave it there: “He knew, subconsciously, that, having seen the stars, he would never be happy again.”

The analytical laboratory

leave a comment »

The Martian

Over the last few months, there’s been a surprising flurry of film and television activity involving the writers featured in my upcoming book Astounding. SyFy has announced plans to adapt Robert A. Heinlein’s Stranger in the Strange Land as a miniseries, with an imposing creative team that includes Hollywood power broker Scott Rudin and Zodiac screenwriter James Vanderbilt. Columbia is aiming to reboot Starship Troopers with producer Neal H. Mortiz of The Fast and the Furious, prompting Paul Verhoeven, the director of the original, to comment: “Going back to the novel would fit very much in a Trump presidency.” The production company Legendary has bought the film and television rights to Dune, which first appeared as a serial edited by John W. Campbell in Analog. Meanwhile, Jonathan Nolan is apparently still attached to an adaptation of Isaac Asimov’s Foundation, although he seems rather busy at the moment. (L. Ron Hubbard remains relatively neglected, unless you want to count Leah Remini’s new show, which the Church of Scientology would probably hope you wouldn’t.) The fact that rights have been purchased and press releases issued doesn’t necessarily mean that anything will happen, of course, although the prospects for Stranger in a Strange Land seem strong. And while it’s possible that I’m simply paying more attention to these announcements now that I’m thinking about these writers all the time, I suspect that there’s something real going on.

So why the sudden surge of interest? The most likely, and also the most heartening, explanation is that we’re experiencing a revival of hard science fiction. Movies like Gravity, Interstellar, The Martian, and Arrival—which I haven’t seen yet—have demonstrated that there’s an audience for films that draw more inspiration from Clarke and Kubrick than from Star Wars. Westworld, whatever else you might think of it, has done much the same on television. And there’s no question that the environment for this kind of story is far more attractive now than it was even ten years ago. For my money, the most encouraging development is the movie Life, a horror thriller set on the International Space Station, which is scheduled to come out next summer. I’m tickled by it because, frankly, it doesn’t look like anything special: the trailer starts promisingly enough, but it ends by feeling very familiar. It might turn out to be better than it looks, but I almost hope that it doesn’t. The best sign that a genre is reaching maturity isn’t a series of singular achievements, but the appearance of works that are content to color inside the lines, consciously evoking the trappings of more visionary movies while remaining squarely focused on the mainstream. A film like Interstellar is always going to be an outlier. What we need are movies like what Life promises to be: a science fiction film of minimal ambition, but a certain amount of skill, and a willingness to copy the most obvious features of its predecessors. That’s when you’ve got a trend.

Jake Gyllenhaal in Life

The other key development is the growing market for prestige dramas on television, which is the logical home for Stranger in a Strange Land and, I think, Dune. It may be the case, as we’ve been told in connection with Star Trek: Discovery, that there isn’t a place for science fiction on a broadcast network, but there’s certainly room for it on cable. Combine this with the increased appetite for hard science fiction on film, and you’ve got precisely the conditions in which smart production companies should be snatching up the rights to Asimov, Heinlein, and the rest. Given the historically rapid rise and fall of such trends, they shouldn’t expect this window to remain open for long. (In a letter to Asimov on February 3, 1939, Frederik Pohl noted the flood of new science fiction magazines on newsstands, and he concluded: “Time is indeed of the essence…Such a condition can’t possibly last forever, and the time to capitalize on it is now; next month may be too late.”) What they’re likely to find, in the end, is that many of these stories are resistant to adaptation, and that they’re better off seeking out original material. There’s a reason that there have been so few movies derived from Heinlein and Asimov, despite the temptation that they’ve always presented. Heinlein, in particular, seems superficially amenable to the movies: he certainly knew how to write action in a way that Asimov couldn’t. But he also liked to spend the second half of a story picking apart the assumptions of the first, after sucking in the reader with an exciting beginning, and if you aren’t going to include the deconstruction, you might as well write something from scratch.

As it happens, the recent spike of action on the adaptation front has coincided with another announcement. Analog, the laboratory in which all these authors were born, is cutting back its production schedule to six double issues every year. This is obviously intended to manage costs, and it’s a reminder of how close to the edge the science fiction digests have always been. (To be fair, the change also coincides with a long overdue update of the magazine’s website, which is very encouraging. If this reflects a true shift from print to online, it’s less a retreat than a necessary recalibration.) It’s easy to contrast the game of pennies being played at the bottom with the expenditure of millions of dollars at the top, but that’s arguably how it has to be. Analog, like Astounding before it, was a machine for generating variations, which needs to be done on the cheap. Most stories are forgotten almost at once, and the few that survive the test of time are the ones that get the lion’s share of resources. All the while, the magazine persists as an indispensable form of research and development—a sort of skunk works that keeps the entire enterprise going. That’s been true since the beginning, and you can see this clearly in the lives of the writers involved. Asimov, Heinlein, Herbert, and their estates became wealthy from their work. Campbell, who more than any other individual was responsible for the rise of modern science fiction, did not. Instead, he remained in his little office, lugging manuscripts in a heavy briefcase twice a week on the train. He was reasonably well off, but not in a way that creates an empire of valuable intellectual property. Instead, he ran the lab. And we can see the results all around us.

The great scene theory

with 2 comments

The Coronation of Napoleon by Jacques-Louis David

“The history of the world is but the biography of great men,” Thomas Carlyle once wrote, and although this statement was criticized almost at once, it accurately captures the way many of us continue to think about historical events, both large and small. There’s something inherently appealing about the idea that certain exceptional personalities—Alexander the Great, Julius Caesar, Napoleon—can seize and turn the temper of their time, and we see it today in attempts to explain, say, the personal computing revolution though the life of someone like Steve Jobs. The alternate view, which was expressed forcefully by Herbert Spencer, is that history is the outcome of impersonal social and economic forces, in which a single man or woman can do little more than catalyze trends that are already there. If Napoleon had never lived, the theory goes, someone very much like him would have taken his place. It’s safe to say that any reasonable view of history has to take both theories into account: Napoleon was extraordinary in ways that can’t be fully explained by his environment, even if he was inseparably a part of it. But it’s also worth remembering that much of our fascination with such individuals arises from our craving for narrative structures, which demand a clear hero or villain. (The major exception, interestingly, is science fiction, in which the “protagonist” is often humanity as a whole. And the transition from the hard science fiction of the golden age to messianic stories like Dune, in which the great man reasserts himself with a vengeance, is a critical turning point in the genre’s development.)

You can see a similar divide in storytelling, too. One school of thought implicitly assumes that a story is a delivery system for great scenes, with the rest of the plot serving as a scaffold to enable a handful of awesome moments. Another approach sees a narrative as a series of small, carefully chosen details designed to create an emotional effect greater than the sum of its parts. When it comes to the former strategy, it’s hard to think of a better example than Game of Thrones, a television series that often seems to be marking time between high points: it can test a viewer’s patience, but to the extent that it works, it’s because it constantly promises a big payoff around the corner, and we can expect two or three transcendent set pieces per season. Mad Men took the opposite tack: it was made up of countless tiny but riveting choices that gained power from their cumulative impact. Like the theories of history I mentioned above, neither type of storytelling is necessarily correct or complete in itself, and you’ll find plenty of exceptions, even in works that seem to fall clearly into one category or the other. It certainly doesn’t mean that one kind of story is “better” than the other. But it provides a useful way to structure our thinking, especially when we consider how subtly one theory shades into the other in practice. The director Howard Hawks famously said that a good movie consisted of three great scenes and no bad scenes, which seems like a vote for the Game of Thrones model. Yet a great scene doesn’t exist in isolation, and the closer we look at stories that work, the more important those nonexistent “bad scenes” start to become.

Leo Tolstoy

I got to thinking about this last week, shortly after I completed the series about my alternative movie canon. Looking back at those posts, I noticed that I singled out three of these movies—The Night of the Hunter, The Limey, and Down with Love—for the sake of one memorable scene. But these scenes also depend in tangible ways on their surrounding material. The river sequence in The Night of the Hunter comes out of nowhere, but it’s also the culmination of a language of dreams that the rest of the movie has established. Terence Stamp’s unseen revenge in The Limey works only because we’ve been prepared for it by a slow buildup that lasts for more than twenty minutes. And Renée Zellweger’s confessional speech in Down with Love is striking largely because of how different it is from the movie around it: the rest of the film is relentlessly active, colorful, and noisy, and her long, unbroken take stands out for how emphatically it presses the pause button. None of the scenes would play as well out of context, and it’s easy to imagine a version of each movie in which they didn’t work at all. We remember them, but only because of the less showy creative decisions that have already been made. And at a time when movies seem more obsessed than ever with “trailer moments” that can be spliced into a highlight reel, it’s important to honor the kind of unobtrusive craft required to make a movie with no bad scenes. (A plot that consists of nothing but high points can be exhausting, and a good story both delivers on the obvious payoffs and maintains our interest in the scenes when nothing much seems to be happening.)

Not surprisingly, writers have spent a lot of time thinking about these issues, and it’s noteworthy that one of the most instructive examples comes from Leo Tolstoy. War and Peace is nothing less than an extended criticism of the great man theory of history: Tolstoy brings Napoleon onto the scene expressly to emphasize how insignificant he actually is, and the novel concludes with a lengthy epilogue in which the author lays out his objections to how history is normally understood. History, he argues, is a pattern that emerges from countless unobservable human actions, like the sum of infinitesimals in calculus, and because we can’t see the components in isolation, we have to content ourselves with figuring out the laws of their behavior in the aggregate. But of course, this also describes Tolstoy’s strategy as a writer: we remember the big set pieces in War and Peace and Anna Karenina, but they emerge from the diligent, seemingly impersonal collation of thousands of tiny details, recorded with what seems like a minimum of authorial interference. (As Victor Shklovsky writes: “[Tolstoy] describes the object as if he were seeing it for the first time, an event as if it were happening for the first time.”) And the awesome moments in his novels gain their power from the fact that they arise, as if by historical inevitability, from the details that came before them. Anna Karenina was still alive at the end of the first draft, and it took her author a long time to reconcile himself to the tragic climax toward which his story was driving him. Tolstoy had good reason to believe that great scenes, like great men, are the product of invisible forces. But it took a great writer to see this.

Quote of the Day

leave a comment »

Frank Herbert

For Dune, I also used what I call a “camera position” method—playing back and forth (and in varied orders depending on the required pace) between long shot, medium, closeup, and so on…The implications of color, position, word root, and prosodic suggestion—all are taken into account when a scene has to have maximum impact. And what scene doesn’t if a book is tightly written?

Frank Herbert

Written by nevalalee

August 27, 2015 at 7:20 am

Posted in Quote of the Day, Writing

Tagged with ,

“Make it recognizable!”

leave a comment »

David Mamet

I’ve mentioned before how David Mamet’s little book On Directing Film rocked my world at a time when I thought I’d already figured out storytelling to my own satisfaction. It provides the best set of tools for constructing a plot I’ve ever seen, and to the extent that I can call any book a writer’s secret weapon, this is it. But I don’t think I’ve ever talked about the moment when I realized how powerful Mamet’s advice really is. The first section of the book is largely given over to a transcript of one of the author’s seminars at Columbia, in which the class breaks down the beats of a simple short film: a student approaches a teacher to request a revised grade. The crucial prop in the scene, which is told entirely without dialogue, is the student’s notebook, its contents unknown—and, as Mamet points out repeatedly, unimportant. Then he asks:

Mamet: What answer do we give to the prop person who says “what’s the notebook look like?” What are you going to say?

The students respond with a number of suggestions: put a label on it, make it look like a book report, make it look “prepared.” Mamet shoots them down one by one, saying that they’re things that the audience can’t be expected to care about, if they aren’t intrinsically impossible:

Mamet: No, you can’t make the book look prepared. You can make it look neat. That might be nice, but that’s not the most important thing for your answer to the prop person…To make it prepared, to make it neat, to make it convincing, the audience ain’t going to notice. What are they doing to notice?
Student: That it’s the same book they’ve seen already.
Mamet: So what’s your answer to the prop person?
Student: Make it recognizable.
Mamet: Exactly so! Good. You’ve got to be able to recognize it. That is the most important thing about this report. This is how you use the principle of throughline to answer questions about the set and to answer questions about the costumes.

A recognizable notebook

Now, this might seem like a small thing, but to me, this was an unforgettable moment: it was a powerful illustration of how close attention to the spine of the plot—the actions and images you use to convey the protagonist’s sequence of objectives—can result in immediate, practical answers to seemingly minor story problems, as long as you’re willing to rigorously apply the rules. “Make it recognizable,” in particular, is a rule whose true value I’ve only recently begun to understand. In writing a story, regardless of the medium, you only have a finite number of details that you can emphasize, so it doesn’t hurt to focus on ones that will help the reader recognize and remember important elements—a character, a prop, an idea—when they recur over the course of the narrative. Mamet notes that you can’t expect a viewer to read signs or labels designed to explain what isn’t clear in the action, and it took me a long time to see that this is equally true of the building blocks of fiction: if the reader needs to pause to remember who a character is or where a certain object has appeared before, you haven’t done your job as well as you could.

And like the instructions a director gives to the prop department, this rule translates into specific, concrete actions that a writer can take to keep the reader oriented. It’s why I try to give my characters names that can be readily distinguished from one another, to the point where I’ll often try to give each major player a name that begins with a different letter. This isn’t true to life, where, as James Wood points out, we’re likely to know three people named John and three more named Elizabeth, but it’s a useful courtesy to the reader. The same applies to other entities within the story: it can be difficult to keep track of the alliances in a novel like Dune, but Frank Herbert helps us tremendously by giving the families distinctive names like House Atreides and House Harkonnen. (Try to guess which house contains all the bad guys.) This is also why it’s useful to give minor characters some small characteristic to lock them in the reader’s mind: we may not remember that we’ve met Robert in Chapter 3 when he returns in Chapter 50, but we’ll recall his bristling eyebrows. Nearly every choice a writer makes should be geared toward making these moments of recognition as painless as possible, without the need for labels. As Mamet says: “The audience doesn’t want to read a sign; they want to watch a motion picture.” And to be told a story.

Written by nevalalee

June 19, 2013 at 9:02 am

Why hobbits need to be short

leave a comment »

Ian McKellen in The Hobbit: An Unexpected Journey

It’s never easy to adapt a beloved novel for the screen. On the one hand, you have a book that has been widely acclaimed as one of the greatest works of speculative fiction of all time, with a devoted fanbase and an enormous invented backstory spread across many novels and appendices. On the other, you have a genius director who moved on from his early, bizarre, low-budget features to a triumphant mainstream success with multiple Oscar nominations, but whose skills as a storyteller have sometimes been less reliable than his unquestioned visual talents. The result, after a protracted development process clouded by rights issues, financial difficulties, and the departure of the previous director, is an overlong movie with too many characters that fails to capture the qualities that drew people to this story in the first place. By trying to appease fans of the book while also drawing in new audiences, it ends up neither here nor there. While it’s cinematically striking, and has its defenders, it leaves critics mostly cold, with few of the awards or accolades that greeted its director’s earlier work. And that’s why David Lynch had so much trouble with Dune.

But it’s what Lynch did next that is especially instructive. After Dune‘s financial failure, he found himself working on his fourth movie under far greater constraints, with a tiny budget and a contractual runtime of no more than 120 minutes. The initial cut ran close to three hours, but eventually, with the help of editor Duwayne Dunham, he got it down to the necessary length, although it meant losing a lot of wonderful material along the way. And what we got was Blue Velvet, which isn’t just Lynch’s best film, but my favorite American movie of all time. I recently had the chance to watch all of the deleted scenes as part of the movie’s release on Blu-ray, and it’s clear that if Lynch had been allowed to retain whatever footage he wanted—as he clearly does these days—the result would have been a movie like Inland Empire: fascinating, important, but ultimately a film that I wouldn’t need to see more than once. The moral, surprisingly enough, is that even a director like Lynch, a genuine artist who has earned the right to pursue his visions wherever they happen to take him, can benefit from the need, imposed by a studio, to cut his work far beyond the level where he might have been comfortable.

Kyle MacLachlan in Blue Velvet

Obviously, the case of Peter Jackson is rather different. The Lord of the Rings trilogy was an enormous international success, and did as much as anything to prove that audiences will still sit happily through a movie of more than three hours if the storytelling is compelling enough. As a result, Jackson was able to make The Hobbit: An Unexpected Journey as long as he liked, which is precisely the problem. The Hobbit isn’t a bad movie, exactly; after an interminable first hour, it picks up considerably in the second half, and there are still moments I’m grateful to have experienced on the big screen. Yet I can’t help feeling that if Jackson had felt obliged, either contractually or artistically, to bring it in at under two hours, it would have been vastly improved. This would have required some hard choices, but even at a glance, there are entire sequences here that never should have made it past a rough cut. As it stands, we’re left with a meandering movie that trades largely on our affection for the previous trilogy—its actors, its locations, its music. And if this had been the first installment of a series, it’s hard to imagine it making much of an impression on anyone. Indeed, it might have justified all our worst fears about a cinematic adaptation of Tolkien.

And the really strange thing is that Jackson has no excuse. For one thing, it isn’t the first time he’s done this: I loved King Kong, but I still feel that it would have been rightly seen as a game changer on the level of Avatar if he’d cut it by even twenty minutes. And unlike David Lynch and Blue Velvet, whose deleted scenes remained unseen for decades before being miraculously rediscovered, Jackson knows that even if has to cut a sequence he loves, he has an audience of millions that will gladly purchase the full extended edition within a year of the movie’s release. But it takes a strong artistic will to accept such constraints if they aren’t being imposed from the outside, and to acknowledge that sometimes an arbitrary limit is exactly what you need to force yourself to make those difficult choices. (My own novels are contractually required to come in somewhere around 100,000 words, and although I’ve had to cut them to the bone to get there, they’ve been tremendously improved by the process, to the point where I intend to impose the same limit on everything I ever write.) The Hobbit has two more installments to go, and I hope Jackson takes the somewhat underwhelming critical and commercial response to the first chapter to heart. Because an unwillingness to edit your work is a hard hobbit to break.

So what happened to John Carter?

leave a comment »

In recent years, the fawning New Yorker profile has become the Hollywood equivalent of the Sports Illustrated cover—a harbinger of bad times to come. It isn’t hard to figure out why: both are awarded to subjects who have just reached the top of their game, which often foreshadows a humbling crash. Tony Gilroy was awarded a profile after the success of Michael Clayton, only to follow it up with the underwhelming Duplicity. For Steve Carrell, it was Dinner with Schmucks. For Anna Faris, it was What’s Your Number? And for John Lasseter, revealingly, it was Cars 2. The latest casualty is Andrew Stanton, whose profile, which I discussed in detail last year, now seems laden with irony, as well as an optimism that reads in retrospect as whistling in the dark. “Among all the top talent here,” a Pixar executive is quoted as saying, “Andrew is the one who has a genius for story structure.” And whatever redeeming qualities John Carter may have, story structure isn’t one of them. (The fact that Stanton claims to have closely studied the truly awful screenplay for Ryan’s Daughter now feels like an early warning sign.)

If nothing else, the making of John Carter will provide ample material for a great case study, hopefully along the lines of Julie Salamon’s classic The Devil’s Candy. There are really two failures here, one of marketing, another of storytelling, and even the story behind the film’s teaser trailer is fascinating. According to Vulture’s Claude Brodesser-Akner, a series of lost battles and miscommunications led to the release of a few enigmatic images devoid of action and scored, in the manner of an Internet fan video, with Peter Gabriel’s dark cover of “My Body is a Cage.” And while there’s more to the story than this—I actually found the trailer quite evocative, and negative responses to early marketing materials certainly didn’t hurt Avatar—it’s clear that this was one of the most poorly marketed tentpole movies in a long time. It began with the inexplicable decision to change the title from John Carter of Mars, on the assumption that women are turned off by science fiction, while making no attempt to lure in female viewers with the movie’s love story or central heroine, or even to explain who John Carter is. This is what happens when a four-quadrant marketing campaign goes wrong: when you try to please everybody, you please no one.

And the same holds true of the movie itself. While the story itself is fairly clear, and Stanton and his writers keep us reasonably grounded in the planet’s complex mythology, we’re never given any reason to care. Attempts to engage us with the central characters fall curiously flat: to convey that Princess Dejah is smart and resourceful, for example, the film shows her inventing the Barsoomian equivalent of nuclear power, evidently in her spare time. John Carter himself is a cipher. And while some of these problems might have been solved by miraculous casting, the blame lands squarely on Stanton’s shoulders. Stanton clearly loves John Carter, but forgets to persuade us to love him as well. What John Carter needed, more than anything else, was a dose of the rather stark detachment that I saw in Mission: Impossible—Ghost Protocol, as directed by Stanton’s former Pixar colleague Brad Bird. Bird clearly had no personal investment in the franchise, except to make the best movie he possibly could. John Carter, by contrast, falls apart on its director’s passion and good intentions, as well as a creative philosophy that evidently works in animation, but not live action. As Stanton says of Pixar:

We’re in this weird, hermetically sealed freakazoid place where everybody’s trying their best to do their best—and the films still suck for three out of the four years it takes to make them.

Which only makes us wonder what might have happened if John Carter had been granted a fourth year.

Stanton should take heart, however. If there’s one movie that John Carter calls to mind, it’s Dune, another financial and critical catastrophe that was doomed—as much as I love it—by fidelity to its source material. (In fact, if you take Roger Ebert’s original review of Dune, which came out in 1985, and replace the relevant proper names, you end up with something remarkably close to a review of John Carter: “Actors stand around in ridiculous costumes, mouthing dialogue with little or no context.”) Yet its director not only recovered, but followed it up with my favorite movie ever made in America. Failure, if it results in another chance, can be the opposite of the New Yorker curse. And while Stanton may not be David Lynch, he’s not without talent: the movie’s design is often impressive, especially its alien effects, and it displays occasional flashes of wit and humor that remind us of what Stanton can do. John Carter may go on record as the most expensive learning experience in history, and while this may be cold comfort to Disney shareholders, it’s not bad for the rest of us, as long as Stanton gets his second chance. Hopefully far away from the New Yorker.

Written by nevalalee

March 15, 2012 at 10:31 am

%d bloggers like this: