Alec Nevala-Lee

Thoughts on art, creativity, and the writing life.

Posts Tagged ‘Man of Steel

The way of the hedgehog

with 3 comments

The Hedgehog

Jean Renoir once suggested that most true creators have only one idea and spend their lives reworking it,” the director Peter Greenaway said in an interview a quarter of a century ago. “But then very rapidly he added that most people don’t have any ideas at all, so one idea is pretty amazing.” I haven’t been able to find the original version of this quote, but it remains true enough even if we attribute it to Greenaway himself, who might otherwise not seem to have much in common with Renoir. Over time, I’ve come to sympathize with the notion that the important thing for an artist is to have an idea, as long as it’s a good one. This wasn’t always how I felt. In college, I was deeply impressed by Isaiah Berlin’s The Hedgehog and the Fox, in which he drew a famous contrast between writers who are hedgehogs, with one overarching obsession that they pursue for all their lives, and the foxes who move restlessly from one idea to another. (Berlin took his inspiration from a fragment of Archilochus—“The fox knows many things, but the hedgehog knows one big thing”—which may mean nothing more than the fact that the fox, for all its cleverness, is ultimately defeated by the hedgehog’s one good defense of rolling itself into a ball.) My natural loyalty at the time was to such foxes as Shakespeare, Joyce, and Pushkin, as much as I came to love such hedgehogs as Dante and Proust. That’s probably how it should be at twenty, when most of us, as Berlin writes, “lead lives, perform acts, and entertain ideas that are centrifugal rather than centripetal.”

Even at the time, however, I sensed that there was a difference between a truly omnivorous intelligence and the simple inability to make up one’s mind. And as I’ve grown older, I’ve begun to feel more respect for the hedgehogs. (It’s worth noting, by the way, that this classification really only makes sense when applied to exceptional creative geniuses. For the rest of us, identifying as a fox is more likely to become an excuse for a lack of fixed ideas, while a hedgehog’s perspective can become indistinguishable from tunnel vision. I’m neither a hedgehog nor a fox, and neither are you—we’re just trying to muddle along and make sense of the world as best as we can.) It takes courage to devote your entire career to a single idea, more so, in many ways, than building it around the act of creation itself. Neither approach is inherently better than the other, but both have their associated pitfalls. When you stick to one idea, you run the obvious risk of being unable to change your mind even if you’re wrong, and of distorting the evidence around you to fit your preconceived notions. But the danger of throwing in your lot with process is no less real. It can result in the sort of empty technical facility that levels all values until they become indistinguishable, and it can lead you astray just as surely as a fixation on a single argument can. These wrong turns may last just a year or two, rather than a lifetime, but a life made up of twenty dead ends in succession isn’t that much different than one spent tunneling for decades in the wrong direction. You wind up repeating the same behaviors in an endless cycle of tiny variations, and if it were a movie, you could call it Hedgehog Day.

Peter Greenaway

I don’t mean to denigrate the acquisition of technical experience, which is a difficult and honorable calling in itself. But it’s necessary to remember that once we become competent in any art, the skills that we’ve acquired are largely fungible, and we become part of a stratum of practitioners who are mostly interchangeable with others at the same level. You can see this most clearly in the movies, which is the medium in which financial and market pressures tend to equalize talent the most ruthlessly. It’s rare to see a film these days that isn’t shot, lit, mixed, and scored with a high degree of proficiency, simply because the competition within those fields is so intense, and based solely on ability, that any movie with a reasonable budget can get excellent craftspeople to fill those roles. It’s in the underlying idea and its execution that films tend to fall short. (There are countless examples, but the one that has been on my mind the most is Batman v. Superman. There’s a perfectly legitimate story that could be told by a film of that title—in which Superman stands for unyielding law and order and Batman represents a more ambiguous form of vigilante justice—but the movie, for whatever reason, declines to use it. Instead, it tries to graft its showdown onto the alien messiah narrative of Man of Steel, which isn’t a bad concept in itself: it just happens to be fundamentally incompatible with the ethical conflict between these two superheroes. Zack Snyder has a great eye, the cast is excellent, and the technical elements are all exquisite. But it’s a movie so misconceived that it could only have been saved by throwing out the entire script and starting again.)

Good ideas, as I’ve often said before, are cheap, but the ones worthy of fueling a great novel or movie or even a lifetime are indescribably precious, and the whole point of developing technical proficiency is to defend those ideas from those who would destroy them, even inadvertently. There’s a reason why screenwriting is the one aspect of filmmaking that doesn’t seem to have advanced at all over the last century. It’s because most studio executives wouldn’t dream of trying to interfere with sound mixing, lighting, or cinematography, but they also believe that their story ideas are as good as anyone else’s. This attitude is particularly stark in the movies, but it’s present in almost any field where ideas are evaluated less on their own merits than on their convenience to the structures that are already in place. We claim to value ideas, but we’re all too willing to drop or ignore uncomfortable truths, or, even more damagingly, to quietly replace them with their counterfeit equivalents. Even a hedgehog needs to be something of a fox to keep an idea alive in the face of all the forces that would oppose it or kill it with indifference. Not every belief is worth fighting or dying for, and history is full of otherwise capable men and women—John W. Campbell among them—who sacrificed their reputations on the altar of an unexamined idea. We need to be willing to change course in light of new evidence and to be as crafty as Odysseus to find our way home. But all that cleverness and tenacity and tactical brilliance become worthless if they aren’t given shape by a clear vision, even if it’s a modest one. Not all of us can be hedgehogs or foxes. But we can’t afford to be ostriches, either.

Pictures at an exhibition

leave a comment »

Silhouette by Kara Walker

Note: Every Friday, The A.V. Club, my favorite pop cultural site on the Internet, throws out a question to its staff members for discussion, and I’ve decided that I want to join in on the fun. This week’s topic: “What piece of art has actually stopped you in your tracks?”

“All art constantly aspires toward the condition of music,” Walter Pater famously said, but these days, it seems more accurate to say that all art aspires toward the condition of advertising. There’s always been a dialogue between the two, of course, and it runs in both directions, with commercials and print ads picking up on advances in the fine arts, even as artists begin to utilize techniques initially developed on Madison Avenue. Advertising is a particularly ruthless medium—you have only a few seconds to grab the viewer’s attention—and the combination of quick turnover, rapid feedback, and intense financial pressure allows innovations to be adapted and refined with blinding speed, at least within a certain narrow range. (There’s a real sense in which the hard lessons that Jim Henson, say, learned while shooting commercials for Wilkins Coffee are what made Sesame Street so successful.) The difference today is that the push for virality—the need to attract eyeballs in brutal competition with countless potential diversions—has superseded all other considerations, including the ability to grow and maintain an audience. When thousands of “content providers” are fighting for our time on equal terms, there’s no particular reason to remain loyal to any one of them. Everything is an ad now, and it’s selling nothing but itself.

This isn’t a new idea, and I’ve written about it here at length before. What really interests me, though, is how even the most successful examples of storytelling are judged by how effectively they point to some undefined future product. The Marvel movies are essentially commercials or trailers for the idea of a superhero film: every installment builds to a big, meaningless battle that serves as a preview for the confrontation in an upcoming sequel, and we know that nothing can ever truly upset the status quo when the studio’s slate of tentpole releases has already been announced well into the next decade. They aren’t bad films, but they’re just ever so slightly better than they have to be, and I don’t have much of an interest in seeing any more. (Man of Steel has plenty of problems, but at least it represents an actual point of view and an attempt to work through its considerable confusions, and I’d sooner watch it again than The Avengers.) Marvel is fortunate enough to possess one of the few brands capable of maintaining an audience, and it’s petrified at the thought of losing it with anything so upsetting as a genuine surprise. And you can’t blame anyone involved. As Christopher McQuarrie aptly puts it, everyone in Hollywood is “terribly lost and desperately in need of help,” and the last thing Marvel or Disney wants is to turn one of the last reliable franchises into anything less than a predictable stream of cash flows. The pop culture pundits who criticize it—many of whom may not have jobs this time next year—should be so lucky.

Untitled Film Still #30 by Cindy Sherman

But it’s unclear where this leaves the rest of us, especially with the question of how to catch the viewer’s eye while inspiring an engagement that lasts. The human brain is wired in such a way that the images or ideas that seize its attention most easily aren’t likely to retain it over the long term: the quicker the impression, the sooner it evaporates, perhaps because it naturally appeals to our most superficial impulses. Which only means that it’s worth taking a close look at works of art that both capture our interest and reward it. It’s like going to an art gallery. You wander from room to room, glancing at most of the exhibits for just a few seconds, but every now and then, you see something that won’t let go. Usually, it only manages to intrigue you for the minute it takes to read the explanatory text beside it, but occasionally, the impression it makes is a lasting one. Speaking from personal experience, I can think of two revelatory moments in which a glimpse of a picture out of the corner of my eye led to a lifelong obsession. One was Cindy Sherman’s Untitled Film Stills; the other was the silhouette work of Kara Walker. They could hardly be more different, but both succeed because they evoke something to which we instinctively respond—movie archetypes and clichés in Sherman’s case, classic children’s illustrations in Walker’s—and then force us to question why they appealed to us in the first place.

And they manage to have it both ways to an extent that most artists would have reason to envy. Sherman’s film stills both parody and exploit the attitudes that they meticulously reconstruct: they wouldn’t be nearly as effective if they didn’t also serve as pin-ups for readers of Art in America. Similarly, Walker’s cutouts fill us with a kind of uneasy nostalgia for the picture books we read growing up, even as they investigate the darkest subjects imaginable. (They also raise fascinating questions about intentionality. Sherman, like David Lynch, can come across as a naif in interviews, while Walker is closer to Michael Haneke, an artist who is nothing if not completely aware of how each effect was achieved.) That strange combination of surface appeal and paradoxical depth may be the most promising angle of attack that artists currently have. You could say much the same about Vijith Assar’s recent piece for McSweeney’s about ambiguous grammar, which starts out as the kind of viral article that we all love to pass around—the animated graphics, the prepackaged nuggets of insight—only to end on a sweet sucker punch. The future of art may lie in forms that seize on the tools of virality while making us think twice about why we’re tempted to click the share button. And it requires artists of unbelievable virtuosity, who are able to exactly replicate the conditions of viral success while infusing them with a white-hot irony. It isn’t easy, but nothing worth doing ever is. This is the game we’re all playing, like it or not, and the artists who are most likely to survive are the ones who can catch the eye while also burrowing into the brain.

“And what does that name have to do with this?”

with 2 comments

"The word on the side of your yacht..."

Note: This post is the thirtieth installment in my author’s commentary for Eternal Empire, covering Chapter 29. You can read the previous installments here.

Earlier this week, in response to a devastating article in the New York Times on the allegedly crushing work environment in Amazon’s corporate offices, Jeff Bezos sent an email to employees that included the following statement:

[The article] claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter is heard. Again, I don’t recognize this Amazon and I very much hope you don’t, either…I strongly believe that anyone working in a company that really is like the one described in the [Times] would be crazy to stay. I know I would leave such a company.

Predictably, the email resulted in numerous headlines along the lines of “Jeff Bezos to Employees: You Don’t Work in a Dystopian Hellscape, Do You?” Bezos, a very smart guy, should have seen it coming. As Richard Nixon learned a long time ago, whenever you tell people that you aren’t a crook, you’re really raising the possibility that you might be. If you’re concerned about the names that your critics might call you, the last thing you want to do is put words in their mouths—it’s why public relations experts advise their clients to avoid negative language, even in the form of a denial—and saying that Amazon isn’t a soulless, dystopian workplace is a little like asking us not to think of an elephant.

Writers have recognized the negative power of certain loaded terms for a long time, and many works of art go out of their way to avoid such words, even if they’re central to the story. One of my favorite examples is the film version of The Girl With the Dragon Tattoo. Coming off Seven and Zodiac, David Fincher didn’t want to be pigeonholed as a director of serial killer movies, so the dialogue exclusively uses the term “serial murderer,” although it’s doubtful how effective this was. Along the same lines, Christopher Nolan’s superhero movies are notably averse to calling their characters by their most famous names: The Dark Knight Rises never uses the name “Catwoman,” while Man of Steel, which Nolan produced, avoids “Superman,” perhaps following the example of Frank Miller’s The Dark Knight Returns, which indulges in similar circumlocutions. Robert Towne’s script for Greystoke never calls its central character “Tarzan,” and The Walking Dead uses just about every imaginable term for its creatures aside from “zombie,” for reasons that creator Robert Kirkman explains:

One of the things about this world is that…they’re not familiar with zombies, per se. This isn’t a world [in which] the Romero movies exist, for instance, because we don’t want to portray it that way…They’ve never seen this in pop culture. This is a completely new thing for them.

"And what does that name have to do with this?"

Kirkman’s reluctance to call anything a zombie, which has inspired an entire page on TV Tropes dedicated to similar examples, is particularly revealing. A zombie movie can’t use that word because an invasion of the undead needs to feel like something unprecedented, and falling back on a term we know conjures up all kinds of pop cultural connotations that an original take might prefer to avoid. In many cases, avoiding particular words subtly encourages us treat the story on its own terms. In The Godfather, the term “Mafia” is never uttered—an aversion, incidentally, not shared by the original novel, the working title of which was actually Mafia. This quietly allows us to judge the Corleones according to the rules of their own closed world, and it circumvents any real reflection about what the family business actually involves. (According to one famous story, the mobster Joseph Colombo paid a visit to producer Al Ruddy, demanding that the word be struck from the script as a condition for allowing the movie to continue. Ruddy, who knew that the screenplay only used the word once, promptly agreed.) The Godfather Part II is largely devoted to blowing up the first movie’s assumptions, and when the word “Mafia” is uttered at a senate hearing, it feels like the real world intruding on a comfortable fantasy. And the moment wouldn’t be as effective if the first installment hadn’t been as diligent about avoiding the term, allowing it to build a new myth in its place.

While writing Eternal Empire, I found myself confronting a similar problem. In this case, the offending word was “Shambhala.” As I’ve noted before, I decided early on that the third novel in the series would center on the Shambhala myth, a choice I made as soon as I stumbled across an excerpt from Rachel Polonsky’s Molotov’s Magic Lantern, in which she states that Vladimir Putin had taken a particular interest in the legend. A little research, notably in Andrei Znamenski’s Red Shambhala, confirmed that the periodic attempts by Russia to confirm the existence of that mythical kingdom, carried out in an atmosphere of espionage and spycraft in Central Asia, was a rich vein of material. The trouble was that the word “Shambhala” itself was so loaded with New Age connotations that I’d have trouble digging my way out from under it: a quick search online reveals that it’s the name of a string of meditation centers, a music festival, and a spa with its own line of massage oils, none of which is exactly in keeping with the tone that I was trying to evoke. My solution, predictably, was to structure the whole plot around the myth of Shambhala while mentioning it as little as possible: the name appears perhaps thirty times across four hundred pages. (The mythological history of Shambhala is treated barely at all, and most of the references occur in discussions of the real attempts by Russian intelligence to discover it.) The bulk of those references appear here, in Chapter 29, and I cut them all down as much as possible, focusing on the bare minimum I needed for Maddy to pique Tarkovsky’s interest. I probably could have cut them even further. But as it stands, it’s more or less enough to get the story to where it needs to be. And it doesn’t need to be any longer than it is…

Raising the stakes

with 6 comments

2012

If there’s one note that nearly every writer gets from an editor or reader at one point or another, it’s this: “Raise the stakes.” What makes this note so handy from a reader’s point of view—and beyond infuriating for the writer who receives it—is that it’s never wrong, and it doesn’t require much in the way of close reading or analysis of the story itself. The stakes in a story could always be a little higher, and it’s hard for an author to make a case that he’s calibrated the stakes just right, or that the story wouldn’t benefit from some additional risk or tension. It’s such a common note, in fact, that it’s turned into a running joke among screenwriters. In the commentary track for the Simpsons episode “Natural Born Kissers,” for instance, the legendary comedy writer George Meyer watches a scene in which Homer and Marge need to drive to the store to buy a new motor for their broken refrigerator, and he drily notes: “This is what’s known as ‘raising the stakes.'”

And the fact that development executives can give this note so unthinkingly explains a lot about the movies.  Recently, the New York Times reporter Brooks Barnes circulated a fake proposal for an action movie called Red, White and Blood to a number of Hollywood insiders to see what they had to say. The response from producer Lynda Obst is particularly interesting:

The stakes need to be much, much higher. A gun battle? How cute. We need hotter weapons. Huge, big battle weapons—maybe an end-of-the-world device.

Hence the fact that every superhero movie seems to end with a crisis that threatens to wipe out all of humanity, or at least most of Gotham City. In itself, this isn’t necessarily a bad thing: the lack of a credible threat is part of what makes Superman Returns, for all its good intentions, a bit of a snooze. But after a while, the stakes become so high that they’re almost abstract. The final battle in The Avengers is theoretically supposed to determine the fate of the world, but it still comes down to our heroes fighting a bunch of aliens on flying scooters outside Grand Central Station.

2012

Really, though, the problem isn’t raising the stakes, but finding ways to express them in immediate human terms. Take the ending of Man of Steel. After an epic fistfight that destroys entire skyscrapers and probably costs thousands of lives, the struggle between Zod and Superman comes down to the fate of a handful of innocent bystanders—also staged, interestingly enough, in Grand Central Station. In principle, a few more casualties shouldn’t matter much either way, but they do: it’s an undeniably powerful moment in a movie in which the emotional side is often puzzlingly opaque. And it isn’t hard to see why. Instead of the legions of digitized fatalities in a Michael Bay movie, we’re given a good look at a handful of real people. We’re close enough to see the fear on their faces, and we care. (One suspects that Synder and Nolan took a cue from Richard Donner’s original Superman movie, in which the destruction of most of California seems insignificant compared to what happens to Lois Lane.)

And maybe it’s time filmmakers—and other storytellers—gave the world a break. In his great Biographical Dictionary of Film, David Thomson notes of Howard Hawks:

Like Monet forever painting lilies or Bonnard always re-creating his wife in her bath, Hawks made only one artwork. It is the principle of that movie that men are more expressive rolling a cigarette than saving the world.

Aside from the fact that Disney isn’t likely to show any of its Marvel characters smoking, this is still good advice to follow. You can raise the stakes as high as you want, but as disaster movies like 2012 have shown, you can destroy the entire planet and we still won’t care if you don’t give us characters to care about. Like most notes from readers, “raising the stakes” is less a way of solving a problem than an indication that deeper issues may lie elsewhere. And the real solution isn’t to blow up the world, or introduce hotter weapons, but to slow things down, show us a recognizable human being with needs we can understand, and maybe even let him roll a cigarette or two.

Written by nevalalee

July 10, 2013 at 9:17 am

Man and supermen

leave a comment »

Man of Steel

I’m starting to come to terms with an uncomfortable realization: I don’t much like The Avengers. Watching it again recently on Netflix, I was impressed by how fluidly it constructs an engaging movie out of so many prefabricated parts, but I couldn’t help noticing how arbitrary much of it seems. Much of the second act, in particular, feels like it’s killing time, and nothing seems all that essential: it clocks along nicely, but the action scenes follow on one another without building, and the stakes never feel especially high, even as the fate of the world hangs in the balance. And I don’t think this is Joss Whedon’s fault. He comes up with an entertaining package, but he’s stuck between the need to play with all the toys he’s been given while delivering them intact to their next three movies. Each hero has his or her own franchise where the real story development takes place, so The Avengers begins to play like a sideshow, rather than the main event it could have been. This is a story about these characters, not the story, and for all its color and energy, it’s a movie devoted to preserving the status quo. (Even its most memorable moment seems to have been retconned out of existence by the upcoming Agents of S.H.I.E.L.D.)

And while it may seem pointless to worry about this now, I think it’s worth asking what kind of comic book movies we really want, now that it seems that they’re going to dominate every summer for the foreseeable future. I’ve been pondering this even more since finally seeing Man of Steel, which I liked a lot. It has huge problems, above all the fact that its vision of Superman never quite comes into focus: by isolating him from his supporting cast for much of the movie, it blurs his identity to the point where major turning points, like his decision to embrace his role as a hero, flit by almost unnoticed. Yet once it ditches its awkward flashback structure, the movie starts to work, and its last hour has a real sense of awe, scale, and danger. And I’m looking forward to the inevitable sequel, even if it remains unclear if Henry Cavill—much less Zach Snyder or Christopher Nolan—can give the scenes set at the Daily Planet the necessary zest. At their best, the Superman films evoke a line of classic newspaper comedies that extends back to His Girl Friday and even Citizen Kane, and it’s in his ability to both wear the suit and occupy the skin of Clark Kent that Christopher Reeve is most sorely missed.

Joss Whedon on the set of The Avengers

If nothing else, Man of Steel at least has a point of view about its material, however clouded it might be, which is exactly what most of the Marvel Universe movies are lacking. At this point, when dazzling special effects can be taken for granted, what we need more than anything is a perspective toward these heroes that doesn’t feel as if it were dictated solely by a marketing department. Marvel itself doesn’t have much of an incentive to change its way of doing business: it’s earned a ton of money with this approach, and these movies have made a lot of people happy. But I’d still rather watch Chris Nolan’s Batman films, or even an insanity like Watchmen or Ang Lee’s Hulk, than yet another impersonal raid on the Marvel toy chest. Whedon himself is more than capable of imposing an idiosyncratic take on his projects, and even though it only intermittently comes through in The Avengers itself, I’m hopeful that its success will allow him to express himself more clearly in the future—which is one reason why I’m looking forward to Agents of S.H.I.E.L.D., which seems more geared toward his strengths.

And although I love Nolan’s take on the material, it doesn’t need to be dark, or even particularly ambitious. For an illustration, we need look no further than Captain America, which increasingly seems to me like the best of the Marvel movies. Joe Johnston’s Spielberg imitation is the most credible we’ve seen in a long time—even better, in many ways, than Spielberg himself has managed recently with similar material—and you can sense his joy at being given a chance to make his own Raiders knockoff. Watching it again last night, even on the small screen, I was utterly charmed by almost every frame. It’s a goof, but charged with huge affection toward its sources, and I suspect that it will hold up better over time than anyone could have anticipated. Unfortunately, it already feels like an anomaly. Much of its appeal is due to the period setting, which we’ve already lost for the sequel, and it looks like we’ve seen the last of Hugo Weaving’s Red Skull, who may well turn out to be the most memorable villain the Marvel movies will ever see. Marvel’s future is unlikely to be anything other than hugely profitable for all concerned, but it’s grown increasingly less interesting.

Written by nevalalee

July 9, 2013 at 8:54 am

%d bloggers like this: